26-person startup Arcee released Trinity Large Thinking, a 400B-parameter open-weight reasoning model under Apache 2.0, claiming it's the most capable open-weight model from a non-Chinese company.
Arcee, a 26-person U.S. startup, released Trinity Large Thinking — a 400B-parameter open-weight reasoning model built on a $20M budget. The model is available for download, self-hosting, or via API, and is licensed under Apache 2.0 with no usage restrictions. Arcee's CEO claims it's the most capable open-weight reasoning model released by a non-Chinese company. The release coincides with growing enterprise anxiety around Chinese models (DeepSeek) and closed-source dependency risks highlighted by Anthropic's recent OpenClaw policy change.
Trinity Large Thinking is a drop-in alternative to closed-source reasoning models with zero licensing restrictions — no usage caps, no API rate limits set by someone else's business decisions, no sudden policy pivots like Anthropic just pulled on OpenClaw users. The Apache 2.0 license means you can modify, redistribute, and commercialize without legal risk. At 400B params with reasoning capabilities, this is directly competitive with models your stack currently pays per-token for.
Pull Trinity Large Thinking via OpenRouter API this week and run it against your existing Claude or GPT-4o reasoning benchmark — measure cost-per-call and latency against your current setup to quantify the switch cost.
Go to openrouter.ai and create a free account, then grab your API key from the dashboard
Tags
Also today
Signals by role
Also today
Tools mentioned