On May 12, The Information reported Anthropic is in advanced talks to acquire Stainless for at least $300 million, a deal that would hand the Claude maker control of the SDK-generation pipeline that produces the official client libraries…

Anthropic is in advanced talks to buy Stainless, the four-year-old New York startup whose AI-powered compiler generates the official Python, TypeScript, Go, Kotlin, and Java SDKs for OpenAI, Google, Cloudflare, Meta, Runway, Groq, and Cerebras, for at least $300 million in what would be its first acquisition. If you import openai or @google/genai anywhere in production, the company that just got paid to maintain those libraries is about to become a wholly owned subsidiary of their largest competitor.
May 12, 2026 – The Information broke the story, citing a person with knowledge of the deal. Price tag: at least $300 million, with some portion likely paid in Anthropic stock. Sources describe this as Anthropic's first acquisition, which should make you sit up. They're not buying a model team or a research lab. They're buying the plumbing.
The markup is the other tell. Stainless raised a Series A in December 2024 at a $150M valuation. Five months later, Anthropic is paying roughly 2x that. The same day the deal leaked, Bloomberg reported Anthropic is targeting a $900B valuation in a new $30B+ round. Talks are advanced but not closed. Terms could still move.
Stainless is an AI-powered compiler that takes an OpenAPI spec and emits production-ready SDKs in TypeScript, Python, Go, Kotlin, Java, Ruby, PHP, and C#, with Terraform in beta and Rust and Swift on the roadmap. The generated SDKs ship with retries, streaming, pagination, and typed error handling already wired up. The customer list a four-year-old startup has no business having: OpenAI, Anthropic, Google, Cloudflare, Meta, Runway, Groq, Cerebras, Lithic, and Modern Treasury. Downloads are measured in tens of millions per week across those libraries. The github.com/stainless-sdks org publicly maintains a slice of it; the rest lives in customer-owned repos that Stainless writes commits to.
There are four. Each is bad in a different way.
Scenario A – they stay on Stainless under contract. Cheapest in the short term. Untenable past renewal. Every SDK release routes through a vendor owned by the largest competitor, and you trust that competitor not to look at the OpenAPI specs it now compiles. I don't think OpenAI signs that renewal. Google might, for a quarter, while they spin up the alternative.
Scenario B – they fork the generator. Stainless's value is the compiler IP, not the output. Forking the latest generated SDK is one git clone. Forking the compiler that produces the next ten years of SDKs is a multi-quarter engineering project that must land before the next major API change ships. Hard, expensive, but defensible. I'd put OpenAI here within six months.
Scenario C – they pull SDK generation in-house from scratch. OpenAI has the headcount. Expect 6–12 months of churn: breaking changes, missing language support, the long tail of edge cases Stainless already solved. Kotlin and Go SDKs are thinner than Python and TypeScript and will regress first.
Scenario D – quiet status quo for twelve months while Anthropic proves it won't gut the team or slow-roll competitor releases. Possible. Not betable. You don't structure a production dependency around the good behavior of a competitor.
The operator implication: pin SDK versions today. Audit which official libraries you depend on. Note which languages have the thinnest test coverage in the libraries you ship to production, because that's where regression risk lands first.
I run six AI providers in one Cloudflare Workers stack – Anthropic, Google AI Studio, xAI, OpenAI, DataForSEO, Browser Rendering. I don't import any of them as SDKs. Every paid call goes through fetch() against the AI Gateway endpoint with a JSON body that mirrors each provider's REST shape.
I made that decision three weeks ago for a different reason: every paid call needed to flow through a single callAi() wrapper that enforces a $20 daily cost cap and a $1 per-instance cap. SDKs hide that chokepoint behind a method call. To put the cap on the SDK path, you have to monkey-patch the client or wrap every call site. To put it on fetch(), you write one function.
The Cloudflare AI Gateway BYOK passthrough means one auth header (cf-aig-authorization) and one URL pattern across Anthropic, OpenAI, Google, xAI, and OpenAI image generation. The shape is roughly:
This week's accidental side effect: zero exposure to whatever Stainless ownership changes do to release cadence, breaking changes, or telemetry hooks. The cost of this architecture: you write the request payloads yourself, which is exactly the kind of detail SDKs were invented to hide. It's worth it when (a) you run three or more providers, (b) you need a single cost chokepoint, and (c) you are now also hedging vendor-ownership risk. The third reason didn't exist on Monday.
If you want the longer version of how this stack runs on Cloudflare Workers without exploding the budget, I wrote it up in Cursor Cloud Agent environments vs Cloudflare Workers.
Audit your package.json and requirements.txt for SDKs Stainless ships. The Anthropic and OpenAI Python and Node clients are obvious. Google's new @google/genai is also Stainless – check the github.com/stainless-sdks org for the full list before you guess.
Pin every Stainless-generated SDK to the current minor version. Take patch updates, hold the minors for review. This is 15 minutes of work and it buys you a quarter of optionality.
For any new code path touching a model API, write the raw fetch first and the SDK call second. If the raw version is under 40 lines including types, ship the raw version. Most chat-completion and message endpoints clear that bar easily. Streaming and tool-use add another 20 lines. Still under the threshold for a path you'll own for years.
If you run a gateway – Cloudflare AI Gateway, OpenRouter, Portkey, LiteLLM – route every model call through it. One auth, one observability surface, one place to swap providers when the SDK underneath turns into a political liability.
Add a 90-day calendar reminder to recheck which SDKs OpenAI and Google maintain in-house versus through Stainless. The fork-or-stay decision will land before year-end.
Model competition has moved from weights to plumbing. The model itself is increasingly interchangeable; the integration layer is sticky. That's the only reading of this deal that makes the price tag rational.
Anthropic raising $30B at $900B while spending $300M on SDK infrastructure rather than a flashier model launch is the tell. Both companies are reaching past the model into the deployment surface. The model is becoming the cheap part. The integration is becoming the moat. You can see the same pattern in how Anthropic is selling Claude into mid-market – the $50K automation stack write-up is one example of the enterprise reach that justifies owning the SDK layer.
For founders, the next 12 months are about owning your integration layer the way you'd own your data layer. Don't optimize for SDK ergonomics. Optimize for substitutability. The model you're calling today is not the model you'll be calling in 2027. Build the wrapper that survives the swap.
May 14, 2026
News

PwC and Anthropic announced a multi-year alliance expansion on May 14, 2026 that puts Claude Code and Cowork into hundreds of thousands of PwC professionals' hands, with 30,000 US staff certified inside the year. The release publishes…
May 15, 2026
On May 14, 2026, OpenAI shipped a preview of Codex inside the ChatGPT mobile app on iOS, iPad, and Android, available on all plans including Free and Go. The phone becomes a thin controller over a Codex desktop session running on a paired…
May 15, 2026OpenAI shipped Codex inside ChatGPT mobile on May 14, free tier included. The phone is now the agent control plane for solo Mac operators.
May 15, 2026