AIDevDesignMarketingFoundersBusinessNewsAbout
Work With Me
Work With Me

AI Blueprints to Leverage Your Business. Strategies. Systems. Execution.

hi@omidsaffari.com
Instagram·X·LinkedIn·GitHub
Navigation
  • HomeHome
  • AboutAbout
  • BlogBlog
  • NewsletterNewsletter
  • Work With MeWork With Me
  • ContactContact
Legal
  • PrivacyPrivacy
  • TermsTerms
  • DisclaimerDisclaimer
  • SitemapSitemap
  • RSS FeedRSS Feed
Categories
  • AIAI
  • StackStack
  • DesignDesign
  • WorkflowWorkflow
  • GrowthGrowth
Topics
  • AI AgentsAI Agents
  • PromptsPrompts
  • Next.jsNext.js
  • n8nn8n
  • NotionNotion
Formats
  • GuidesGuides
  • LabsLabs
  • ToolsTools
  • TrendsTrends
  • ResourcesResources
More Formats
  • TutorialsTutorials
  • Case StudiesCase Studies
  • ComparisonsComparisons
  • TemplatesTemplates
  • ChecklistsChecklists
Empire
  • DaVinci HorizonDaVinci Horizon
  • Imperfeqt AIImperfeqt AI
  • DVNC StudioDVNC Studio
  • DVNC.aeDVNC.ae
  • With LidaWith Lida
Connect
  • YouTubeYouTube
  • Twitter/XTwitter/X
  • LinkedInLinkedIn
  • GitHubGitHub
  • InstagramInstagram
© 2026 omidsaffari.comBuilt with Next.js · Vercel
  1. Blog
  2. News

Anthropic Is Buying the SDK Generator Behind OpenAI's and Google's Libraries. Here's What That Means for Anyone Building on More Than One Model.

On May 12, The Information reported Anthropic is in advanced talks to acquire Stainless for at least $300 million, a deal that would hand the Claude maker control of the SDK-generation pipeline that produces the official client libraries…

Anthropic Is Buying the SDK Generator Behind OpenAI's and Google's Libraries. Here's What That Means for Anyone Building on More Than One Model.
Omid Saffari

Founder & CEO, AI Entrepreneur

Share
Stay updated

Get weekly AI blueprints and insights.



Anthropic is in advanced talks to buy Stainless, the four-year-old New York startup whose AI-powered compiler generates the official Python, TypeScript, Go, Kotlin, and Java SDKs for OpenAI, Google, Cloudflare, Meta, Runway, Groq, and Cerebras, for at least $300 million in what would be its first acquisition. If you import openai or @google/genai anywhere in production, the company that just got paid to maintain those libraries is about to become a wholly owned subsidiary of their largest competitor.

What actually got reported on Tuesday

May 12, 2026 – The Information broke the story, citing a person with knowledge of the deal. Price tag: at least $300 million, with some portion likely paid in Anthropic stock. Sources describe this as Anthropic's first acquisition, which should make you sit up. They're not buying a model team or a research lab. They're buying the plumbing.

The markup is the other tell. Stainless raised a Series A in December 2024 at a $150M valuation. Five months later, Anthropic is paying roughly 2x that. The same day the deal leaked, Bloomberg reported Anthropic is targeting a $900B valuation in a new $30B+ round. Talks are advanced but not closed. Terms could still move.

What Stainless actually builds, in one paragraph

Stainless is an AI-powered compiler that takes an OpenAPI spec and emits production-ready SDKs in TypeScript, Python, Go, Kotlin, Java, Ruby, PHP, and C#, with Terraform in beta and Rust and Swift on the roadmap. The generated SDKs ship with retries, streaming, pagination, and typed error handling already wired up. The customer list a four-year-old startup has no business having: OpenAI, Anthropic, Google, Cloudflare, Meta, Runway, Groq, Cerebras, Lithic, and Modern Treasury. Downloads are measured in tens of millions per week across those libraries. The github.com/stainless-sdks org publicly maintains a slice of it; the rest lives in customer-owned repos that Stainless writes commits to.

The realistic scenarios for OpenAI and Google

There are four. Each is bad in a different way.

Scenario A – they stay on Stainless under contract. Cheapest in the short term. Untenable past renewal. Every SDK release routes through a vendor owned by the largest competitor, and you trust that competitor not to look at the OpenAPI specs it now compiles. I don't think OpenAI signs that renewal. Google might, for a quarter, while they spin up the alternative.

Scenario B – they fork the generator. Stainless's value is the compiler IP, not the output. Forking the latest generated SDK is one git clone. Forking the compiler that produces the next ten years of SDKs is a multi-quarter engineering project that must land before the next major API change ships. Hard, expensive, but defensible. I'd put OpenAI here within six months.

Scenario C – they pull SDK generation in-house from scratch. OpenAI has the headcount. Expect 6–12 months of churn: breaking changes, missing language support, the long tail of edge cases Stainless already solved. Kotlin and Go SDKs are thinner than Python and TypeScript and will regress first.

Scenario D – quiet status quo for twelve months while Anthropic proves it won't gut the team or slow-roll competitor releases. Possible. Not betable. You don't structure a production dependency around the good behavior of a competitor.

The operator implication: pin SDK versions today. Audit which official libraries you depend on. Note which languages have the thinnest test coverage in the libraries you ship to production, because that's where regression risk lands first.

Why raw fetch through an AI gateway looks different this week

I run six AI providers in one Cloudflare Workers stack – Anthropic, Google AI Studio, xAI, OpenAI, DataForSEO, Browser Rendering. I don't import any of them as SDKs. Every paid call goes through fetch() against the AI Gateway endpoint with a JSON body that mirrors each provider's REST shape.

I made that decision three weeks ago for a different reason: every paid call needed to flow through a single callAi() wrapper that enforces a $20 daily cost cap and a $1 per-instance cap. SDKs hide that chokepoint behind a method call. To put the cap on the SDK path, you have to monkey-patch the client or wrap every call site. To put it on fetch(), you write one function.

The Cloudflare AI Gateway BYOK passthrough means one auth header (cf-aig-authorization) and one URL pattern across Anthropic, OpenAI, Google, xAI, and OpenAI image generation. The shape is roughly:

ts
1export async function callAi(env: Env, ctx: Ctx, runner: () => Promise<Response>) {
2 await assertUnderCostCap(env, ctx);
3 const started = Date.now();
4 const res = await runner();
5 await recordCost(env, ctx, res, Date.now() - started);
6 return res;
7}
8
9// usage — one provider, no SDK in the dep tree
10const res = await callAi(env, ctx, () => fetch(
11 `https://gateway.ai.cloudflare.com/v1/${env.ACCOUNT}/${env.GATEWAY}/anthropic/v1/messages`,
12 {
13 method: "POST",
14 headers: {
15 "cf-aig-authorization": `Bearer ${env.AI_GATEWAY_TOKEN}`,
16 "x-api-key": env.ANTHROPIC_API_KEY,
17 "anthropic-version": "2023-06-01",
18 "content-type": "application/json",
19 },
20 body: JSON.stringify({ model, max_tokens, messages }),
21 },
22));

This week's accidental side effect: zero exposure to whatever Stainless ownership changes do to release cadence, breaking changes, or telemetry hooks. The cost of this architecture: you write the request payloads yourself, which is exactly the kind of detail SDKs were invented to hide. It's worth it when (a) you run three or more providers, (b) you need a single cost chokepoint, and (c) you are now also hedging vendor-ownership risk. The third reason didn't exist on Monday.

If you want the longer version of how this stack runs on Cloudflare Workers without exploding the budget, I wrote it up in Cursor Cloud Agent environments vs Cloudflare Workers.

What to do this week

Audit your package.json and requirements.txt for SDKs Stainless ships. The Anthropic and OpenAI Python and Node clients are obvious. Google's new @google/genai is also Stainless – check the github.com/stainless-sdks org for the full list before you guess.

Pin every Stainless-generated SDK to the current minor version. Take patch updates, hold the minors for review. This is 15 minutes of work and it buys you a quarter of optionality.

For any new code path touching a model API, write the raw fetch first and the SDK call second. If the raw version is under 40 lines including types, ship the raw version. Most chat-completion and message endpoints clear that bar easily. Streaming and tool-use add another 20 lines. Still under the threshold for a path you'll own for years.

If you run a gateway – Cloudflare AI Gateway, OpenRouter, Portkey, LiteLLM – route every model call through it. One auth, one observability surface, one place to swap providers when the SDK underneath turns into a political liability.

Add a 90-day calendar reminder to recheck which SDKs OpenAI and Google maintain in-house versus through Stainless. The fork-or-stay decision will land before year-end.

The bigger frame

Model competition has moved from weights to plumbing. The model itself is increasingly interchangeable; the integration layer is sticky. That's the only reading of this deal that makes the price tag rational.

Anthropic raising $30B at $900B while spending $300M on SDK infrastructure rather than a flashier model launch is the tell. Both companies are reaching past the model into the deployment surface. The model is becoming the cheap part. The integration is becoming the moat. You can see the same pattern in how Anthropic is selling Claude into mid-market – the $50K automation stack write-up is one example of the enterprise reach that justifies owning the SDK layer.

For founders, the next 12 months are about owning your integration layer the way you'd own your data layer. Don't optimize for SDK ergonomics. Optimize for substitutability. The model you're calling today is not the model you'll be calling in 2027. Build the wrapper that survives the swap.

Key Takeaways

  • Anthropic is in advanced talks to buy Stainless for at least $300M – the company that generates official SDKs for OpenAI, Google, Cloudflare, Meta, and a dozen others.
  • The realistic 12-month path is OpenAI forks the generator or pulls SDK generation in-house; Google probably follows. Expect breaking changes in Kotlin and Go first.
  • Pin every Stainless-generated SDK to its current minor version this week. Audit your imports.
  • For new code paths, write the raw fetch through a gateway first. Under 40 lines, ship the raw version.
  • Model competition has moved from weights to plumbing. Build for substitutability, not SDK ergonomics.
Last Updated

May 14, 2026

Category

News

Omid Saffari

Founder & CEO, AI Entrepreneur

Digital marketing specialist with expertise in AI, automation, and web development. Helping businesses build strong online presences that drive results.

X.com
Instagram
LinkedIn
WhatsApp
Email

More from News

PwC Will Certify 30,000 on Claude Code. Here's the 18-Month Window Before Big-4 Procurement Eats Mid-Market AI Builds.
PwC Will Certify 30,000 on Claude Code. Here's the 18-Month Window Before Big-4 Procurement Eats Mid-Market AI Builds.

PwC and Anthropic announced a multi-year alliance expansion on May 14, 2026 that puts Claude Code and Cowork into hundreds of thousands of PwC professionals' hands, with 30,000 US staff certified inside the year. The release publishes…

May 15, 2026
OpenAI Just Put Codex Inside the ChatGPT Mobile App. Here's Why Asynchronous Coding Just Became the Default for Solo Operators.
OpenAI Just Put Codex Inside the ChatGPT Mobile App. Here's Why Asynchronous Coding Just Became the Default for Solo Operators.

On May 14, 2026, OpenAI shipped a preview of Codex inside the ChatGPT mobile app on iOS, iPad, and Android, available on all plans including Free and Go. The phone becomes a thin controller over a Codex desktop session running on a paired…

May 15, 2026
OpenAI Just Put Codex Inside the ChatGPT Mobile App. Here's Why Asynchronous Coding Just Became the Default for Solo Operators.

OpenAI shipped Codex inside ChatGPT mobile on May 14, free tier included. The phone is now the agent control plane for solo Mac operators.

May 15, 2026
View all News articles