Nobody's locked in anymore

In the space of 48 hours, OpenAI broke free from Microsoft exclusivity to land on AWS, GM replaced Google Assistant with Gemini in 4 million cars, and Harvard dropped ChatGPT Edu in favour of Claude. The pattern is unmistakable: switching AI providers has become trivially easy at every level of the stack — cloud infrastructure, car dashboards, campus IT. For anyone building on top of AI, the implication is stark: your vendor relationship is a lease, not a mortgage, and the tenant can move out overnight.

·3 min read

GeekWire

OpenAI models land on AWS Bedrock one day after Microsoft exclusivity ends

GPT-5.5, Codex, and Managed Agents are now available on Amazon Bedrock as part of a $50 billion Amazon investment in OpenAI, ending seven years of Microsoft exclusivity and reshaping cloud AI distribution.

geekwire.com

Twenty-four hours. That's how long it took for OpenAI's models to appear on a rival cloud after seven years of Microsoft exclusivity ended. Not weeks of integration work. Not months of enterprise sales cycles. GeekWire reported that GPT-5.5, Codex, and Managed Agents landed on Amazon Bedrock on 28 April, backed by $50 billion in Amazon investment and 2 gigawatts of Trainium capacity. The plumbing was ready before the ink was dry.

That speed tells you something the individual headlines don't. In the same 48-hour window, GM announced it's replacing Google Assistant with Gemini across roughly 4 million Cadillac, Chevrolet, Buick, and GMC vehicles via over-the-air update. And The Harvard Crimson reported that Harvard's Faculty of Arts and Sciences is phasing out subsidised ChatGPT Edu access after June and rolling out Anthropic's Claude instead. A cloud hyperscaler, a car manufacturer, and a university all swapped AI providers in the same news cycle. The common thread isn't who they switched to. It's how little friction the switching involved.

The commoditisation nobody planned for

The conventional reading is that these are competitive wins for Amazon, Google, and Anthropic respectively. And they are, in the narrow sense. But the bigger story is what the ease of switching reveals about the product itself. When GM can swap out one conversational AI for another with an OTA update to millions of cars, and when Harvard can move its entire faculty from one provider to another between semesters, the underlying technology has reached a level of interchangeability that none of the providers set out to create.

There's a useful parallel in electricity markets. In the early days of electrification, customers were locked to their local utility. The infrastructure was proprietary, the switching costs were enormous, and the supplier had pricing power. Deregulation and standardised grid interconnection changed that. Power became fungible. The generator still mattered, but the customer relationship became a contract, not a marriage. AI inference is arriving at the same structural moment, decades faster.

The way I see it, the AI providers have accidentally commoditised their own product by optimising for the same interface. OpenAI, Anthropic, and Google all converged on similar API shapes and pricing models, with comparable capability at the frontier. That convergence was inevitable (customers demanded it), but it means the moat isn't the model anymore. Harvard's FAS said it expects to "continually evaluate" AI platforms given the pace of change. That's not a vote of confidence in Claude. That's a declaration that every provider is on a rolling audition.

For anyone building products on top of these models, the implication cuts both ways. If you're a platform, your relationship with enterprise customers is a lease, not a mortgage, and the tenant can move out overnight. If you're a builder, the good news is you're never truly trapped. The bad news is neither are your customers.

The companies that win in a low-switching-cost world won't be the ones with the best model on any given Tuesday. They'll be the ones who make switching away feel like losing something the replacement can't replicate — whether that's a deeply integrated workflow, a data flywheel, or institutional trust built over years. The race to be the smartest model in the room is giving way to the race to be the hardest one to leave.


Read the original on GeekWire

geekwire.com

Stay up to date

Get notified when I publish something new, and unsubscribe at any time.

More news