Media reports that OpenAI has been in discussions with Google for several months and finalized a cooperation agreement last month, where Google Cloud provides the computing power for OpenAI to train and run its models. Cooperation could not be established earlier due to OpenAI's lock-in agreement with Microsoft. Before January of this year, Microsoft was the exclusive cloud service provider for OpenAI.
Recent news shows that OpenAI is both reaching out to its major "backer" Microsoft's old rival Alphabet-C and actively promoting Industry infighting with a fierce price war.
On June 10th, Tuesday, Eastern Time, OpenAI's CEO Sam Altman announced on Social Media that the inference model o3 would be reduced in price by 80%, looking forward to seeing people’s reactions, believing that people would be satisfied with the performance and pricing of o3 Pro.

This significant price drop is undoubtedly another step for OpenAI to further promote the 'internal competition' of large models after DeepSeek's emergence.
After DeepSeek released a groundbreaking high-cost performance open-source model in January this year, OpenAI officially launched its most cost-effective inference model o3-mini at the end of that month, opening up the inference model for free users for the first time, proposing that ChatGPT free users can try out the o3-mini model by selecting 'Reason' or regenerating responses in the message editor.
On the same Tuesday, shortly after Altman announced the price reduction of o3, media reported that OpenAI has reached a cloud service cooperation agreement with Alphabet-C, utilizing OpenAI's capabilities in the AI field alongside Alphabet-C's computing resources to support its own Business. This unprecedented cooperation not only signifies OpenAI's further loosening of its reliance on Microsoft but also exposes the harsh reality behind the AI arms race—survival needs are overshadowing everything, and yesterday's enemies can become today's allies.
The 'impossible alliance' driven by the hunger for computing power.
According to media reports, the seemingly incredible collaboration between OpenAI and Alphabet-C was officially finalized in May this year, with discussions taking place for several months prior. For OpenAI, this is the latest move to reduce over-reliance on Microsoft; for Alphabet-C, this is a major victory for its cloud service Business, although it also means providing ammunition to a strong competitor in its own AI Business.
Media reports cite sources claiming that Alphabet-C Cloud will provide new computing power for OpenAI to train and run AI models. As is well known, the launch of OpenAI's ChatGPT poses the most serious threat to Google's leading search business in years.
This collaboration highlights a harsh reality in the AI Industry: massive computing demands are reshaping the competitive landscape. On Monday, an OpenAI spokesperson revealed that OpenAI's annual recurring revenue (ARR) reached 10 billion dollars, nearly doubling from 5.5 billion dollars in the same period last year. However, this rapid growth is driven by an insatiable thirst for computing power.
Reports from last September indicated that OpenAI anticipates that the computing costs for model training could rise significantly in the coming years, reaching as high as 9.5 billion dollars per year by 2026, not including the pre-training amortization costs for large language model (LLM) research. A report from February of this year stated that OpenAI would 'burn money' even more vigorously in the coming years as it invests all its revenue into the computing demands of running existing models and developing new ones. OpenAI expects the total computing costs from this year to 2030 to exceed 320 billion dollars.
Microsoft is no longer 'exclusive'.
On January 21 of this year, $Microsoft (MSFT.US)$ it announced that it would no longer serve as OpenAI's exclusive cloud service provider but retained 'preferential purchasing rights'. Reports at the time indicated that this adjustment was due to dissatisfaction among OpenAI's top management regarding the slow progress of Microsoft's construction of new data centers.
Despite no longer being the exclusive cloud provider, Microsoft still retains the right to exclusively resell OpenAI models on the Azure cloud platform and can reuse OpenAI's intellectual property in its products. Microsoft stated that the current cooperation agreement between the two parties will last until 2030. In addition, Microsoft also enjoys 25% of OpenAI's revenue and has the right to share in the profits of its future products.
Reports from the media this Tuesday indicate that prior to Microsoft's official announcement in January, Microsoft's Azure cloud service has been the exclusive data center infrastructure provider for OpenAI. $Alphabet-C (GOOG.US)$ Previously, OpenAI was unable to reach a partnership with OpenAI due to the exclusivity agreement with Microsoft. Currently, Microsoft and OpenAI are renegotiating the terms of Microsoft's multi-billion dollar investment agreement, including Microsoft's future equity stake in OpenAI.
OpenAI's diversified strategy also includes a $500 billion 'Stargate' infrastructure project in collaboration with SoftBank and$Oracle (ORCL.US)$a multi-billion dollar computing power agreement.$CoreWeave (CRWV.US)$
Wall Street Journal mentioned in February that currently, most of the power and data center computing capacity for OpenAI is provided by Microsoft. OpenAI is shifting to relying on SoftBank as the main investor for the Stargate project. OpenAI expects that by 2030, the Stargate data center project will support three-quarters of the computing power needed for its running and development of AI models.
In addition, it has been reported that OpenAI also plans to finalize the design of its first self-developed chip this year to reduce dependence on external hardware providers.
Editor/rice