① According to the latest news on Friday, Amin Vahdat, head of Google Cloud's AI infrastructure, has disclosed the computing power target of the AI giant during an internal meeting; ② CEO Sundar Pichai also warned at the meeting that 2026 would be an "extremely tight year," with AI competition, cloud demand, and computing pressure simultaneously driving up the company’s investment pace.
At Friday's opening, $Alphabet-A (GOOGL.US)$ shares surged over 4%, with the latest price at USD 301.065.

Following the unveiling of its groundbreaking new Gemini large model this week, Google's ability to meet a surge in computing power demand has become the latest focal point.
According to multiple sources familiar with the matter, $Alphabet-A (GOOGL.US)$ Amin Vahdat, head of Cloud AI infrastructure, stated at an all-hands meeting on November 6 that Google must “double computing capacity every six months,” with an overall goal of achieving “a 1000-fold increase in capability within four to five years” (Now we must double every 6 months... the next 1000x in 4-5 years).
Meeting materials showed that in a presentation titled "AI Infrastructure," Vahdat explicitly pointed out that "the competition for AI infrastructure is the most critical and expensive part of the entire AI race." He emphasized that Google’s mission is “not about who spends more,” but rather about building more reliable, higher-performing, and more scalable infrastructure.
This presentation took place one week after Google’s parent company Alphabet reported better-than-expected Q3 financial results. In its latest earnings report, Alphabet raised its capital expenditure forecast for the second time this year, setting the range at $91 billion to $93 billion, and noted that 2026 would see "significant growth."
This also means that the combined capital expenditures of the four major AI giants—Google, Microsoft, Amazon, and Meta—will reach $380 billion this year alone.
Vahdat stated that Google’s advantage lies in its ability, thanks to DeepMind’s long-term research capabilities, to anticipate future AI model architectures. Additionally, Google has its own self-developed chips, and last week, when it launched its seventh-generation Ironwood Tensor Processing Unit (TPU), the company claimed that its energy efficiency is nearly 30 times higher than the first-generation TPU introduced in 2018.
He said: "We must deliver 1000 times the computing power, storage, and networking capabilities at the same cost and equivalent energy consumption."
Company CEO Sundar Pichai further warned at the meeting that 2026 would be an "extremely tight year," with AI competition, cloud demand, and computing pressure simultaneously accelerating the company’s investment pace.
In response to internal concerns about an "AI bubble," Pichai acknowledged that such discussions do exist in the market, but added that "at this moment, the risk of underinvestment far outweighs that of overinvestment." Pichai noted that Google Cloud’s revenue this quarter grew by 34% year-over-year to exceed $15 billion, but added that “with more computing power, growth could have been even faster.”
The head of Google also stated that computing power remains the company's biggest bottleneck in the short term.
Pichai cited the example of Veo, the video generation tool upgraded last month, which could not be made available to more users due to insufficient computing power, thereby limiting its rollout speed. However, he emphasized that amid rising capital expenditures, the company still needs to maintain healthy free cash flow.
Editor /rice