Microsoft lost $357 billion in market value in a single day—the biggest drop since March 2020—last week. On the same day, Meta rose 10%. On Thursday, February 5th, Google raised the stakes: it announced up to $185 billion in investments for 2026—97% more than 2025 and 50% above what analysts expected. Shares fell 7% in after-hours trading before recovering.
All these companies spend tens of billions on artificial intelligence ( AI ). Why are some rewarded and others punished? The answer reveals that we are in the midst of one of the greatest value transfers in economic history—and most are looking in the wrong place.
Microsoft reported results that exceeded expectations — $81 billion in revenue and record profits — but its shares suffered their biggest drop in five years. The company spent $37.5 billion on capital investments, 66% more than the previous year, and still cannot build data processing centers fast enough.
Chief Financial Officer Amy Hood admitted that capacity constraints are already limiting Azure. Microsoft is stuck: it needs to spend more and more to serve customers who, in turn, have not yet proven that they can monetize the AI they consume.
Meta has already announced investments of up to US$135 billion for 2026 — almost double that of 2025. Mark Zuckerberg was explicit: he is focused on building a "personal superintelligence." Every dollar spent on AI directly improves ad targeting, recommendations, and content moderation.
Google presented an intermediate case: revenue and profit above expectations, Google Cloud growing 48% and surpassing Azure for the first time in years, and a contract backlog reaching US$240 billion. But the CapEx guidance was alarming: up to US$185 billion, when the market expected US$119 billion.
CEO Sundar Pichai justified it as: "to meet customer demand and capitalize on opportunities." There's an important technical detail: Google has reduced the costs of running Gemini by 78% throughout 2025. Even so, it will need to almost double its investments. Efficiency is improving—but demand is growing faster.
The lesson from all three cases is clear: when you spend on AI for yourself, the return is yours. When you spend to sell AI to third parties, the return depends on those third parties being able to monetize it—and most are failing.
The seven blind spots of the market
There are seven dynamics that, together, paint a very different picture from the optimistic narrative:
The Inference Trap: everyone talks about training models, but few talk about the cost of running them. Training is a one-time event; inference is an ongoing cost that grows with each user. OpenAI spent $8.67 billion on inference in the first nine months of 2025—almost double its revenue. Sam Altman publicly admitted that the $200-a-month Pro subscription is unprofitable.
Mathematics is perverse. The more success, the more losses. The "AI agents" that everyone celebrates—like Anthropic's Claude Code—consume 10 to 25 times more resources than simple conversations. A programming session with an agent can use 500,000 tokens; a normal conversation uses 20,000. And there's an additional paradox that few realize: the cost per token is falling about 200 times per year, according to Epoch AI. But new reasoning models consume 50 to 100 times more tokens per task. It's like building more efficient engines and using them to create giant trucks. The cost per complex task may be rising, not falling.
The Wall of Data: According to Elon Musk , "the cumulative sum of human knowledge has been exhausted" for training AI. There is no more high-quality human text available on the necessary scale. Worse: 74% of new web pages already contain AI-generated text. Models trained on synthetic data degrade like a photocopy of a photocopy—a phenomenon researchers call "model collapse," mathematically inevitable according to a study published at the ICLR 2025 conference.
The Scaling Plateau: For five years, larger models meant better models. That era is over. Researchers at HEC Paris concluded that "for over a year, frontier models seem to have reached their ceiling." Jensen Huang of Nvidia confirmed this: the demand for inference is growing faster than for training. The world is shifting from building models to running models. This fundamentally changes where the money goes—and favors infrastructure over applications.
The 95% Failure Rate: MIT's NANDA Project revealed that, despite approximately $35 billion invested, 95% of enterprise AI projects fail to deliver measurable returns. S&P Global reported that 42% of companies abandoned most AI initiatives by 2025—compared to 17% in 2024. PwC surveyed 4,500 global CEOs: 56% saw no significant financial benefit. When CFOs demand proof, the money will dry up.
The Talent Bottleneck: There are only 40,000 AI scientists in the US —40 times fewer than software developers. Global demand exceeds supply by 3.2 times. Meta offered over $250 million to hire a single researcher. Energy is an engineering problem that takes years to solve. Talent is a problem that takes decades.
The most dramatic case is at Apple . This week, Bloomberg reported that a dozen key researchers and executives will leave the company by 2025 alone. The head of the AI modeling team and the executive who would lead the new virtual assistant went to Meta. The result: Apple will pay $1 billion a year to use Google's Gemini because it can't develop it internally.
The contrast with investments is revealing: while Amazon, Google, and Meta increase spending by 60% to 97%, Apple barely moves—only 2% more than the previous year, totaling US$13 billion compared to Amazon's US$200 billion. It's an implicit admission that it arrived too late.
Circular Financing: Nvidia invests $100 billion in OpenAI. OpenAI buys processors from Nvidia. Microsoft and Nvidia invest $15 billion in Anthropic . Anthropic commits $30 billion to computing. This isn't independent capital—it's circular flow. Goldman Sachs and Yale analysts have flagged this as a systemic risk. When the music stops, everyone stops at the same time.
The Software as a Service Spiral: This week, the software sector entered a bear market. ServiceNow fell 11%, SAP fell 16%, Salesforce fell 7% — even after results exceeding expectations. The market is saying that execution doesn't matter: the business model is under threat. AI agents like Claude and ChatGPT can execute entire workflows that previously required specialized software. Traditional software has a near-zero marginal cost; AI-powered software has a real cost per click. This economic inversion is only just beginning.
Infrastructure wins regardless of who wins.
This week's numbers are historic. Adding up the announced guidance, the seven largest technology groups are expected to invest more than US$700 billion in 2026 alone: Amazon (US$200 billion), Google (US$180 billion), Meta (US$125 billion), Microsoft (US$117 billion), Oracle (US$55 billion), Tesla (US$20 billion), and Apple (US$13 billion). This is equivalent to the GDP of Switzerland, spent in a single year by seven companies on a single technology. The increase compared to 2025 approaches US$300 billion—almost 1% of the American GDP.
This capital will be directed to various segments, regardless of which AI model prevails.
Energy emerges as the ultimate bottleneck in this cycle. Data centers already consume 8.9% of American electricity and account for almost half of the demand growth by 2030. The International Energy Agency projects that global data center consumption will double to 945 terawatt-hours by 2030—equivalent to Japan's total consumption.
Critical transmission equipment has a waiting list of up to 143 weeks. Elon Musk was blunt: "AI is fundamentally limited by energy." That's why he merged SpaceX with xAI—it's going to space because it can't solve energy problems on Earth fast enough.
There's a geopolitical dimension that few discuss. In 2024, China added 429 gigawatts of electrical capacity. The US added 29—fifteen times less. China plans to add seven times more capacity than the US in 2025, with 80% from renewable sources. China's State Grid will invest US$574 billion in transmission infrastructure by 2030. While American executives treat energy as an "extreme bottleneck," Chinese executives treat it as a "solved problem." The AI race is, at its core, an energy race—and the US is lagging behind in this chapter.
Cooling is another critical point. While a traditional rack consumes between 10 and 20 kilowatts, Nvidia's new AI chips—like the Blackwell—exceed 1,000 watts per component. Air cooling has reached its physical limit. Liquid cooling has gone from a luxury to a necessity. Data centers consume up to 5 million gallons of water per day; projections indicate 68 billion gallons per year by 2028 in the US, a 300% increase. Two-thirds of new data centers since 2022 are located in water-scarce regions.
Connectivity acts as the invisible glue that enables the ecosystem. Training models requires thousands of processors operating in parallel. The demand for ever-increasing speeds is growing faster than in any other segment.
Infrastructure semiconductors play a vital role. Although Nvidia dominates graphics processors, the unsustainable economics of inference are forcing migration. Midjourney cut monthly costs from $2.1 million to $700,000 by migrating to Google chips—payback in 11 days. Anthropic closed the largest custom chip contract in Google's history. OpenAI is diversifying: it closed $350 billion contracts with Broadcom and $90 billion with AMD.
All this diversification strengthens those in the layer below. TSMC manufactures 90% of the world's advanced chips—for Nvidia, for AMD, for Google's and Amazon's custom chips. Broadcom designs the specialized circuits that everyone is ordering. Fragmentation in the upper layer benefits those who collect tolls in the lower layer.
The lesson of history
Those who invested in railroads in the 1860s saw most of their companies fail—but the infrastructure they built transformed the economy for a century. Those who invested in internet companies in 1999 lost almost everything—but the broadband they created remains essential.
The pattern is always the same: infrastructure comes first, productivity later. The gap between the two is where fortunes are lost and made. Infrastructure operates on the timescale of engineering and physics. Productivity operates on the timescale of human organization, governance, and learning. These two clocks are out of sync—and this lag is the greatest investment opportunity of the decade.
AI is not limited by intelligence itself. It is limited by economics, energy, data, and people. Whoever understands this first, wins.
This week proved every element of the thesis: $700 billion in announced investments, Microsoft and Google admitting capacity constraints, Apple losing its AI team to Meta and outsourcing to Google, software as a service entering a bear market because AI threatens to replace entire workflows.
The value goes to those who build the tracks, not to those who promise the destination. The next chapter of this story is already being written. The question is: will you be on the right side?
Walter Maciel is the CEO of AZ Quest.