FINK
ALIGN
IBIT
CEO
BLACKROCK
Global demand for computing power is rising so fast that it could create an entirely new type of financial market, according to Larry Fink.
Speaking at the Milken Institute Global Conference, the BlackRock CEO said traders may one day buy and sell futures tied to computing capacity — similar to how markets already price commodities like energy or agriculture.
"A new asset class will be buying futures of compute," Fink said during a panel discussion Tuesday, adding that current supply is far from enough. "We just don’t have enough compute power right now."
Fink’s comments come as demand for artificial intelligence infrastructure continues to outpace supply.
He pointed to multiple bottlenecks, including shortages in chips, memory and power, which are slowing the pace of expansion despite rising demand.
Related: BlackRock warns there isn't enough stock to buy
"We're short power, we're short compute, we're short chips," Fink said, warning that the industry is not scaling quickly enough to keep up.
"I don't believe we're moving fast enough."
He also pushed back on concerns that the rapid growth in AI could be forming a bubble.
“There is not an AI bubble,” Fink said.
“There is the opposite. We have supply shortages. Demand is growing much faster than anyone has ever anticipated.”
To address these gaps, BlackRock is committing significant capital to the sector.
The firm is investing tens of billions of dollars in data centers and energy companies through partnerships with Microsoft, Nvidia and MGX, an investment vehicle based in the United Arab Emirates.
A consortium led by BlackRock’s Global Infrastructure Partners agreed to acquire Aligned Data Centers for about $40 billion in October 2025. The group is also working with private equity firm EQT to purchase power provider AES Corp. for $10.7 billion in cash.
The Aligned deal is part of a broader push to expand AI infrastructure.
The investment group, which includes Microsoft, Nvidia and others, aims to deploy $30 billion in equity capital to support the buildout of data centers and related systems.
The surge in investment reflects how central data centers have become to the global economy.
These facilities house the hardware needed to train AI models and run large-scale computing workloads. As demand grows, so does the need for energy and infrastructure to support them.
At least $3 trillion is expected to flow into data center-related investments over the next five years, according to Moody’s Ratings. The firm said this capital will support spending across servers, computing equipment, facilities and power capacity.
Large technology companies are already leading that push. Six US hyperscalers — Microsoft, Amazon, Alphabet, Oracle, Meta and CoreWeave — are on track to spend $500 billion on data centers this year alone, Moody’s said.
Other industry leaders see the same long-term transformation underway.
At the same panel, Bruce Flatt said the global economy is being reshaped around data centers, cloud computing and artificial intelligence.
“For the next 10 years, we will be rewiring the global economy,” Flatt said.
Fink believes the growing importance of computing power could eventually change how companies manage costs and risk.
Just as businesses hedge against fluctuations in commodities like fuel or crops, a futures market for computing could allow firms to lock in pricing for the resources needed to run AI systems.
While such a market does not yet exist, the scale of demand — combined with persistent supply shortages — is already pushing investors to think of computing power as a tradable asset in its own right.
For crypto investors, the idea of turning compute into a tradable asset may sound familiar. Blockchain networks already rely on computation as a core input, with miners and validators effectively monetizing it. Extending that logic to AI infrastructure could create new ways to price and trade access to computing resources, bringing traditional finance closer to models already seen in crypto markets.
Related: Goldman Sachs, JPMorgan sharply diverge on quantum computing