The Unyielding Demand for Power

Jan 31, 2024
The Unyielding Demand for Power

Gaining a deep understanding of a trend — and therefore the magnitude of its growth — rarely comes from reading the headlines printed by the media.

They might contribute to the hype, or give the reader a sense that something big is happening…

But they lack specifics.

Let’s take a recent headline from yesterday’s Wall Street Journal for example:

“Microsoft Earnings Jump as AI Demand Boosts Cloud Unit”

Sounds great. Microsoft’s quarterly net income rose 33% to a whopping $21.9 billion. Revenue was up 18% to $62 billion. And Microsoft Cloud revenue jumped 24% to $33.7 billion.

Not surprisingly, the narrative was all about artificial intelligence (AI) — the collective buzzword used by companies everywhere, whether they are using AI or not.

Microsoft CEO Satya Nadella quipped, “We’ve moved from talking about AI to applying AI at scale.”

These are big numbers, no doubt. The hype around AI and Microsoft’s earnings announcement actually lifted Microsoft above a $3 trillion valuation last week.

Microsoft is only the second company after Apple to reach such lofty heights.

But what the financial journalists tend to miss are the underlying drivers behind the headline news.

And without understanding that, it’s easy to miss the real story — the bigger story, in this case.

Because it’s not just about Microsoft’s cloud business — that it is growing at a rapid pace…

By the time we’ve read the headline — that story has come and gone — it’s old news. Microsoft’s stock actually dropped after the earnings results were out.

The real story is about the underlying drivers behind not only Microsoft’s cloud growth…

But what’s fueling that growth, and what’s required to power that growth… literally.

Racks and Racks of Servers

But let’s stick with Microsoft for a moment as an example.

Over the last decade, one of the most interesting strategies that Microsoft has employed has been to acquire large software-based technology companies.

Every acquisition I’ve seen Microsoft tout has been all about “synergies” and “complimentary” services that “add value” to its enterprise customers and “improve” human productivity. The typical tiresome string of corporate buzzwords so common today.

This narrative hides the real purpose of acquisitions like these, which is to increase the utilization of one of Microsoft’s largest assets, its Azure cloud services business. 

By acquiring software companies that can host their software services on Azure, it increases the revenue to Microsoft’s cloud services business. 

It also utilizes Microsoft’s excess cash on its balance sheet. Either Microsoft spends it on acquisitions, or returns it to shareholders through stock buybacks or dividends.

Its acquisition of LinkedIn is a perfect example. 

Acquired in 2016 for $26.2 billion in cash, it was a real stretch to claim some kind of synergy with Microsoft’s enterprise business.

But it would drive material revenues to Microsoft’s cloud business.

The same was true of its Nuance Communications deal for $18.8 billion in 2022. Or its GitHub deal for $7.5 billion deal in 2018. Or more recently, its Activision Blizzard deal for $68.7 billion, which just closed last quarter.

All of these deals are designed to drive utilization of Microsoft’s data centers and increase revenues for its cloud services business.

It’s investment strategy has the same motivations, as well.

Take its investments in OpenAI for example.

Microsoft invested $1 billion into OpenAI back in 2019. At the time, it was well-known that Microsoft had been very behind in development of AI technology. So naturally, the deal was positioned as “strategic” for that reason. 

But the key part of the deal was that OpenAI was required to use Microsoft’s cloud services for its AI product.

Said another way, Microsoft gave OpenAI $1 billion so that it had the money to pay Microsoft back for cloud services at healthy gross margins to Microsoft. Out of one pocket, into another.

It clearly worked, as Microsoft stepped back up and has now invested a total of $13 billion into OpenAI, gaining control of the private company.

After having seen OpenAI’s product development as an insider — and after having gauged the sheer volume of computation required to run OpenAI’s ChatGPT and to train OpenAIs large language models (LLM) — Microsoft wanted more…

Hence the additional $12 billion investment.

While the above examples are specific to Microsoft, these deals are happening everywhere. 

If you’re in the cloud services business, you’re in the business of building and managing massive data centers around the world.

And it’s the kind of business that requires massive scale to be successful. 

After all, capital expenditures are in the billions, with heavy operational costs, so maximizing utilization is critical. 

And scale brings strong purchasing power to reduce costs. This also enables cloud services providers to offer competitive pricing in what has largely become a commoditized marketplace.

This is what drives these acquisitions and investment strategies.

After all, rack after rack of computer servers — required for computational power and storage — need to be procured to build out these football field-sized data centers.

Miles of cabling, millions of semiconductors, massive cooling systems, and incredible amounts of electricity are required to power these facilities.

Microsoft Cloud Data Center, Cheyenne, WY | Source: Microsoft
Microsoft Cloud Data Center, Cheyenne, WY | Source: Microsoft

And it’s that last point that’s particularly interesting…

Because we come to understand the scale of a this impending, rapid proliferation of cloud services and data center facilities.

An Unyielding Surge

Data centers will consume anywhere between 10-50-times more electricity per square foot than an equivalent commercial office building.

They consume electricity like nothing else. It is literally the most important operational input and cost required in operating a data center. 

And yet, unless you’re in the business, no one thinks about it.

Many complain about cryptocurrencies… and how they require huge amounts of electricity and emit CO2 to run.

And yet I’ve never heard a complaint about the CO2 emissions required to power the software behind Facebook, Instagram, TikTok, Netflix, or your choice of mobile game.

The reality is that there is no effort whatsoever to reduce electricity consumption when it comes to software services. The opposite is true. The goal is to increase consumption, which naturally requires increased production of electricity.

The forecasts for energy demand from data centers, recently published by the International Energy Agency (IEA), really highlight this point…

Electricity requirements to power these data centers is expected to double in just the next three years.

And not surprisingly, the unyielding surge in AI applications is the dominant catalyst for this incredible doubling in such a short period of time.

It is the definition of exponential growth — like a Moore’s Law for electricity demand, instead of for packing more transistors into smaller and smaller semiconductors.

To state the obvious, AI won’t run without electricity. 

And it won’t run without scaling up the computational power of data centers around the world. 

AI is a different beast.

It’s not like running Slack, or LinkedIn, or Google Mail.

Hitting Hypergrowth

I’ve been chuckling quite a bit at the pundits who proclaim that “true” AI hasn’t arrived yet. Or that the current AI software isn’t really AI, it’s just an “illusion” — it’s just software code.

The opposite it true.

Today’s AI technology has the ability to train on multi-billion parameter data sets, learn from that data, and generate meaningful outputs.

Whether those outputs are paragraphs, entire books, documents, photos, 3D images, or videos — the AI is creating something from the knowledge that it gained from its learning.

And the kind of technology developed by Tesla for autonomous driving technology is based on deep neural networks. The software has the ability to “think on the fly,” like us humans do when driving a car.

From its learning, a Tesla has the ability to infer the correct actions based on its massive real world data learning on billions of miles driven by Teslas. 

Tesla’s full self-driving (FSD) software has the ability to ingest real time data taken from its cameras around the car, analyze that data, and “decide” what to do. 

This all happens in split seconds. It’s extraordinary. I test the technology every week in my own Tesla, and the improvements from month to month are incredible.

Source: Whole Mars Catalog

This is very different from the large-language models (LLMs), like ChatGPT, that “train” on a large fixed data set… then generate outputs based on that “learned” knowledge when prompted. ChatGPT is unable to ingest real-time information, so its outputs are naturally limited.

Robotic arms and humanoid robots are now using a form of AI called reinforcement learning to learn by trial and error how to perform a task. 

We explored an example of this in my recent January 25, 2024 issue of Outer Limits, which you can find here.

In this case, an AI — manifested in a robotic form — is given a task to master without instructions on how to do so. It has to learn by itself by making a series of mistakes until it learns the best way to complete the task.

Said another way, the AI is not pre-programmed on how to perform the task by humans.

None of these advancements can happen without copious consumption of electricity.

For perspective, the amount of additional electricity that will be required in this doubling in three years is equivalent to the total electricity consumption of Japan.

And just imagine, the increase in electricity requirements by data centers in the three years that follow will be even larger than the next three — that’s the definition of exponential growth.

Electricity consumption is one of the best confirmations of the scale and growth of this unyielding need for computational power and electricity to fuel artificial intelligence. It’s the canary in the coal mine.

The availability and cost of electricity is the single most important factor in choosing a location for building a new data center for cloud service providers.

If they get this wrong, they will have either scaling problems, or operational cost problems… or both.

We’ve now entered a period of hypergrowth in data centers driven by the explosion of artificial intelligence.

We can see it in the data. The electricity consumption — and the demand for AI-specific semiconductors — tells the tale.

And not surprisingly, this is where we’ll find some of our best investment opportunities over the next few years.

What do you think of this issue of Outer Limits? As always, we welcome your feedback and questions, and look forward to them. We read each and every email and address common questions in the Friday AMA issues. Please write to us by clicking here.

Previous Post Next Post