Wallstreetcn
2024.03.27 17:32
I'm PortAI, I can summarize articles.

Missed out on US stock AI? Morgan Stanley offers a new idea: buy power stocks

With the rapid decline in the computing cost of generative artificial intelligence, there will be a significant mismatch between the rapid growth in AI demand and the slow growth of electricity infrastructure. Morgan Stanley compares this mismatch to a "tortoise and hare race," implying that although AI is currently booming like the hare leading the race, the growth in AI demand relies on electricity supply. More data centers may pay a higher premium for electricity to ensure faster power, as a result, the slow-growing electricity infrastructure sector, as the tortoise, may have better prospects

Morgan Stanley recently released a research report predicting that with the rapid decline in the cost of computational power for generative artificial intelligence, there will be a significant mismatch between the rapid growth in AI demand and the slow growth in electricity infrastructure. Morgan Stanley likened this mismatch to a "tortoise and hare race," implying that although AI is currently booming like the hare in a race, the growth in AI demand depends on electricity supply. More data centers may pay a higher electricity price premium in order to ensure electricity faster. Therefore, the outlook for the electricity sector, as the tortoise, may be more favorable, leading to increased target prices for various electricity companies.

Significant Cost Reduction in Next-Generation AI Chip Computational Power May Stimulate Upgrading Demand

The research report indicates that the cost of computational power for generative artificial intelligence will rapidly decrease, a situation that the market has somewhat underestimated. Morgan Stanley's data center model shows that when transitioning from NVIDIA's H100 Hopper GPU to using the B100 Blackwell GPU, the capital cost per teraFLOP (floating-point operations per second) in data centers decreased by approximately 50%. Morgan Stanley stated that this number is the ratio of total capital cost of data centers to floating-point operations per second in data centers, which may help determine whether generative artificial intelligence business models will generate higher returns on investment.

Research found that in data center economic models using the Hopper GPU, this number is approximately $14 per teraFLOP, while for the Blackwell data center model, this number decreases to $7 per teraFLOP. This rapid decline in computational costs is made possible by the rapid improvement in NVIDIA GPU power efficiency, thus it is expected that the demand for upgrading to new technologies in data centers will increase.

AI Electricity Demand Could Double by 2027

In light of this, Morgan Stanley has revised its estimates and now predicts that global demand for electricity from generative AI may double. Morgan Stanley expects the utilization rate of GPUs/custom chips to increase from 60% to 70%, and anticipates that renewable energy will account for a smaller percentage of data center electricity, with traditional energy playing a larger role.

Morgan Stanley forecasts that under basic conditions, the global data center electricity demand in 2024 and 2027 will be approximately 430 and 748 terawatt-hours (TWh) respectively, equivalent to about 2% and 4% of global electricity demand in 2022. At the same time, the compound annual growth rate (CAGR) of electricity demand from generative AI from 2023 to 2027 is estimated to be around 105%, while the CAGR of global data center electricity demand (including generative AI) during the same period is approximately 20%.

In an optimistic scenario (reflecting 90% chip utilization), Morgan Stanley predicts that global data center electricity demand in 2024 and 2027 will be around 446 and 820 TWh, while in a pessimistic scenario (reflecting 50% utilization), the predicted data center electricity demand in 2024 and 2027 will be around 415 and 677 TWh Convert these numbers to gigawatts (GW), the total power capacity of data centers in 2024 and 2027 is expected to be around 70 GW and 122 GW, respectively.

Slow Growth in Power Infrastructure, Data Centers Willing to Pay Premium for Electricity

While the demand for AI power is surging, the growth of power infrastructure is facing challenges. Morgan Stanley pointed out that one of the main concerns is the availability of grid connections to support the new data center capacity, with key issues including limited power line capacity, delays in planning and permitting for new transmission and distribution projects, and supply chain bottlenecks.

For example, according to the Lawrence Berkeley National Laboratory in the United States, upgrading existing transmission lines may take three years or longer due to regulatory obstacles, and the queue time for new projects to connect to the grid has increased from less than 2 years in 2008 to 5 years in 2022. Therefore, increasing generation and grid capacity in smaller secondary markets is a suitable strategy.

Therefore, considering the significant capital deployed in these new, very large data centers, as well as the rapid pace of AI chip innovation, data centers will want to connect to the grid as soon as possible, which has tremendous value. Therefore, to offset the time cost, Morgan Stanley believes that more data center developers may be willing to pay a premium for electricity.

The research report gives an example: if a data center developer can secure power sources two years earlier than other developers, assuming a GPU's economic life of six years, the developer would be willing to pay about a 101% electricity price premium, or assuming a 10-year economic life, pay about a 61% electricity price premium.

Ensuring Power? Data Centers Best Located Inside Nuclear Power Plant Fences

Therefore, Morgan Stanley believes that the best place to build data centers is inside nuclear power plants in the United States, pointing out that in a partnership between Amazon and Talen Energy, there are plans to build a data center with a capacity of around 960 megawatts inside the "fence" of a nuclear power plant in Pennsylvania.

The report states that nuclear power plants are the most suitable because they provide power to data centers faster, and nuclear power plant sites are usually very large, providing the space needed to build large data centers; at the same time, nuclear power plants already have a lot of power infrastructure, reducing the cost of developing the necessary power infrastructure. Additionally, nuclear power plants can provide a large amount of cooling water, which may be an advantage for data centers considering that new, higher-temperature data centers may require liquid cooling. Finally, dual-unit nuclear power plants provide redundancy in case of operational issues with any single unit. Furthermore, the nuclear power industry has a very good operational record and is unlikely to experience unplanned shutdowns.

However, Morgan Stanley also acknowledges in the research report that if the bottleneck in power infrastructure growth can be resolved, making it easier to develop new data centers, this may in turn reduce the value of existing data centers. But few analyses show that even the most promising solution like nuclear power plants, their power supply growth relative to data center growth remains limited At the same time, if NVIDIA and other chip manufacturers achieve a rapid increase in chip computing capabilities, existing data centers may need frequent upgrades to remain competitive with newly built data centers. If data center owners fail to fully receive compensation for these frequent renovations, they may find themselves on a "capital expenditure treadmill." Upgrading old data centers may pose practical challenges, especially when it comes to liquid cooling (water resources may be a problem - but some new liquid cooling technologies use closed-loop systems). If the business model of generative AI fails to generate the expected profit margins, data center customers may face default risks