
Without the State Grid of China, "domestic lobsters" cannot be produced

This article explores China's power cost competitiveness against the backdrop of the surge in demand for AI computing power. The rapid popularity of the OpenClaw project has raised concerns about computing power and electricity consumption, with global data center electricity consumption expected to account for 1.5%-2% of total electricity usage by 2025. China's power advantage is considered a key factor in its AI competition, but it also faces opportunities and challenges
In early 2026, OpenClaw (commonly known as "Little Lobster") quickly became popular on GitHub, and by March, it became one of the fastest-growing open-source projects in GitHub's history. From geek players to ordinary users, from programmers to retired seniors, "raising lobsters" swept across major platforms almost overnight, even becoming a "fashion item" for young people, sparking a new trend of eager experimentation.
When it comes to Agents doing work for people, the "cost" is not small. When an Agent completes a task, it involves dozens of consecutive reasoning and decision-making processes, and the computational power consumption may be dozens of times that of ordinary conversations. For example, GPT-5.4 requires $80 to respond with a "Hi." The more widespread Agents become, the greater the demand for computational power, and the electricity consumption sharply rises accordingly.
A senior AI researcher stated that the consumption of tokens will increasingly be embedded in the underlying logic of economic operations. In the future, the vast majority of economic activities will be completed in the form of token consumption. Meanwhile, electricity is becoming an increasingly critical variable in this AI competition.
Historically, China's electricity costs have maintained strong competitiveness among major economies, with even Elon Musk and Jensen Huang publicly stating multiple times that "China may win the AI race due to its energy and electricity advantages."
However, the "electricity advantage" is just a vague statement; behind this phrase lies a unique energy and electricity landscape in China, with both opportunities and challenges.

AI Computational Power is Racing, Global Electricity Demand is Rising
With the explosion of generative AI, it is consuming global computational resources at an unprecedented speed.
According to data from the China Academy of Information and Communications Technology, by June 2025, the total scale of global computing device computational power had reached 4,495 EFlops, a year-on-year increase of approximately 117%. Among this, the proportion of intelligent computational power used for AI training and reasoning has increased from about 70% in 2024 to 85%, becoming the main driving force behind the growth of computational power.
This rapid growth in computational power demand is also directly translating into electricity consumption. According to the International Energy Agency (IEA) report titled "Energy and Artificial Intelligence," global data center electricity consumption is expected to be around 415-650 terawatt-hours (TWh) in 2025, accounting for approximately 1.5%-2% of the total global electricity consumption. Among this, the electricity consumption of AI data centers has reached 30%-40% and is still rapidly increasing.
The IEA pointed out that the United States, China, and Europe are the regions with the highest concentration of global data center electricity consumption, with the United States accounting for about 45%, China about 25%, and Europe about 15%.
The widespread application of Agents like "Little Lobster" is pushing this consumption into a new magnitude Taking OpenClaw as an example, each step consumes computing power and a significant amount of real money. Overseas users have reported that a misconfigured automated task burned through $200 in API fees in a single day.
Moreover, even when running on relatively low-cost models, the continuous invocation of Agents can incur substantial costs: the daily cost of running Kimi is approximately between $5 and $10, with a monthly token budget typically ranging from $150 to $300; if using the Claude API and allowing OpenClaw to run 24/7, the monthly cost could reach between $800 and $1,500.
Gartner predicts that by the end of 2026, about 40% of enterprise applications will embed task-oriented AI agents, a figure that was less than 5% in 2025. As the use of Agents becomes a part of daily enterprise infrastructure, the demand for computing power will also change, gradually transforming into a continuous operational baseline that continuously drives electricity demand from the global power grid.
This rate of growth has already begun to put pressure on the energy industry. Morgan Stanley estimates that from 2025 to 2028, the cumulative electricity shortfall for data centers in the United States will reach 47 gigawatts, equivalent to the total electricity consumption of about nine cities the size of Miami. Behind these numbers is a visible competition for energy consumption.
At the same time, the AI computing power ecosystem in different countries is also influencing the cost structure of this energy consumption.
A senior AI researcher stated that the current global large model training system is primarily built on NVIDIA's CUDA software ecosystem, and many large-scale model trainings still heavily rely on NVIDIA GPUs and mature software frameworks. In contrast, domestic GPUs are still in the rapid improvement phase regarding software ecosystems, and are currently more used in inference scenarios.
This also makes open-weight models more practically significant in China. As long as the model weights are open, enterprises can complete inference deployment on local servers or domestic GPUs, thus not having to rely entirely on overseas cloud vendors.
In this model, the cost structure of AI will change: the cost of model inference will no longer be just GPU and cloud service fees, but will increasingly convert to server and electricity costs.
For this reason, the importance of electricity costs in China's AI computing power system is further amplified.
The energy competition behind AI is not just about cheap electricity
With the popularity of Agents and continuous multi-step reasoning, the token consumption has significantly increased, and behind the computing power is a continuous consumption of electricity. In this context, can China's advantage of low-cost tokens (referring to the API call prices offered by domestic AI large models) continue to be maintained in the future?
Xiong Yuxuan, an assistant professor at the School of Artificial Intelligence Education at Central China Normal University, stated, From the perspective of energy and electricity, there exists a structural cost advantage behind China's low-cost tokens. This advantage primarily comes from the scale and cost of China's overall power system.
From a scale perspective, China already has the largest power system in the world. By 2025, China's cumulative installed power generation capacity will reach 3.89 billion kilowatts, and the total electricity consumption will exceed 1 trillion kilowatt-hours for the first time, firmly maintaining the world's top position. "1 trillion kilowatt-hours" is equivalent to more than twice the annual electricity consumption of the United States and is higher than the total annual electricity consumption of the European Union, Russia, India, and Japan combined. This large-scale power generation capacity provides a solid supply foundation for energy-intensive AI data centers.
On March 5, 2026, the "2026 State Council Government Work Report" proposed to "implement new infrastructure projects such as ultra-large-scale intelligent computing clusters and computing-energy synergy, strengthen national integrated computing power monitoring and scheduling, and support the development of public clouds." Under the policy framework of deep synergy between computing power and energy, China's data centers and AI computing power infrastructure can more effectively match with the power system.
At the same time, China has long ranked among the top in the world in terms of installed capacity in clean energy fields such as hydropower, wind power, and photovoltaics. In some regions, the abundant energy supply also allows data centers to obtain relatively low electricity costs.
"While reducing electricity costs, this energy structure also helps to reduce carbon emissions," said Xiong Yuxuan. "In the future, China's low-cost token will still have certain competitiveness."
A senior power grid expert stated that what truly constitutes a structural moat is not only the scale of energy but also China's systematic capabilities in power system scheduling and long-distance transmission systems.
China's power energy distribution has a natural structural problem: resources are in the west, while demand is in the east, separated by thousands of kilometers. The core project to solve this contradiction is the ultra-high voltage (UHV) transmission network.
Currently, China has built 44 cross-regional transmission channels, possessing the highest voltage level and the longest ultra-high voltage transmission lines in the world. By the end of 2025, the capacity for sending electricity from the west to the east will exceed 340 million kilowatts, supporting about one-fifth of the electricity demand in the eastern and central regions. Low-cost wind and solar power from Xinjiang, Gansu, and Inner Mongolia can be efficiently transmitted to the computing power-intensive eastern regions, allowing data centers to still obtain stable and sufficient power supply without relying on high-priced urban power grids.
This "East Data West Computing" model is innovative on a global scale, and currently, only China is systematically promoting it as a national project.
Therefore, China has unique advantages in AI energy: a large enough scale to accommodate substantial loads, strong ultra-high voltage capabilities for cross-regional scheduling, and a fast enough system to implement quickly. The combination of these factors constitutes a true strategic barrier at the level of AI computing power infrastructure.
The Real Threshold of AI Computing Power: Power Quality and Stability
In addition to scale and distribution, as well as ecological capabilities, there is another factor that is often overlooked: power quality.
Behind China's low electricity prices is a distinctly different energy structure. Li Guanghui, CEO of BraneMatrix AI, stated that the formation mechanism of China's electricity prices is actually very complex, and the cost advantage of AI computing power depends not only on electricity prices but is also closely related to the energy structure and the capabilities of the power system. **
The reality is that there are significant differences in the power structure across different regions of China.
For example, some regions are rich in coal power resources, resulting in relatively low electricity prices; some regions have a large installed capacity of renewable energy, but due to limited absorption capacity, there can be periods of electricity surplus; and some regions face challenges in integrating renewable energy into the grid and ensuring stable power supply due to limitations in grid dispatch capabilities. These differences in energy structure give rise to distinct regional characteristics in China's power resources.
Moreover, for AI data centers, the differences in power resources across regions are not just about the price of electricity; more importantly, it is about whether the power supply environment is stable.
Li Guanghui pointed out that GPU clusters have extremely stringent requirements for the power environment. Significant fluctuations in voltage or current can lightly affect equipment performance and severely damage hardware or increase failure rates. This means that even if low-cost electricity is obtained, if the power quality does not meet standards, it could lead to greater losses. Additionally, in regions where power infrastructure is not well-developed, directly using fluctuating renewable energy could also increase the operational risks of equipment.
Therefore, some large AI data centers are building more complete power infrastructure systems, such as dedicated power dispatch systems, stable power supply equipment, and software-driven power management capabilities, to ensure that GPU clusters can operate in a stable environment for the long term. For instance, data centers of cloud providers like Google and Microsoft often come with independent power systems, which have become an important component of AI computing infrastructure.
From this perspective, in the Agent era, the key issue facing AI computing power is not just the price of electricity, but whether the power system can stably support the long-term operation of large-scale GPU clusters.
Energy Concerns Hanging Over AI Computing Power: Geopolitics
The scale and stability of the power system determine whether AI computing power can operate long-term. However, on a more macro level, the stability of energy supply itself will also affect the operation of this system.
Recently, tensions between the United States and Iran have escalated, drawing attention to the security situation in the Middle East.
Among them, the Strait of Hormuz, an important waterway connecting the Persian Gulf and the Gulf of Oman, is one of the world's most critical energy transportation routes. Most of the oil exported by Middle Eastern oil-producing countries needs to pass through this strait to reach Asia, Europe, and other regions. The crude oil imported by China from Middle Eastern countries such as Saudi Arabia, Iraq, and the UAE is also primarily transported via this route.
If regional conflicts escalate or maritime transport routes are obstructed, it could not only drive up global oil prices but also transmit through the international energy market to affect prices of natural gas and other energy sources, thereby impacting the overall energy cost structure.
As of 2025, China's dependence on imported natural gas is about 40%-45%, with major sources including Russian pipeline gas, Central Asian natural gas pipelines, and LNG (liquefied natural gas) from countries like Qatar and Australia. Although the proportion of natural gas in China's overall power generation structure is not high, it plays an important role in peak shaving within the power system.
If China's energy structure changes in the future while the demand for AI computing power continues to grow rapidly, whether the power supply system will face new pressures is also a major concern for the industry Senior power grid experts indicate that although China's energy system is vast, there remains external dependence on certain energy resources, such as oil and natural gas, which still need to be imported through international markets. Some of these can be used for power generation and other energy purposes. However, with changes in the international geopolitical landscape and energy markets, the stability of these external energy sources may also fluctuate.
The energy issues behind this AI competition still have many unresolved aspects.
Risk Warning and Disclaimer
The market has risks, and investment requires caution. This article does not constitute personal investment advice and does not take into account the specific investment goals, financial situation, or needs of individual users. Users should consider whether any opinions, views, or conclusions in this article align with their specific circumstances. Investment based on this is at one's own risk
