NVIDIA's AI dominance stands unshaken, why can no giant shake its throne as the king?

Zhitong
2024.08.12 07:02
portai
I'm PortAI, I can summarize articles.

NVIDIA is renowned for its outstanding performance in the AI chip manufacturing field, with its core design creating a business barrier formed by the tight integration of software and hardware, effectively preventing competitors from entering. NVIDIA has successfully addressed the issue of running non-graphical software on dedicated chips through its software platform called CUDA, thereby maintaining a dominant position in the AI market. Competitors are striving to develop the ability to bypass the software barriers set by NVIDIA. In the long run, NVIDIA's market dominance will rely more on coding capabilities than circuit design

According to the financial news app Zhitong Finance, NVIDIA (NVDA.US) is renowned for its outstanding performance in the AI chip manufacturing field. However, the company's core design lies in building a business barrier, which is composed of a tight integration of software and hardware, effectively preventing the intrusion of customers and competitors.

Over the past twenty years, NVIDIA has carefully built a "walled garden" in the tech industry, similar to the ecosystem created by Apple (AAPL.US). Apple's ecosystem mainly targets consumers, while NVIDIA focuses on serving developers who use its chips to build artificial intelligence systems and other software.

This closed system design explains why NVIDIA is able to maintain its dominant position in the AI market amidst fierce competition from numerous competitors, including other chip manufacturers and tech giants like Google (GOOGL.US) and Amazon (AMZN.US), and is unlikely to lose its market share in the coming years.

Looking ahead, the competition for NVIDIA's market dominance will increasingly focus on the company's coding capabilities, not just circuit design. Competitors are racing to develop software that can bypass the software barriers set by NVIDIA.

CUDA: The Cornerstone of the "Walled Garden"

The key to understanding NVIDIA's "walled garden" lies in its software platform called CUDA. Since its launch in 2007, CUDA has solved a problem that other companies have failed to address: how to run non-graphical software on NVIDIA's dedicated chips designed for labor-intensive applications like 3D graphics and video games, such as encryption algorithms and cryptocurrency mining.

CUDA not only supports diverse computing tasks on these Graphics Processing Units (GPUs), but also enables AI software to run on NVIDIA chips. The explosive growth of AI software in recent years has propelled NVIDIA to become one of the most valuable companies in the world.

More importantly, CUDA is still evolving. Year after year, NVIDIA meets the needs of software developers by releasing specialized code libraries that significantly accelerate the execution speed of tasks on NVIDIA GPUs compared to traditional general-purpose processors like Intel (INTC.US) and AMD (AMD.US) products.

The Importance of Full-Stack Computing and Software Platforms

The importance of NVIDIA's software platform also explains why the company has invested more resources in recruiting software engineers than hardware engineers over the years. CEO Jensen Huang recently emphasized that NVIDIA focuses on "full-stack computing," combining hardware and software, from chips to AI software manufacturing.

Whenever a competitor announces the launch of an AI chip to compete with NVIDIA, they are essentially competing with a system that has been in use for over 15 years, with a large amount of code written. These software are difficult to port to competitors' systems, which is where NVIDIA's coding capabilities truly shine.

At the shareholder meeting in June, NVIDIA announced that CUDA now includes over 300 code libraries and 600 AI models, supporting 3,700 GPU-accelerated applications used by over 5 million developers from approximately 40,000 companies

Market Forecast and Competitive Situation

The vast scale of the AI computing market has prompted multiple companies to join forces to compete with NVIDIA. Atif Malik, a semiconductor and networking equipment analyst at Citigroup Research, predicts that the AI chip market will reach $400 billion annually by 2027. Meanwhile, NVIDIA's revenue as of January was approximately $61 billion.

Bill Pearson, Vice President of AI Cloud Customers at Intel, stated that industry collaborations are mainly focused on developing open-source alternatives to CUDA. Intel engineers are involved in two such projects, one of which involves collaboration with companies such as ARM, Google, Samsung, and Qualcomm. OpenAI, the company behind ChatGPT, is also developing its own open-source projects.

Investors are flocking to startups dedicated to developing CUDA alternatives. Part of the reason for these investments is that engineers from many global tech giants may come together to collectively drive companies to use any chips they want without paying what some in the industry call a "CUDA tax."

Market Forecast and Competitive Situation

In the field of AI chips, while NVIDIA maintains a significant leadership position, the wave of competition is intensifying. Startup Groq, with a valuation of $2.8 billion, successfully raised $640 million and is committed to developing chips that can rival NVIDIA's. This marks the rise of open-source software, bringing new vitality and possibilities to the industry.

Not only startups, but tech giants are also actively positioning themselves. Google and Amazon are independently developing AI training and deployment chips, while Microsoft (MSFT.US) announced its entry into this field in 2023. These initiatives not only challenge NVIDIA's market position but also drive industry innovation.

In this competition, AMD has become one of NVIDIA's strongest competitors with its Instinct AI chip series. Andrew Dieckman, Executive Vice President of AMD, stated that although AMD still lags behind NVIDIA in market share, the company is investing heavily in software engineers, expanding software resources to narrow the gap. AMD announced the acquisition of Silo AI for $665 million last month, further enhancing its AI research and development capabilities.

NVIDIA's two major customers, Microsoft and Meta Platforms, have also started purchasing AMD's AI chips, reflecting the market's demand for diversified suppliers and the desire for competition in high-end products.

However, NVIDIA's market barriers are not insurmountable. Babak Pahlavan, CEO of startup NinjaTech AI, revealed that if cost allows, he prefers to use NVIDIA's hardware and software. However, faced with the shortage and high cost of NVIDIA's H100 chip, NinjaTech AI had to turn to Amazon, which provided its own AI training chip, Trainium. After months of effort and collaboration, NinjaTech AI successfully trained their AI models on the Trainium chip and launched an AI "agent" in May, which now has over 1 million monthly active users, all supported by models trained and run on Amazon's chips This transition is not easy, Pahlavan admits that there have been many challenges and mistakes along the way. Amazon Web Services Director Gadi Hutt also acknowledges that both sides made mistakes in the early stages of cooperation, but now things are on the right track. Amazon's AI chip customer base is expanding, including companies like Anthropic, Airbnb, Pinterest, and Snap. Although Amazon provides customers with the option to use NVIDIA chips, the cost is high and customer conversion takes time.

NinjaTech AI's experience illustrates one of the main reasons why startups like it face challenges in developing AI outside of NVIDIA's "walled garden": cost. Pahlavan stated that to support over a million users per month, NinjaTech's monthly cloud service costs on Amazon are around $250,000. In contrast, running the same AI on NVIDIA chips would cost between $750,000 and $1.2 million.

NVIDIA's Response and Future

Faced with these competitive pressures, NVIDIA is fully aware of the high costs of purchasing and running its chips. CEO Jensen Huang has pledged that NVIDIA's next-generation artificial intelligence chips will focus on reducing the cost of training AI on the company's hardware.

Citi Research's Malik expects that in the next two to three years, NVIDIA's market share in the AI-related chip market will remain around 90%. This indicates that despite facing competition, NVIDIA's leading position remains solid.

In the foreseeable future, NVIDIA's fate will depend on the inertia that has historically trapped many companies and customers - the "walled garden" effect