Amazon's annual cloud computing conference opens, with the CEO taking the stage tonight. Can the new AI chip "catch up with Google's TPU"?

Wallstreetcn
2025.12.02 02:22
portai
I'm PortAI, I can summarize articles.

Tonight's keynote speech by Amazon's CEO is the highlight of the conference, with the market expecting the release of the Trainium 3 chip, which boasts a 40% performance improvement, and the self-developed multimodal model Nova. In response to the pressure of its cloud market share slipping from 49.7% to 45.1%, AWS is consolidating its position by expanding its computing power collaboration with Anthropic and the $38 billion deal with OpenAI

Amazon's annual cloud computing event - AWS re:Invent conference opened on December 1st (Monday) in Las Vegas.

Although a series of AI applications and collaborations were announced on the first day, the highlight of this conference is the upcoming CEO keynote speech. The market is holding its breath for Amazon's upcoming release of the next-generation Nova AI model and the latest developments of its self-developed chip - Trainium 3. The market is closely watching whether these self-developed products are strong enough to compete against fierce rivals like Google and Microsoft, potentially shaking up the current market landscape.

These releases are crucial for Amazon. According to a report from JP Morgan, despite strong recent revenue growth, AWS's cloud market share has seen a slight decline, dropping from 49.7% to 45.1%. The company's stock price has also recently pulled back, partly due to investor concerns about its AI strategy and the timeline for the new chip launch. The performance of Trainium 3 and Nova will be key to determining whether Amazon can maintain its lead in the top AI infrastructure race.

New Chip Trainium 3: 40% Performance Improvement, But Facing Intense Competition

Whether Trainium 3 can become the "deciding factor" for AWS to remain competitive in the AI era depends on its performance, cost, and market acceptance. The market has high hopes for Trainium 3, but remains cautious.

JP Morgan predicts that the cost-performance ratio of Trainium 3 will improve by about 40% compared to its second-generation product, Trainium 2. The report also points out that there are doubts in the market about whether Amazon's self-developed chips can catch up with competitors. Currently, Amazon's Trainium chip has only iterated to the second or third generation, while Google's TPU has developed to the seventh generation, and NVIDIA's GPU is about the tenth generation.

At the same time, the report cites media reports that some customers have encountered technical difficulties when adopting Trainium, and even engineers from Amazon's important AI partner, Anthropic, are more inclined to use Google's TPU. Therefore, the market expects Amazon to provide more specific data on the performance improvements of Trainium 3 at re:Invent and clarify its "go-to-market" strategy for expanding its customer base.

In terms of release timing, JP Morgan expects Trainium 3 to launch a preview version by the end of 2025 and achieve larger-scale deployment in early 2026. However, there are also concerns that it may be delayed until the second quarter of 2026. This conference is expected to provide a clearer timeline.

Despite various concerns, the Trainium business has shown strong growth momentum. Amazon CEO Andy Jassy revealed in last month's earnings call that Trainium has become a multi-billion dollar business, with revenue growing by 150% in the last quarter, mainly due to its use by Anthropic

New Model Nova: A "Versatile" Contender Against Gemini?

In addition to chips, Amazon's self-developed AI large model Nova is also a focus of this conference. According to The Information citing informed sources, AWS is expected to release a brand new Nova model, which is a multimodal model capable of processing text, speech, images, and video, and generating text and images.

Reports indicate that AWS is positioning this new model as an "all-in-one" product, aimed at directly competing with top models like Google's Gemini. This is seen as a key step for Amazon to address its shortcomings in the cutting-edge model field. An AWS salesperson revealed to The Information that the existing Nova model is not yet a "mainstream choice" for handling complex reasoning tasks like OpenAI or Anthropic models.

However, Amazon officials refuted claims of its AI model lagging behind. A company spokesperson stated that Nova is the second most popular model series on its AI service platform Bedrock (as of Q2 2025), with over 10,000 customers including Siemens and Coinbase.

Expanding Cooperation with Anthropic and OpenAI

While developing its own technology, Amazon is also actively managing its complex relationships with external AI giants.

Anthropic is one of Amazon's most important AI allies, with its Claude model leading sales on AWS's Bedrock platform. To support this collaboration, the AI supercomputer "Project Rainier," jointly developed by both parties, is rapidly expanding, with the number of Trainium 2 chips expected to double from about 5 million to over 10 million by the end of the year. JP Morgan estimates that this project could bring approximately $9 billion in annualized revenue to AWS by 2026.

However, Anthropic is also actively seeking diversified computing power support and has formed large-scale collaborations with AWS's competitors. It has not only agreed to rent cloud servers powered by Google chips worth hundreds of billions of dollars but also announced last month that it would rent servers powered by Nvidia chips from Microsoft Azure. JP Morgan believes this reflects Anthropic's enormous demand for computing power and the current supply chain constraints, rather than a departure from AWS.

To hedge risks and seize market opportunities, Amazon has also extended an olive branch to another AI giant, OpenAI, signing a 7-year, $38 billion agreement to run and scale its workloads on AWS infrastructure, including the use of hundreds of thousands of Nvidia GPUs acquired through AWS.

According to a JP Morgan report, OpenAI has committed to investing $38 billion over the next 7 years to use AWS infrastructure to scale its workloads, which is undoubtedly a significant victory for Amazon in the AI computing power market

Market Game: Share Defense Battle?

From an investor's perspective, Amazon's AI story is filled with growth opportunities and uncertainties. JP Morgan's report points out that despite AWS achieving its fastest growth in 11 quarters in the third quarter, Amazon's stock price has fallen back to pre-earnings report levels, partly due to market concerns over the launch timing of Trainium 3 and Anthropic's collaborations with other cloud providers.

According to IDC data, AWS's global cloud infrastructure market share has decreased from approximately 49.7% in 2022 to 45.1% in the first half of 2025.

Nevertheless, Wall Street analysts remain optimistic about its prospects. JP Morgan believes that if clearer signals regarding AI strategy can be provided at the re:Invent conference, it should help reduce the cloud hanging over Amazon's stock price. The bank's analysts noted that the demand trend for AWS remains healthy, with its backlog orders growing 22% year-on-year to $200 billion in the third quarter, and the backlog orders added in October alone exceeded the entire third quarter. The bank predicts that AWS's growth will accelerate in 2026.

Intensive Launch on First Day, Comprehensive AI Ecosystem Rollout

Before the CEO's keynote speech, a series of announcements made on the first day of the re:Invent conference have already demonstrated AWS's determination to penetrate its AI capabilities across various industries.

According to TechRepublic, these announcements include:

  • Multicloud Networking: AWS has partnered with Google Cloud to launch a service called "AWS Interconnect – multicloud," allowing customers to establish private, high-bandwidth connections between different cloud platforms.

  • Financial Services: BlackRock confirmed that its Aladdin investment technology platform will begin operating on AWS infrastructure for U.S. clients starting in the second half of 2026. Visa also announced a partnership with AWS to enable AI agents to securely complete multi-step transactions.

  • Industry Applications: Lyft is collaborating with AWS and Anthropic to provide AI support for drivers using the Claude model, reducing problem-solving time by 87%. Nissan is accelerating the development of its software-defined vehicle platform on AWS.

These intensive announcements indicate that AWS is attempting to solidify its leadership position in the cloud computing market by building a vast and deep AI ecosystem and translating it into tangible business growth