Track Hyper | Will Samsung be able to take the lead in the CXL market as planned?
The battle is intense, with a slight disadvantage
Author: Zhou Yuan / Wall Street News
The rapid development of Generative Artificial Intelligence (GenAI) applications has not only propelled NVIDIA to surpass Samsung Electronics and become the world's second largest semiconductor (revenue) manufacturer, but also filled SK Hynix's coffers by providing HBM (High Bandwidth Memory) for NVIDIA's AI accelerator cards.
Thanks to the correct prediction of the future market potential of HBM in 2021, SK Hynix is now dominating the HBM market, with a global market share that is first in the world. Compared to Samsung Electronics, which ranks second, SK Hynix has truly achieved a "distant lead".
After realizing its lag behind SK Hynix in the HBM market, Samsung Electronics launched a tactical counterattack, hoping to regain the global number one position occupied by SK Hynix.
Samsung Electronics' tactics in the GenAI field include two directions: first, developing the Mach series (currently developing 1, planning to develop 2) to compete with NVIDIA; second, promoting the development of the CXL (Compute Express Link) memory standard, so that both NVIDIA and their Mach series can achieve high storage capacity while also improving data transfer rates.
SK Hynix Leads: Samsung's Countermeasures?
NVIDIA is the world's largest buyer of HBM, this super "chip" has become a key component of AI graphics processing units. SK Hynix is the sole supplier of HBM3 and HBM3E, the current generation memory chips for NVIDIA. In the global market for HBM3E, SK Hynix holds a market share between 75% and 80%.
SK Hynix has already begun mass production of its next-generation chip HBM3E (sixth generation), and Micron Technology has also joined the ranks. Both companies recently announced that all of their HBM3E production capacity for this year has been fully booked by NVIDIA. In comparison, Samsung's supply of HBM3E to NVIDIA is currently in the qualification testing stage.
HBM is a chip stack made up of multiple DRAM chips. These DRAM chips are interconnected between vertical stacking layers through fine wires called Through Silicon Vias (TSV). The first HBM chip was introduced in 2014.
HBM chip capacity has increased from 1GB to 24GB, bandwidth has increased from 128GB/s to 819GB/s, and data transfer rates have also increased from 1Gbps to 6.4Gbps.
Due to the increasing difficulty of increasing the number of transistors on a limited chip area, stacking has been adopted as a technology to push the performance of various chips (including 3D NAND) beyond their limits, starting in 2010 to promote the industry.
As of March 2024, the suppliers capable of stably supplying HBM3 products globally are South Korea's SK Hynix, Samsung Electronics, and Micron Technology. Among them, SK Hynix and Micron Technology can stably supply HBM3E Before 2020, Samsung Electronics was the leader in this technology. In 2015, Samsung Electronics globally launched the HBM2 chip; in 2021, SK Hynix surpassed them to become the first manufacturer in the world to offer HBM3. Subsequently, in 2019, Samsung Electronics disbanded its HBM business and technology team. SK Hynix seized the opportunity and became the global leader in HBM.
On November 30, 2022, ChatGPT emerged, making HBM the most important high-bandwidth memory chip for NVIDIA's AI accelerator cards. That year, 50% of global HBM shipments came from SK Hynix, 40% from Samsung, and 10% from Micron.
However, in the HBM3E sub-market, SK Hynix holds a global market share of 75%-80%, while Samsung Electronics has not yet qualified to supply NVIDIA with this next-generation HBM3E chip.
As a result, SK Hynix has achieved a significant strategic victory.
Samsung Electronics reacted strongly: in January and March of this year, Samsung Electronics successively established two HBM technology teams; on March 19, at the 2024 Memcon global chip manufacturers conference, Samsung's Vice President and DRAM Product and Technology Chief Hwang Sang-joong revealed that Samsung Electronics will mass produce the 12-layer fifth-generation HBM (HBM3E).
At the same time, Hwang Sang-joong unveiled the HBM technology roadmap, expecting HBM shipments in 2026 to be 13.8 times that of 2023. By 2028, HBM annual production will further increase to 23.1 times the 2023 level.
According to Samsung Electronics' HBM technology roadmap, Samsung's main HBM products in 2026 should be the fifth-generation HBM3E and the sixth-generation HBM4. The former will significantly increase the number of stacks. Samsung showcased the HBM3E 12H chip at the conference - the industry's first 12-layer stack of HBM3E, marking the highest capacity breakthrough in HBM technology history.
Public information shows that Samsung Electronics is providing NVIDIA with HBM3E 12H chip samples and plans to start mass production in the first half of 2024.
Furthermore, as the longest and most comprehensive semiconductor industry chain globally, Samsung Electronics is tactically counterattacking in the GenAI market, aiming not only to challenge SK Hynix but also to increase competitive pressure on NVIDIA, albeit through a more indirect approach.
Kyung Kye-hyun, head of Samsung's semiconductor business, stated in late March that Samsung is developing the next-generation AI chip, Mach-1, with the aim of taking a slice of NVIDIA's AI accelerator card market.
According to Samsung, Mach-1 is an AI accelerator card in the form of a System-on-Chip (SoC) that reduces the data transmission rate bottleneck between the Graphics Processing Unit (GPU) and HBM. This technology product is adopted by Naver, a Korean search giant and downstream purchaser of NVIDIA, in a contract worth as much as $752 million
CXL Market: Uncertain Future
Given Samsung Electronics' lag in HBM strategy, they can only rely on technological breakthroughs to regain the global number one position from Micron.
Samsung Electronics' ace in the hole is CXL technology.
At Memcon 2024, Samsung's Executive Vice President Han Jin-man revealed Samsung's CXL technology and vision.
CXL technology is an open standard for high-speed, high-capacity connections between central processing units (CPUs) and devices, as well as between CPUs and memory, designed specifically for high-performance data center computers.
Due to the complexity of CXL technology (related to PCIe - Peripheral Component Interconnect Express technology), only the function of CXL is discussed: memory using CXL technology can overcome the performance and slot packaging limitations of common DIMM memory while achieving high storage capacity. In terms of applications, it involves both hardware and software aspects - servers and storage products and solutions.
In early 2023, AMD's fourth-generation EPYC (code-named Genoa) and Intel's fourth-generation Xeon Scalable (code-named Sapphire Rapids) processor platforms laid the foundation for CXL application hardware, including Samsung, Micron, Intel, Astera Labs, and Taiwan's SMART Modular Technologies; on the software side, there are mainly cloud services and software system application companies such as Elastics.cloud, UK's IntelliProp, and Israel's UnifabriX.
Among them, Samsung Electronics is at the forefront of promoting the application of CXL technology.
On May 11, 2021, Samsung announced the industry's first memory module supporting the CXL interconnect standard (based on 128 GB DDR5). When integrated with Samsung's DDR5 technology, this module significantly expands the memory capacity of server systems to the TB level and bandwidth, accelerating the workloads of AI and high-performance computing (HPC) in data centers (IDCs).
Samsung's CXL memory module also integrates various controllers and software technologies such as memory mapping, interface conversion, and error management, enabling CPUs or GPUs to recognize CXL-based memory and use it as main memory.
Subsequently, Samsung released the world's first CXL memory module equipped with 512GB DDR5 DRAM, which increased the capacity by 4 times and reduced latency by 20% compared to the previous generation.
On May 12, 2023, Samsung announced the successful development of the industry's first 128 GB DRAM supporting CXL 2.0, supporting PCIe 5.0 interface (x8 channel) and providing up to 35GB per second bandwidth. This product ultimately achieved a milestone progress on Intel's Xeon platform Compared to HBM, CXL not only solves the bandwidth issue, but also addresses the capacity expansion problem at the same time. Just like how Chiplet technology resolves the transistor quantity bottleneck on a limited area.
In general, memory compatible with CXL technology has three main characteristics: high bandwidth, low latency, and scalability. Therefore, although HBM is still the main player for now, its inability to expand in the future will constrain its development prospects.
In fact, SK Hynix and Micron also see the potential of CXL, but these two companies are lagging behind Samsung in terms of progress in this technology. Compared to Samsung, SK Hynix showcased its first CXL 2.0 product in September 2023, trailing Samsung by about four months, while Micron was two months ahead of SK Hynix but still lagging behind Samsung by two months.
Samsung aims to surpass SK Hynix through CXL technology, but currently, its lead is relatively weak. It is still difficult to determine who will dominate this new market in the end