SK hynix took center stage at CES 2026 this Tuesday to reveal its latest weapons for the artificial intelligence era. The star of the show is the world’s first 16-high 48GB HBM4 chip. This massive memory breakthrough sets a new bar for how much data AI systems can handle at once.
This new HBM4 chip replaces the previous 12-high 36GB version, which already moved data at a blistering 11.7 gigabits per second. Building a 16-layer stack is incredibly difficult. To make it work, engineers had to squeeze the layers closer together, stop the silicon wafers from warping, and line everything up with perfect precision. They also had to solve complex cooling problems, as stacking more layers typically creates more heat.
Beyond the new HBM4, SK hynix showed off its current heavy hitter: the 12-high 36GB HBM3E. They displayed this chip inside a GPU module built for Nvidia’s newest AI servers. This demonstration gave visitors a real-world view of how memory powers the massive computations required for modern AI environments.
The company also introduced several other tools for the AI industry. They debuted the SOCAMM2, a specialized memory module designed to reduce power consumption in AI servers. For smartphones and laptops, they brought out LPDDR6. This version speeds up on-device AI tasks while using less battery than previous generations.
Data centers also got a significant boost. SK hynix unveiled a 2-terabit NAND chip with a staggering 321 layers. This chip offers some of the highest storage density available today. It uses less power and fits more data into smaller spaces, making it a perfect fit for the giant data centers that run the world’s most popular AI programs.
Under their new theme, “Innovative AI, Sustainable Tomorrow,” SK hynix wants to prove they can make AI faster and greener at the same time. The company says it will work closely with global partners to push the boundaries of what these chips can do.











