High bandwidth memory (HBM) has been attracting users across a growing number of data-intensive computing markets ever since the second-generation HBM2 version of the technology was approved as an industry standard in January 2016. Samsung started manufacturing HBM2 dynamic random-access memory (DRAM) that same month, and we are excited by the broad support we’re seeing for this groundbreaking memory technology.
HBM2 comes in memory cubes containing up to eight vertically stacked 8-gigabit (Gb) DRAM chips connected internally by as many as 40,000 tiny “through silicon via” (TSV) data paths. A wide 1024-bit data interface provides unprecedented memory bandwidth, with each DRAM stack capable of transferring up to 256 gigabytes (GB) of data per second. The compact, energy-efficient architecture also takes up far less circuit board space than traditional memory modules, making it attractive for space-constrained designs.
Early adopters
High performance computing applications, which are always hungry for more data, were among the first applications needing compute acceleration. Nvidia quickly chose to use HBM2 in its Tesla P100 accelerators to power data centers in need of supercharged performance. AMD chose to use HBM2 in its Radeon Instinct accelerators for the data center, as well as in its high-end graphics cards.
We saw the announcement of a client-based solution using HBM in November. Intel has embraced the technology, leveraging HBM2 to introduce high-performance, power efficient, graphics solutions for mobile PCs. The new Intel chipset will make it easier to build thinner, lighter notebooks.
HBM2 technology is also finding its way into networking applications. Rambus and Northwest Logic, for instance, recently teamed up to introduce HBM2-compatible memory controller and physical layer (PHY) technology for use in high-performance networking chips. Other companies developing products that combine HBM2 storage with various networking capabilities include Mobiveil, eSilicon and Open-Silicon and Wave Computing.
AI’s growing HBM appetite
Finally, artificial intelligence (AI) is starting to look like one of the most promising new markets for HBM2. It turns out that GPUs, which were originally developed for graphics processing, are remarkably effective at helping AI software learn how to identify the complex patterns gleaned from large volumes of data. IDC is forecasting that global AI revenue will multiply from about $8 billion in 2016 to more than $47 billion in 2020.
And as AI technology makes increasing inroads into areas such as health care, home automation and voice, image and text recognition, demand for high bandwidth memory that can help users squeeze more performance out of their AI applications appears to have nowhere to go but up.
Published by TIEN SHIAH
Tien Shiah is Product Marketing Manager for High Bandwidth Memory at Samsung Semiconductor, Inc. In this capacity, he serves as the company’s product consultant, market expert and evangelist for HBM in the Americas, focused on providing a clear understanding of the tremendous benefits offered by HBM in the enterprise and client marketplaces.
At this year's MWC in Barcelona, Samsung Semiconductor exhibited its latest solutions and products for mobile applications
Date: 2019-11-22Reference server design guidelines for NGSFF SSD from Samsung describing Mission Peak architecture and other details.
Date: 2019-11-22Watch the video to check out Samsung NF1 SSD, small form factor that has high capacity and compatibility.
Date: 2019-11-22Greater memory will be a critical part of the design and implementation of 5G networks. To address the advanced memory needs of 5G networks, Samsung offers a growing array of products.
Date: 2019-11-22Watch the video below to check out Samsung Z-SSD, an SSD that boasts a new level of performance.
Date: 2019-11-22Samsung Z-SSD SZ985, a new type of Ultra-low Latency SSD for Enterprise and Data Centers
Date: 2019-11-22Samsung announced today that 46 of its new product innovations have been recognized as CES® 2020 Innovation Awards winners, including three Best of Innovations accolades.
Date: 2019-11-22Samsung’s new 256Gb V-NAND features industry’s fastest data transfer speed, while being the first to apply the ‘Toggle DDR 4.0’ NAND interface
Date: 2019-11-22