From ExtremeTech: At long last, HBM4 is officially here—at least as a specification. The JEDEC released Standard 270-4, supplying high bandwidth memory (HBM) makers with a complete specification for what will likely be a massively lucrative product for the usual suspects: Micron, Samsung, and SK Hynix. The specification sets the stage for a gradual transition from HBM3 to HBM4 as semiconductor companies like AMD and Nvidia build the next generations of AI GPUs and other hardware.
The HBM4 standard will allow for 48GB of capacity in a stack of 16 DRAM dies. According to the JEDEC, HBM4 supports 4, 8, 12, and 16 DRAM die stacks. Each stack supports 32 channels or 64 pseudo channels. The memory should be more power-efficient than its HBM3 and HBM3E predecessors. And, like the earlier versions, HBM4 will probably show up only in GPUs designed for AI and other data center applications, as opposed to gaming graphics processors.
View: Full Article