micron wafer semiconductor

Along with unveiling its first HBM3 memory products yesterday, Micron additionally printed a contemporary DRAM roadmap for its AI prospects for the approaching years. Being one of many world’s largest reminiscence producers, Micron has quite a lot of attention-grabbing issues deliberate, together with high-capacity DDR5 reminiscence units and modules, GDDR7 chips for graphics playing cards and different bandwidth-hungry units, in addition to HBMNext for synthetic intelligence and high-performance computing functions.

32 Gb DDR5 ICs

All of us love cheap high-capacity reminiscence modules, and it appears like Micron has us coated. Generally within the late first half of 2024, the corporate plans to roll-out its first 32 Gb DDR5 reminiscence dies, which might be produced on the corporate’s 1β (1-beta) manufacturing course of. That is Micron’s newest course of node and which doesn’t use excessive ultraviolet lithography, however fairly depends on multipatterning.

32 Gb DRAM dies will allow Micron to construct 32 GB DDR5 modules utilizing simply eight reminiscence units on one facet of the module. Such modules might be made right now with Micron’s present 16 Gb dies, however this requires both putting 16 DRAM packages over either side of a reminiscence module – driving up manufacturing prices – or by putting two 16 Gb dies inside a single DRAM package deal, which incurs its personal prices because of the packaging required. 32 Gb ICs, by comparability, are simpler to make use of, so 32 GB modules primarily based on denser DRAM dies will ultimately result in decrease prices in comparison with right now’s 32 GB reminiscence sticks.

However desktop issues apart, Micron’s preliminary focus with their increased density dies might be to construct even increased capability information center-class elements, together with RDIMMs, MRDIMMs, and CXL modules. Present excessive efficiency AI fashions are usually very massive and reminiscence constrained, so bigger reminiscence swimming pools open the door each to even bigger fashions, or in bringing down inference prices by having the ability to run further cases on a single server.

For 2024, Micron is planning to launch 128GB DDR5 modules primarily based on these new dies. As well as, the corporate introduced plans for 192+ GB and 256+ GB DDR5 modules for 2025, albeit with out disclosing which chips these are set to make use of.

In the meantime, Micron’s capacity-focused roadmap would not have a lot to say about bandwidth. Whereas it might be uncommon for newer DRAM dies to not clock not less than considerably increased, reminiscence producers as an entire haven’t supplied a lot steerage about future DDR5 reminiscence speeds. Particularly with MRDIMMs within the pipeline, the main target is extra on gaining further pace by way of parallelism, fairly than working particular person DRAM cells quicker. Although with this roadmap particularly, it is clear that Micron is extra targeted on selling DDR5 capability than selling DDR5 efficiency.

GDDR7 in 1H 2024

Micron was the primary bigger reminiscence maker to announce plans to roll out its GDDR7 reminiscence within the first half of 2024. And following up on that, the brand new roadmap has the the corporate prepping 16 Gb and 24 Gb GDDR7 chips for late Q2 2024.

As with Samsung, Micron’s plans for his or her first era GDDR7 modules do not need them reaching the spec’s highest switch charges immediately (36 GT/sec), and as an alternative Micron is aiming for a extra modest and sensible 32 GT/sec. Which continues to be adequate to allow upwards of fifty% larger bandwidth for next-generation graphics processors from AMD, Intel, and NVIDIA. And maybe particularly NVIDIA, since this roadmap additionally implies that we cannot be seeing a GDDR7X from Micron, which means that for the primary time since 2018, NVIDIA will not have entry to a specialty GDDR DRAM from Micron.

HBMNext in 2026

Along with GDDR7, which might be utilized by graphics playing cards, recreation consoles, and lower-end high-bandwidth functions like accelerators and networking tools, Micron can also be engaged on the forthcoming generations of its HBM reminiscence for heavy-duty synthetic intelligence (AI) and high-performance computing (HPC) functions.

Micron expects its HBMNext (HBM4?) to be obtainable in 36 GB and 64 GB capacities, which factors to quite a lot of configurations, corresponding to 12-Hello 24 Gb stacks (36 GB) or 16-Hello 32 Gb stacks (64 GB), although these are pure speculations at this level. As for efficiency, Micron is touting 1.5 TB/s – 2+ TB/s of bandwidth per stack, which factors to information switch charges in extra of 11.5 GT/s/pin.


Leave a Reply

Your email address will not be published. Required fields are marked *