At the NVIDIA Graphics Technology Conference (GTC), NVIDIA has announced several new projects regarding the use of its latest AI supercomputer chip, Vera Rubin, such as data centers that will be launched into space.
Recently, two computer component manufacturers, Samsung and Micron have confirmed that they have begun producing high-capacity HBM4 (High Bandwidth Memory) memory chips to be used with the Vera Rubin AI chip.
According to a statement issued by Samsung, their sixth-generation HBM4 chip has now entered the mass production process, and has been specially produced for use with NVIDIA Vera Rubin AI processing chips.
Samsung says that each pin on their existing HBM4 memory chip has a data transfer speed of 11.7 Gbps and will increase to 13 Gbps when it is updated in the future. As for the HBM4E memory, it is seen to come with a total data transfer rate of 4.0 TB/s.
They also showed off their brand new copper stack technology that they say can support up to 16 substrate stacks, allowing a single HBM4 chip to come with significantly higher memory capacity and faster transfer rates.
Micron says its 36GB 12H HBM4 memory chip delivers pin speeds of over 11 Gb/s and an overall data transfer rate of up to 2.8 TB/s, representing a 2.3x increase in data transfer rates. The memory chip is also said to be 20 percent more power efficient than its previous HBM3E offering.

