The processing chips launched today, whether for home users, data centers or graphics processing farms, are all equipped with artificial intelligence processing capabilities, because that is what the market is demanding.
Amazon Web Services (AWS) has launched the Trainium3 AI processing chip, which is their latest chip for machine learning purposes and can also be used for processing generative AI instructions such as generating text, images and videos at a faster and more efficient rate.
The Trainium3 processing chip is developed using TSMC's 3nm process, provides 2.5 petaflops of computing power and uses 144GB of HBM3e memory. AWS also says that the Trainium3 chip is 4 times more power efficient than previous chips, reducing electricity consumption considerably.
The chip was also designed and tested in their own lab in Austin, Texas before being mass-produced.
With the announcement of the Trainium3 AI processing chip, AWS is also seen updating their AI server machine offering, now called AWS Trn3 UltraServer, where each cluster comes with up to 144 Trainium3 chips, giving it 4.4 times more processing power than the previously launched AWS Trn2 UltraServer server machine.
AWS also said that this AI processing chip, and the AWS Trn3 UltraServer server machine have begun testing, and will be sold to AI development companies and service providers in the future.
