News Score: Score the News, Sort the News, Rewrite the Headlines

AWS' Trainium2 chips for building LLMs are now generally available, with Trainium3 coming in late 2025 | TechCrunch

At its re:Invent conference, AWS today announced the general availably of its Trainium2 (T2) chips for training and deploying large language models (LLMs). These chips, which AWS first announced a year ago, will be four times as fast as their predecessors, with a single Trainium2-powered EC2 instance with 16 T2 chips providing up to 20.8 petaflops of compute performance. In practice, that means running inference for Meta’s massive Llama 405B model as part of Amazon’s Bedrock LLM platform will be...

Read more at techcrunch.com

© News Score  score the news, sort the news, rewrite the headlines