News Score: Score the News, Sort the News, Rewrite the Headlines

Training AI models might not need enormous data centres

Eventually, models could be trained without any dedicated hardware at allOnce, the world’s richest men competed over yachts, jets and private islands. Now, the size-measuring contest of choice is clusters. Just 18 months ago, OpenAI trained GPT-4, its then state-of-the-art large language model (LLM), on a network of around 25,000 then state-of-the-art graphics processing units (GPUs) made by Nvidia. Now Elon Musk and Mark Zuckerberg, bosses of X and Meta respectively, are waving their chips in t...

Read more at economist.com

© News Score  score the news, sort the news, rewrite the headlines