News Score: Score the News, Sort the News, Rewrite the Headlines

JetMoE: Reaching LLaMA2 Performance with 0.1M Dollars

Key MessagesJetMoE-8B is trained with less than $ 0.1 million1 cost but outperforms LLaMA2-7B from Meta AI, who has multi-billion-dollar training resources. LLM training can be much cheaper than people generally thought.JetMoE-8B is very open and academia-friendly because:It only uses public datasets for training, and the code is open-sourced. No proprietary resource is needed.It can be finetuned with very limited compute budget (e.g., consumer-grade GPU) that most labs can afford.JetMoE-8B only...

Read more at research.myshell.ai

© News Score  score the news, sort the news, rewrite the headlines