🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages
An eagle, flying past a transformer-looking robotEagle 7B is a 7.52B parameter model that:Built on the RWKV-v5 architecture(a linear transformer with 10-100x+ lower inference cost)Ranks as the world’s greenest 7B model (per token)Trained on 1.1 Trillion Tokens across 100+ languagesOutperforms all 7B class models in multi-lingual benchmarksApproaches Falcon (1.5T), LLaMA2 (2T), Mistral (>2T?) level of performance in English evalsTrade blows with MPT-7B (1T) in English evalsAll while being an “Att...
Read more at blog.rwkv.com