8.5
"Megalodon: New Neural Architecture Outperforms Transformers with Unlimited Context Length, Achieving Greater Efficiency in Sequence Modeling"
arxiv.org
#
©
News Score
score the news, sort the news, rewrite the headlines
Leaderboard
Submit
About