RWKV Language Model
Github
Twitter
Discord
RWKV (pronounced RwaKuv) is an RNN with great LLM performance,
which can also be directly trained like a GPT transformer
(parallelizable). We are at RWKV-7 "Goose".
It's combining the best of RNN and transformer - great
performance, linear time, constant space (no kv-cache), fast training, infinite ctxlen, and free text embedding. And it's 100% attention-free,
and a
Linux Foundation AI project.
v6 7B Demo
v7 0.1B Demo
WebGPU Demo
RWKV-Projects
RWKV-LM
Training RWKV (and la...
Read more at rwkv.com