So RWKV 7b v5 is 60% trained now, saw that multilingual parts are better than mistral now, and the english capabilities are close to mistral, except for hellaswag and arc, where its a little behind. all the benchmarks are on rwkv discor, and you can google the pro/cons of rwkv, though most of them are v4.
Thoughts?
SIGNIFICATNLY less - it is not a transformer that goes totally quadratic.
It is not a transformer?
Nope, RNN without attention, with some tricks for enabling parallel training.
Its basically… 0?
From github: