There has been a lot of movement around and below the 13b parameter bracket in the last few months but it’s wild to think the best 70b models are still llama2 based. Why is that?
We have 13b models like 8bit bartowski/Orca-2-13b-exl2 approaching or even surpassing the best 70b models now
Qwen 72b is comming in 2 days 👍 Will be a real beast.
I heard, if it comes out then finally it might be worth exllama supporting it. I heard the 14b was fairly strong.
Yes I also hope it get’s exllamav2 support, here is a issue regarding it: (Qwen model not supported) · Issue #160 · turboderp/exllamav2 (github.com)
I can’t seem to find anything about qwen 72b except two tweets from a month ago that said it was coming out. who makes it? what’s it trained on? any details?
Curiously nobody from the previous comment upvoters have provided an answer to your question.
2 days? Bro if they said November and haven’t released it by now, it’s not two days.