I agree, it’s my favourite 7b model too. I use it mainly to help me with bot personalities. It’s too bad it’s not really fine-tuned for roleplay, otherwise it would be wrecking. And yes, 16k is broken for me too.
In general I think it would be nice if people tried to mix several Mistral models more often, as it was with the Mistral-11B-CC-Air-RP. Yes, it has serious problems with understanding the context and the characters go into psychosis, but if you use a small quantization (like q 5-6) and minimum P parameter, it improves the situation a bit. It’s just that apparently something went wrong when model merging. Otherwise, this model is really the most unique I’ve tried. Characters talk similarly to the early Character AI.
I agree, it’s my favourite 7b model too. I use it mainly to help me with bot personalities. It’s too bad it’s not really fine-tuned for roleplay, otherwise it would be wrecking. And yes, 16k is broken for me too.
In general I think it would be nice if people tried to mix several Mistral models more often, as it was with the Mistral-11B-CC-Air-RP. Yes, it has serious problems with understanding the context and the characters go into psychosis, but if you use a small quantization (like q 5-6) and minimum P parameter, it improves the situation a bit. It’s just that apparently something went wrong when model merging. Otherwise, this model is really the most unique I’ve tried. Characters talk similarly to the early Character AI.
https://huggingface.co/TheBloke/Mistral-11B-CC-Air-RP-GGUF/tree/main?not-for-all-audiences=true