I’m curious whether that would work and someone might already try. They are both finetunes from mistral, so i would imagine. I have a feeling that this frankenmerge could produce a very good small billion parameter model that might be better than any current <=14b.
Your wish has apparently been granted
https://huggingface.co/Weyaxi/OpenHermes-2.5-neural-chat-7b-v3-1-7B
https://huggingface.co/TheBloke/OpenHermes-2.5-neural-chat-7B-v3-1-7B-GGUF