Aaaaaaaaaeeeee@alien.topB to LocalLLaMA · 2 years ago55B Yi model mergeshuggingface.coexternal-linkmessage-square14linkfedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-link55B Yi model mergeshuggingface.coAaaaaaaaaeeeee@alien.topB to LocalLLaMA · 2 years agomessage-square14linkfedilink
minus-squareDesm0nt@alien.topBlinkfedilinkEnglisharrow-up1·2 years agoHm. I just load gguf yi-34b-chat q4_k_m in oobabooga via llama.cpp with default params and 8k context and it’s just work like a charm. Better (more lively language) than any 70b from openrouter (my local machine can’t handle 70b)
Hm. I just load gguf yi-34b-chat q4_k_m in oobabooga via llama.cpp with default params and 8k context and it’s just work like a charm. Better (more lively language) than any 70b from openrouter (my local machine can’t handle 70b)