Aaaaaaaaaeeeee@alien.topB to LocalLLaMA · 1 year ago55B Yi model mergeshuggingface.coexternal-linkmessage-square14fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-link55B Yi model mergeshuggingface.coAaaaaaaaaeeeee@alien.topB to LocalLLaMA · 1 year agomessage-square14fedilink
minus-squareDesm0nt@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoHm. I just load gguf yi-34b-chat q4_k_m in oobabooga via llama.cpp with default params and 8k context and it’s just work like a charm. Better (more lively language) than any 70b from openrouter (my local machine can’t handle 70b)
Hm. I just load gguf yi-34b-chat q4_k_m in oobabooga via llama.cpp with default params and 8k context and it’s just work like a charm. Better (more lively language) than any 70b from openrouter (my local machine can’t handle 70b)