• Desm0nt@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Hm. I just load gguf yi-34b-chat q4_k_m in oobabooga via llama.cpp with default params and 8k context and it’s just work like a charm. Better (more lively language) than any 70b from openrouter (my local machine can’t handle 70b)