minus-squarePacmanIncarnate@alien.topBtoLocalLLaMA•Optimizing Your Language Model Experience: A Student's Journey with a Cutting-Edge PC featuring Core i7 14th Gen, RTX 4070 Ti, and 32GB DDR5 RAMlinkfedilinkEnglisharrow-up1·1 year agoA 24 GB GPU is still limited to fitting a 13B fully in VRAM. His PC is a great one; not the highest end, but perfectly fine to run anything up to a 70B in llama.cpp linkfedilink
minus-squarePacmanIncarnate@alien.topBtoLocalLLaMA•Any alternatives to couqi for TTS?linkfedilinkEnglisharrow-up1·1 year agoAnd fast. Not sure they’ll find something better. linkfedilink
A 24 GB GPU is still limited to fitting a 13B fully in VRAM. His PC is a great one; not the highest end, but perfectly fine to run anything up to a 70B in llama.cpp