Hello, I’m a student delving into the study of large language models. I recently acquired a new PC equipped with a Core i7 14th Gen processor, RTX 4070 Ti graphics, and 32GB DDR5 RAM. Could you kindly suggest a recommended language model for optimal performance on my machine?

  • PacmanIncarnate@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    A 24 GB GPU is still limited to fitting a 13B fully in VRAM. His PC is a great one; not the highest end, but perfectly fine to run anything up to a 70B in llama.cpp

    • opi098514@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I didn’t say it wasn’t. But getting into LLMs really just shows you how much better your PC can be and you will never been as cutting edge as you think or want.