Hello, I’m a student delving into the study of large language models. I recently acquired a new PC equipped with a Core i7 14th Gen processor, RTX 4070 Ti graphics, and 32GB DDR5 RAM. Could you kindly suggest a recommended language model for optimal performance on my machine?
first off, why is your title formatted like an article? you mention being a student, so do you imply you want resources for studying the underlying architecture on the large language models? in which case you could watch the channel of “Andej Karpathy” which was pretty enlightening to a layman like me, but it’s pretty hard to progress further than that on the ‘science’ of llms without a cs degree.
other than that, there really isn’t a ‘study’ to be done of llms, as it’s a pretty new field, unless you want to get into hardcore ml stuff with a cs degree and all, as for models, with your not quite ‘cutting edge’ pc, you could try a yi 34b finetune for longer context , though it’s prone to break from my testing, or you could try many smaller 7b models, of which i have been enjoying the whole family of mistral finetunes the most (openorca, openhermes, etc).
for roleplay and stuff the LLaMa Tiefighter model is pretty cool. if you truly want access to cutting edge hardware that will be capable of running the best open source models. eg llama 70b or goliath 120b, you could look into paid cloud gpu services like runpod which are pretty easy to use and i had a mostly positive experience running llms there. hope this answer helps.