Hi!
I’m quite new to LLMs and want to use it to make training workouts. My idea would be to feed it scientific studies and a bunch of example workouts.
Is this what “training a model” is for? Any resource where I can start to learn how to train one?
Can I use and already fine tuned model like Mistral, or do I need to train a base model like LLama2?
Can I train a quantized model or do I need to use a vanilla one? And quantize it after training?
I have 2x3090, 5950x and 64GB of Ram. If that matters. If I can load a model for inference can I train? Are the resources needed the same?
Thanks!
Generally if what you want is to impart new knowledge what you want is a embedding.
Assuming it is a large amount of data you will want a vector db.
Using retrieval augmented generation, RAG.
This is better explained by this guy 16 days ago
https://www.reddit.com/r/LocalLLaMA/comments/17qse19/comment/k8e7fvx/