minus-squarecreaturefeature16@alien.topBtoLocalLLaMA•When training an LLM how do you decide to use a 7b, 30b, 120b, etc model (assuming you can run them all)?linkfedilinkEnglisharrow-up1·1 year agoI have a similar question as OP. What if you wanted to train a model specifically on coding? And even more specifically in say, just a particular library? linkfedilink
I have a similar question as OP. What if you wanted to train a model specifically on coding? And even more specifically in say, just a particular library?