minus-squareKeyAdvanced1032@alien.topBtoLocalLLaMA•Best Local LLM Backend Server Library?linkfedilinkEnglisharrow-up1·2 years agoI think all frameworks support custom instruct templates, and know for a fact llama.cpp does due to my use of StudioLM, based on llama.cpp, in which I can alter the system / user / assistant templates. linkfedilink
minus-squareKeyAdvanced1032@alien.topBtoLocalLLaMA•In my opinion open-source projects should focus an a very narrow thing, instead of focusing on being a "GPT", that focuses on being able to do everything.linkfedilinkEnglisharrow-up1·2 years agoSmall specialized LLMs are going to be a thing the same way using frameworks is now. linkfedilink
I think all frameworks support custom instruct templates, and know for a fact llama.cpp does due to my use of StudioLM, based on llama.cpp, in which I can alter the system / user / assistant templates.