perlthoughts@alien.topB to LocalLLaMAEnglish · 2 years agoNew Model: openchat 3.5 with 16k contexthuggingface.coexternal-linkmessage-square20linkfedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1external-linkNew Model: openchat 3.5 with 16k contexthuggingface.coperlthoughts@alien.topB to LocalLLaMAEnglish · 2 years agomessage-square20linkfedilinkfile-text
minus-squarerkzed@alien.topBlinkfedilinkEnglisharrow-up1·2 years agoI’m confused with their prompt format, do we really need to use their library to try the model?
minus-squareperlthoughts@alien.topOPBlinkfedilinkEnglisharrow-up1·2 years agonah you can use llama.cpp or whatever you like, thebloke already has multiple gguf versions up already.
minus-squareinvolviert@alien.topBlinkfedilinkEnglisharrow-up1·2 years agoThey were talking about the prompt format. Because obviously their library will be translating that OpenAI API-style to actual proper prompt format internally, which is not documented at all.
minus-squareDear_noobs@alien.topBlinkfedilinkEnglisharrow-up1·2 years agoI came across this yesterday, one interface to be able to jump between all the things. Find what you want to try, click Download, then chat with it…
I’m confused with their prompt format, do we really need to use their library to try the model?
nah you can use llama.cpp or whatever you like, thebloke already has multiple gguf versions up already.
They were talking about the prompt format. Because obviously their library will be translating that OpenAI API-style to actual proper prompt format internally, which is not documented at all.
I came across this yesterday, one interface to be able to jump between all the things.
Find what you want to try, click Download, then chat with it…