perlthoughts@alien.topB to LocalLLaMAEnglish · 2 years agoNew Model: openchat 3.5 with 16k contexthuggingface.coexternal-linkmessage-square20linkfedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1external-linkNew Model: openchat 3.5 with 16k contexthuggingface.coperlthoughts@alien.topB to LocalLLaMAEnglish · 2 years agomessage-square20linkfedilinkfile-text
minus-squareperlthoughts@alien.topOPBlinkfedilinkEnglisharrow-up1·2 years agonah you can use llama.cpp or whatever you like, thebloke already has multiple gguf versions up already.
minus-squareinvolviert@alien.topBlinkfedilinkEnglisharrow-up1·2 years agoThey were talking about the prompt format. Because obviously their library will be translating that OpenAI API-style to actual proper prompt format internally, which is not documented at all.
nah you can use llama.cpp or whatever you like, thebloke already has multiple gguf versions up already.
They were talking about the prompt format. Because obviously their library will be translating that OpenAI API-style to actual proper prompt format internally, which is not documented at all.