perlthoughts@alien.topB to LocalLLaMAEnglish · 1 year agoNew Model: openchat 3.5 with 16k contexthuggingface.coexternal-linkmessage-square20fedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1external-linkNew Model: openchat 3.5 with 16k contexthuggingface.coperlthoughts@alien.topB to LocalLLaMAEnglish · 1 year agomessage-square20fedilinkfile-text
minus-squareperlthoughts@alien.topOPBlinkfedilinkEnglisharrow-up1·1 year agonah you can use llama.cpp or whatever you like, thebloke already has multiple gguf versions up already.
minus-squareinvolviert@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoThey were talking about the prompt format. Because obviously their library will be translating that OpenAI API-style to actual proper prompt format internally, which is not documented at all.
nah you can use llama.cpp or whatever you like, thebloke already has multiple gguf versions up already.
They were talking about the prompt format. Because obviously their library will be translating that OpenAI API-style to actual proper prompt format internally, which is not documented at all.