perlthoughts@alien.topB to LocalLLaMAEnglish · 1 year agoNew Model: openchat 3.5 with 16k contexthuggingface.coexternal-linkmessage-square20fedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1external-linkNew Model: openchat 3.5 with 16k contexthuggingface.coperlthoughts@alien.topB to LocalLLaMAEnglish · 1 year agomessage-square20fedilinkfile-text
minus-squareReMeDyIII@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoPlus, isn’t GPT-3.5-Turbo multimodal? There’s no way a 7B can outperform that.
Plus, isn’t GPT-3.5-Turbo multimodal? There’s no way a 7B can outperform that.