oKatanaa@alien.topBtoLocalLLaMA•MonadGPT, an early modern chatbot trained on Mistral-Hermes and 17th century books.English
1·
1 year agoHow was it trained? Did you just train it on the passages from those books? If so, I am very surprised it retained its conversational capabilities. I would expect it to just go off the rails and generate random 17th century stuff
You’re better off using something like BERT rather than shooting a pigeon with a ballistic missile. It easier, cheaper, faster and much more reliable.