Nonetendo65@alien.topBtoLocalLLaMA•Anyone have a 1B or 3B model that is mostly coherent?English
1·
1 year agoI’ve found Orca-Mini to be quite helpful for simple generation tasks < 200 tokens, given it’s only 2.0GB it’s quite powerful and easy to deploy on consumer hardware. Orca is the famous dataset that the wonderful Mistral 7B was trained on :)
GPT-4 is plagued with outages. I’ve found the API unreliable to use in a production setting. Perhaps this will improve with time :)