Using flan-t5 models - https://huggingface.co/lorahub
Access to powerful, open-source LLMs has also inspired a community devoted to refining the accuracy of these models, as well as reducing the computation required to run them. This vibrant community is active on the Hugging Face Open LLM Leaderboard, which is updated often with the latest top-performing models.
That’s a nice indirect shout out.
The stages of learning
MS is a lot more than just their OpenAI investment, shorting MS based upon OpenAI exclusively seems not to be the best idea.
And we find the backend is just Mechanical Turk.
He spoke at the Cambridge Union to receive the Hawking Fellowship on 1st of November, from the talk the allegations sound like a lot of BS, it’s a shame I can’t short their stock - https://www.youtube.com/watch?v=NjpNG0CJRMM
I use oobabooga, I’m actually testing out using it with Language Agent Tree Search to see if it can make better outputs -https://github.com/andyz245/LanguageAgentTreeSearch
Deepseek Coder 33B worked well for me, I asked it to make the game snake and it did it the first time with the 4bit GPTQ - https://huggingface.co/TheBloke/deepseek-coder-33B-instruct-GPTQ
Other models are available to run on CPU/GPU - https://huggingface.co/models?search=deepseek 33b
Hacker News thread - https://news.ycombinator.com/item?id=38309611
If you collapse does that end the universe???
Models are data, data is speech, any attempt by the USG to regulate it is therefore a violation of the 1st Amendment just like cryptography.
https://reason.com/video/2020/10/21/cryptowars-gilmore-zimmermann-cryptography/
https://github.com/khaimt/qa_expert