Very exciting for multi-lingual models. I really hope this one performs as well as the benchmarks suggest.
Very exciting for multi-lingual models. I really hope this one performs as well as the benchmarks suggest.
Yes. This is known as Mixture of Experts (MOE).
We already have several promising ways of doing this:
But are the short responses more correct?
Game changer! Would love to see this incorporated into ExLLama, AutoGPTQ and LlamaCPP
The startup time makes Replicate nearly unusable for me. Only popular models stay in memory. Other less used models shutdown, and you need to wait for startup before first inference.
Do you have Link to the stl for the 3d print?
Today I learned there are a large group of "EA"s (Effective Altruists) who are composed of millionaires and people in high positions of power.
These people believe it is their duty to act as the ‘gatekeepers’ of AI and prevent regular people from have useful or powerful AI. They want to destroy the open source AI movement and any company that is willing to allow regular people access to powerful AI.
Would love to use this for handling remote security camera footage.
Tried with LLAVA with little success. Has anyone successfully applied any of the Open Vision models to the problem of security?
None of the open models perform function calling as well as openai…
Do we get the dataset this time?
How fast is the generation? Can it be used real-time?
I too would like a comparison of these two techniques.
What vscode extensions are you using with these local models?
Sounds like he LIED to the board and the kicked him off. What did he lie about? Did they crack AGI, or is the board upset that Altman appears to be pro-regulation?
OpenAI is also non-profit.
Been using lmql since guidance appeared to be abandoned. Having great success with lmql.
Wow, could this be used to replicate MOE?
PLEASE BE OPEN SOURCE!
Has anyone tested this yet? We have a use case for our European partners from German speaking countries. Would like to know what other people’s experiences are.