

1·
2 years agoThat would be amazing. I think something like that could even be included into ooba’s official extension repo.
That would be amazing. I think something like that could even be included into ooba’s official extension repo.
They fine-tuned their model on LLama2 or what?
Sooo, how’s the model?
In other news: the water is wet.
I’m still trying to figure out what are the correct settings for under 200k context. Ooba loads compress_emb(or whatever it’s called) to 5mils and I dunno if you should leave it alone or change it if you change the context size to, say, 64k.