Find is really great in specific instances. As soon as I give it a bunch of code with any complexity in requests, it falls apart and does much worse than chatgpt.
Find is really great in specific instances. As soon as I give it a bunch of code with any complexity in requests, it falls apart and does much worse than chatgpt.
I think this would play well with some games.
Might need to go back to dual graphics cards setups, one for finetuned LLMs and one for the actual game.
But imagine a civilization 7 with LLM based AI.
Can’t imagine the batshit stuff that would come out of that and high replayability. It would even pair well with the more limited context (we don’t really need catherine the great to hold a grudge from 3000bc in 1800ad).
I could see secret alliances, clandestine plans to take over the world, strange treaties… the world congress could be a lot more fun/interactive…
Why did Gandhi team up with Sulaiman to launch a first strike nuclear attack on me??? What does he mean the burning flesh of my people is more satisfying than the scent of fresh rose water???
How many users do you have? If you’ve been keeping your inputs/outputs to gpt4, then you can probably use that to tune a your own model that will perform similarly.
The biggest issue you’re going to have is probably hardware.
LLMs are not cheap to run, and if you start needing multiple of them to replace OpenAI, your bill is going to be pretty significant just to keep the models online.
It’s also going to be tough to maintain all the infra you’ll need without a full time devops/mlops person.