I’m confused with their prompt format, do we really need to use their library to try the model?
- 0 Posts
- 3 Comments
Joined 2 years ago
Cake day: November 12th, 2023
You are not logged in. If you use a Fediverse account that is able to follow users, you can follow this user.
rkzed@alien.topBto LocalLLaMA•Question about the 'economics' of running a LLM locally?English1·2 years agoif i just want to try big models, renting is more reasonable. but if i plan to use it daily and for the longer period of time, making investment to buy the GPU is make a lot of sense. not only you could use the GPU for gaming or another tasks, you also actually own the GPU which you could probably sell it later for minimum half the original price.
I had great fun with this specific model. Tried up to 32K context length with very minimum repetition problem…