Appropriate-Tax-9585@alien.topOPBtoLocalLLaMA•What kind of specs to run local llm and serve to say up to 20-50 usersEnglish
1·
1 year agoAt the moment I’m just trying to grasp the basics, like for example what kind of GPUS I will need and how many. This is more for comparison to SaaS options, however in reality I need to setup a server for testing with just few users. I’m going to research into but I like this community and to hear others view on the case as many have tried to manage their own servers I imagine :)
Thank you, this is really good to hear!