sarl__cagan@alien.topBtoLocalLLaMA•Anyone spend a bunch of $$ on a computer for LLM and regret it?English
1·
1 year agoI bought a fancy 4090 rig and returned it but mostly because I want a more powerful rig
I bought a fancy 4090 rig and returned it but mostly because I want a more powerful rig
If you are cool just using the command line, ollama is great and easy to use.
Otherwise, you could download LMStudio app on Mac, then download a model using the search feature, then you can start chatting. Models from TheBloke are good. You will probably need to try a few models (GGML format most likely). Mistral 7B or llama2 7B is a good starting place IMO.