unculturedperl@alien.topBtoLocalLLaMA•What kind of mini PC can handle a local LLM?English
1·
1 year agoHow mini do you want? I plugged a llama2 7b into an N100 w/16gb and ran it, speed was not very good.
Real question is what are you trying to accomplish and is this the best route to do so?
Speed costs money, how fast can you afford to go?
Why 72gb? 80 or 96 seems like a more reasonable number. H100’s have 80gb models if you can afford it ($29k?). Two A6000 adas would be $15k (plus a system to put them in).
The higher end compute cards seem more limited by funds and production than anything, X090 cards are where you find more scalpers and their ilk.