tldr: can i just drop a used 3090 in my old PC or would be a 4060ti new safer option?
Hi all!
I really want to make my feet wet in running local LLM, especially
- inferencing of 7b models
- some QLora fun
I’d like also to have fun running bigger, quantized models and, if possible, finetune some smallish model like GPT2-XL (like 1B) but if it’s feasible otherwise i’ll just rent some cloud. A little bit of gaming (Escape from Tarkov) in my freetime would’nt hurt
I’ve figure it out that my best GPU options are :
- 4060ti 16gb for around 450€ new and hoping for some black friday deals
- 3090 24gb used for around 700€
My current (very old) pc spec are the following:
- i5 2500 3.3GHz
- 16gb DDR3
- Asus p8p67 LGA1155 ( 4x PCI-E 32 but bus width)
- AMR R9 270 Sapphire
- a 600 W PSU
So my questions are:
- Can I afford to invest all my budget in the 3090? I have a second PSU at home that will be used only to power the gpu out of the case
- Is it better to buy the 4060ti and use the remaining budget to upgrade older parts (in this case, which one?)
Thanks for the help guys!
3090 for more VRAM = bigger models