Recently, I got interested in fine-tuning low-parameter models on my low-end hardware. My hardware specs are as follows: i7 1195G7, 32 GB RAM, and no dedicated GPU. I want to finetune the model to model my writing style based on years of text written by myself. Right now, I’m looking to fine-tune this model (TinyLlama). Is this possible? If it’s possible, how long will it take for the model to be fine-tuned?

  • __SlimeQ__@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I can’t speak for 1B models but you’re going to have a really hard time training with no gpu. It’s just going to take an insanely long time.

    For $500 though you can get a 4060ti with 16gb of ram which is good enough to train a 13B lora