So I’m considering getting a good LLM rig, and the M2 Ultra seems to be a good option for large memory, with much lower power usage/heat than 2 to 8 3090s or 4090s, albeit with lower speeds.

I want to know if anyone is using one, and what it’s like. I’ve read that it is less supported by software which could be an issue. Also, is it good for Stable Diffusion?

Another question is about memory and context length. Does a big memory let you increase the context length with smaller models where the parameters don’t fill the memory? I feel a big context would be useful for writing books and things.

Is there anything else to consider? Thanks.

  • bebopkim1372@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Another question is about memory and context length. Does a big memory let you increase the context length with smaller models where the parameters don’t fill the memory? I feel a big context would be useful for writing books and things.

    Of course. Long context also requires VRAM. Larger VRAM is always good for LLM or other AI stuff.