Use case is that I want to create a service based on Mistral 7b that will server an internal office of 8-10 users.

I’ve been looking at modal.com, and runpod. Are there any other recommendations?

  • dazld@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Did you think about running out of a local m1 Mac mini? Ollama uses the Mac GPU out of the box.