minus-squaredazld@alien.topBtoLocalLLaMA•What’s recommended hosting for open source LLMs?linkfedilinkEnglisharrow-up1·1 year agoDid you think about running out of a local m1 Mac mini? Ollama uses the Mac GPU out of the box. linkfedilink
Did you think about running out of a local m1 Mac mini? Ollama uses the Mac GPU out of the box.