Jugg3rnaut@alien.topB to LocalLLaMAEnglish · 1 year agoGPU-over-IP for LLM inference?plus-squaremessage-squaremessage-square1fedilinkarrow-up10arrow-down10
arrow-up10arrow-down1message-squareGPU-over-IP for LLM inference?plus-squareJugg3rnaut@alien.topB to LocalLLaMAEnglish · 1 year agomessage-square1fedilink
Jugg3rnaut@alien.topB to LocalLLaMAEnglish · 1 year agoChassis only has space for 1 GPU - Llama 2 70b possible on a budget?plus-squaremessage-squaremessage-square8fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareChassis only has space for 1 GPU - Llama 2 70b possible on a budget?plus-squareJugg3rnaut@alien.topB to LocalLLaMAEnglish · 1 year agomessage-square8fedilink