minus-squareMonkeyMaster64@alien.topBtoLocalLLaMA•ExLlamaV2: The Fastest Library to Run LLMslinkfedilinkEnglisharrow-up1·1 year agoIs this able to use CPU (similar to llama.cpp)? linkfedilink
MonkeyMaster64@alien.topB to LocalLLaMAEnglish · 1 year agoLarge-scale LLM deployment with GBNF supportplus-squaremessage-squaremessage-square1fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareLarge-scale LLM deployment with GBNF supportplus-squareMonkeyMaster64@alien.topB to LocalLLaMAEnglish · 1 year agomessage-square1fedilink
Is this able to use CPU (similar to llama.cpp)?