derpgod123@alien.topB to LocalLLaMAEnglish · 1 year agoIs using WSL good enough for running LLM models locally?message-squaremessage-square5fedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1message-squareIs using WSL good enough for running LLM models locally?derpgod123@alien.topB to LocalLLaMAEnglish · 1 year agomessage-square5fedilinkfile-text
WSL eats alot of RAM memory