DrVonSinistro@alien.topB to LocalLLaMAEnglish · 1 year ago«I don't have the ability to form memories or learn from interactions» That's what she actually saidplus-squaremessage-squaremessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-square«I don't have the ability to form memories or learn from interactions» That's what she actually saidplus-squareDrVonSinistro@alien.topB to LocalLLaMAEnglish · 1 year agomessage-square0fedilink
minus-squareDrVonSinistro@alien.topBtoLocalLLaMA•Models Megathread #2 - What models are you currently using?linkfedilinkEnglisharrow-up1·1 year agoBecause a model can be divine or crap with some settings, I think its important I specify that I use: Deepseek 33b q8 gguf with the Min-p setting (I love it very much) Source of my Min-p settings: (1) Your settings are (probably) hurting your model - Why sampler settings matter : LocalLLaMA (reddit.com) linkfedilink
DrVonSinistro@alien.topB to LocalLLaMAEnglish · 1 year agoChat style for coding?plus-squaremessage-squaremessage-square0fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareChat style for coding?plus-squareDrVonSinistro@alien.topB to LocalLLaMAEnglish · 1 year agomessage-square0fedilink
minus-squareDrVonSinistro@alien.topBtoLocalLLaMA•Buying a p40 for 70b-120blinkfedilinkEnglisharrow-up1·1 year agoI run 2x P40s with 70b chat and 8k ctx I get 7-8 T/s and I’m very happy with that. Anything above 5 is awesome for me. linkfedilink
DrVonSinistro@alien.topB to LocalLLaMAEnglish · 1 year agoI need help with my P40 I bought to run inferenceplus-squaremessage-squaremessage-square1fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squareI need help with my P40 I bought to run inferenceplus-squareDrVonSinistro@alien.topB to LocalLLaMAEnglish · 1 year agomessage-square1fedilink
DrVonSinistro@alien.topB to LocalLLaMA · 1 year agoPSA about Mining Rigsplus-squaremessage-squaremessage-square3fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1message-squarePSA about Mining Rigsplus-squareDrVonSinistro@alien.topB to LocalLLaMA · 1 year agomessage-square3fedilink
Because a model can be divine or crap with some settings, I think its important I specify that I use:
Deepseek 33b q8 gguf with the Min-p setting (I love it very much)
Source of my Min-p settings: (1) Your settings are (probably) hurting your model - Why sampler settings matter : LocalLLaMA (reddit.com)