PookaMacPhellimen@alien.topB to LocalLLaMAEnglish · 1 year agoQwen-72B releasedhuggingface.coexternal-linkmessage-square39fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkQwen-72B releasedhuggingface.coPookaMacPhellimen@alien.topB to LocalLLaMAEnglish · 1 year agomessage-square39fedilink
minus-squarecandre23@alien.topBlinkfedilinkEnglisharrow-up1·1 year ago we have expanded the context window length to 32K Kinda buried the lead here. This is far and away the biggest feature of this model. Here’s hoping it’s actually decent as well!
minus-squarejeffwadsworth@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoWell, it depends on how well it keeps the context resolution. Did you see that comparison sheet on Claude and GPT-4? Astounding.
minus-squaredomlincog@alien.topBlinkfedilinkarrow-up1·1 year ago https://preview.redd.it/c5k1ugynhj3c1.png?width=1100&format=png&auto=webp&s=4024b3e295ab740f341e132b9d9662104fdc09ef
Kinda buried the lead here. This is far and away the biggest feature of this model. Here’s hoping it’s actually decent as well!
Well, it depends on how well it keeps the context resolution. Did you see that comparison sheet on Claude and GPT-4? Astounding.
https://preview.redd.it/c5k1ugynhj3c1.png?width=1100&format=png&auto=webp&s=4024b3e295ab740f341e132b9d9662104fdc09ef