minus-squareapepkuss@alien.topBtoLocalLLaMA•What’s recommended hosting for open source LLMs?linkfedilinkEnglisharrow-up1·1 year agoWebAssembly based open source LLM inference (API service and local hosting): https://github.com/second-state/llama-utils linkfedilink
WebAssembly based open source LLM inference (API service and local hosting): https://github.com/second-state/llama-utils