So far, I have experimented with the following projects:
https://github.com/huggingface/chat-ui - Amazing clean UI with very good web search, my go to currently.
https://github.com/oobabooga/text-generation-webui - Best overall, supports any model format and has many extensions
https://github.com/ParisNeo/lollms-webui/ - Has PDF, stable diffusion and web search integration
https://github.com/h2oai/h2ogpt - Has PDF, Web search, best for files ingestion (supports many file formats)
https://github.com/SillyTavern/SillyTavern - Best for custom characters and roleplay
https://github.com/NimbleBoxAI/ChainFury - Has great UI and web search (experimental)
https://github.com/nomic-ai/gpt4all - Basic UI that replicated ChatGPT
https://github.com/imartinez/privateGPT - Basic UI that replicated ChatGPT with PDF integration
LM Studio - Clean UI, focuses on GGUF format
-
Really love them and wondering if there are any other great projects,
Some of them include full web search and PDF integrations, some are more about characters, or for example oobabooga is the best at trying every single model format there is as it supports anything.
What is your favorite project to interact with your large language models ?
Share your findings and il add them!
exui by turboderp (exllamav2 creator) is a nice ui for exl2 models. https://github.com/turboderp/exui
I wish we had a UI like this for GGUF (for Apple)
Can it serve on a CPU-only machine?
I use kobold cpp for local llm deployment. It’s clean, it’s easy and allows for sliding context. Can interact with drop in replacement for OpenAI.
Here is a new one I found the other day. Still seems to be WIP but overall I really like what is being done here - https://github.com/lobehub/lobe-chat
Wow looks very good indeed, how is the web extraction plugin? can you share some screenshots?
kobold.cpp should have its position. https://github.com/LostRuins/koboldcpp
Agreed, will add that !
I mostly use a UI I made myself:
https://github.com/shinomakoi/AI-Messenger
Works with llama.cpp and Exllama V2, supports LLaVA, character cards and moar.
Right now, I’m using your earlier project [1]. It’s proving to be incredibly helpful, thank you!.
Since it’s a desktop application, it’s more convenient for me than the WebUIs, because I tend to have a lot of tabs open in my browser, which makes it pretty chaotic. I have set up an AutoHotkey script to can easily launch it using a easy to remember hotkey.
What is your favorite project to interact with
I still don’t have a favorite, tbh. I’ve tried a few of the UIs you shared, and I found them to be either too complicated or lacking in certain areas I need. Like many others, I ended up building my own.
Share your findings
Recently, I started collecting local UIs and this is what I’ve gathered so far: UI list.
How chat-ui local? Last I tried they still require mongo.
I had some struggles with it, it works best for me in combination with llamacpp, and you need to run a docker command to start a mongo DB for you chats locally.
Even the search results can be queried on your device instead of API.
https://github.com/serge-chat/serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.
(…without websearch)
Serge is underrated, unknown, and development is slow because of it
Nice, thanks for compiling this info.
I released a UI last week noco-ai/spellbook-docker (github.com). Has chat plugins with 50+ in v0.1.0 that handle things like simple math (multiplication, addition, …), image generation, TTS, Bing news search, etc.
I’d check out Sanctum too. Super easy to use
Websearch is dope. Too bad for me because I am comfortable with pip, not npm. Setting this up will involve pulling some hair out, so I will not even attempt.
I have decent results with langchain and SERP API for google search with gpt 4 function calling. However, I would LOVE the implementation of ChatUI search functionality in python. I hope someone makes a wrapper (if thats even a thing - I am not a programmer by profession).
Im not great at troubleshooting errors but the install of chat-ui was pretty straightforward.
If you already have a llamacpp server it would be very easy to connect.
I enjoy the search functionality so much and I think its worth the hassle, if you need any help with it just comment here.
I have llamacpp server up and running. I will def give the install a shot!
If you need any help with the local.env. File, tell me and il help out
No exui?
https://github.com/turboderp/exui
Its blazing fast, vram efficient, supports minp and has a notebook mode… what else could I ask for.
I was using ooba before, but I have dumped it because its so much slower.
That looks very clean for sure.
FreeChat OSX app
Any of these projects supply clustering multiple gpus/users, or even multiple machines.