Hey ya’ll, quick update about my open source llama.cpp app, FreeChat. As of this weekend it’s live on the mac app store. Big thanks to this community for all the feedback and testing, would not have gotten here without ya’ll. Next I’m working on the most common request I get here: a model catalog.
Have friends who aren’t hackers who you think should try local AI? Send them a link! Hoping to expand local AI usage by making it dead simple.
App Store! https://apps.apple.com/us/app/freechat/id6458534902
And fOR tHe HaCkers: https://github.com/psugihara/FreeChat
LLooks good, but I need to upgrade the system in order to install it. Why does it require such a high system version?
I think I needed a newer API for something. Min is 13.5 (last minor of last major). What version are you on?
Stupid question. Would it be possible to make this work on ios?
Congrats for the launch! Its nice to witness the rise of these new open source tools.
Thank you!
I just want to add for those that might wonder … this will support running at least up to 7B models (e.g. some nice newer Mistral models!) on a 8GB ram 2017 i5 iMac 3.4 intel … I can get about 4.5 - 6 t/s on the old beast on the 7B model … about 7-8 t/s running the 3B orca_mini. So there’s some hope even for old machines. Thanks for making an app and running it thru the App Store process too!
hell yeah, glad you can use it!
Unfortunately, it does not work with “CasualLM 14B”
Error Loading (causallm_14b.Q5_1.gguf)
Also, as I can see, prompt formatting is not possible to set, is it being selected automatically? ChatML etc
Are transparent windows no longer part of the macOS private api?
Looks like it’s been available since macOS 12 https://developer.apple.com/documentation/swiftui/shapestyle/ultrathinmaterial?changes=_5