What is the best way to learn LLM / generative AI topics in depth:
- loss functions
- finetuning (LoRA, QLoRA, etc)
- creating good datasets
- quantization, AWQ
etc. I know there is the Fast AI course, but more interested in these above topics. Seems like there are scattered guides, notebooks and a promising repo here: https://github.com/peremartra/Large-Language-Model-Notebooks-Course but nothing comprehensive.
I second this
Karpathys channel is awesome for intro, then you can just start reading the papers for more in depth learning
https://microsoft.github.io/AI-For-Beginners/ ai for everyone by Andrew ng
If you don’t know where to start, perhaps start with this video from Karpathy published yesterday: https://www.youtube.com/watch?v=zjkBMFhNj_g
Absolutely! I wanted to start a thread like this myself. There is information out there for general AI stuff. But I seek concrete tutorials and articles about current open source LLMs practice . I agree that knowledge is frustratingly scattered.