Home C Library
Run GGUF models easily with a KoboldAI UI. One File. Zero Install.
LLM inference in C/C++
qBittorrent BitTorrent client
Real-time webcam demo with SmolVLM and llama.cpp server