Hey π, welcome to LocalSLM!
This repo is all about running Small Language Models (SLMs) locally β think LLM (chatGPT, Claude, and Gemini) but way more lightweight and privacy-friendly. Perfect for ChromeBooks.
- Local Inference: Run SLMs on your own machine, no cloud needed.
- Plug-n-Play: Add or swap out models easily.
- Extensible: Designed for quick hacks, tweaks, and upgrades.
- Privacy First: All your data stays on your device, I am also too lazy for a server.
- download the html, done.
- Model Selection: Drop your models into
models/and point the config to 'em. - Custom Prompts: Tweak the prompts in the source or config for your use-case.
Got ideas? Found a bug? Wanna help out?
- Fork it π΄
- Make your changes βοΈ
- Submit a PR π
Q: What even is a "Small Language Model"?
A: It's like a mini version of GPT/LLM, way lighter and runs locally!
Q: Can I use my own model?
A: Yup! Just drop it in /models and update the config.
Q: Is this production-ready?
A: Nah, this is for local AI and experiments. Use at your own risk!
Made with π» by Henry and the Robot's.
MIT β do whatever, just donβt sue me π (BCPS :| )
Stay curious, stay local!