Psyllama documentation
Psyllama helps you run large language models locally with a simple CLI and an OpenAI-compatible API.
Quickstart
Install Psyllama and run your first model.
curl -fsSL https://psyllama.com/install.sh | sh psyllama run kimi-k2.5:cloud
API reference
Use the local HTTP API to build apps and integrations.
curl http://localhost:11434/api/chat \
-d '{"model":"kimi-k2.5:cloud","messages":[{"role":"user","content":"Hello!"}]}'
Libraries
Use any HTTP client. Official language SDKs can be added later.