Minimalist local LLM chat interface using Ollama
npm install yak-llm
A minimalist local LLM terminal chat interface using Ollama, native streaming, and JSONL memory.
Yak at your local model. Fully offline. Edit or delete your chat history(s).
---
- ๐ง Runs on any Ollama-compatible model (gemma3, llama3, mistral, etc)
- ๐งต Offline, Real-time streaming replies in the terminal
- ๐ Memory stored in lightweight jsonl files on your machine
---
- Node.js 18+
- Ollama installed and running
---
bash
npm install -g yak-llm
`Start
`bash
yak start
`Start chatting
`
๐ฆง> What's a good name for a chat app?
๐ค: How about "Yak"? It's short, and yaks are fluffy like llamas!
`If you have Ollama downloaded and running, Yak will automatically download a model if none is found.
Yak will also create a new
default chat session for your first session.Stop chatting with command or ctrl+c
`
๐ฆง> /bye
`Usage
`
๐ Main Commands
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
yak start Start chat session with your model
yak help Show this help message
yak list List all chat sessions
yak models List available models
๐ฌ Chat Management
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
yak new Create new chat session
yak switch Switch to chat session
yak delete Delete chat session
yak --reset Clear current chat history
๐ค Model Management
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
yak model Switch to different model
yak models List downloaded models
๐ฌ In-Chat Commands
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
/bye or /quit Exit chat session
http or www Chat will detect given URLs and crawl them
`Models
You can see your downloaded Ollama models by running:
`bash
yak models
`You can change the model with the CLI command
`bash
yak model
`
or in the config.json file manually.
`json
{
"model": "gemma3:1b",
}
`If you want more models, download them from Ollama! ๐ฆ
History
Don't like your chat? All messages are logged to
.yak/chats.You can remove lines from file, or delete the entire log:
`bash
yak delete
``