AI-powered code assistant CLI with RAG, LLM classification, and MCP tools for frontend development
npm install @nikita-bekish/code-assistantš¤ AI-powered code assistant CLI with RAG (Retrieval-Augmented Generation), LLM classification, and MCP tools for frontend development.
- š Semantic Search - Find relevant code using RAG with vector embeddings
- š¬ Interactive Chat - Ask questions about your codebase
- š·ļø Smart Classification - LLM-based question routing (git/crm/tasks/rag)
- š ļø MCP Tools - Model Context Protocol for git operations, CRM, and task management
- š Project Analysis - Automatic indexing and code understanding
- šÆ Frontend Focus - Optimized for JavaScript/TypeScript projects
- Node.js >= 18.0.0
- Ollama - Local LLM runtime (Installation guide)
``bash`
# Install Ollama
curl https://ollama.ai/install.sh | sh
# Pull required models
ollama pull llama3.2
ollama pull nomic-embed-text
`bash`
npm install -g @nikita-bekish/code-assistant
`bash`
npm install @nikita-bekish/code-assistant
npx code-assistant --help
1. Initialize in your project:
`bash`
cd your-project
code-assistant init
2. Index your codebase:
`bash`
code-assistant index
3. Start chatting:
`bash`
code-assistant chat
`bash`
code-assistant init
Creates .code-assistant-config.json with default settings.
`bash`
code-assistant index
Options:
- Indexes all files according to .code-assistant-config.jsonnode_modules/.code-assistant/
- Generates embeddings using Ollama
- Stores chunks in
`bash`
code-assistant chat
Example questions:
- "How does authentication work?"
- "Show me high priority tasks"
- "What is the current git status?"
- "List all open tickets for user_1"
`bash`
code-assistant reindex
.code-assistant-config.json example:
`json`
{
"projectName": "My Project",
"projectDescription": "A modern web application",
"indexing": {
"includeFolders": ["src", "lib"],
"excludeFolders": ["node_modules", "dist", ".git"],
"includeFileTypes": ["js", "ts", "jsx", "tsx", "vue", "svelte"],
"chunkSize": 1024,
"chunkOverlap": 256
},
"llm": {
"model": "llama3.2",
"temperature": 0.7,
"maxResults": 5
},
"embedding": {
"model": "nomic-embed-text",
"provider": "ollama"
}
}
- Get current branch
- git_status - Show repository statusCRM Tools:
-
get_user - User information
- list_tickets - User tickets
- create_ticket - New support ticket
- update_ticket - Update ticket statusTasks Tools:
-
list_tasks - Team tasks with filters
- create_task - New task
- update_task - Update task statusExamples
$3
`bash
Initialize and index
code-assistant init
code-assistant indexAsk about code
code-assistant chat
> How does authentication work in this project?Ask about tasks
> Show me high priority tasks
`$3
`bash
export OPENAI_API_KEY=your_key_here
code-assistant chat
`Troubleshooting
$3
Install Ollama from https://ollama.ai/$3
Pull required models:
`bash
ollama pull llama3.2
ollama pull nomic-embed-text
`$3
Run indexing first:
`bash
code-assistant index
`$3
- Reduce chunkSize in config
- Use smaller LLM model
- Consider using OpenAI APIDevelopment
`bash
Clone repository
git clone https://github.com/nikita-bekish/my-code-assistant.git
cd my-code-assistantInstall dependencies
npm installBuild
npm run buildTest locally
npm link
code-assistant --help
``MIT Ā© Nikita Bekish