Local-First, Context-Aware AI Code Reviewer - Works with any language
npm install jstar-reviewerbash
Install globally
npm install -g jstar-reviewer
In any project directory:
jstar setup # Create config files
jstar init # Index the codebase
jstar review # Review staged changes
`
$3
`bash
curl -fsSL https://raw.githubusercontent.com/JStaRFilms/jstar-code-review/v2.0.0/setup.js | node
`
$3
1. Check Config: The tool now auto-creates .env.example and .jstar/ when you run it.
2. Add Keys: Copy .env.example → .env.local and add your GEMINI_API_KEY and GROQ_API_KEY.
3. Index: Run jstar init (or pnpm run index:init) to build the brain.
4. Review: Stage changes (git add) and run jstar review (or pnpm run review).
For a detailed walkthrough, see ONBOARDING.md.
---
`
git diff --staged
│
▼
┌──────────────────┐
│ Detective │ ← Static analysis (secrets, console.log, "use client")
│ Engine │
└────────┬─────────┘
│
▼
┌──────────────────┐
│ Local Brain │ ← Gemini embeddings via LlamaIndex
│ (Retrieval) │
└────────┬─────────┘
│
▼
┌──────────────────┐
│ Chunked Review │ ← Splits diff by file, delays between calls
│ Queue │
└────────┬─────────┘
│
▼
┌──────────────────┐
│ Groq LLM │ ← moonshotai/kimi-k2-instruct-0905
│ (The Judge) │
└────────┬─────────┘
│
▼
📝 Review Report
`
🚀 Quick Start
$3
`bash
pnpm install
`
$3
Create .env.local:
`env
GEMINI_API_KEY=your_gemini_key
GROQ_API_KEY=your_groq_key
`
$3
`bash
pnpm run index:init
`
$3
`bash
git add
pnpm run review
`
📁 Project Structure
`
scripts/
├── indexer.ts # Scans codebase, builds vector index
├── reviewer.ts # Orchestrates review pipeline
├── detective.ts # Static analysis engine
├── gemini-embedding.ts # Google Gemini adapter
└── mock-llm.ts # LlamaIndex compatibility stub
.jstar/
└── storage/ # Persisted embeddings (gitignored)
docs/features/
├── architecture-v2.md # Full architecture docs
├── detective.md # Static analysis rules
├── analyst.md # LLM reviewer (The Judge)
└── ...
`
⚙️ Configuration
Edit scripts/reviewer.ts:
`typescript
const MODEL_NAME = "moonshotai/kimi-k2-instruct-0905";
const MAX_TOKENS_PER_REQUEST = 8000;
const DELAY_BETWEEN_CHUNKS_MS = 2000;
``