AI-driven autonomous development workflow - Ralph builds while you sleep
npm install ralph-inferno```
π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯
π₯ π₯
π₯ βββββββ ββββββ βββ βββββββ βββ βββ π₯
π₯ βββββββββββββββββββ βββββββββββ βββ π₯
π₯ βββββββββββββββββββ ββββββββββββββββ π₯
π₯ βββββββββββββββββββ βββββββ ββββββββ π₯
π₯ βββ ββββββ ββββββββββββββ βββ βββ π₯
π₯ βββ ββββββ ββββββββββββββ βββ βββ π₯
π₯ π₯
π₯ I N F E R N O M O D E π₯
π₯ π₯
π₯ Build while you sleep. Wake to working code π₯
π₯ π β βοΈ π₯
π₯ π₯
π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯π₯
> Light the fire. Walk away.
>
> Ralph runs autonomously on a VM while you do literally anything else.
> Come back when it's done. Or don't. Ralph doesn't care.
AI-driven autonomous development workflow.
Ralph installs as Claude Code commands and Codex CLI prompts. When you run npx ralph-inferno install, it creates a .ralph/ folder with scripts, syncs .claude/commands/ for Claude Code, and syncs ~/.codex/prompts/ for Codex CLI.
``
Local Machine VM (Sandbox)
ββββββββββββββββββββββββββββ βββββββββββββββββββ
β Claude Code or Codex CLI β β Agent CLI β
β + Ralph commands/prompts β GitHub β + ralph.sh β
β β ββββββββββββΊ β β
β /ralph: or /prompts: β β Runs specs β
ββββββββββββββββββββββββββββ βββββββββββββββββββ
The flow:
1. You work locally with Claude Code (/ralph:) or Codex CLI (/prompts:ralph-)
2. Deploy pushes your specs to GitHub and starts Ralph on the VM
3. Ralph runs autonomously on the VM while you sleep
4. Next day: review what was built
| Tool | Required | How to install |
|------|----------|----------------|
| Node.js | Yes | brew install node |npm install -g @anthropic-ai/claude-code
| Claude Code | Yes (choose one) | |npm install -g @openai/codex
| Codex CLI | Yes (choose one) | |brew install gh
| GitHub CLI | Recommended | then gh auth login |
| Tool | Required | Notes |
|------|----------|-------|
| SSH access | Yes | You need to be able to SSH into the VM |
| Git | Yes | Usually pre-installed |
| Claude Code | Yes (choose one) | npm install -g @anthropic-ai/claude-code |npm install -g @openai/codex
| Claude auth | Yes (if using Claude) | See Authentication below |
| Codex CLI | Yes (choose one) | |codex login
| Codex auth | Yes (if using Codex) | Run OR set OPENAI_API_KEY |brew install gh
| GitHub CLI | Yes | then gh auth login |
Important: Both machines need gh auth login for Git operations to work!
- Codex web search - Run Codex with --search (or enable it in Codex config) for /prompts:ralph-discover./ralph:discover
- Claude Chrome Extension - Lets Claude browse websites during .dev-browser
- If browsing isnβt available, use the skill as a fallback.hcloud
- Cloud CLI (, gcloud, doctl, aws) - For VM management
- ntfy.sh - Push notifications when Ralph finishes
`bash`
cd your-project
npx ralph-inferno install
This creates:
- .ralph/ - Scripts and config.claude/commands/
- - Claude Code commands~/.codex/prompts/
- - Codex prompt files
SSH into your VM and install the prerequisites:
`bashInstall Node.js (if not installed)
curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash -
sudo apt-get install -y nodejs
$3
On your local machine, start your CLI:
Claude Code:
`bash
claude
`
Then run:
- /ralph:discover
- /ralph:plan
- /ralph:deployCodex CLI:
`bash
codex
`
Then run:
- /prompts:ralph-discover
- /prompts:ralph-plan
- /prompts:ralph-deployAuthentication
Claude Code supports multiple authentication methods. Set these on your VM (not local machine):
$3
`bash
claude login
`$3
`bash
export ANTHROPIC_API_KEY="sk-ant-..."
`$3
`bash
export CLAUDE_CODE_USE_BEDROCK=1
export ANTHROPIC_MODEL="us.anthropic.claude-sonnet-4-20250514-v1:0"
export AWS_REGION="us-east-1"
export AWS_ACCESS_KEY_ID="..."
export AWS_SECRET_ACCESS_KEY="..."
`$3
`bash
export CLAUDE_CODE_USE_FOUNDRY=1
export ANTHROPIC_FOUNDRY_BASE_URL="https://your-resource.services.ai.azure.com/api/v1"
export ANTHROPIC_FOUNDRY_API_KEY="..."
export ANTHROPIC_FOUNDRY_RESOURCE="your-resource-name"
`> Tip: Add these to
~/.bashrc on your VM so they persist across sessions.Update
Update core files while preserving your config:
`bash
npx ralph-inferno update
`Or use the command/prompt:
`
/ralph:update # Claude Code
/prompts:ralph-update # Codex CLI
`Workflow
Ralph supports two entry points: Greenfield (new apps) and Brownfield (existing apps).
`
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β TWO ENTRY POINTS β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ GREENFIELD (new app) BROWNFIELD (existing app)
β β
βΌ βΌ
/ralph:idea βββββββββββββββββββββββββββΊ /ralph:change-request
(BMAD Brainstorm) (Analyze scope: S/M/L)
β β
βΌ β
PROJECT-BRIEF.md β
β β
βΌ β
/ralph:discover β
(BMAD Analyst) β
β β
βΌ βΌ
PRD.md ββββββββββββββββββββββββββΊ CHANGE-REQUEST.md
β
βΌ
/ralph:plan
(auto-detects input)
β
βΌ
/ralph:deploy β VM β /ralph:review
`Codex CLI equivalents use
/prompts:ralph-* (see table below).
`$3
| Claude Code | Codex CLI | Description |
|-------------|-----------|-------------|
| /ralph:idea | /prompts:ralph-idea | Greenfield start - BMAD brainstorm β PROJECT-BRIEF.md |
| /ralph:discover | /prompts:ralph-discover | BMAD analyst mode β PRD.md |
| /ralph:change-request | /prompts:ralph-change-request | Brownfield start - Analyze changes β CR specs |
| /ralph:plan | /prompts:ralph-plan | Creates specs from PRD or Change Request |
| /ralph:preflight | /prompts:ralph-preflight | Verify requirements before deployment |
| /ralph:deploy | /prompts:ralph-deploy | Push to GitHub, choose mode, start Ralph on VM |
| /ralph:review | /prompts:ralph-review | Open SSH tunnels, test the app |
| /ralph:status | /prompts:ralph-status | Check Ralph's progress on VM |
| /ralph:abort | /prompts:ralph-abort | Stop Ralph on VM |
| /ralph:update | /prompts:ralph-update | Update Ralph to latest version |$3
When running deploy (
/ralph:deploy or /prompts:ralph-deploy), you choose a mode:| Mode | What it does |
|------|--------------|
| Quick | Spec execution + build verify only |
| Standard | + Playwright E2E tests + auto-CR generation |
| Inferno | + Design review + parallel worktrees |
$3
Discovery mode works best when the agent can browse the web.
- Codex CLI: use web search (
--search or enable it in config)
- Claude Code: use the Claude Chrome Extension
- Otherwise: use the dev-browser skill as a fallback$3
`bash
1. Install Ralph
npx ralph-inferno install2. Brainstorm & Discover
/ralph:idea "todo app" # BMAD brainstorm β PROJECT-BRIEF.md
/ralph:discover # BMAD analyst β PRD.md3. Plan & Deploy
/ralph:plan # Generate specs
/ralph:deploy # Send to VM4. Review
/ralph:review # Test what Ralph builtCodex CLI equivalents
/prompts:ralph-idea "todo app"
/prompts:ralph-discover
/prompts:ralph-plan
/prompts:ralph-deploy
/prompts:ralph-review
`$3
`bash
1. Describe changes
/ralph:change-request "add dark mode" # Analyze scope β CR specs2. Plan & Deploy
/ralph:plan # Auto-detects Change Request
/ralph:deploy # Send to VM3. Review
/ralph:review # Test the changesCodex CLI equivalents
/prompts:ralph-change-request "add dark mode"
/prompts:ralph-plan
/prompts:ralph-deploy
/prompts:ralph-review
`Language Agnostic
Ralph auto-detects your project type and uses the appropriate build/test commands:
| Project Type | Build Command | Test Command |
|--------------|---------------|--------------|
| Node.js (package.json) |
npm run build | npm test |
| Rust (Cargo.toml) | cargo build | cargo test |
| Go (go.mod) | go build ./... | go test ./... |
| Python (pyproject.toml) | python -m build | pytest |
| Makefile | make build | make test |Custom commands: Override in
.ralph/config.json:
`json
{
"build_cmd": "yarn build",
"test_cmd": "yarn test:ci"
}
`Set
agent to claude or codex (or auto to use whatever is installed on the VM).Safety
Ralph runs AI-generated code autonomously. For safety:
- ALWAYS run on a disposable VM - never on your local machine
- Review generated code before production
- Never store credentials in code
Cloud Providers
Ralph supports multiple cloud providers for VM execution:
| Provider | CLI | Notes |
|----------|-----|-------|
| Hetzner |
hcloud | Cheapest, great for Europe |
| Google Cloud | gcloud | Good free tier |
| DigitalOcean | doctl | Simple and reliable |
| AWS | aws | Enterprise option |
| SSH | - | Use your own server |Config File
Configuration is stored in
.ralph/config.json:`json
{
"version": "1.0.9",
"language": "en",
"provider": "hcloud",
"vm_name": "ralph-sandbox",
"region": "fsn1",
"github": {
"username": "your-username"
},
"agent": "claude",
"claude": {
"auth_method": "subscription"
},
"codex": {
"auth_method": "account"
},
"notifications": {
"ntfy_enabled": true,
"ntfy_topic": "my-unique-ralph-topic"
},
"build_cmd": "npm run build",
"test_cmd": "npm test"
}
`Auth method options:
| Value | Description |
|-------|-------------|
|
subscription | Uses claude login (Pro/Max subscription) |
| api_key | Uses ANTHROPIC_API_KEY env var |
| bedrock | Uses AWS Bedrock env vars |
| foundry | Uses Azure AI Foundry env vars |Documentation
- Architecture - System overview and memory model
- CLI Flags - All ralph.sh options
- Token Optimization - Cost-saving strategies
Credits & Inspiration
Ralph Inferno builds on ideas from:
- snarktank/ralph - Ryan Carson's original Ralph concept
- BMAD Method - The discovery workflow (
/ralph:idea and /ralph:discover`) is heavily inspired by BMAD's Brainstorm and Analyst personasMIT