CLI for code completion
npm install @ishaanranjan/code-completeA tiny Node.js CLI that sends your code context and a prompt to InceptionLabs' Mercury model for code completion.
> ⚠️ Security: Never commit your API key to a public repo. Prefer .env and environment variables.
---
``bash1) Go to the folder
cd mercury-code-completion
$3
`bash
Print completion to stdout
mercury-complete -f src/app.tsProvide a guiding prompt
mercury-complete -f src/app.ts -p "Continue the code. Add input validation and JSDoc comments."Append completion to the file
mercury-complete -f src/app.ts --appendWrite completion to a new file
mercury-complete -f src/app.ts --out src/app.completed.ts
`$3
- -f, --file : (required) Source file to provide as context.
- -p, --prompt : Guidance for the model (default: "Continue the code from the end. Provide only code.").
- --append: Append the generated code to the original file.
- -o, --out : Write completion to this file.
- --max-tokens : Upper bound on tokens to generate (default 512).
- --temperature : Sampling temperature (default 0.2).---
How it works
We call
POST https://api.inceptionlabs.ai/v1/chat/completions with the Mercury model.
The file content is sent as context with your prompt. We ask the model to return code only (no backticks, no prose).---
Example
`bash
mercury-complete -f example/input.ts -p "Add a robust retry helper with exponential backoff."
`---
Troubleshooting
- 401 / 403 – Make sure
INCEPTION_API_KEY is set.
- 429 – You may be rate-limited; retry after a pause.
- 500 – Transient server error; try again.
- Weird output – Tighten your prompt (e.g., "Return only code for TypeScript; no comments or backticks.").
`$3
- --keep-fences`: keep triple-backtick code fences if the model returns them (default is to strip them).