n8n Community Node for Codex CLI as LangChain Chat Model
npm install @chrishdx/n8n-nodes-codex-cli-lmCommunity node package for n8n that exposes the OpenAI Codex CLI as a LangChain Chat Model. It runs codex exec --experimental-json using the bundled @openai/codex-sdk binary or a custom codex binary.
- Codex Chat Model (SDK): LangChain chat model for the AI Agent node
- Codex Auth (Device Login): device login helper that writes CODEX_HOME/auth.json on the n8n host
- Codex CLI integration: spawns codex exec --experimental-json and parses JSONL events
- Auth: device login via CLI or UI-based device flow
- Model selection: static list in the node (Codex + GPT-5.x entries) or default model
- Security controls: sandbox mode, approval policy, optional web search + network access
- Advanced config: --config key=value overrides and --enable/--disable feature flags
- Multimodal: attach local images or data: images from LangChain content
- Structured output: optional JSON Schema via --output-schema
- Streaming: optional response streaming from JSONL item.updated / item.completed
- Context controls: cap message count or prompt size in stateless mode
- Self-hosted n8n only (uses child_process and filesystem access)
- n8n >= 1.0.0
- Node.js >= 18
- Codex CLI available via:
- Bundled binary from @openai/codex-sdk, or
- Custom Path to a codex binary on the n8n host
``bash`
npm install @chrishdx/n8n-nodes-codex-cli-lm
`bash`
git clone
cd n8n-nodes-codex-cli-lm
npm install
npm run build
npm link
Then in your n8n installation directory:
`bash`
npm link @chrishdx/n8n-nodes-codex-cli-lm
n8n start
Create Codex (SDK) API credentials in n8n:
- Codex Binary: Bundled (via @openai/codex-sdk) or Custom Pathcodex
- Codex Path: path to if using Custom Path~/.codex
- Base URL (OPENAI_BASE_URL): optional proxy or enterprise gateway
- Codex Home (CODEX_HOME): optional; where Codex stores sessions/config (default: )
The credential test pings the OpenAI auth discovery endpoint to verify connectivity.
You must complete Codex login on the same host/container where n8n runs so CODEX_HOME/auth.json is available.
Run on the n8n host:
`bash`
codex login --device-auth
Then open the provided URL in your browser and enter the device code. Make sure CODEX_HOME in n8n points to the same directory used by the CLI.
If you use the Bundled binary, codex might not be on your PATH. You can either:
- Use Custom Path and install codex globally, ornode_modules/@openai/codex-sdk/vendor/.../codex/codex
- Run the bundled binary directly from
Use the Codex Auth (Device Login) node:
1. Start Device Login: run the node and copy verificationUrl and userCodedeviceAuthId
2. Open the URL in your browser, sign in, and enter the code
3. Complete Device Login: provide + userCode from step 1auth.json
4. (Optional) Login Status to check the current session
5. (Optional) Logout to remove
This writes tokens to CODEX_HOME/auth.json on the n8n host.
1. Add an AI Agent node
2. Select Codex Chat Model (SDK) as the chat model
3. Choose your Codex (SDK) API credentials
4. Configure Model and Options as needed
- Working Directory and Additional Directories for filesystem access
- Sandbox Mode and Approval Policy for command execution safety
- Network Access Enabled and Web Search Enabled for external access
- Use Open Source Provider and Local Provider for --oss flows
- Output Schema (JSON) for structured results
- Stream Response for token streaming
- Max Messages and Max Prompt Characters for context control
A simple flow that turns Telegram messages into Codex responses:
1. Telegram Trigger (incoming message)
2. AI Agent using Codex Chat Model (SDK)
3. Telegram node (Send Message)
Typical mappings:
- AI Agent input: use Telegram message.text (for example, {{$json.message.text}})
- Telegram response: map the AI Agent output text (inspect the AI Agent output in n8n and map its response field)
- This package runs a local binary and uses filesystem access. It is intended for self-hosted n8n.
- Tool calling is supported via a prompt-based JSON adapter. It is best-effort and depends on the model output.
``
n8n-nodes-codex-cli-lm/
├── credentials/
│ └── CodexCliApi.credentials.ts
├── nodes/
│ ├── CodexAuth/
│ └── LmChatCodexCli/
├── package.json
└── tsconfig.json
- npm run dev - Start n8n with hot reload for developmentnpm run build
- - Build the TypeScript codenpm run lint
- - Check code for errorsnpm run lint:fix
- - Auto-fix linting issues
- Verify install: npm list @chrishdx/n8n-nodes-codex-cli-lm
- Restart n8n
- Check n8n logs for load errors
- Verify the Codex binary source (Bundled vs Custom Path)
- Check the binary manually: codex --versionCODEX_HOME
- Ensure contains auth.json from device login
- Confirm the login happened on the same host/container as n8n
- Use the Codex Auth (Device Login) node to re-run device login
MIT
- GitHub Issues: https://github.com/chrishdx/n8n-nodes-codex-cli-lm/issues
- n8n Community Forum: https://community.n8n.io
- Built on the n8n community nodes starter
- Powered by LangChain
- Uses the official @openai/codex-sdk`