A n8n community node to interact with a (custom) LLM model and knowledge bases on an Open WebUI instance.
This is an n8n community node that lets you interact with Open WebUI models in your n8n workflows.
Open WebUI is an extensible web interface for running and managing local or remote AI models. It supports chat-style interactions, file and knowledge base integrations, and an OpenAI-compatible API.
n8n is a fair-code licensed workflow automation platform.
- Installation
- Operations
- Credentials
- Compatibility
- Usage
- Resources
- Version history
---
Follow the installation guide in the n8n community nodes documentation.
Install this package by searching for n8n-nodes-llm-openwebui inside Settings → Community Nodes in your n8n instance.
---
The Open WebUI node currently supports:
- Chat with Model: send a user message to a chosen Open WebUI model and receive its response.
- Attach File (optional): upload a binary file from your workflow and include it in the conversation.
- Add File to Target Knowledge Base (optional): upload a binary file to a knowledge collection.
- Use Conversation Knowledge Bases (optional): add one or more knowledge base collections to the query.
---
You need to create Open WebUI API credentials in n8n.
http://localhost:3000 These credentials are used for all requests (chat, file upload, and knowledge base management).
---
- Requires n8n v1.20.0 or higher (tested).
- Requires Node.js 22.0.0 or higher to build the package.
- Tested against Open WebUI releases >=0.6 and against >=0.6.22 with the OpenAI-compatible API endpoints enabled.
---
1. Add the Open WebUI Chat node to your workflow.
2. Configure your credentials with the Base URL and API key.
3. Select a Model (the node dynamically loads available models the API key has access to from your Open WebUI).
4. Enter your User Message.
5. (Optional) Provide a Binary Property if you want to attach a file.
6. (Optional) Select a Upload Target Knowledge Base to upload a file into.
7. (Optional) Select Conversation Knowledge Bases to include.
8. Select Output format to Streaming or Final.
9. Run the workflow. The model’s reply will be available in the node output under json.content (short) or json.choices.choices[0].message.content (full).
---
- n8n community nodes documentation
- Open WebUI GitHub repository
- Open WebUI API reference (OpenAI-compatible)
---
1.2.0
- Fixed: Knowledge base endpoint.
1.1.9
- Code Refactoring
- Added timeout to node settings.
1.1.8
- Fixed: Increased streaming timeout to 600 sec preventing SSE timeout errors for slow responding model calls.
1.1.7
- Fixed: Node now support compressed streaming chunks behind ALB.
1.1.6
- Fixed: Streaming mode could result in empty content.
1.1.5
- Added support for streaming api calls and chunks output.
- Added support for System Prompt (optional).
- Added support for Assistant Message (optional).
- Fixed: Node now supports error output path correctly.
1.1.2
- Fixed: Node now supports model names containing spaces.
1.0.5
- Added support for attaching multiple knowledge bases to a single chat conversation (conversation kb).
- Added support for uploading an optionally provided binary file to a knowledge base (upload target kb).
1.0.2
- Fixed: Node output could include invalid JSON when responses were streamed.
1.0.0
- Initial release with model selection, message sending, and optional file upload with Knowledge Base integration (list collections, add files to collections, reference them in chat).