MCP Server for HuggingFace inference endpoints with custom LoRA and story generation
npm install huggingface-mcp-serverA TypeScript-based MCP (Model, Chat, Protocol) server that integrates with HuggingFace's inference endpoints to provide:
1. Image generation with custom LoRA support
2. Story generation
This server implements the OpenAI API protocol for tools, making it compatible with tool-enabled LLM clients.
- Image Generation Tool: Generate images using Flux model (Stable Diffusion XL) with optional custom LoRA model support
- Story Generation Tool: Generate stories using LLM models from HuggingFace
- MCP Protocol Compatible: Implements the OpenAI-compatible tool protocol
- Multiple Transport Support: Use HTTP or stdio transport for maximum compatibility
- CLI Support: Run as a command-line tool with npx
- Claude & Cursor Integration: Ready-to-use MCP configuration files
Before others can use this package with npx, you need to publish it to npm:
``bashCreate an npm account if you don't have one
npm adduser
When updating the package with new features:
1. Update the version in
package.json, claude-mcp.json, and src/cli.ts
2. Build the project with npm run build
3. Publish the new version with npm publishThe
prepublishOnly script will ensure the project is built before publishing.$3
Once published, you can install this package globally:
`bash
npm install -g huggingface-mcp-server
`Then run it using:
`bash
hf-mcp-server --api-key YOUR_HUGGINGFACE_API_KEY
`$3
Run it directly with npx without installing:
`bash
npx huggingface-mcp-server --api-key YOUR_HUGGINGFACE_API_KEY
`$3
1. Clone this repository
2. Install dependencies:
`
npm install
`
3. Copy environment file and configure it:
`
cp .env.example .env
`
Update the .env file with your HuggingFace API key.4. Build and run the server:
`
npm run build
npm start
`
Or run in development mode:
`
npm run dev
`Claude Desktop, Cursor, and Cline Integration
This project includes MCP configuration files for easy integration with various AI assistants:
$3
Use the claude-mcp.json file in the Claude Desktop MCP configuration settings.$3
Use the cursor-mcp.json file in the Cursor MCP settings.$3
Add the contents of cline-mcp.json to your Cline configuration:`json
{
"huggingface-mcp": {
"command": "npx",
"args": [
"--yes",
"huggingface-mcp-server@latest",
"--api-key=YOUR_HUGGINGFACE_API_KEY_HERE",
"--port=3000"
],
"disabled": false,
"timeout": 60
}
}
`Make sure to replace
YOUR_HUGGINGFACE_API_KEY_HERE with your actual API key.All configurations will start the server and require your HuggingFace API key.
CLI Options
`
Options:
-p, --port Port to run the HTTP server on (default: "3000")
-k, --api-key HuggingFace API key
-e, --env Path to .env file
-t, --transport Transport type (http or stdio) (default: "http")
-h, --help display help for command
`Example using HTTP transport:
`bash
npx huggingface-mcp-server --port 4000 --api-key YOUR_API_KEY
`Example using stdio transport:
`bash
npx huggingface-mcp-server --transport stdio --api-key YOUR_API_KEY
`You can also set the transport via environment variables:
`bash
TRANSPORT=stdio npx huggingface-mcp-server --api-key YOUR_API_KEY
`Communication Protocols
$3
When running in HTTP mode, the following endpoints are available:
#### GET /
Returns a health check message indicating the server is running.
#### POST /v1/tools
Returns the list of available tools:
-
generate_image: Generate an image with optional custom LoRA
- generate_story: Generate a story based on a prompt#### POST /v1/chat/completions
Main endpoint that handles the MCP protocol for tool usage.
$3
When running in stdio mode, the server communicates using JSON messages through standard input/output:
#### Request Tools
`json
{"type": "tools"}
`#### Chat Request
`json
{
"type": "chat",
"data": {
"messages": [
{
"role": "user",
"content": "Generate an image of a cat"
}
]
}
}
`#### Exit Request
`json
{"type": "exit"}
`Example Usage
Example request to generate an image:
`json
{
"messages": [
{
"role": "user",
"content": "I want to generate an image of a cat in space"
},
{
"role": "assistant",
"tool_calls": [
{
"id": "call_123",
"type": "function",
"function": {
"name": "generate_image",
"arguments": "{\"prompt\": \"A cat in space with a space helmet, stars in background\", \"lora_name\": \"username/space-cats-lora\"}"
}
}
]
}
]
}
`Example request to generate a story:
`json
{
"messages": [
{
"role": "user",
"content": "Write me a story about a space explorer"
},
{
"role": "assistant",
"tool_calls": [
{
"id": "call_456",
"type": "function",
"function": {
"name": "generate_story",
"arguments": "{\"prompt\": \"A space explorer discovers an ancient alien civilization\"}"
}
}
]
}
]
}
``MIT