Microsoft Fabric MCP Server - Model Context Protocol implementation for Fabric, for win32 on arm64
npm install @microsoft/fabric-mcp-win32-arm64
A local-first Model Context Protocol (MCP) server that provides AI agents with comprehensive access to Microsoft Fabric's public APIs, item definitions, and best practices. The Fabric MCP Server packages complete OpenAPI specifications into a single context layer for AI-assisted development—without connecting to live Fabric environments.
Microsoft Fabric MCP Server gives your AI agents the knowledge they need to generate robust, production-ready code for Microsoft Fabric—all without directly accessing your environment.
Key capabilities:
- Complete API Context: Full OpenAPI specifications for all supported Fabric workloads
- Item Definition Knowledge: JSON schemas for every Fabric item type (Lakehouses, pipelines, semantic models, notebooks, etc.)
- Built-in Best Practices: Embedded guidance on pagination, error handling, and recommended patterns
- Local-First Security: Runs entirely on your machine—never connects to your Fabric environment
npm and npx. We recommend Node.js 20 LTS or later. To verify your installation run: node --version, npm --version, and npx --version.mcp.json file with the following: ``json`
{
"mcpServers": {
"fabric-mcp-server": {
"command": "npx",
"args": [
"-y",
"@microsoft/fabric-mcp@latest",
"server",
"start",
"--mode",
"all"
]
}
}
}
servers
Note: When manually configuring Visual Studio and Visual Studio Code, use instead of mcpServers as the root object.
Client-Specific Configuration
| IDE | File Location | Documentation Link |
|-----|---------------|-------------------|
| Claude Code | ~/.claude.json or .mcp.json (project) | Claude Code MCP Configuration |~/.claude/claude_desktop_config.json
| Claude Desktop | (macOS)%APPDATA%\Claude\claude_desktop_config.json (Windows) | Claude Desktop MCP Setup |~/.cursor/mcp.json
| Cursor | or .cursor/mcp.json | Cursor MCP Documentation |.vscode/mcp.json
| VS Code | (workspace)settings.json (user) | VS Code MCP Documentation |~/.codeium/windsurf/mcp_config.json
| Windsurf | | Windsurf Cascade MCP Integration |
1. Open GitHub Copilot in VS Code and switch to Agent mode.
1. Click refresh on the tools listWhat Fabric workload types are available?
- You should see the Fabric MCP Server in the list of tools
1. Try a prompt that uses Fabric context, such as
- The agent should be able to use the Fabric MCP Server tools to complete your query
1. Check out the Microsoft Fabric documentation and review the troubleshooting guide for commonly asked questions
1. We're building this in the open. Your feedback is much appreciated!
- 👉 Open an issue in the public repository
✨ The Fabric MCP Server supercharges your agents with Microsoft Fabric context. Here are some prompts you can try:
* "What are the available Fabric workload types I can work with?"
* "Show me the OpenAPI operations for 'notebook' and give a sample creation body"
* "Get the platform-level API specifications for Microsoft Fabric"
* "List all supported Fabric item types"
* "Create a Lakehouse resource definition with a schema that enforces a string column and a datetime column"
* "Show me the JSON schema for a Data Pipeline item definition"
* "Generate a Semantic Model configuration with sample measures"
* "What properties are required for creating a KQL Database?"
* "Show me best practices for handling API throttling in Fabric"
* "How should I implement retry logic for Fabric API rate limits?"
* "List recommended retry/backoff behavior for Fabric APIs when rate-limited"
* "Show me best practices for authenticating with Fabric APIs"
* "Get example request/response payloads for creating a Notebook"
* "What are the pagination patterns for Fabric REST APIs?"
* "Generate a data pipeline configuration with sample data sources"
* "Help me scaffold a Fabric workspace with Lakehouse and notebooks"
* "Show me how to handle long-running operations in Fabric APIs"
* "What's the recommended error handling pattern for Fabric API calls?"
The Fabric MCP Server exposes the following tools for AI agents:
| Tool | Tool Name | Description |
|------|-----------|-------------|
| List Public APIs | publicapis_list | List all Microsoft Fabric workload types that have public API specifications available |publicapis_get
| Get Public API | | Retrieve the complete OpenAPI/Swagger specification for a specific Microsoft Fabric workload |publicapis_platform_get
| Get Platform API | | Retrieve the OpenAPI/Swagger specification for Microsoft Fabric platform APIs |publicapis_bestpractices_get
| Get Best Practices | | Retrieve embedded best practice documentation and guidance for a specific Microsoft Fabric topic |publicapis_bestpractices_examples_get
| Get Best Practices Examples | | Retrieve all example API request/response files for a specific Microsoft Fabric workload |publicapis_bestpractices_itemdefinition_get
| Get Item Definition | | Retrieve the JSON schema definitions for specific items within a Microsoft Fabric workload's API |
> Always verify available commands via --help. Command names and availability may change between releases.
)| Command | Purpose |
|---|---|
| onelake download file | Download a OneLake file to disk. |onelake upload file
| | Upload a local file into OneLake. |onelake directory create
| | Create a directory via the DFS endpoint. |onelake directory delete
| | Delete a directory (optionally recursive). |onelake file list
| | List files using the hierarchical file-list endpoint. |onelake file delete
| | Remove individual files from OneLake storage. |onelake item list
| | List workspace items and high-level metadata. |onelake item list-data
| | List Fabric items via the DFS endpoint. |onelake item create
| | Provision new Fabric items (lakehouse, notebook, etc.). |
All commands accept either GUID identifiers (--workspace-id, --item-id) or friendly names (--workspace, --item), with the exception of onelake item create, which currently requires GUID identifiers. Friendly-name items must be provided as (for example, SalesLakehouse.lakehouse). Use dotnet run -- onelake --help (or fabmcp onelake --help` for published builds) to inspect the complete option set before scripting.
- See the Microsoft Fabric documentation to learn about the Microsoft Fabric platform.
- For MCP server-specific troubleshooting, see the Troubleshooting Guide.
- Check the Troubleshooting guide to diagnose and resolve common issues.
- We're building this in the open. Your feedback is much appreciated!
- 👉 Open an issue in the public GitHub repository — we'd love to hear from you!
The Fabric MCP Server is a local-first tool that runs entirely on your machine. It provides API specifications, schemas, and best practices without connecting to live Microsoft Fabric environments.
MCP as a phenomenon is very novel and cutting-edge. As with all new technology standards, consider doing a security review to ensure any systems that integrate with MCP servers follow all regulations and standards your system is expected to adhere to.
The software may collect information about you and your use of the software and send it to Microsoft. Microsoft may use this information to provide services and improve our products and services. You may turn off the telemetry as described in the repository. There are also some features in the software that may enable you and Microsoft to collect data from users of your applications. If you use these features, you must comply with applicable law, including providing appropriate notices to users of your applications together with a copy of Microsoft's privacy statement. You can learn more about data collection and use in the help documentation and our privacy statement. Your use of the software operates as your consent to these practices.
We welcome contributions to the Fabric MCP Server! Whether you're fixing bugs, adding new features, or improving documentation, your contributions are welcome.
Please read our Contributing Guide for guidelines on:
* 🛠️ Setting up your development environment
* ✨ Adding new commands
* 📝 Code style and testing requirements
* 🔄 Making pull requests
---
This project is licensed under the MIT License — see the LICENSE file for details.