Microsoft Fabric Analytics
STDIOMCP server providing Microsoft Fabric analytics, query, and monitoring capabilities for AI assistants.
MCP server providing Microsoft Fabric analytics, query, and monitoring capabilities for AI assistants.
A comprehensive Model Context Protocol (MCP) server that provides analytics capabilities and tools for interacting with Microsoft Fabric data platform. This server enables AI assistants like Claude to seamlessly access, analyze, and monitor Microsoft Fabric resources through standardized MCP protocols, bringing the power of Microsoft Fabric directly to your AI conversations.
- **☸️ Enterprise Deployment** - Full Kubernetes and Azure deployment support with auto-scaling
- **🔄 Docker Support** - Containerized deployment with health checks and monitoring
- **📊 Monitoring & Observability** - Built-in Prometheus metrics and Grafana dashboards
- **🔀 Synapse to Fabric Migration** - Automated migration of Spark notebooks from Azure Synapse Analytics
- **🎯 52 Total Tools** - Comprehensive coverage of Fabric operations including migration (up from 48 tools)
## 🏗️ **New Workspace Management Features**
### **🆕 Latest Updates - Comprehensive Workspace Operations**
The MCP server now includes **21 new workspace management tools** that enable complete workspace lifecycle management:
### **🌟 Core Workspace Operations**
- **fabric_list_workspaces** - List all accessible workspaces with detailed metadata
- **fabric_create_workspace** - Create new workspaces with custom configuration
- **fabric_delete_workspace** - Delete workspaces with confirmation and cleanup
- **fabric_update_workspace** - Update workspace properties and settings
- **fabric_get_workspace** - Get detailed workspace information and status
### **⚡ Capacity & Resource Management**
- **fabric_list_capacities** - List all available Fabric capacities
- **fabric_assign_workspace_to_capacity** - Attach workspaces to dedicated capacity
- **fabric_unassign_workspace_from_capacity** - Move workspaces to shared capacity
- **fabric_list_capacity_workspaces** - List all workspaces in a capacity
### **👥 Access Control & Security**
- **fabric_get_workspace_role_assignments** - View workspace permissions
- **fabric_add_workspace_role_assignment** - Grant workspace access to users/groups
- **fabric_update_workspace_role_assignment** - Modify user permissions
- **fabric_remove_workspace_role_assignment** - Remove workspace access
### **🔄 Advanced Operations**
- **fabric_get_workspace_git_status** - Check Git integration status
- **fabric_connect_workspace_to_git** - Enable Git integration for workspace
- **fabric_disconnect_workspace_from_git** - Disable Git integration
- **fabric_update_workspace_git_connection** - Modify Git repository settings
### **🛠️ Environment & Pipeline Management**
- **fabric_list_workspace_environments** - List all environments in workspace
- **fabric_create_workspace_environment** - Create new environments
- **fabric_delete_workspace_environment** - Remove environments
- **fabric_list_workspace_data_pipelines** - List data integration pipelines
- **fabric_create_workspace_data_pipeline** - Create new data pipelines
### **🎯 Real-World Scenarios Enabled**
**🚀 Automated Workspace Provisioning:**
"Create a new workspace called 'Analytics-Q1-2025' and assign it to our premium capacity"
**📊 Multi-Workspace Analytics:**
"List all workspaces in our tenant and show their capacity assignments"
**🔒 Access Management:**
"Add user [email protected] as Admin to the Analytics workspace"
**🏗️ Environment Setup:**
"Create a development environment in the Analytics workspace with Python and R libraries"
**🔄 Git Integration:**
"Connect the Analytics workspace to our GitHub repository for version control"
### **🤖 GitHub Copilot Integration**
**Perfect for GitHub Copilot** - The enhanced workspace management works seamlessly with **GitHub Copilot's built-in terminal**, making it ideal for:
- **🔧 Azure CLI Authentication** - Uses your existing `az login` session
- **💻 Terminal-Based Operations** - Natural workflow within your coding environment
- **⚡ Rapid Prototyping** - Quickly create test workspaces and environments
- **🏗️ Infrastructure as Code** - Manage Fabric resources alongside your codebase
- **🔄 CI/CD Integration** - Automate workspace provisioning in deployment pipelines
**GitHub Copilot Example Commands:**
```bash
# Using Azure CLI auth, create a new workspace for our ML project
# List all workspaces and their Git integration status
# Set up a complete analytics environment with lakehouse and notebooks
The MCP server now includes 4 specialized migration tools that automate the migration of Spark notebooks and pipelines from Azure Synapse Analytics to Microsoft Fabric:
Automatic Code Transformation - Converts Synapse-specific code to Fabric equivalents:
mssparkutils → notebookutilsComprehensive Asset Discovery - Inventories all migrat assets:
Safe Testing with Dry Run - Preview all changes before applying:
End-to-End Automation - Complete migration pipeline:
📋 Explore Before Migrating:
"List all my Synapse workspaces and show me what notebooks are in workspace 'analytics-synapse'"
🔄 Preview Transformations:
"Discover assets from my Synapse workspace 'analytics-synapse' and show me how the code would be transformed (dry run)"
🚀 Complete Migration:
"Migrate all notebooks from Synapse workspace 'analytics-synapse' to Fabric workspace 'abcd-1234' and create a lakehouse called 'MigratedData'"
📊 Detailed Migration Guide: See MIGRATION.md for comprehensive migration documentation including:
The MCP server now includes comprehensive end-to-end testing that creates real workspaces, assigns them to capacities, and executes actual jobs to validate the complete workflow:
# One-command end-to-end test npm run test:e2e
What it tests:
Recommended for AI Assistant Usage:
{ "mcpServers": { "fabric-analytics": { "command": "node", "args": ["C:\\path\\to\\your\\build\\index.js"], "cwd": "C:\\path\\to\\your\\project", "env": { "FABRIC_AUTH_METHOD": "bearer_token", "FABRIC_TOKEN": "your_bearer_token_here", "FABRIC_WORKSPACE_ID": "your_workspace_id", "ENABLE_HEALTH_SERVER": "false" } } } }
💡 Get Bearer Token: Visit Power BI Embed Setup to generate tokens
⚠️ Important: Tokens expire after ~1 hour and need to be refreshed
If you experience 60-second timeouts during startup, this is due to interactive authentication flows blocking Claude Desktop's sandboxed environment. Solution:
Use Bearer Token Method (Recommended):
FABRIC_AUTH_METHOD: "bearer_token" in your configFABRIC_TOKEN with a valid bearer tokenAlternative - Per-Tool Authentication:
bearerToken: "your_token_here"bearerToken: "simulation"Troubleshooting:
🎯 Quick Fix: The server automatically prioritizes
FABRIC_TOKENenvironment variable over interactive authentication flows, preventing Claude Desktop timeouts.
# Clone and run locally git clone https://github.com/santhoshravindran7/Fabric-Analytics-MCP.git cd Fabric-Analytics-MCP npm install && npm run build && npm start
# Using Docker Compose docker-compose up -d # Or standalone Docker docker build -t fabric-analytics-mcp . docker run -p 3000:3000 -e FABRIC_CLIENT_ID=xxx fabric-analytics-mcp
# One-command enterprise deployment export ACR_NAME="your-registry" FABRIC_CLIENT_ID="xxx" FABRIC_CLIENT_SECRET="yyy" FABRIC_TENANT_ID="zzz" ./scripts/setup-azure-resources.sh && ./scripts/build-and-push.sh && ./scripts/deploy-to-aks.sh
# Serverless deployment on Azure az mcp server create --name "fabric-analytics-mcp" --repository "santhoshravindran7/Fabric-Analytics-MCP"
📚 Detailed Guides:
Tool: list-fabric-items
Description: List items in a Microsoft Fabric workspace (Lakehouses, Notebooks, etc.)
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDitemType: Filter by item type (optional)Tool: create-fabric-item
Description: Create new items in Microsoft Fabric workspace
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDitemType: Type of item (Lakehouse, Notebook, Dataset, Report, Dashboard)displayName: Display name for the new itemdescription: Optional descriptionTool: get-fabric-item
Description: Get detailed information about a specific Microsoft Fabric item
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDitemId: ID of the item to retrieveTool: update-fabric-item
Description: Update existing items in Microsoft Fabric workspace
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDitemId: ID of the item to updatedisplayName: New display name (optional)description: New description (optional)Tool: delete-fabric-item
Description: Delete items from Microsoft Fabric workspace
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDitemId: ID of the item to deletequery-fabric-datasetbearerToken: Microsoft Fabric bearer token (optional - uses simulation if not provided)workspaceId: Microsoft Fabric workspace IDdatasetName: Name of the dataset to queryquery: SQL or KQL query to executeexecute-fabric-notebookbearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDnotebookId: ID of the notebook to executeparameters: Optional parameters to pass to the notebookget-fabric-metricsworkspaceId: Microsoft Fabric workspace IDitemId: Item ID (dataset, report, etc.)timeRange: Time range for metrics (1h, 24h, 7d, 30d)metrics: List of metrics to analyzeanalyze-fabric-modelworkspaceId: Microsoft Fabric workspace IDitemId: Item ID to analyzegenerate-fabric-reportworkspaceId: Microsoft Fabric workspace IDreportType: Type of report (performance, usage, health, summary)Tool: create-livy-session
Description: Create a new Livy session for interactive Spark/SQL execution
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDsessionConfig: Optional session configurationTool: get-livy-session
Description: Get details of a Livy session
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDsessionId: Livy session IDTool: list-livy-sessions
Description: List all Livy sessions in a lakehouse
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDTool: delete-livy-session
Description: Delete a Livy session
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDsessionId: Livy session IDTool: execute-livy-statement
Description: Execute SQL or Spark statements in a Livy session
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDsessionId: Livy session IDcode: SQL or Spark code to executekind: Statement type (sql, spark, etc.)Tool: get-livy-statement
Description: Get status and results of a Livy statement
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDsessionId: Livy session IDstatementId: Statement IDTool: create-livy-batch
Description: Create a new Livy batch job for long-running operations
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDbatchConfig: Batch job configurationTool: get-livy-batch
Description: Get details of a Livy batch job
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDbatchId: Batch job IDTool: list-livy-batches
Description: List all Livy batch jobs in a lakehouse
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDTool: delete-livy-batch
Description: Delete a Livy batch job
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Microsoft Fabric lakehouse IDbatchId: Batch job IDget-workspace-spark-applicationsbearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDcontinuationToken: Optional token for paginationTool: get-notebook-spark-applications
Description: Get all Spark applications for a specific notebook
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDnotebookId: Notebook IDcontinuationToken: Optional token for paginationTool: get-lakehouse-spark-applications
Description: Get all Spark applications for a specific lakehouse
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlakehouseId: Lakehouse IDcontinuationToken: Optional token for paginationTool: get-spark-job-definition-applications
Description: Get all Spark applications for a specific Spark Job Definition
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDsparkJobDefinitionId: Spark Job Definition IDcontinuationToken: Optional token for paginationTool: get-spark-application-details
Description: Get detailed information about a specific Spark application
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlivyId: Livy session IDTool: cancel-spark-application
Description: Cancel a running Spark application
Parameters:
bearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDlivyId: Livy session IDget-spark-monitoring-dashboardbearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDThe MCP server provides comprehensive notebook management capabilities with predefined templates and custom notebook support.
create-fabric-notebookbearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDdisplayName: Display name for the new notebooktemplate: Template type (blank, sales_analysis, nyc_taxi_analysis, data_exploration, machine_learning, custom)customNotebook: Custom notebook definition (required if template is 'custom')environmentId: Optional environment ID to attachlakehouseId: Optional default lakehouse IDlakehouseName: Optional default lakehouse nameAvailable Templates:
get-fabric-notebook-definitionbearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDnotebookId: ID of the notebook to retrieveformat: Format to return (ipynb or fabricGitSource)update-fabric-notebook-definitionbearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDnotebookId: ID of the notebook to updatenotebookDefinition: Updated notebook definition objectrun-fabric-notebookbearerToken: Microsoft Fabric bearer tokenworkspaceId: Microsoft Fabric workspace IDnotebookId: ID of the notebook to runparameters: Optional notebook parameters (key-value pairs with types)configuration: Optional execution configuration (environment, lakehouse, pools, etc.)Features:
Choose your preferred installation method:
# Install via pip (easiest method) pip install fabric-analytics-mcp # Verify installation fabric-analytics --version # Start the server fabric-analytics-mcp start
# Install globally via npm npm install -g mcp-for-microsoft-fabric-analytics # Verify installation fabric-analytics --version # Start the server fabric-analytics # Or using npx (no installation required) npx mcp-for-microsoft-fabric-analytics
For automated setup with environment configuration:
Unix/Linux/macOS:
# Download and run universal installer curl -fsSL https://raw.githubusercontent.com/santhoshravindran7/Fabric-Analytics-MCP/main/scripts/install-universal.sh | bash # Or with options for full setup curl -fsSL https://raw.githubusercontent.com/santhoshravindran7/Fabric-Analytics-MCP/main/scripts/install-universal.sh | bash -s -- --method pip --config --env --test
Windows (PowerShell):
# Download and run Windows installer iex ((New-Object System.Net.WebClient).DownloadString('https://raw.githubusercontent.com/santhoshravindran7/Fabric-Analytics-MCP/main/scripts/install-windows.ps1')) # Or with options for full setup & ([scriptblock]::Create((iwr https://raw.githubusercontent.com/santhoshravindran7/Fabric-Analytics-MCP/main/scripts/install-windows.ps1).Content)) -Method pip -Config -Environment -Test
# Clone repository git clone https://github.com/santhoshravindran7/Fabric-Analytics-MCP.git cd Fabric-Analytics-MCP # Build and run with Docker docker build -t fabric-analytics-mcp . docker run -d --name fabric-mcp -p 3000:3000 --env-file .env fabric-analytics-mcp
📖 See Docker Installation Guide for detailed Docker and Kubernetes deployment options.
# Clone and build from source git clone https://github.com/santhoshravindran7/Fabric-Analytics-MCP.git cd Fabric-Analytics-MCP npm install npm run build # ✅ All configuration files included!
Set up your environment variables:
export FABRIC_AUTH_METHOD=bearer_token # or service_principal, interactive export FABRIC_CLIENT_ID=your-client-id export FABRIC_CLIENT_SECRET=your-client-secret export FABRIC_TENANT_ID=your-tenant-id export FABRIC_DEFAULT_WORKSPACE_ID=your-workspace-id
Add to your Claude Desktop config:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{ "mcpServers": { "fabric-analytics": { "command": "fabric-analytics-mcp", "args": ["start"], "env": { "FABRIC_AUTH_METHOD": "bearer_token" } } } }
{ "mcpServers": { "fabric-analytics": { "command": "fabric-analytics", "env": { "FABRIC_AUTH_METHOD": "bearer_token" } } } }
{ "mcpServers": { "fabric-analytics": { "command": "node", "args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"] } } }
Restart Claude Desktop and try these queries:
npm start # Production mode npm run dev # Development mode with auto-reload
For comprehensive testing of Spark functionality, install Python dependencies:
pip install -r livy_requirements.txt
Available Test Scripts:
livy_api_test.ipynb - Interactive notebook for step-by-step testingcomprehensive_livy_test.py - Full-featured test with error handlingspark_monitoring_test.py - Spark application monitoring testsmcp_spark_monitoring_demo.py - MCP server integration demoAdd this configuration to your Claude Desktop config file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{ "mcpServers": { "fabric-analytics": { "command": "node", "args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"] } } }
🎉 You're ready! Restart Claude Desktop and start asking questions about your Microsoft Fabric data!
For testing the Livy API functionality, additional Python dependencies are required:
# Install Python dependencies for Livy API testing pip install -r livy_requirements.txt
livy_api_test.ipynb - Interactive Jupyter notebook for step-by-step testingcomprehensive_livy_test.py - Full-featured test with error handlingsimple_livy_test.py - Simple test following example patternslivy_batch_test.py - Batch job testing capabilitiesspark_monitoring_test.py - Spark application monitoring testsnpm start
npm run dev
Add the following configuration to your Claude Desktop config file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{ "mcpServers": { "fabric-analytics": { "command": "node", "args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"] } } }
Once connected to Claude Desktop, you can ask natural language questions like:
Manage Microsoft Fabric capacity assignments directly from your AI assistant. These tools let you inspect available capacities, attach/detach workspaces, and audit capacity usage.
fabric_list_capacities – Enumerate all capacities you can access (ID, SKU, region, state)fabric_assign_workspace_to_capacity – Attach a workspace to a dedicated capacityfabric_unassign_workspace_from_capacity – Return a workspace to shared capacityfabric_list_capacity_workspaces – List all workspaces currently hosted on a given capacitybearerToken field) or rely on global auth.| Tool | Required Parameters | Optional |
|---|---|---|
| fabric_list_capacities | (none) | bearerToken |
| fabric_assign_workspace_to_capacity | capacityId, workspaceId | bearerToken |
| fabric_unassign_workspace_from_capacity | workspaceId | bearerToken |
| fabric_list_capacity_workspaces | capacityId | bearerToken |
Unexpected token 'P')If Claude Desktop or another MCP client reports an error like:
Error: Unexpected token 'P', "Please set"... is not valid JSON
SyntaxError: Unexpected token 'P', "Please set"... is not valid JSON
This is the EXACT issue reported in GitHub issue where the user saw "Unexpected token 'P', 'Please set'..." errors.
This almost always means something wrote plain text to STDOUT (which must contain ONLY JSON-RPC frames). Common causes:
console.log debug statements in server codeconsole.log / console.info to STDERR automaticallyDEBUG_MCP_RUN=1 flagconsole.log statements—prefer console.error (goes to STDERR)ALLOW_UNSAFE_STDOUT=truenpm run build) after changes to ensure compiled output matches sourceIf the capacity tools don't show up when the client lists tools:
npm run buildbuild/index.js and not an older snapshotbuild/ folder and rebuild to clear stale artifactsaz login sessionSet the following (sent to STDERR, safe for MCP framing):
DEBUG_MCP_RUN=1
Optionally add structured auth tracing:
DEBUG_AUTH=1
Need a new troubleshooting topic? Open an issue or PR so others benefit from the resolution.
This MCP server supports multiple authentication methods powered by Microsoft Authentication Library (MSAL):
🤖 For Claude Desktop: Use Bearer Token Authentication (Method #1) for the best experience and compatibility.
🔧 Claude Desktop Fix: Recent updates prevent authentication timeouts by prioritizing bearer tokens and adding timeout protection for interactive authentication flows.
Perfect for AI assistants and interactive usage:
For Claude Desktop:
claude_desktop_config.jsonFor Testing:
# All test scripts will prompt for authentication method python enhanced_auth_test.py
Use Azure AD application credentials:
Environment Variables Setup:
export FABRIC_AUTH_METHOD="service_principal" export FABRIC_CLIENT_ID="your-app-client-id" export FABRIC_CLIENT_SECRET="your-app-client-secret" export FABRIC_TENANT_ID="your-tenant-id" export FABRIC_DEFAULT_WORKSPACE_ID="your-workspace-id"
Claude Desktop Configuration:
{ "mcpServers": { "fabric-analytics": { "command": "node", "args": ["/path/to/build/index.js"], "env": { "FABRIC_AUTH_METHOD": "service_principal", "FABRIC_CLIENT_ID": "your-client-id", "FABRIC_CLIENT_SECRET": "your-client-secret", "FABRIC_TENANT_ID": "your-tenant-id" } } } }
Sign in with browser on another device (great for headless environments):
export FABRIC_AUTH_METHOD="device_code" export FABRIC_CLIENT_ID="your-client-id" export FABRIC_TENANT_ID="your-tenant-id"
Automatic browser-based authentication:
export FABRIC_AUTH_METHOD="interactive" export FABRIC_CLIENT_ID="your-client-id" export FABRIC_TENANT_ID="your-tenant-id"
Use your existing Azure CLI login for seamless local testing:
export FABRIC_AUTH_METHOD="azure_cli"
Prerequisites:
winget install Microsoft.AzureCLI (Windows) or Downloadaz loginaz account set --subscription "your-subscription-name"Benefits:
Quick Test:
# Verify Azure CLI setup npm run test:azure-cli # Start MCP server with Azure CLI auth $env:FABRIC_AUTH_METHOD="azure_cli"; npm start
💡 Pro Tip: Azure CLI authentication is perfect for developers who want to quickly test the MCP server without complex Azure AD app setup. Just
az loginand you're ready to go!
📚 Detailed Guides:
Check your authentication status:
"Check my Fabric authentication status"
"What authentication method am I using?"
"Test my Microsoft Fabric authentication setup"
Note: The MCP server seamlessly handles token validation and provides clear error messages for authentication issues.
Deploy the MCP server as a scalable service on Azure Kubernetes Service for enterprise production use.
# Build the Docker image npm run docker:build # Tag and push to Azure Container Registry npm run docker:push
# Create Azure resources and deploy ./scripts/deploy-to-aks.sh
Once deployed, your MCP server will be available at:
https://your-aks-cluster.region.cloudapp.azure.com/mcp
The AKS deployment includes:
All Kubernetes manifests are located in the /k8s directory:
namespace.yaml - Dedicated namespacedeployment.yaml - Application deployment with scalingservice.yaml - Load balancer serviceingress.yaml - External access and SSLconfigmap.yaml - Configuration managementsecret.yaml - Secure credential storagehpa.yaml - Horizontal Pod AutoscalerConfigure the deployment by setting these environment variables:
export AZURE_SUBSCRIPTION_ID="your-subscription-id" export AZURE_RESOURCE_GROUP="fabric-mcp-rg" export AKS_CLUSTER_NAME="fabric-mcp-cluster" export ACR_NAME="fabricmcpregistry" export DOMAIN_NAME="your-domain.com"
The AKS deployment includes enterprise-grade security:
The deployment scripts support:
📚 Detailed Guide: See AKS_DEPLOYMENT.md for complete setup instructions.
Microsoft Azure now offers a preview service for hosting MCP servers natively. This eliminates the need for custom infrastructure management.
# Login to Azure az login # Enable MCP preview features az extension add --name mcp-preview # Deploy the MCP server az mcp server create \ --name "fabric-analytics-mcp" \ --resource-group "your-rg" \ --source-type "github" \ --repository "santhoshravindran7/Fabric-Analytics-MCP" \ --branch "main" \ --auth-method "service-principal"
# Set up service principal authentication az mcp server config set \ --name "fabric-analytics-mcp" \ --setting "FABRIC_CLIENT_ID=your-client-id" \ --setting "FABRIC_CLIENT_SECRET=your-secret" \ --setting "FABRIC_TENANT_ID=your-tenant-id"
# Get the server endpoint az mcp server show --name "fabric-analytics-mcp" --query "endpoint"
Azure MCP Server offers:
📚 Learn More: Azure MCP Server Documentation
Note: Azure MCP Server is currently in preview. Check Azure Preview Terms for service availability and limitations.
This MCP server is built with:
The server uses the following configuration files:
tsconfig.json - TypeScript compiler configurationpackage.json - Node.js package configuration.vscode/mcp.json - MCP server configuration for VS Code├── src/
│ ├── index.ts # Main MCP server implementation
│ └── fabric-client.ts # Microsoft Fabric API client
├── build/ # Compiled JavaScript output
├── tests/ # Test scripts and notebooks
├── .vscode/ # VS Code configuration
├── package.json
├── tsconfig.json
└── README.md
To add new tools to the server:
server.tool()This server includes:
✅ Production Ready:
🧪 Demonstration Features:
The MCP server includes comprehensive end-to-end testing that creates real workspaces, items, and jobs to validate complete functionality using Azure CLI authentication.
# 1. Set up end-to-end testing environment npm run setup:e2e # 2. Run the comprehensive end-to-end test npm run test:e2e
The end-to-end test creates a complete workflow in your Microsoft Fabric tenant:
az login sessionThe setup script creates a .env.e2e configuration file:
# Example configuration FABRIC_CAPACITY_ID=your-capacity-id-here # Optional: for capacity testing E2E_TEST_TIMEOUT=300000 # 5 minutes per operation E2E_CLEANUP_ON_FAILURE=true # Clean up on test failure E2E_RETRY_COUNT=3 # Retry failed operations
Azure CLI installed and logged in:
az login
Microsoft Fabric Access with permissions to:
Fabric Capacity (optional but recommended):
FABRIC_CAPACITY_ID in .env.e2e for capacity testing# Complete setup and run npm run setup:e2e && npm run test:e2e # Or run individual steps npm run setup:e2e # Set up environment npm run test:e2e # Run end-to-end test # Direct execution node setup-e2e.cjs # Setup script node test-end-to-end.cjs # Test script
The test provides comprehensive output including:
🚀 Starting End-to-End Test for Microsoft Fabric Analytics MCP Server
✅ MCP Server Startup (1234ms)
✅ Azure CLI Authentication
✅ Workspace Creation
✅ Capacity Attachment
✅ Notebook Creation
✅ Lakehouse Creation
✅ Item Validation
✅ Job Execution
📊 TEST SUMMARY
================
✅ MCP Server Startup (2341ms)
✅ Azure CLI Authentication
✅ Workspace Creation
✅ Capacity Attachment
✅ Notebook Creation
✅ Lakehouse Creation
✅ Item Validation
✅ Job Execution
Total: 8 | Passed: 8 | Failed: 0
# Install Python dependencies for API testing pip install -r livy_requirements.txt
livy_api_test.ipynb - Interactive Jupyter notebook for step-by-step testingcomprehensive_livy_test.py - Full-featured test with error handlingsimple_livy_test.py - Simple test following example patternslivy_batch_test.py - Batch job testing capabilitiesspark_monitoring_test.py - Spark application monitoring testsInteractive Testing:
jupyter notebook livy_api_test.ipynb
Command Line Testing:
python simple_livy_test.py python spark_monitoring_test.py
Comprehensive Testing:
undefined
We welcome contributions! Here's how to get started:
git checkout -b feature/amazing-feature)git commit -m 'Add amazing feature')git push origin feature/amazing-feature)This project is licensed under the MIT License - see the LICENSE file for details.
For issues and questions:
This project began as my weekend hack project exploring AI integration with Microsoft Fabric. During a casual conversation with Chris and Bogdan about making AI tooling more accessible. What started as a personal experiment over a weekend is now available for everyone to build upon.