A Model Context Protocol (MCP) server implementation for Microsoft Fabric Real-Time Intelligence (RTI). This server enables AI agents to interact with Fabric RTI services by providing tools through the MCP interface, allowing for seamless data querying and analysis capabilities.
Note
This project is in Public Preview and implementation may significantly change prior to General Availability.
The Fabric RTI MCP Server acts as a bridge between AI agents and Microsoft Fabric RTI services:
Eventhouse (Kusto): Execute KQL queries against Microsoft Fabric RTI Eventhouse and Azure Data Explorer (ADX).
Eventstreams: Manage Microsoft Fabric Eventstreams for real-time data processing:
Eventhouse Analytics:
Eventstream Management:
kusto_known_services
- List all available Kusto services configured in the MCPkusto_query
- Execute KQL queries on the specified databasekusto_command
- Execute Kusto management commands (destructive operations)kusto_list_databases
- List all databases in the Kusto clusterkusto_list_tables
- List all tables in a specified databasekusto_get_entities_schema
- Get schema information for all entities (tables, materialized views, functions) in a databasekusto_get_table_schema
- Get detailed schema information for a specific tablekusto_get_function_schema
- Get schema information for a specific function, including parameters and output schemakusto_sample_table_data
- Retrieve random sample records from a specified tablekusto_sample_function_data
- Retrieve random sample records from the result of a function callkusto_ingest_inline_into_table
- Ingest inline CSV data into a specified tablekusto_get_shots
- Retrieve semantically similar query examples from a shots table using AI embeddingslist_eventstreams
- List all Eventstreams in your Fabric workspaceget_eventstream
- Get detailed information about a specific Eventstreamget_eventstream_definition
- Retrieve complete JSON definition of an Eventstreamuv
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
or, check here for other install options
The Fabric RTI MCP Server is available on PyPI, so you can install it using pip. This is the easiest way to install the server.
1. Open the command palette (Ctrl+Shift+P) and run the command `MCP: Add Server`
2. Select install from Pip
3. When prompted, enter the package name `microsoft-fabric-rti-mcp`
4. Follow the prompts to install the package and add it to your settings.json or your mcp.json file
The process should end with the below settings in your settings.json
or your mcp.json
file.
{ "mcp": { "server": { "fabric-rti-mcp": { "command": "uvx", "args": [ "microsoft-fabric-rti-mcp" ], "env": { "KUSTO_SERVICE_URI": "https://help.kusto.windows.net/", "KUSTO_SERVICE_DEFAULT_DB": "Samples", "AZ_OPENAI_EMBEDDING_ENDPOINT": "https://your-openai-resource.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings?api-version=2024-10-21;impersonate" } } } } }
๐ง Manual Install (Install from source)Note: All environment variables are optional. The
KUSTO_SERVICE_URI
andKUSTO_SERVICE_DEFAULT_DB
provide default cluster and database settings. TheAZ_OPENAI_EMBEDDING_ENDPOINT
is only needed for semantic search functionality in thekusto_get_shots
tool.
pip install .
or uv tool install .
)settings.json
or your mcp.json
file.{ "mcp": { "servers": { "fabric-rti-mcp": { "command": "uv", "args": [ "--directory", "C:/path/to/fabric-rti-mcp/", "run", "-m", "fabric_rti_mcp.server" ], "env": { "KUSTO_SERVICE_URI": "https://help.kusto.windows.net/", "KUSTO_SERVICE_DEFAULT_DB": "Samples", "AZ_OPENAI_EMBEDDING_ENDPOINT": "https://your-openai-resource.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings?api-version=2024-10-21;impersonate" } } } } }๐ Debugging the MCP Server locally
Assuming you have python installed and the repo cloned:
Follow the Manual Install instructions.
Use the Python: Attach
configuration in your launch.json
to attach to the running server. Once VS Code picks up the server and starts it, navigate to its output:
MCP: List Servers
fabric-rti-mcp
and select Show Output
Python: Attach
configuration in your launch.json
file, and paste the PID of the server in the promptThe MCP server can be configured using the following environment variables:
Required Environment VariablesNone - the server will work with default settings for demo purposes.
Optional Environment Variables Variable Service Description Default ExampleKUSTO_SERVICE_URI
Kusto Default Kusto cluster URI None https://mycluster.westus.kusto.windows.net
KUSTO_SERVICE_DEFAULT_DB
Kusto Default database name for Kusto queries NetDefaultDB
MyDatabase
AZ_OPENAI_EMBEDDING_ENDPOINT
Kusto Azure OpenAI embedding endpoint for semantic search in kusto_get_shots
None https://your-resource.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings?api-version=2024-10-21;impersonate
KUSTO_KNOWN_SERVICES
Kusto JSON array of preconfigured Kusto services None [{"service_uri":"https://cluster1.kusto.windows.net","default_database":"DB1","description":"Prod"}]
KUSTO_EAGER_CONNECT
Kusto Whether to eagerly connect to default service on startup (not recommended) false
true
or false
KUSTO_ALLOW_UNKNOWN_SERVICES
Kusto Security setting to allow connections to services not in KUSTO_KNOWN_SERVICES
true
true
or false
FABRIC_API_BASE
Global Base URL for Microsoft Fabric API https://api.fabric.microsoft.com/v1
https://api.fabric.microsoft.com/v1
Embedding Endpoint Configuration
The AZ_OPENAI_EMBEDDING_ENDPOINT
is used by the semantic search functionality (e.g., kusto_get_shots
function) to find similar query examples.
Format Requirements:
https://{your-openai-resource}.openai.azure.com/openai/deployments/{deployment-name}/embeddings?api-version={api-version};impersonate
Components:
{your-openai-resource}
: Your Azure OpenAI resource name{deployment-name}
: Your text embedding deployment name (e.g., text-embedding-ada-002
){api-version}
: API version (e.g., 2024-10-21
, 2023-05-15
);impersonate
: Authentication method (you might use managed identity)Authentication Requirements:
The kusto_get_shots
tool retrieves shots that are most similar to your prompt from the shots table. This function requires configuration of:
The MCP Server seamlessly integrates with your host operating system's authentication mechanisms. We use Azure Identity via DefaultAzureCredential
, which tries these authentication methods in order:
EnvironmentCredential
) - Perfect for CI/CD pipelinesVisualStudioCredential
) - Uses your Visual Studio credentialsAzureCliCredential
) - Uses your existing Azure CLI loginAzurePowerShellCredential
) - Uses your Az PowerShell loginAzureDeveloperCliCredential
) - Uses your azd loginInteractiveBrowserCredential
) - Falls back to browser-based login if neededIf you're already logged in through any of these methods, the Fabric RTI MCP Server will automatically use those credentials.
Your credentials are always handled securely through the official Azure Identity SDK - we never store or manage tokens directly.
MCP as a phenomenon is very novel and cutting-edge. As with all new technology standards, consider doing a security review to ensure any systems that integrate with MCP servers follow all regulations and standards your system is expected to adhere to. This includes not only the Azure MCP Server, but any MCP client/agent that you choose to implement down to the model provider.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
The software may collect information about you and your use of the software and send it to Microsoft. Microsoft may use this information to provide services and improve our products and services. You may turn off the telemetry as described in the repository. There are also some features in the software that may enable you and Microsoft to collect data from users of your applications. If you use these features, you must comply with applicable law, including providing appropriate notices to users of your applications together with a copy of Microsoftโs privacy statement. Our privacy statement is located at https://go.microsoft.com/fwlink/?LinkID=824704. You can learn more about data collection and use in the help documentation and our privacy statement. Your use of the software operates as your consent to these practices.
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4