Dagster
HTTP-SSEDagster集成MCP服务器,AI交互数据管道
Dagster集成MCP服务器,AI交互数据管道
The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external data sources and tools. This repository provides an MCP server for interacting with Dagster, the data orchestration platform.
A Model Context Protocol server that enables AI agents to interact with Dagster instances, explore data pipelines, monitor runs, and manage assets. It serves as a bridge between LLMs and your data engineering workflows.
Read our launch post to learn more.
The server implements several tools for Dagster interaction:
list_repositories: Lists all available Dagster repositorieslist_jobs: Lists all jobs in a specific repositorylist_assets: Lists all assets in a specific repositoryrecent_runs: Gets recent Dagster runs (default limit: 10)get_run_info: Gets detailed information about a specific runlaunch_run: Launches a Dagster job runmaterialize_asset: Materializes a specific Dagster assetterminate_run: Terminates an in-progress Dagster runget_asset_info: Gets detailed information about a specific assetThe server connects to Dagster using these defaults:
http://localhost:3000/graphqluv run dagster dev -f ./examples/open-ai-agent/pipeline.py
uv run examples/open-ai-agent/run_sse_mcp.py
uv run ./examples/open-ai-agent/agent.py
Once the agent is running, you can ask questions like:
The agent will use the MCP server to interact with your Dagster instance and provide answers based on your data pipelines.