mcp-airflow-api 0.0.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- mcp_airflow_api-0.0.0/LICENSE +21 -0
- mcp_airflow_api-0.0.0/MANIFEST.in +1 -0
- mcp_airflow_api-0.0.0/PKG-INFO +169 -0
- mcp_airflow_api-0.0.0/README.md +156 -0
- mcp_airflow_api-0.0.0/pyproject.toml +25 -0
- mcp_airflow_api-0.0.0/setup.cfg +4 -0
- mcp_airflow_api-0.0.0/src/mcp_airflow_api/__init__.py +1 -0
- mcp_airflow_api-0.0.0/src/mcp_airflow_api/airflow_api.py +270 -0
- mcp_airflow_api-0.0.0/src/mcp_airflow_api/functions.py +63 -0
- mcp_airflow_api-0.0.0/src/mcp_airflow_api/prompt_template.md +56 -0
- mcp_airflow_api-0.0.0/src/mcp_airflow_api.egg-info/PKG-INFO +169 -0
- mcp_airflow_api-0.0.0/src/mcp_airflow_api.egg-info/SOURCES.txt +14 -0
- mcp_airflow_api-0.0.0/src/mcp_airflow_api.egg-info/dependency_links.txt +1 -0
- mcp_airflow_api-0.0.0/src/mcp_airflow_api.egg-info/entry_points.txt +2 -0
- mcp_airflow_api-0.0.0/src/mcp_airflow_api.egg-info/requires.txt +4 -0
- mcp_airflow_api-0.0.0/src/mcp_airflow_api.egg-info/top_level.txt +1 -0
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
The MIT License (MIT)
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2015 Anthony Lapenna
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
include src/mcp_airflow_api/prompt_template.md
|
|
@@ -0,0 +1,169 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: mcp-airflow-api
|
|
3
|
+
Version: 0.0.0
|
|
4
|
+
Summary: Model Context Protocol (MCP) server for Apache Airflow API integration. Provides comprehensive tools for managing Airflow clusters including service operations, configuration management, status monitoring, and request tracking.
|
|
5
|
+
Requires-Python: >=3.11
|
|
6
|
+
Description-Content-Type: text/markdown
|
|
7
|
+
License-File: LICENSE
|
|
8
|
+
Requires-Dist: fastapi>=0.116.1
|
|
9
|
+
Requires-Dist: requests>=2.32.4
|
|
10
|
+
Requires-Dist: uvicorn>=0.35.0
|
|
11
|
+
Requires-Dist: mcp>=1.12.3
|
|
12
|
+
Dynamic: license-file
|
|
13
|
+
|
|
14
|
+
Model Context Protocol (MCP) server for Apache Airflow API integration.
|
|
15
|
+
This project provides natural language MCP tools for essential Airflow cluster operations.
|
|
16
|
+
|
|
17
|
+
[](https://github.com/call518/MCP-Airflow-API/actions/workflows/pypi-publish.yml)
|
|
18
|
+
|
|
19
|
+
[](https://smithery.ai/server/@call518/mcp-airflow-api)
|
|
20
|
+
|
|
21
|
+
---
|
|
22
|
+
|
|
23
|
+
|
|
24
|
+
# MCP-Airflow-API
|
|
25
|
+
|
|
26
|
+
**Tested and supported Airflow version: 2.10.2 (API Version: v1)**
|
|
27
|
+
|
|
28
|
+
## Features
|
|
29
|
+
|
|
30
|
+
- List all DAGs in the Airflow cluster
|
|
31
|
+
- Monitor running/failed DAG runs
|
|
32
|
+
- Trigger DAG runs on demand
|
|
33
|
+
- Minimal, LLM-friendly output for all tools
|
|
34
|
+
- Easy integration with MCP Inspector, OpenWebUI, Smithery, etc.
|
|
35
|
+
|
|
36
|
+
---
|
|
37
|
+
|
|
38
|
+
## Available MCP Tools
|
|
39
|
+
|
|
40
|
+
### DAG Management
|
|
41
|
+
|
|
42
|
+
- `list_dags`
|
|
43
|
+
Returns all DAGs registered in the Airflow cluster.
|
|
44
|
+
Output: `dag_id`, `dag_display_name`, `is_active`, `is_paused`, `owners`, `tags`
|
|
45
|
+
|
|
46
|
+
- `running_dags`
|
|
47
|
+
Returns all currently running DAG runs.
|
|
48
|
+
Output: `dag_id`, `run_id`, `state`, `execution_date`, `start_date`, `end_date`
|
|
49
|
+
|
|
50
|
+
- `failed_dags`
|
|
51
|
+
Returns all recently failed DAG runs.
|
|
52
|
+
Output: `dag_id`, `run_id`, `state`, `execution_date`, `start_date`, `end_date`
|
|
53
|
+
|
|
54
|
+
- `trigger_dag(dag_id)`
|
|
55
|
+
Immediately triggers the specified DAG.
|
|
56
|
+
Output: `dag_id`, `run_id`, `state`, `execution_date`, `start_date`, `end_date`
|
|
57
|
+
|
|
58
|
+
- `pause_dag(dag_id)`
|
|
59
|
+
Pauses the specified DAG (prevents scheduling new runs).
|
|
60
|
+
Output: `dag_id`, `is_paused`
|
|
61
|
+
|
|
62
|
+
- `unpause_dag(dag_id)`
|
|
63
|
+
Unpauses the specified DAG (allows scheduling new runs).
|
|
64
|
+
Output: `dag_id`, `is_paused`
|
|
65
|
+
|
|
66
|
+
---
|
|
67
|
+
|
|
68
|
+
## Prompt Template
|
|
69
|
+
|
|
70
|
+
The package exposes a tool `get_prompt_template` that returns either the entire template, a specific section, or just the headings. Three MCP prompts (`prompt_template_full`, `prompt_template_headings`, `prompt_template_section`) are also registered for discovery.
|
|
71
|
+
|
|
72
|
+
### MCP Prompts
|
|
73
|
+
|
|
74
|
+
For easier discoverability in MCP clients (so `prompts/list` is not empty), the server now registers three prompts:
|
|
75
|
+
|
|
76
|
+
• `prompt_template_full` – returns the full canonical template
|
|
77
|
+
• `prompt_template_headings` – returns only the section headings
|
|
78
|
+
• `prompt_template_section` – takes a `section` argument (number or keyword) and returns that section
|
|
79
|
+
|
|
80
|
+
You can still use the `get_prompt_template` tool for programmatic access or when you prefer tool invocation over prompt retrieval.
|
|
81
|
+
|
|
82
|
+
Single canonical English prompt template guides safe and efficient tool selection.
|
|
83
|
+
|
|
84
|
+
Files:
|
|
85
|
+
• Packaged: `src/mcp_airflow_api/prompt_template.md` (distributed with PyPI)
|
|
86
|
+
• (Optional workspace root copy `PROMPT_TEMPLATE.md` may exist for editing; packaged copy is the one loaded at runtime.)
|
|
87
|
+
|
|
88
|
+
Retrieve dynamically via MCP tool:
|
|
89
|
+
• `get_prompt_template()` – full template
|
|
90
|
+
• `get_prompt_template("tool map")` – only the tool mapping section
|
|
91
|
+
• `get_prompt_template("3")` – section 3 (tool map)
|
|
92
|
+
• `get_prompt_template(mode="headings")` – list all section headings
|
|
93
|
+
|
|
94
|
+
Policy: Only English is stored; LLM는 사용자 질의 언어와 무관하게 영어 지침을 내부 추론용으로 사용하고, 사용자 응답은 필요 시 다국어로 생성한다.
|
|
95
|
+
|
|
96
|
+
---
|
|
97
|
+
|
|
98
|
+
## Main Tool Files
|
|
99
|
+
|
|
100
|
+
- MCP tool definitions: `src/mcp_airflow_api/airflow_api.py`
|
|
101
|
+
- Utility functions: `src/mcp_airflow_api/functions.py`
|
|
102
|
+
|
|
103
|
+
---
|
|
104
|
+
|
|
105
|
+
## How To Use
|
|
106
|
+
|
|
107
|
+
1. In your MCP Tools environment, configure `mcp-config.json` as follows:
|
|
108
|
+
|
|
109
|
+
```json
|
|
110
|
+
{
|
|
111
|
+
"mcpServers": {
|
|
112
|
+
"airflow-api": {
|
|
113
|
+
"command": "uvx",
|
|
114
|
+
"args": ["--python", "3.11", "mcp-airflow-api"],
|
|
115
|
+
"env": {
|
|
116
|
+
"AIRFLOW_API_URL": "http://localhost:38080/api/v1",
|
|
117
|
+
"AIRFLOW_API_USERNAME": "airflow",
|
|
118
|
+
"AIRFLOW_API_PASSWORD": "airflow",
|
|
119
|
+
"AIRFLOW_LOG_LEVEL": "INFO"
|
|
120
|
+
}
|
|
121
|
+
}
|
|
122
|
+
}
|
|
123
|
+
}
|
|
124
|
+
```
|
|
125
|
+
|
|
126
|
+
2. Register the MCP server in MCP Inspector, OpenWebUI, Smithery, etc. and use the tools.
|
|
127
|
+
|
|
128
|
+
---
|
|
129
|
+
|
|
130
|
+
## QuickStart (Demo): Running MCP-Airflow-API with Docker
|
|
131
|
+
|
|
132
|
+
1. Prepare an Airflow cluster
|
|
133
|
+
- See [Official Airflow Docker Install Guide](https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html)
|
|
134
|
+
|
|
135
|
+
2. Prepare MCP Tools environment
|
|
136
|
+
- Install Docker and Docker Compose
|
|
137
|
+
- Clone this project and run `docker-compose up -d` in the root directory
|
|
138
|
+
|
|
139
|
+
3. Register the MCP server in MCP Inspector/Smithery
|
|
140
|
+
- Example address: `http://localhost:8000/airflow-api`
|
|
141
|
+
|
|
142
|
+
---
|
|
143
|
+
|
|
144
|
+
## Logging & Observability
|
|
145
|
+
|
|
146
|
+
- Structured logs for all tool invocations and HTTP requests
|
|
147
|
+
- Control log level via environment variable (`AIRFLOW_LOG_LEVEL`) or CLI flag (`--log-level`)
|
|
148
|
+
- Supported levels: DEBUG, INFO, WARNING, ERROR, CRITICAL
|
|
149
|
+
|
|
150
|
+
---
|
|
151
|
+
|
|
152
|
+
## License
|
|
153
|
+
|
|
154
|
+
This project is licensed under the MIT License.
|
|
155
|
+
|
|
156
|
+
---
|
|
157
|
+
|
|
158
|
+
## Roadmap
|
|
159
|
+
|
|
160
|
+
This project starts with a minimal set of essential Airflow management tools. Many more useful features and tools for Airflow cluster operations will be added soon, including advanced monitoring, DAG/task analytics, scheduling controls, and more. Contributions and suggestions are welcome!
|
|
161
|
+
|
|
162
|
+
---
|
|
163
|
+
|
|
164
|
+
## Additional Links
|
|
165
|
+
|
|
166
|
+
- [Code](https://github.com/call518/MCP-Airflow-API)
|
|
167
|
+
- [Issues](https://github.com/call518/MCP-Airflow-API/issues)
|
|
168
|
+
- [Smithery Deployment](https://smithery.ai/server/@call518/mcp-airflow-api)
|
|
169
|
+
|
|
@@ -0,0 +1,156 @@
|
|
|
1
|
+
Model Context Protocol (MCP) server for Apache Airflow API integration.
|
|
2
|
+
This project provides natural language MCP tools for essential Airflow cluster operations.
|
|
3
|
+
|
|
4
|
+
[](https://github.com/call518/MCP-Airflow-API/actions/workflows/pypi-publish.yml)
|
|
5
|
+
|
|
6
|
+
[](https://smithery.ai/server/@call518/mcp-airflow-api)
|
|
7
|
+
|
|
8
|
+
---
|
|
9
|
+
|
|
10
|
+
|
|
11
|
+
# MCP-Airflow-API
|
|
12
|
+
|
|
13
|
+
**Tested and supported Airflow version: 2.10.2 (API Version: v1)**
|
|
14
|
+
|
|
15
|
+
## Features
|
|
16
|
+
|
|
17
|
+
- List all DAGs in the Airflow cluster
|
|
18
|
+
- Monitor running/failed DAG runs
|
|
19
|
+
- Trigger DAG runs on demand
|
|
20
|
+
- Minimal, LLM-friendly output for all tools
|
|
21
|
+
- Easy integration with MCP Inspector, OpenWebUI, Smithery, etc.
|
|
22
|
+
|
|
23
|
+
---
|
|
24
|
+
|
|
25
|
+
## Available MCP Tools
|
|
26
|
+
|
|
27
|
+
### DAG Management
|
|
28
|
+
|
|
29
|
+
- `list_dags`
|
|
30
|
+
Returns all DAGs registered in the Airflow cluster.
|
|
31
|
+
Output: `dag_id`, `dag_display_name`, `is_active`, `is_paused`, `owners`, `tags`
|
|
32
|
+
|
|
33
|
+
- `running_dags`
|
|
34
|
+
Returns all currently running DAG runs.
|
|
35
|
+
Output: `dag_id`, `run_id`, `state`, `execution_date`, `start_date`, `end_date`
|
|
36
|
+
|
|
37
|
+
- `failed_dags`
|
|
38
|
+
Returns all recently failed DAG runs.
|
|
39
|
+
Output: `dag_id`, `run_id`, `state`, `execution_date`, `start_date`, `end_date`
|
|
40
|
+
|
|
41
|
+
- `trigger_dag(dag_id)`
|
|
42
|
+
Immediately triggers the specified DAG.
|
|
43
|
+
Output: `dag_id`, `run_id`, `state`, `execution_date`, `start_date`, `end_date`
|
|
44
|
+
|
|
45
|
+
- `pause_dag(dag_id)`
|
|
46
|
+
Pauses the specified DAG (prevents scheduling new runs).
|
|
47
|
+
Output: `dag_id`, `is_paused`
|
|
48
|
+
|
|
49
|
+
- `unpause_dag(dag_id)`
|
|
50
|
+
Unpauses the specified DAG (allows scheduling new runs).
|
|
51
|
+
Output: `dag_id`, `is_paused`
|
|
52
|
+
|
|
53
|
+
---
|
|
54
|
+
|
|
55
|
+
## Prompt Template
|
|
56
|
+
|
|
57
|
+
The package exposes a tool `get_prompt_template` that returns either the entire template, a specific section, or just the headings. Three MCP prompts (`prompt_template_full`, `prompt_template_headings`, `prompt_template_section`) are also registered for discovery.
|
|
58
|
+
|
|
59
|
+
### MCP Prompts
|
|
60
|
+
|
|
61
|
+
For easier discoverability in MCP clients (so `prompts/list` is not empty), the server now registers three prompts:
|
|
62
|
+
|
|
63
|
+
• `prompt_template_full` – returns the full canonical template
|
|
64
|
+
• `prompt_template_headings` – returns only the section headings
|
|
65
|
+
• `prompt_template_section` – takes a `section` argument (number or keyword) and returns that section
|
|
66
|
+
|
|
67
|
+
You can still use the `get_prompt_template` tool for programmatic access or when you prefer tool invocation over prompt retrieval.
|
|
68
|
+
|
|
69
|
+
Single canonical English prompt template guides safe and efficient tool selection.
|
|
70
|
+
|
|
71
|
+
Files:
|
|
72
|
+
• Packaged: `src/mcp_airflow_api/prompt_template.md` (distributed with PyPI)
|
|
73
|
+
• (Optional workspace root copy `PROMPT_TEMPLATE.md` may exist for editing; packaged copy is the one loaded at runtime.)
|
|
74
|
+
|
|
75
|
+
Retrieve dynamically via MCP tool:
|
|
76
|
+
• `get_prompt_template()` – full template
|
|
77
|
+
• `get_prompt_template("tool map")` – only the tool mapping section
|
|
78
|
+
• `get_prompt_template("3")` – section 3 (tool map)
|
|
79
|
+
• `get_prompt_template(mode="headings")` – list all section headings
|
|
80
|
+
|
|
81
|
+
Policy: Only English is stored; LLM는 사용자 질의 언어와 무관하게 영어 지침을 내부 추론용으로 사용하고, 사용자 응답은 필요 시 다국어로 생성한다.
|
|
82
|
+
|
|
83
|
+
---
|
|
84
|
+
|
|
85
|
+
## Main Tool Files
|
|
86
|
+
|
|
87
|
+
- MCP tool definitions: `src/mcp_airflow_api/airflow_api.py`
|
|
88
|
+
- Utility functions: `src/mcp_airflow_api/functions.py`
|
|
89
|
+
|
|
90
|
+
---
|
|
91
|
+
|
|
92
|
+
## How To Use
|
|
93
|
+
|
|
94
|
+
1. In your MCP Tools environment, configure `mcp-config.json` as follows:
|
|
95
|
+
|
|
96
|
+
```json
|
|
97
|
+
{
|
|
98
|
+
"mcpServers": {
|
|
99
|
+
"airflow-api": {
|
|
100
|
+
"command": "uvx",
|
|
101
|
+
"args": ["--python", "3.11", "mcp-airflow-api"],
|
|
102
|
+
"env": {
|
|
103
|
+
"AIRFLOW_API_URL": "http://localhost:38080/api/v1",
|
|
104
|
+
"AIRFLOW_API_USERNAME": "airflow",
|
|
105
|
+
"AIRFLOW_API_PASSWORD": "airflow",
|
|
106
|
+
"AIRFLOW_LOG_LEVEL": "INFO"
|
|
107
|
+
}
|
|
108
|
+
}
|
|
109
|
+
}
|
|
110
|
+
}
|
|
111
|
+
```
|
|
112
|
+
|
|
113
|
+
2. Register the MCP server in MCP Inspector, OpenWebUI, Smithery, etc. and use the tools.
|
|
114
|
+
|
|
115
|
+
---
|
|
116
|
+
|
|
117
|
+
## QuickStart (Demo): Running MCP-Airflow-API with Docker
|
|
118
|
+
|
|
119
|
+
1. Prepare an Airflow cluster
|
|
120
|
+
- See [Official Airflow Docker Install Guide](https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html)
|
|
121
|
+
|
|
122
|
+
2. Prepare MCP Tools environment
|
|
123
|
+
- Install Docker and Docker Compose
|
|
124
|
+
- Clone this project and run `docker-compose up -d` in the root directory
|
|
125
|
+
|
|
126
|
+
3. Register the MCP server in MCP Inspector/Smithery
|
|
127
|
+
- Example address: `http://localhost:8000/airflow-api`
|
|
128
|
+
|
|
129
|
+
---
|
|
130
|
+
|
|
131
|
+
## Logging & Observability
|
|
132
|
+
|
|
133
|
+
- Structured logs for all tool invocations and HTTP requests
|
|
134
|
+
- Control log level via environment variable (`AIRFLOW_LOG_LEVEL`) or CLI flag (`--log-level`)
|
|
135
|
+
- Supported levels: DEBUG, INFO, WARNING, ERROR, CRITICAL
|
|
136
|
+
|
|
137
|
+
---
|
|
138
|
+
|
|
139
|
+
## License
|
|
140
|
+
|
|
141
|
+
This project is licensed under the MIT License.
|
|
142
|
+
|
|
143
|
+
---
|
|
144
|
+
|
|
145
|
+
## Roadmap
|
|
146
|
+
|
|
147
|
+
This project starts with a minimal set of essential Airflow management tools. Many more useful features and tools for Airflow cluster operations will be added soon, including advanced monitoring, DAG/task analytics, scheduling controls, and more. Contributions and suggestions are welcome!
|
|
148
|
+
|
|
149
|
+
---
|
|
150
|
+
|
|
151
|
+
## Additional Links
|
|
152
|
+
|
|
153
|
+
- [Code](https://github.com/call518/MCP-Airflow-API)
|
|
154
|
+
- [Issues](https://github.com/call518/MCP-Airflow-API/issues)
|
|
155
|
+
- [Smithery Deployment](https://smithery.ai/server/@call518/mcp-airflow-api)
|
|
156
|
+
|
|
@@ -0,0 +1,25 @@
|
|
|
1
|
+
[project]
|
|
2
|
+
name = "mcp-airflow-api"
|
|
3
|
+
version = "0.0.0"
|
|
4
|
+
description = "Model Context Protocol (MCP) server for Apache Airflow API integration. Provides comprehensive tools for managing Airflow clusters including service operations, configuration management, status monitoring, and request tracking."
|
|
5
|
+
readme = "README.md"
|
|
6
|
+
requires-python = ">=3.11"
|
|
7
|
+
dependencies = [
|
|
8
|
+
"fastapi>=0.116.1",
|
|
9
|
+
"requests>=2.32.4",
|
|
10
|
+
"uvicorn>=0.35.0",
|
|
11
|
+
"mcp>=1.12.3",
|
|
12
|
+
]
|
|
13
|
+
|
|
14
|
+
[project.scripts]
|
|
15
|
+
mcp-airflow-api = "mcp_airflow_api.airflow_api:main"
|
|
16
|
+
|
|
17
|
+
[tool.setuptools.packages.find]
|
|
18
|
+
where = ["src"]
|
|
19
|
+
include = ["mcp_airflow_api*"]
|
|
20
|
+
|
|
21
|
+
[tool.setuptools]
|
|
22
|
+
include-package-data = true
|
|
23
|
+
|
|
24
|
+
[tool.setuptools.package-data]
|
|
25
|
+
mcp_airflow_api = ["prompt_template.md"]
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
"""MCP Airflow API package."""
|
|
@@ -0,0 +1,270 @@
|
|
|
1
|
+
"""
|
|
2
|
+
MCP tool definitions for Airflow REST API operations.
|
|
3
|
+
"""
|
|
4
|
+
import argparse
|
|
5
|
+
import logging
|
|
6
|
+
from typing import Any, Dict, List, Optional
|
|
7
|
+
import mcp
|
|
8
|
+
from mcp.server.fastmcp import FastMCP
|
|
9
|
+
import os
|
|
10
|
+
from .functions import airflow_request, read_prompt_template, parse_prompt_sections
|
|
11
|
+
|
|
12
|
+
# Setup logging
|
|
13
|
+
logger = logging.getLogger(__name__)
|
|
14
|
+
logging.basicConfig(
|
|
15
|
+
level=logging.INFO,
|
|
16
|
+
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
|
17
|
+
)
|
|
18
|
+
|
|
19
|
+
|
|
20
|
+
# MCP server instance for registering tools
|
|
21
|
+
mcp = FastMCP("mcp-airflow-api")
|
|
22
|
+
|
|
23
|
+
PROMPT_TEMPLATE_PATH = os.path.join(os.path.dirname(__file__), "prompt_template.md")
|
|
24
|
+
|
|
25
|
+
|
|
26
|
+
|
|
27
|
+
@mcp.tool()
|
|
28
|
+
def get_prompt_template(section: Optional[str] = None, mode: Optional[str] = None) -> str:
|
|
29
|
+
"""
|
|
30
|
+
Returns the MCP prompt template (full, headings, or specific section).
|
|
31
|
+
Args:
|
|
32
|
+
section: Section number or keyword (optional)
|
|
33
|
+
mode: 'full', 'headings', or None (optional)
|
|
34
|
+
"""
|
|
35
|
+
template = read_prompt_template(PROMPT_TEMPLATE_PATH)
|
|
36
|
+
|
|
37
|
+
if mode == "headings":
|
|
38
|
+
headings, _ = parse_prompt_sections(template)
|
|
39
|
+
lines = ["Section Headings:"]
|
|
40
|
+
for idx, title in enumerate(headings, 1):
|
|
41
|
+
lines.append(f"{idx}. {title}")
|
|
42
|
+
return "\n".join(lines)
|
|
43
|
+
|
|
44
|
+
if section:
|
|
45
|
+
headings, sections = parse_prompt_sections(template)
|
|
46
|
+
# Try by number
|
|
47
|
+
try:
|
|
48
|
+
idx = int(section) - 1
|
|
49
|
+
if 0 <= idx < len(sections):
|
|
50
|
+
return sections[idx]
|
|
51
|
+
except Exception:
|
|
52
|
+
pass
|
|
53
|
+
# Try by keyword
|
|
54
|
+
section_lower = section.strip().lower()
|
|
55
|
+
for i, heading in enumerate(headings):
|
|
56
|
+
if section_lower in heading.lower():
|
|
57
|
+
return sections[i]
|
|
58
|
+
return f"Section '{section}' not found."
|
|
59
|
+
|
|
60
|
+
return template
|
|
61
|
+
|
|
62
|
+
@mcp.tool()
|
|
63
|
+
def list_dags() -> Dict[str, Any]:
|
|
64
|
+
"""
|
|
65
|
+
[Tool Role]: Lists all DAGs registered in the Airflow cluster.
|
|
66
|
+
|
|
67
|
+
Returns:
|
|
68
|
+
List of DAGs with minimal info: dag_id, dag_display_name, is_active, is_paused, owners, tags
|
|
69
|
+
"""
|
|
70
|
+
resp = airflow_request("GET", "/dags")
|
|
71
|
+
resp.raise_for_status()
|
|
72
|
+
dags = resp.json().get("dags", [])
|
|
73
|
+
minimal = []
|
|
74
|
+
for dag in dags:
|
|
75
|
+
minimal.append({
|
|
76
|
+
"dag_id": dag.get("dag_id"),
|
|
77
|
+
"dag_display_name": dag.get("dag_display_name"),
|
|
78
|
+
"is_active": dag.get("is_active"),
|
|
79
|
+
"is_paused": dag.get("is_paused"),
|
|
80
|
+
"owners": dag.get("owners"),
|
|
81
|
+
"tags": [t.get("name") for t in dag.get("tags", [])]
|
|
82
|
+
})
|
|
83
|
+
return {"dags": minimal}
|
|
84
|
+
|
|
85
|
+
@mcp.tool()
|
|
86
|
+
def running_dags() -> Dict[str, Any]:
|
|
87
|
+
"""
|
|
88
|
+
[Tool Role]: Lists all currently running DAG runs in the Airflow cluster.
|
|
89
|
+
|
|
90
|
+
Returns:
|
|
91
|
+
List of running DAG runs with minimal info: dag_id, run_id, state, execution_date, start_date, end_date
|
|
92
|
+
"""
|
|
93
|
+
dags_resp = airflow_request("GET", "/dags")
|
|
94
|
+
dags_resp.raise_for_status()
|
|
95
|
+
dags = dags_resp.json().get("dags", [])
|
|
96
|
+
running = []
|
|
97
|
+
for dag in dags:
|
|
98
|
+
dag_id = dag.get("dag_id")
|
|
99
|
+
if not dag_id:
|
|
100
|
+
continue
|
|
101
|
+
runs_resp = airflow_request("GET", f"/dags/{dag_id}/dagRuns")
|
|
102
|
+
runs_resp.raise_for_status()
|
|
103
|
+
runs = runs_resp.json().get("dag_runs", [])
|
|
104
|
+
for run in runs:
|
|
105
|
+
if run.get("state") == "running":
|
|
106
|
+
running.append({
|
|
107
|
+
"dag_id": dag_id,
|
|
108
|
+
"run_id": run.get("run_id"),
|
|
109
|
+
"state": run.get("state"),
|
|
110
|
+
"execution_date": run.get("execution_date"),
|
|
111
|
+
"start_date": run.get("start_date"),
|
|
112
|
+
"end_date": run.get("end_date")
|
|
113
|
+
})
|
|
114
|
+
return {"dag_runs": running}
|
|
115
|
+
|
|
116
|
+
@mcp.tool()
|
|
117
|
+
def failed_dags() -> Dict[str, Any]:
|
|
118
|
+
"""
|
|
119
|
+
[Tool Role]: Lists all recently failed DAG runs in the Airflow cluster.
|
|
120
|
+
|
|
121
|
+
Returns:
|
|
122
|
+
List of failed DAG runs with minimal info: dag_id, run_id, state, execution_date, start_date, end_date
|
|
123
|
+
"""
|
|
124
|
+
dags_resp = airflow_request("GET", "/dags")
|
|
125
|
+
dags_resp.raise_for_status()
|
|
126
|
+
dags = dags_resp.json().get("dags", [])
|
|
127
|
+
failed = []
|
|
128
|
+
for dag in dags:
|
|
129
|
+
dag_id = dag.get("dag_id")
|
|
130
|
+
if not dag_id:
|
|
131
|
+
continue
|
|
132
|
+
runs_resp = airflow_request("GET", f"/dags/{dag_id}/dagRuns")
|
|
133
|
+
runs_resp.raise_for_status()
|
|
134
|
+
runs = runs_resp.json().get("dag_runs", [])
|
|
135
|
+
for run in runs:
|
|
136
|
+
if run.get("state") == "failed":
|
|
137
|
+
failed.append({
|
|
138
|
+
"dag_id": dag_id,
|
|
139
|
+
"run_id": run.get("run_id"),
|
|
140
|
+
"state": run.get("state"),
|
|
141
|
+
"execution_date": run.get("execution_date"),
|
|
142
|
+
"start_date": run.get("start_date"),
|
|
143
|
+
"end_date": run.get("end_date")
|
|
144
|
+
})
|
|
145
|
+
return {"dag_runs": failed}
|
|
146
|
+
|
|
147
|
+
@mcp.tool()
|
|
148
|
+
def trigger_dag(dag_id: str) -> Dict[str, Any]:
|
|
149
|
+
"""
|
|
150
|
+
[Tool Role]: Triggers a new DAG run for a specified Airflow DAG.
|
|
151
|
+
|
|
152
|
+
Args:
|
|
153
|
+
dag_id: The DAG ID to trigger
|
|
154
|
+
|
|
155
|
+
Returns:
|
|
156
|
+
Minimal info about triggered DAG run: dag_id, run_id, state, execution_date, start_date, end_date
|
|
157
|
+
"""
|
|
158
|
+
if not dag_id:
|
|
159
|
+
raise ValueError("dag_id must not be empty")
|
|
160
|
+
resp = airflow_request("POST", f"/dags/{dag_id}/dagRuns", json={"conf": {}})
|
|
161
|
+
resp.raise_for_status()
|
|
162
|
+
run = resp.json()
|
|
163
|
+
return {
|
|
164
|
+
"dag_id": dag_id,
|
|
165
|
+
"run_id": run.get("run_id"),
|
|
166
|
+
"state": run.get("state"),
|
|
167
|
+
"execution_date": run.get("execution_date"),
|
|
168
|
+
"start_date": run.get("start_date"),
|
|
169
|
+
"end_date": run.get("end_date")
|
|
170
|
+
}
|
|
171
|
+
|
|
172
|
+
@mcp.tool()
|
|
173
|
+
def pause_dag(dag_id: str) -> Dict[str, Any]:
|
|
174
|
+
"""
|
|
175
|
+
[Tool Role]: Pauses the specified Airflow DAG (prevents scheduling new runs).
|
|
176
|
+
|
|
177
|
+
Args:
|
|
178
|
+
dag_id: The DAG ID to pause
|
|
179
|
+
|
|
180
|
+
Returns:
|
|
181
|
+
Minimal info about the paused DAG: dag_id, is_paused
|
|
182
|
+
"""
|
|
183
|
+
if not dag_id:
|
|
184
|
+
raise ValueError("dag_id must not be empty")
|
|
185
|
+
resp = airflow_request("PATCH", f"/dags/{dag_id}", json={"is_paused": True})
|
|
186
|
+
resp.raise_for_status()
|
|
187
|
+
dag = resp.json()
|
|
188
|
+
return {"dag_id": dag.get("dag_id", dag_id), "is_paused": dag.get("is_paused", True)}
|
|
189
|
+
|
|
190
|
+
@mcp.tool()
|
|
191
|
+
def unpause_dag(dag_id: str) -> Dict[str, Any]:
|
|
192
|
+
"""
|
|
193
|
+
[Tool Role]: Unpauses the specified Airflow DAG (allows scheduling new runs).
|
|
194
|
+
|
|
195
|
+
Args:
|
|
196
|
+
dag_id: The DAG ID to unpause
|
|
197
|
+
|
|
198
|
+
Returns:
|
|
199
|
+
Minimal info about the unpaused DAG: dag_id, is_paused
|
|
200
|
+
"""
|
|
201
|
+
if not dag_id:
|
|
202
|
+
raise ValueError("dag_id must not be empty")
|
|
203
|
+
resp = airflow_request("PATCH", f"/dags/{dag_id}", json={"is_paused": False})
|
|
204
|
+
resp.raise_for_status()
|
|
205
|
+
dag = resp.json()
|
|
206
|
+
return {"dag_id": dag.get("dag_id", dag_id), "is_paused": dag.get("is_paused", False)}
|
|
207
|
+
|
|
208
|
+
#========================================================================================
|
|
209
|
+
# MCP Prompts (for prompts/list exposure)
|
|
210
|
+
#========================================================================================
|
|
211
|
+
|
|
212
|
+
@mcp.prompt("prompt_template_full")
|
|
213
|
+
def prompt_template_full_prompt() -> str:
|
|
214
|
+
"""Return the full canonical prompt template."""
|
|
215
|
+
return read_prompt_template(PROMPT_TEMPLATE_PATH)
|
|
216
|
+
|
|
217
|
+
@mcp.prompt("prompt_template_headings")
|
|
218
|
+
def prompt_template_headings_prompt() -> str:
|
|
219
|
+
"""Return compact list of section headings."""
|
|
220
|
+
template = read_prompt_template(PROMPT_TEMPLATE_PATH)
|
|
221
|
+
headings, _ = parse_prompt_sections(template)
|
|
222
|
+
lines = ["Section Headings:"]
|
|
223
|
+
for idx, title in enumerate(headings, 1):
|
|
224
|
+
lines.append(f"{idx}. {title}")
|
|
225
|
+
return "\n".join(lines)
|
|
226
|
+
|
|
227
|
+
@mcp.prompt("prompt_template_section")
|
|
228
|
+
def prompt_template_section_prompt(section: Optional[str] = None) -> str:
|
|
229
|
+
"""Return a specific prompt template section by number or keyword."""
|
|
230
|
+
if not section:
|
|
231
|
+
headings_result = prompt_template_headings_prompt()
|
|
232
|
+
return "\n".join([
|
|
233
|
+
"[HELP] Missing 'section' argument.",
|
|
234
|
+
"Specify a section number or keyword.",
|
|
235
|
+
"Examples: 1 | overview | tool map | usage",
|
|
236
|
+
headings_result.strip()
|
|
237
|
+
])
|
|
238
|
+
return get_prompt_template(section=section)
|
|
239
|
+
|
|
240
|
+
#========================================================================================
|
|
241
|
+
|
|
242
|
+
def main(argv: Optional[List[str]] = None):
|
|
243
|
+
"""Entrypoint for MCP Airflow API server.
|
|
244
|
+
|
|
245
|
+
Supports optional CLI arguments (e.g. --log-level DEBUG) while remaining
|
|
246
|
+
backward-compatible with stdio launcher expectations.
|
|
247
|
+
"""
|
|
248
|
+
parser = argparse.ArgumentParser(prog="mcp-airflow-api", description="MCP Airflow API Server")
|
|
249
|
+
parser.add_argument(
|
|
250
|
+
"--log-level", "-l",
|
|
251
|
+
dest="log_level",
|
|
252
|
+
help="Logging level override (DEBUG, INFO, WARNING, ERROR, CRITICAL). Overrides AIRFLOW_LOG_LEVEL env if provided.",
|
|
253
|
+
choices=["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"],
|
|
254
|
+
)
|
|
255
|
+
# Allow future extension without breaking unknown args usage
|
|
256
|
+
args = parser.parse_args(argv)
|
|
257
|
+
|
|
258
|
+
if args.log_level:
|
|
259
|
+
# Override root + specific logger level
|
|
260
|
+
logging.getLogger().setLevel(args.log_level)
|
|
261
|
+
logger.setLevel(args.log_level)
|
|
262
|
+
logging.getLogger("requests.packages.urllib3").setLevel("WARNING") # reduce noise at DEBUG
|
|
263
|
+
logger.info("Log level set via CLI to %s", args.log_level)
|
|
264
|
+
else:
|
|
265
|
+
logger.debug("Log level from environment: %s", logging.getLogger().level)
|
|
266
|
+
|
|
267
|
+
mcp.run(transport='stdio')
|
|
268
|
+
|
|
269
|
+
if __name__ == "__main__":
|
|
270
|
+
main()
|
|
@@ -0,0 +1,63 @@
|
|
|
1
|
+
"""
|
|
2
|
+
Utility functions for Airflow MCP
|
|
3
|
+
"""
|
|
4
|
+
import os
|
|
5
|
+
import requests
|
|
6
|
+
from typing import Any, Dict, Optional
|
|
7
|
+
|
|
8
|
+
def airflow_request(method: str, path: str, **kwargs) -> requests.Response:
|
|
9
|
+
"""
|
|
10
|
+
Make a Basic Auth request to Airflow REST API.
|
|
11
|
+
'path' should be relative to AIRFLOW_API_URL (e.g., '/dags', '/pools').
|
|
12
|
+
"""
|
|
13
|
+
base_url = os.getenv("AIRFLOW_API_URL", "").rstrip("/")
|
|
14
|
+
if not base_url:
|
|
15
|
+
raise RuntimeError("AIRFLOW_API_URL environment variable is not set")
|
|
16
|
+
|
|
17
|
+
# Ensure path starts with /
|
|
18
|
+
if not path.startswith("/"):
|
|
19
|
+
path = "/" + path
|
|
20
|
+
|
|
21
|
+
# Construct full URL
|
|
22
|
+
full_url = base_url + path
|
|
23
|
+
|
|
24
|
+
# Get authentication
|
|
25
|
+
username = os.getenv("AIRFLOW_API_USERNAME")
|
|
26
|
+
password = os.getenv("AIRFLOW_API_PASSWORD")
|
|
27
|
+
if not username or not password:
|
|
28
|
+
raise RuntimeError("AIRFLOW_API_USERNAME or AIRFLOW_API_PASSWORD environment variable is not set")
|
|
29
|
+
|
|
30
|
+
auth = (username, password)
|
|
31
|
+
headers = kwargs.pop("headers", {})
|
|
32
|
+
|
|
33
|
+
return requests.request(method, full_url, headers=headers, auth=auth, **kwargs)
|
|
34
|
+
|
|
35
|
+
def read_prompt_template(path: str) -> str:
|
|
36
|
+
"""
|
|
37
|
+
Reads the MCP prompt template file and returns its content as a string.
|
|
38
|
+
"""
|
|
39
|
+
with open(path, "r", encoding="utf-8") as f:
|
|
40
|
+
return f.read()
|
|
41
|
+
|
|
42
|
+
|
|
43
|
+
def parse_prompt_sections(template: str):
|
|
44
|
+
"""
|
|
45
|
+
Parses the prompt template into section headings and sections.
|
|
46
|
+
Returns (headings, sections).
|
|
47
|
+
"""
|
|
48
|
+
lines = template.splitlines()
|
|
49
|
+
sections = []
|
|
50
|
+
current = []
|
|
51
|
+
headings = []
|
|
52
|
+
for line in lines:
|
|
53
|
+
if line.startswith("## "):
|
|
54
|
+
if current:
|
|
55
|
+
sections.append("\n".join(current))
|
|
56
|
+
current = []
|
|
57
|
+
headings.append(line[3:].strip())
|
|
58
|
+
current.append(line)
|
|
59
|
+
else:
|
|
60
|
+
current.append(line)
|
|
61
|
+
if current:
|
|
62
|
+
sections.append("\n".join(current))
|
|
63
|
+
return headings, sections
|
|
@@ -0,0 +1,56 @@
|
|
|
1
|
+
# MCP Airflow API Prompt Template
|
|
2
|
+
|
|
3
|
+
## 1. Overview
|
|
4
|
+
|
|
5
|
+
This MCP server provides natural language tools for managing Apache Airflow clusters via REST API. All prompts and tool outputs are designed for minimal, LLM-friendly English responses.
|
|
6
|
+
|
|
7
|
+
## 2. Available MCP Tools
|
|
8
|
+
|
|
9
|
+
- `list_dags`: List all DAGs in the Airflow cluster.
|
|
10
|
+
- `running_dags`: List all currently running DAG runs.
|
|
11
|
+
- `failed_dags`: List all recently failed DAG runs.
|
|
12
|
+
- `trigger_dag(dag_id)`: Trigger a DAG run by ID.
|
|
13
|
+
- `pause_dag(dag_id)`: Pause a DAG (prevent scheduling).
|
|
14
|
+
- `unpause_dag(dag_id)`: Unpause a DAG (allow scheduling).
|
|
15
|
+
|
|
16
|
+
## 3. Tool Map
|
|
17
|
+
|
|
18
|
+
| Tool Name | Role/Description | Input Args | Output Fields |
|
|
19
|
+
|----------------|-------------------------------------------|-----------------|--------------------------------------|
|
|
20
|
+
| list_dags | List all DAGs | None | dag_id, dag_display_name, is_active, is_paused, owners, tags |
|
|
21
|
+
| running_dags | List running DAG runs | None | dag_id, run_id, state, execution_date, start_date, end_date |
|
|
22
|
+
| failed_dags | List failed DAG runs | None | dag_id, run_id, state, execution_date, start_date, end_date |
|
|
23
|
+
| trigger_dag | Trigger a DAG run | dag_id (str) | dag_id, run_id, state, execution_date, start_date, end_date |
|
|
24
|
+
| pause_dag | Pause a DAG | dag_id (str) | dag_id, is_paused |
|
|
25
|
+
| unpause_dag | Unpause a DAG | dag_id (str) | dag_id, is_paused |
|
|
26
|
+
|
|
27
|
+
## 4. Usage Guidelines
|
|
28
|
+
|
|
29
|
+
- Always use minimal, structured output.
|
|
30
|
+
- All tool invocations must use English for internal reasoning.
|
|
31
|
+
- For user-facing responses, translate to the user's language if needed.
|
|
32
|
+
|
|
33
|
+
## 5. Example Queries
|
|
34
|
+
|
|
35
|
+
- "List all DAGs."
|
|
36
|
+
- "Show running DAGs."
|
|
37
|
+
- "Trigger DAG 'example_dag'."
|
|
38
|
+
- "Pause DAG 'etl_job'."
|
|
39
|
+
- "Unpause DAG 'etl_job'."
|
|
40
|
+
|
|
41
|
+
## 6. Formatting Rules
|
|
42
|
+
|
|
43
|
+
- Output only the requested fields.
|
|
44
|
+
- No extra explanation unless explicitly requested.
|
|
45
|
+
- Use JSON objects for tool outputs.
|
|
46
|
+
|
|
47
|
+
## 7. Logging & Environment
|
|
48
|
+
|
|
49
|
+
- Control log level via AIRFLOW_LOG_LEVEL env or --log-level CLI flag.
|
|
50
|
+
- Supported levels: DEBUG, INFO, WARNING, ERROR, CRITICAL.
|
|
51
|
+
|
|
52
|
+
## 8. References
|
|
53
|
+
|
|
54
|
+
- Main MCP tool file: `src/mcp_airflow_api/airflow_api.py`
|
|
55
|
+
- Utility functions: `src/mcp_airflow_api/functions.py`
|
|
56
|
+
- See README.md for full usage and configuration.
|
|
@@ -0,0 +1,169 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: mcp-airflow-api
|
|
3
|
+
Version: 0.0.0
|
|
4
|
+
Summary: Model Context Protocol (MCP) server for Apache Airflow API integration. Provides comprehensive tools for managing Airflow clusters including service operations, configuration management, status monitoring, and request tracking.
|
|
5
|
+
Requires-Python: >=3.11
|
|
6
|
+
Description-Content-Type: text/markdown
|
|
7
|
+
License-File: LICENSE
|
|
8
|
+
Requires-Dist: fastapi>=0.116.1
|
|
9
|
+
Requires-Dist: requests>=2.32.4
|
|
10
|
+
Requires-Dist: uvicorn>=0.35.0
|
|
11
|
+
Requires-Dist: mcp>=1.12.3
|
|
12
|
+
Dynamic: license-file
|
|
13
|
+
|
|
14
|
+
Model Context Protocol (MCP) server for Apache Airflow API integration.
|
|
15
|
+
This project provides natural language MCP tools for essential Airflow cluster operations.
|
|
16
|
+
|
|
17
|
+
[](https://github.com/call518/MCP-Airflow-API/actions/workflows/pypi-publish.yml)
|
|
18
|
+
|
|
19
|
+
[](https://smithery.ai/server/@call518/mcp-airflow-api)
|
|
20
|
+
|
|
21
|
+
---
|
|
22
|
+
|
|
23
|
+
|
|
24
|
+
# MCP-Airflow-API
|
|
25
|
+
|
|
26
|
+
**Tested and supported Airflow version: 2.10.2 (API Version: v1)**
|
|
27
|
+
|
|
28
|
+
## Features
|
|
29
|
+
|
|
30
|
+
- List all DAGs in the Airflow cluster
|
|
31
|
+
- Monitor running/failed DAG runs
|
|
32
|
+
- Trigger DAG runs on demand
|
|
33
|
+
- Minimal, LLM-friendly output for all tools
|
|
34
|
+
- Easy integration with MCP Inspector, OpenWebUI, Smithery, etc.
|
|
35
|
+
|
|
36
|
+
---
|
|
37
|
+
|
|
38
|
+
## Available MCP Tools
|
|
39
|
+
|
|
40
|
+
### DAG Management
|
|
41
|
+
|
|
42
|
+
- `list_dags`
|
|
43
|
+
Returns all DAGs registered in the Airflow cluster.
|
|
44
|
+
Output: `dag_id`, `dag_display_name`, `is_active`, `is_paused`, `owners`, `tags`
|
|
45
|
+
|
|
46
|
+
- `running_dags`
|
|
47
|
+
Returns all currently running DAG runs.
|
|
48
|
+
Output: `dag_id`, `run_id`, `state`, `execution_date`, `start_date`, `end_date`
|
|
49
|
+
|
|
50
|
+
- `failed_dags`
|
|
51
|
+
Returns all recently failed DAG runs.
|
|
52
|
+
Output: `dag_id`, `run_id`, `state`, `execution_date`, `start_date`, `end_date`
|
|
53
|
+
|
|
54
|
+
- `trigger_dag(dag_id)`
|
|
55
|
+
Immediately triggers the specified DAG.
|
|
56
|
+
Output: `dag_id`, `run_id`, `state`, `execution_date`, `start_date`, `end_date`
|
|
57
|
+
|
|
58
|
+
- `pause_dag(dag_id)`
|
|
59
|
+
Pauses the specified DAG (prevents scheduling new runs).
|
|
60
|
+
Output: `dag_id`, `is_paused`
|
|
61
|
+
|
|
62
|
+
- `unpause_dag(dag_id)`
|
|
63
|
+
Unpauses the specified DAG (allows scheduling new runs).
|
|
64
|
+
Output: `dag_id`, `is_paused`
|
|
65
|
+
|
|
66
|
+
---
|
|
67
|
+
|
|
68
|
+
## Prompt Template
|
|
69
|
+
|
|
70
|
+
The package exposes a tool `get_prompt_template` that returns either the entire template, a specific section, or just the headings. Three MCP prompts (`prompt_template_full`, `prompt_template_headings`, `prompt_template_section`) are also registered for discovery.
|
|
71
|
+
|
|
72
|
+
### MCP Prompts
|
|
73
|
+
|
|
74
|
+
For easier discoverability in MCP clients (so `prompts/list` is not empty), the server now registers three prompts:
|
|
75
|
+
|
|
76
|
+
• `prompt_template_full` – returns the full canonical template
|
|
77
|
+
• `prompt_template_headings` – returns only the section headings
|
|
78
|
+
• `prompt_template_section` – takes a `section` argument (number or keyword) and returns that section
|
|
79
|
+
|
|
80
|
+
You can still use the `get_prompt_template` tool for programmatic access or when you prefer tool invocation over prompt retrieval.
|
|
81
|
+
|
|
82
|
+
Single canonical English prompt template guides safe and efficient tool selection.
|
|
83
|
+
|
|
84
|
+
Files:
|
|
85
|
+
• Packaged: `src/mcp_airflow_api/prompt_template.md` (distributed with PyPI)
|
|
86
|
+
• (Optional workspace root copy `PROMPT_TEMPLATE.md` may exist for editing; packaged copy is the one loaded at runtime.)
|
|
87
|
+
|
|
88
|
+
Retrieve dynamically via MCP tool:
|
|
89
|
+
• `get_prompt_template()` – full template
|
|
90
|
+
• `get_prompt_template("tool map")` – only the tool mapping section
|
|
91
|
+
• `get_prompt_template("3")` – section 3 (tool map)
|
|
92
|
+
• `get_prompt_template(mode="headings")` – list all section headings
|
|
93
|
+
|
|
94
|
+
Policy: Only English is stored; LLM는 사용자 질의 언어와 무관하게 영어 지침을 내부 추론용으로 사용하고, 사용자 응답은 필요 시 다국어로 생성한다.
|
|
95
|
+
|
|
96
|
+
---
|
|
97
|
+
|
|
98
|
+
## Main Tool Files
|
|
99
|
+
|
|
100
|
+
- MCP tool definitions: `src/mcp_airflow_api/airflow_api.py`
|
|
101
|
+
- Utility functions: `src/mcp_airflow_api/functions.py`
|
|
102
|
+
|
|
103
|
+
---
|
|
104
|
+
|
|
105
|
+
## How To Use
|
|
106
|
+
|
|
107
|
+
1. In your MCP Tools environment, configure `mcp-config.json` as follows:
|
|
108
|
+
|
|
109
|
+
```json
|
|
110
|
+
{
|
|
111
|
+
"mcpServers": {
|
|
112
|
+
"airflow-api": {
|
|
113
|
+
"command": "uvx",
|
|
114
|
+
"args": ["--python", "3.11", "mcp-airflow-api"],
|
|
115
|
+
"env": {
|
|
116
|
+
"AIRFLOW_API_URL": "http://localhost:38080/api/v1",
|
|
117
|
+
"AIRFLOW_API_USERNAME": "airflow",
|
|
118
|
+
"AIRFLOW_API_PASSWORD": "airflow",
|
|
119
|
+
"AIRFLOW_LOG_LEVEL": "INFO"
|
|
120
|
+
}
|
|
121
|
+
}
|
|
122
|
+
}
|
|
123
|
+
}
|
|
124
|
+
```
|
|
125
|
+
|
|
126
|
+
2. Register the MCP server in MCP Inspector, OpenWebUI, Smithery, etc. and use the tools.
|
|
127
|
+
|
|
128
|
+
---
|
|
129
|
+
|
|
130
|
+
## QuickStart (Demo): Running MCP-Airflow-API with Docker
|
|
131
|
+
|
|
132
|
+
1. Prepare an Airflow cluster
|
|
133
|
+
- See [Official Airflow Docker Install Guide](https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html)
|
|
134
|
+
|
|
135
|
+
2. Prepare MCP Tools environment
|
|
136
|
+
- Install Docker and Docker Compose
|
|
137
|
+
- Clone this project and run `docker-compose up -d` in the root directory
|
|
138
|
+
|
|
139
|
+
3. Register the MCP server in MCP Inspector/Smithery
|
|
140
|
+
- Example address: `http://localhost:8000/airflow-api`
|
|
141
|
+
|
|
142
|
+
---
|
|
143
|
+
|
|
144
|
+
## Logging & Observability
|
|
145
|
+
|
|
146
|
+
- Structured logs for all tool invocations and HTTP requests
|
|
147
|
+
- Control log level via environment variable (`AIRFLOW_LOG_LEVEL`) or CLI flag (`--log-level`)
|
|
148
|
+
- Supported levels: DEBUG, INFO, WARNING, ERROR, CRITICAL
|
|
149
|
+
|
|
150
|
+
---
|
|
151
|
+
|
|
152
|
+
## License
|
|
153
|
+
|
|
154
|
+
This project is licensed under the MIT License.
|
|
155
|
+
|
|
156
|
+
---
|
|
157
|
+
|
|
158
|
+
## Roadmap
|
|
159
|
+
|
|
160
|
+
This project starts with a minimal set of essential Airflow management tools. Many more useful features and tools for Airflow cluster operations will be added soon, including advanced monitoring, DAG/task analytics, scheduling controls, and more. Contributions and suggestions are welcome!
|
|
161
|
+
|
|
162
|
+
---
|
|
163
|
+
|
|
164
|
+
## Additional Links
|
|
165
|
+
|
|
166
|
+
- [Code](https://github.com/call518/MCP-Airflow-API)
|
|
167
|
+
- [Issues](https://github.com/call518/MCP-Airflow-API/issues)
|
|
168
|
+
- [Smithery Deployment](https://smithery.ai/server/@call518/mcp-airflow-api)
|
|
169
|
+
|
|
@@ -0,0 +1,14 @@
|
|
|
1
|
+
LICENSE
|
|
2
|
+
MANIFEST.in
|
|
3
|
+
README.md
|
|
4
|
+
pyproject.toml
|
|
5
|
+
src/mcp_airflow_api/__init__.py
|
|
6
|
+
src/mcp_airflow_api/airflow_api.py
|
|
7
|
+
src/mcp_airflow_api/functions.py
|
|
8
|
+
src/mcp_airflow_api/prompt_template.md
|
|
9
|
+
src/mcp_airflow_api.egg-info/PKG-INFO
|
|
10
|
+
src/mcp_airflow_api.egg-info/SOURCES.txt
|
|
11
|
+
src/mcp_airflow_api.egg-info/dependency_links.txt
|
|
12
|
+
src/mcp_airflow_api.egg-info/entry_points.txt
|
|
13
|
+
src/mcp_airflow_api.egg-info/requires.txt
|
|
14
|
+
src/mcp_airflow_api.egg-info/top_level.txt
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
mcp_airflow_api
|