postgres_ai 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,147 @@
1
+ Metadata-Version: 2.4
2
+ Name: postgres_ai
3
+ Version: 0.1.0
4
+ Summary: Add your description here
5
+ Requires-Python: >=3.12
6
+ Description-Content-Type: text/markdown
7
+ Requires-Dist: asyncpg>=0.31.0
8
+ Requires-Dist: databases>=0.9.0
9
+ Requires-Dist: fastapi>=0.131.0
10
+ Requires-Dist: fastmcp>=3.0.2
11
+ Requires-Dist: pandas>=3.0.1
12
+ Requires-Dist: polars>=1.38.1
13
+ Requires-Dist: typer>=0.24.1
14
+ Requires-Dist: uvicorn>=0.41.0
15
+
16
+ # pg_ai <img src="https://upload.wikimedia.org/wikipedia/commons/a/ad/Logo_PostgreSQL.png" alt="PostgreSQL" width="100">
17
+ [![GitHub license](https://img.shields.io/github/license/saiprasaad2002/pg_ai)](https://github.com/saiprasaad2002/pg_ai/blob/main/LICENSE)
18
+ [![GitHub stars](https://img.shields.io/github/stars/saiprasaad2002/pg_ai)](https://github.com/saiprasaad2002/pg_ai/stargazers)
19
+ [![GitHub issues](https://img.shields.io/github/issues/saiprasaad2002/pg_ai)](https://github.com/saiprasaad2002/pg_ai/issues)
20
+
21
+ ## Overview
22
+
23
+ pg_ai is an open-source MCP (Model Context Protocol) server tailored for PostgreSQL databases. It bridges LLMs with your Postgres data by allowing users to define custom "skills" — encapsulated business logic in markdown files — that guide intelligent database interactions. By implementing the SKILL Graph technique, pg_ai enables dynamic, on-demand context expansion, preventing LLM overload while scaling complexity through interconnected skills.
24
+
25
+ Inspired by Anthropic's Claude Skills and the SKILL Graph concept (as discussed in [this X thread](https://x.com/arscontexta/status/2023957499183829467?s=20)), pg_ai turns your database into a contextual powerhouse for AI agents.
26
+
27
+ ## Key Features
28
+
29
+ - **Postgres Integration**: Asynchronous connections for efficient querying.
30
+ - **Skill Management**: Load custom skills from markdown files (SKILL.md) in a progressive, token-efficient manner.
31
+ - **Dynamic Context Growth**: Use SKILL Graph to link and load skills on-demand, building a network of reusable logic.
32
+ - **MCP Compliance**: Exposes tools for LLMs to read business logic, load specific skills, and execute SQL queries.
33
+ - **Logging and Configurability**: Environment-based setup with dedicated logging.
34
+
35
+ ## Architecture
36
+
37
+ pg_ai is built on FastMCP (a FastAPI-based MCP implementation) and uses asyncpg for PostgreSQL interactions. The core components include:
38
+
39
+ 1. **Server Setup (app.py)**: Initializes the MCP server with a lifespan hook for database connection management.
40
+ 2. **PgMCP Class (src/mcp_server/server.py)**: Wraps FastMCP to configure the server with tools and lifespan.
41
+ 3. **Postgres Connector (src/connectors/pg_connector.py)**: Handles async connect/disconnect to Postgres using the `databases` library.
42
+ 4. **Environment Loader (src/loaders/env_loader.py)**: Loads settings from `.env` using Pydantic for validation.
43
+ 5. **Logger (src/logger/mcp_logger.py)**: Configures file-based logging for server events.
44
+ 6. **Tools (src/mcp_tools/tools.py)**: Defines MCP tools:
45
+ - `read_business_logic()`: Loads default business logic from `pg_skills/business-logic/SKILL.md`.
46
+ - `load_skill(skill_name)`: Dynamically loads a specific skill's instructions from `pg_skills/<skill_name>/SKILL.md`.
47
+ - `execute_query(sql_query)`: Executes SQL queries on the connected Postgres DB and returns results as a Polars DataFrame.
48
+
49
+ The SKILL Graph is realized through interconnected skills: Each SKILL.md can reference other skills, allowing the LLM to traverse the graph by calling `load_skill` as needed. This grows context incrementally, aligning with MCP's goal of efficient external system integration.
50
+
51
+ ## Installation
52
+
53
+ ### Prerequisites
54
+ - Python >= 3.12
55
+ - PostgreSQL database
56
+ - Git
57
+
58
+ ### Steps
59
+ 1. Clone the repository:
60
+ ```
61
+ git clone https://github.com/saiprasaad2002/pg_ai.git
62
+ cd pg_ai
63
+ ```
64
+ 2. Create virtual environment (preferably python 3.12):
65
+ ```
66
+ uv venv --python 3.12
67
+ ```
68
+
69
+ 3. Install dependencies:
70
+ ```
71
+ uv sync
72
+ ```
73
+
74
+ 4. Copy and configure the environment file:
75
+ ```
76
+ cp .env.example .env
77
+ ```
78
+ Edit `.env` with your Postgres credentials and MCP server settings:
79
+ - `DB_USER`, `DB_PASS`, `DB_HOST`, `DB_PORT`, `DB_NAME`: Postgres connection details.
80
+ - `MCP_SERVER_HOST`, `MCP_SERVER_PORT`, `MCP_SERVER_TRANSPORT`: Server config (e.g., `STREAMABLE-HTTP` for production).
81
+
82
+ 5. Run the server:
83
+ ```
84
+ uv run app.py
85
+ ```
86
+
87
+ The server will start, connect to your Postgres DB, and expose MCP endpoints.
88
+
89
+ ## Usage
90
+
91
+ ### Adding Skills
92
+ Skills are stored in the `pg_skills/` directory. Each skill is a subfolder containing a `SKILL.md` file.
93
+
94
+ - **Structure Example**:
95
+ ```
96
+ pg_skills/
97
+ ├── business-logic/
98
+ │ └── SKILL.md # Default business logic (e.g., table schemas, query guidelines)
99
+ └── custom-skill/
100
+ └── SKILL.md # Custom skill instructions (YAML frontmatter + Markdown body)
101
+ ```
102
+
103
+ - **SKILL.md Format** (Inspired by Claude Skills):
104
+ - **YAML Frontmatter**: Minimal metadata (name, description) for progressive disclosure.
105
+ - **Body**: Detailed instructions, examples, or business logic for the LLM.
106
+ - Example:
107
+ ```
108
+ ---
109
+ name: inventory-check
110
+ description: Checks inventory levels in the products table. Use when querying stock.
111
+ ---
112
+
113
+ # Inventory Check Skill
114
+
115
+ To check inventory:
116
+ 1. Query the `products` table: SELECT * FROM products WHERE id = {id};
117
+ 2. Analyze stock levels.
118
+ ...
119
+ ```
120
+
121
+ Skills can reference others (e.g., "Load the 'reporting' skill for summaries"), forming a graph for dynamic loading.
122
+
123
+ ### Interacting with the Server
124
+ - Connect your LLM (e.g., Claude, ChatGPT) via an MCP client.
125
+ - The LLM can call tools to load skills and execute queries, building context as needed.
126
+ - Logs are saved in `mcp_logs/pg_ai_log.log`.
127
+
128
+ ## Contributing
129
+
130
+ Contributions are welcome! Please follow these steps:
131
+ 1. Fork the repository.
132
+ 2. Create a feature branch: `git checkout -b feature/new-feature`.
133
+ 3. Commit changes: `git commit -m 'Add new feature'`.
134
+ 4. Push to the branch: `git push origin feature/new-feature`.
135
+ 5. Open a Pull Request.
136
+
137
+ ## License
138
+
139
+ This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
140
+
141
+ ## References
142
+ - [Model Context Protocol (MCP)](https://modelcontextprotocol.io/docs/getting-started/intro)
143
+ - [Claude Skills Blog](https://www.anthropic.com/news/skills)
144
+ - [SKILL Graph Discussion](https://x.com/arscontexta/status/2023957499183829467?s=20)
145
+
146
+ For questions, open an issue or contact the maintainer.
147
+
@@ -0,0 +1,132 @@
1
+ # pg_ai <img src="https://upload.wikimedia.org/wikipedia/commons/a/ad/Logo_PostgreSQL.png" alt="PostgreSQL" width="100">
2
+ [![GitHub license](https://img.shields.io/github/license/saiprasaad2002/pg_ai)](https://github.com/saiprasaad2002/pg_ai/blob/main/LICENSE)
3
+ [![GitHub stars](https://img.shields.io/github/stars/saiprasaad2002/pg_ai)](https://github.com/saiprasaad2002/pg_ai/stargazers)
4
+ [![GitHub issues](https://img.shields.io/github/issues/saiprasaad2002/pg_ai)](https://github.com/saiprasaad2002/pg_ai/issues)
5
+
6
+ ## Overview
7
+
8
+ pg_ai is an open-source MCP (Model Context Protocol) server tailored for PostgreSQL databases. It bridges LLMs with your Postgres data by allowing users to define custom "skills" — encapsulated business logic in markdown files — that guide intelligent database interactions. By implementing the SKILL Graph technique, pg_ai enables dynamic, on-demand context expansion, preventing LLM overload while scaling complexity through interconnected skills.
9
+
10
+ Inspired by Anthropic's Claude Skills and the SKILL Graph concept (as discussed in [this X thread](https://x.com/arscontexta/status/2023957499183829467?s=20)), pg_ai turns your database into a contextual powerhouse for AI agents.
11
+
12
+ ## Key Features
13
+
14
+ - **Postgres Integration**: Asynchronous connections for efficient querying.
15
+ - **Skill Management**: Load custom skills from markdown files (SKILL.md) in a progressive, token-efficient manner.
16
+ - **Dynamic Context Growth**: Use SKILL Graph to link and load skills on-demand, building a network of reusable logic.
17
+ - **MCP Compliance**: Exposes tools for LLMs to read business logic, load specific skills, and execute SQL queries.
18
+ - **Logging and Configurability**: Environment-based setup with dedicated logging.
19
+
20
+ ## Architecture
21
+
22
+ pg_ai is built on FastMCP (a FastAPI-based MCP implementation) and uses asyncpg for PostgreSQL interactions. The core components include:
23
+
24
+ 1. **Server Setup (app.py)**: Initializes the MCP server with a lifespan hook for database connection management.
25
+ 2. **PgMCP Class (src/mcp_server/server.py)**: Wraps FastMCP to configure the server with tools and lifespan.
26
+ 3. **Postgres Connector (src/connectors/pg_connector.py)**: Handles async connect/disconnect to Postgres using the `databases` library.
27
+ 4. **Environment Loader (src/loaders/env_loader.py)**: Loads settings from `.env` using Pydantic for validation.
28
+ 5. **Logger (src/logger/mcp_logger.py)**: Configures file-based logging for server events.
29
+ 6. **Tools (src/mcp_tools/tools.py)**: Defines MCP tools:
30
+ - `read_business_logic()`: Loads default business logic from `pg_skills/business-logic/SKILL.md`.
31
+ - `load_skill(skill_name)`: Dynamically loads a specific skill's instructions from `pg_skills/<skill_name>/SKILL.md`.
32
+ - `execute_query(sql_query)`: Executes SQL queries on the connected Postgres DB and returns results as a Polars DataFrame.
33
+
34
+ The SKILL Graph is realized through interconnected skills: Each SKILL.md can reference other skills, allowing the LLM to traverse the graph by calling `load_skill` as needed. This grows context incrementally, aligning with MCP's goal of efficient external system integration.
35
+
36
+ ## Installation
37
+
38
+ ### Prerequisites
39
+ - Python >= 3.12
40
+ - PostgreSQL database
41
+ - Git
42
+
43
+ ### Steps
44
+ 1. Clone the repository:
45
+ ```
46
+ git clone https://github.com/saiprasaad2002/pg_ai.git
47
+ cd pg_ai
48
+ ```
49
+ 2. Create virtual environment (preferably python 3.12):
50
+ ```
51
+ uv venv --python 3.12
52
+ ```
53
+
54
+ 3. Install dependencies:
55
+ ```
56
+ uv sync
57
+ ```
58
+
59
+ 4. Copy and configure the environment file:
60
+ ```
61
+ cp .env.example .env
62
+ ```
63
+ Edit `.env` with your Postgres credentials and MCP server settings:
64
+ - `DB_USER`, `DB_PASS`, `DB_HOST`, `DB_PORT`, `DB_NAME`: Postgres connection details.
65
+ - `MCP_SERVER_HOST`, `MCP_SERVER_PORT`, `MCP_SERVER_TRANSPORT`: Server config (e.g., `STREAMABLE-HTTP` for production).
66
+
67
+ 5. Run the server:
68
+ ```
69
+ uv run app.py
70
+ ```
71
+
72
+ The server will start, connect to your Postgres DB, and expose MCP endpoints.
73
+
74
+ ## Usage
75
+
76
+ ### Adding Skills
77
+ Skills are stored in the `pg_skills/` directory. Each skill is a subfolder containing a `SKILL.md` file.
78
+
79
+ - **Structure Example**:
80
+ ```
81
+ pg_skills/
82
+ ├── business-logic/
83
+ │ └── SKILL.md # Default business logic (e.g., table schemas, query guidelines)
84
+ └── custom-skill/
85
+ └── SKILL.md # Custom skill instructions (YAML frontmatter + Markdown body)
86
+ ```
87
+
88
+ - **SKILL.md Format** (Inspired by Claude Skills):
89
+ - **YAML Frontmatter**: Minimal metadata (name, description) for progressive disclosure.
90
+ - **Body**: Detailed instructions, examples, or business logic for the LLM.
91
+ - Example:
92
+ ```
93
+ ---
94
+ name: inventory-check
95
+ description: Checks inventory levels in the products table. Use when querying stock.
96
+ ---
97
+
98
+ # Inventory Check Skill
99
+
100
+ To check inventory:
101
+ 1. Query the `products` table: SELECT * FROM products WHERE id = {id};
102
+ 2. Analyze stock levels.
103
+ ...
104
+ ```
105
+
106
+ Skills can reference others (e.g., "Load the 'reporting' skill for summaries"), forming a graph for dynamic loading.
107
+
108
+ ### Interacting with the Server
109
+ - Connect your LLM (e.g., Claude, ChatGPT) via an MCP client.
110
+ - The LLM can call tools to load skills and execute queries, building context as needed.
111
+ - Logs are saved in `mcp_logs/pg_ai_log.log`.
112
+
113
+ ## Contributing
114
+
115
+ Contributions are welcome! Please follow these steps:
116
+ 1. Fork the repository.
117
+ 2. Create a feature branch: `git checkout -b feature/new-feature`.
118
+ 3. Commit changes: `git commit -m 'Add new feature'`.
119
+ 4. Push to the branch: `git push origin feature/new-feature`.
120
+ 5. Open a Pull Request.
121
+
122
+ ## License
123
+
124
+ This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
125
+
126
+ ## References
127
+ - [Model Context Protocol (MCP)](https://modelcontextprotocol.io/docs/getting-started/intro)
128
+ - [Claude Skills Blog](https://www.anthropic.com/news/skills)
129
+ - [SKILL Graph Discussion](https://x.com/arscontexta/status/2023957499183829467?s=20)
130
+
131
+ For questions, open an issue or contact the maintainer.
132
+
@@ -0,0 +1,21 @@
1
+ [project]
2
+ name = "postgres_ai"
3
+ version = "0.1.0"
4
+ description = "Add your description here"
5
+ readme = "README.md"
6
+ requires-python = ">=3.12"
7
+ dependencies = [
8
+ "asyncpg>=0.31.0",
9
+ "databases>=0.9.0",
10
+ "fastapi>=0.131.0",
11
+ "fastmcp>=3.0.2",
12
+ "pandas>=3.0.1",
13
+ "polars>=1.38.1",
14
+ "typer>=0.24.1",
15
+ "uvicorn>=0.41.0",
16
+ ]
17
+ [project.scripts]
18
+ postgres_ai = "postgres_ai.cli:app"
19
+
20
+ [tool.hatch.build.targets.wheel]
21
+ packages = ["src/postgres_ai"]
@@ -0,0 +1,4 @@
1
+ [egg_info]
2
+ tag_build =
3
+ tag_date = 0
4
+
File without changes
@@ -0,0 +1,89 @@
1
+ import asyncio
2
+ import typer
3
+ from typing import Optional
4
+ from postgres_ai.loaders.env_loader import get_mcp_settings
5
+ from postgres_ai.mcp_server.server import PgMCP
6
+ from postgres_ai.connectors.pg_connector import PGConnector
7
+ from postgres_ai.logger.mcp_logger import get_mcp_logger
8
+ from fastmcp.server.lifespan import lifespan
9
+
10
+
11
+ app = typer.Typer(
12
+ name="postgres_ai",
13
+ help="Postgres AI MCP Server — run AI tools directly on your database",
14
+ add_completion=True,
15
+ no_args_is_help=True,
16
+ )
17
+
18
+ @app.command()
19
+ def main(
20
+ env_file: Optional[str] = typer.Option(".env", "--env-file", "-e", help="Path to .env file"),
21
+ host: str = typer.Option("0.0.0.0", "--host", help="MCP server host"),
22
+ port: int = typer.Option(8000, "--port", "-p", help="MCP server port"),
23
+ transport: str = typer.Option("streamable-http", "--transport"),
24
+ db_host: Optional[str] = typer.Option(None, "--db-host", envvar="DB_HOST"),
25
+ db_port: Optional[int] = typer.Option(None, "--db-port", envvar="DB_PORT"),
26
+ db_user: Optional[str] = typer.Option(None, "--db-user", envvar="DB_USER"),
27
+ db_name: Optional[str] = typer.Option(None, "--db-name", envvar="DB_NAME"),
28
+ prompt_password: bool = typer.Option(
29
+ False, "--prompt-password", help="Interactively ask for DB password (overrides .env)"
30
+ ),
31
+ ):
32
+ """Start the postgres_ai MCP Server"""
33
+ print("🚀 Welcome to postgres_ai MCP Server!")
34
+
35
+ try:
36
+ settings = get_mcp_settings(env_file)
37
+ except Exception as e:
38
+ typer.secho(f"❌ Config error: {e}", fg=typer.colors.RED)
39
+ typer.secho("Tip: Copy .env.example → .env and fill the values", fg=typer.colors.YELLOW)
40
+ raise typer.Exit(1)
41
+
42
+ # Apply CLI overrides
43
+ db_host = db_host or settings.db_host
44
+ db_port = db_port or settings.db_port
45
+ db_user = db_user or settings.db_user
46
+ db_name = db_name or settings.db_name
47
+
48
+ # Password handling
49
+ if prompt_password or not getattr(settings, "db_pass", None):
50
+ db_pass = typer.prompt("Database Password", hide_input=True, confirmation_prompt=True)
51
+ else:
52
+ db_pass = settings.db_pass
53
+
54
+ logger = get_mcp_logger()
55
+
56
+ conn = PGConnector(
57
+ db_user=db_user,
58
+ db_host=db_host,
59
+ db_port=db_port,
60
+ db_pass=db_pass,
61
+ db_name=db_name,
62
+ )
63
+
64
+ @lifespan
65
+ async def mcp_lifespan(server):
66
+ try:
67
+ database = await conn.connect()
68
+ logger.info(f"✅ Connected to Postgres @ {db_host}:{db_port}")
69
+ yield {"pg_connection": database, "logger": logger}
70
+ finally:
71
+ await conn.disconnect()
72
+ logger.info("Database connection closed")
73
+
74
+ server_config = PgMCP(
75
+ lifespan=mcp_lifespan,
76
+ server_name="postgres_ai",
77
+ instructions="MCP Server to handle Postgres database operations with AI tools",
78
+ )
79
+ mcp_server = server_config.get_mcp_server()
80
+
81
+ typer.secho(f"Connected with Database @ {db_host}:{db_port}", fg=typer.colors.GREEN)
82
+ typer.secho(f"MCP Server starting @ {host}:{port}", fg=typer.colors.GREEN)
83
+ asyncio.run(
84
+ mcp_server.run_async(transport=transport, host=host, port=port)
85
+ )
86
+
87
+
88
+ if __name__ == "__main__":
89
+ app()
@@ -0,0 +1,23 @@
1
+ # from sqlalchemy.ext.declarative import declarative_base
2
+ from databases import Database
3
+
4
+ class PGConnector:
5
+ def __init__(self, db_user: str, db_pass: str, db_host: str, db_port: int | str, db_name: str):
6
+ self.db_user = db_user
7
+ self.db_pass = db_pass
8
+ self.db_host = db_host
9
+ self.db_port = str(db_port)
10
+ self.db_name = db_name
11
+ self.DB_URL = f"postgresql+asyncpg://{self.db_user}:{self.db_pass}@{self.db_host}:{self.db_port}/{self.db_name}"
12
+
13
+ async def connect(self):
14
+ try:
15
+ self.database = Database(self.DB_URL)
16
+ await self.database.connect()
17
+ return self.database
18
+ except Exception as e:
19
+ raise RuntimeError(f"Failed to connect to Postgres: {e}") from e
20
+
21
+ async def disconnect(self):
22
+ if hasattr(self, "database") and self.database is not None:
23
+ await self.database.disconnect()
@@ -0,0 +1,24 @@
1
+ from pydantic_settings import BaseSettings, SettingsConfigDict
2
+ from functools import lru_cache
3
+ from typing import Optional
4
+
5
+ class MCPSettings(BaseSettings):
6
+ model_config = SettingsConfigDict(
7
+ env_file=".env",
8
+ env_file_encoding="utf-8",
9
+ case_sensitive=False
10
+ )
11
+ db_host: str
12
+ db_port: int
13
+ db_user: str
14
+ db_pass: str
15
+ db_name: str
16
+ mcp_server_host: str
17
+ mcp_server_port: int
18
+ mcp_server_transport: str
19
+
20
+ @lru_cache
21
+ def get_mcp_settings(env_file: Optional[str] = None):
22
+ if env_file and env_file != ".env":
23
+ return MCPSettings(_env_file=env_file)
24
+ return MCPSettings()
@@ -0,0 +1,20 @@
1
+ import logging
2
+ from pathlib import Path
3
+ from functools import lru_cache
4
+
5
+ LOG_DIR = Path("mcp_logs")
6
+ LOG_DIR.mkdir(parents=True, exist_ok=True)
7
+ LOG_FILE = LOG_DIR / "pg_ai_log.log"
8
+
9
+ logging.basicConfig(
10
+ filename=LOG_FILE,
11
+ format="{asctime} - {name} - {levelname} - {message}",
12
+ datefmt="%Y-%m-%d %H:%M:%S",
13
+ level=logging.INFO,
14
+ style="{"
15
+ )
16
+
17
+ @lru_cache
18
+ def get_mcp_logger():
19
+ return logging.getLogger("mcp_logger")
20
+
@@ -0,0 +1,30 @@
1
+ from fastmcp import FastMCP
2
+ from postgres_ai.mcp_tools.tools import get_mcp_tools
3
+ from fastmcp.server.lifespan import Lifespan
4
+
5
+ class PgMCP:
6
+ def __init__(self,
7
+ lifespan: Lifespan,
8
+ server_name: str | None = None,
9
+ instructions: str | None = None,
10
+ version: str | None = None,
11
+ website_url: str | None = None,
12
+ ):
13
+ self.lifespan = lifespan
14
+ self.server_name = server_name
15
+ self.instructions = instructions
16
+ self.version = version
17
+ self.website_url = website_url
18
+ tools = get_mcp_tools()
19
+ self.mcp_tools = tools.mcp_tools
20
+
21
+ def get_mcp_server(self):
22
+ mcp_server = FastMCP(
23
+ lifespan=self.lifespan,
24
+ name=self.server_name,
25
+ instructions=self.instructions,
26
+ version=self.version,
27
+ website_url=self.website_url,
28
+ tools=self.mcp_tools
29
+ )
30
+ return mcp_server
@@ -0,0 +1,44 @@
1
+ from pathlib import Path
2
+ from postgres_ai.types.pg_ai_types import Content, DatabaseResult, MCPTools
3
+ from fastmcp import Context
4
+ from sqlalchemy import text
5
+ import polars as pl
6
+
7
+ async def read_business_logic():
8
+ try:
9
+ with open(Path("pg_skills")/"business-logic"/"SKILL.md", encoding="utf-8") as bl:
10
+ content = bl.read()
11
+ return Content(content=content)
12
+ except Exception as e:
13
+ raise e
14
+
15
+ async def load_skill(skill_name: str):
16
+ """
17
+ Load the respective skill to get the references for table schema, query instructions, etc.,
18
+ """
19
+ try:
20
+ with open(Path("pg_skills")/skill_name/"SKILL.md", encoding="utf-8") as skill:
21
+ content = skill.read()
22
+ return Content(content=content)
23
+ except Exception as e:
24
+ raise e
25
+
26
+ async def execute_query(sql_query: str, ctx: Context):
27
+ """
28
+ Tool to execute the prepared sql query in the Postgres database
29
+ """
30
+ database = ctx.lifespan_context.get("pg_connection")
31
+ try:
32
+ if database:
33
+ result = await database.fetch_all(text(sql_query))
34
+ dataframe = pl.DataFrame(result)
35
+ return DatabaseResult(sql_query=sql_query,sql_result=dataframe)
36
+ return Content(content="No database connection acquired")
37
+ except Exception as e:
38
+ raise e
39
+
40
+
41
+ def get_mcp_tools():
42
+ return MCPTools(
43
+ mcp_tools=[read_business_logic, load_skill, execute_query]
44
+ )
@@ -0,0 +1,13 @@
1
+ from pydantic import BaseModel, Field
2
+ from typing import List, Callable, Any
3
+
4
+ class Content(BaseModel):
5
+ content: str = Field(description="The returned skill file content")
6
+
7
+ class MCPTools(BaseModel):
8
+ mcp_tools: List[Callable] = Field(description="List of MCP tools to be used to host the MCP server")
9
+
10
+ class DatabaseResult(BaseModel):
11
+ sql_query: str = Field(description="The prepared sql query")
12
+ sql_result: Any = Field(description="The result of sql execution wrapped as DataFrame")
13
+
@@ -0,0 +1,147 @@
1
+ Metadata-Version: 2.4
2
+ Name: postgres_ai
3
+ Version: 0.1.0
4
+ Summary: Add your description here
5
+ Requires-Python: >=3.12
6
+ Description-Content-Type: text/markdown
7
+ Requires-Dist: asyncpg>=0.31.0
8
+ Requires-Dist: databases>=0.9.0
9
+ Requires-Dist: fastapi>=0.131.0
10
+ Requires-Dist: fastmcp>=3.0.2
11
+ Requires-Dist: pandas>=3.0.1
12
+ Requires-Dist: polars>=1.38.1
13
+ Requires-Dist: typer>=0.24.1
14
+ Requires-Dist: uvicorn>=0.41.0
15
+
16
+ # pg_ai <img src="https://upload.wikimedia.org/wikipedia/commons/a/ad/Logo_PostgreSQL.png" alt="PostgreSQL" width="100">
17
+ [![GitHub license](https://img.shields.io/github/license/saiprasaad2002/pg_ai)](https://github.com/saiprasaad2002/pg_ai/blob/main/LICENSE)
18
+ [![GitHub stars](https://img.shields.io/github/stars/saiprasaad2002/pg_ai)](https://github.com/saiprasaad2002/pg_ai/stargazers)
19
+ [![GitHub issues](https://img.shields.io/github/issues/saiprasaad2002/pg_ai)](https://github.com/saiprasaad2002/pg_ai/issues)
20
+
21
+ ## Overview
22
+
23
+ pg_ai is an open-source MCP (Model Context Protocol) server tailored for PostgreSQL databases. It bridges LLMs with your Postgres data by allowing users to define custom "skills" — encapsulated business logic in markdown files — that guide intelligent database interactions. By implementing the SKILL Graph technique, pg_ai enables dynamic, on-demand context expansion, preventing LLM overload while scaling complexity through interconnected skills.
24
+
25
+ Inspired by Anthropic's Claude Skills and the SKILL Graph concept (as discussed in [this X thread](https://x.com/arscontexta/status/2023957499183829467?s=20)), pg_ai turns your database into a contextual powerhouse for AI agents.
26
+
27
+ ## Key Features
28
+
29
+ - **Postgres Integration**: Asynchronous connections for efficient querying.
30
+ - **Skill Management**: Load custom skills from markdown files (SKILL.md) in a progressive, token-efficient manner.
31
+ - **Dynamic Context Growth**: Use SKILL Graph to link and load skills on-demand, building a network of reusable logic.
32
+ - **MCP Compliance**: Exposes tools for LLMs to read business logic, load specific skills, and execute SQL queries.
33
+ - **Logging and Configurability**: Environment-based setup with dedicated logging.
34
+
35
+ ## Architecture
36
+
37
+ pg_ai is built on FastMCP (a FastAPI-based MCP implementation) and uses asyncpg for PostgreSQL interactions. The core components include:
38
+
39
+ 1. **Server Setup (app.py)**: Initializes the MCP server with a lifespan hook for database connection management.
40
+ 2. **PgMCP Class (src/mcp_server/server.py)**: Wraps FastMCP to configure the server with tools and lifespan.
41
+ 3. **Postgres Connector (src/connectors/pg_connector.py)**: Handles async connect/disconnect to Postgres using the `databases` library.
42
+ 4. **Environment Loader (src/loaders/env_loader.py)**: Loads settings from `.env` using Pydantic for validation.
43
+ 5. **Logger (src/logger/mcp_logger.py)**: Configures file-based logging for server events.
44
+ 6. **Tools (src/mcp_tools/tools.py)**: Defines MCP tools:
45
+ - `read_business_logic()`: Loads default business logic from `pg_skills/business-logic/SKILL.md`.
46
+ - `load_skill(skill_name)`: Dynamically loads a specific skill's instructions from `pg_skills/<skill_name>/SKILL.md`.
47
+ - `execute_query(sql_query)`: Executes SQL queries on the connected Postgres DB and returns results as a Polars DataFrame.
48
+
49
+ The SKILL Graph is realized through interconnected skills: Each SKILL.md can reference other skills, allowing the LLM to traverse the graph by calling `load_skill` as needed. This grows context incrementally, aligning with MCP's goal of efficient external system integration.
50
+
51
+ ## Installation
52
+
53
+ ### Prerequisites
54
+ - Python >= 3.12
55
+ - PostgreSQL database
56
+ - Git
57
+
58
+ ### Steps
59
+ 1. Clone the repository:
60
+ ```
61
+ git clone https://github.com/saiprasaad2002/pg_ai.git
62
+ cd pg_ai
63
+ ```
64
+ 2. Create virtual environment (preferably python 3.12):
65
+ ```
66
+ uv venv --python 3.12
67
+ ```
68
+
69
+ 3. Install dependencies:
70
+ ```
71
+ uv sync
72
+ ```
73
+
74
+ 4. Copy and configure the environment file:
75
+ ```
76
+ cp .env.example .env
77
+ ```
78
+ Edit `.env` with your Postgres credentials and MCP server settings:
79
+ - `DB_USER`, `DB_PASS`, `DB_HOST`, `DB_PORT`, `DB_NAME`: Postgres connection details.
80
+ - `MCP_SERVER_HOST`, `MCP_SERVER_PORT`, `MCP_SERVER_TRANSPORT`: Server config (e.g., `STREAMABLE-HTTP` for production).
81
+
82
+ 5. Run the server:
83
+ ```
84
+ uv run app.py
85
+ ```
86
+
87
+ The server will start, connect to your Postgres DB, and expose MCP endpoints.
88
+
89
+ ## Usage
90
+
91
+ ### Adding Skills
92
+ Skills are stored in the `pg_skills/` directory. Each skill is a subfolder containing a `SKILL.md` file.
93
+
94
+ - **Structure Example**:
95
+ ```
96
+ pg_skills/
97
+ ├── business-logic/
98
+ │ └── SKILL.md # Default business logic (e.g., table schemas, query guidelines)
99
+ └── custom-skill/
100
+ └── SKILL.md # Custom skill instructions (YAML frontmatter + Markdown body)
101
+ ```
102
+
103
+ - **SKILL.md Format** (Inspired by Claude Skills):
104
+ - **YAML Frontmatter**: Minimal metadata (name, description) for progressive disclosure.
105
+ - **Body**: Detailed instructions, examples, or business logic for the LLM.
106
+ - Example:
107
+ ```
108
+ ---
109
+ name: inventory-check
110
+ description: Checks inventory levels in the products table. Use when querying stock.
111
+ ---
112
+
113
+ # Inventory Check Skill
114
+
115
+ To check inventory:
116
+ 1. Query the `products` table: SELECT * FROM products WHERE id = {id};
117
+ 2. Analyze stock levels.
118
+ ...
119
+ ```
120
+
121
+ Skills can reference others (e.g., "Load the 'reporting' skill for summaries"), forming a graph for dynamic loading.
122
+
123
+ ### Interacting with the Server
124
+ - Connect your LLM (e.g., Claude, ChatGPT) via an MCP client.
125
+ - The LLM can call tools to load skills and execute queries, building context as needed.
126
+ - Logs are saved in `mcp_logs/pg_ai_log.log`.
127
+
128
+ ## Contributing
129
+
130
+ Contributions are welcome! Please follow these steps:
131
+ 1. Fork the repository.
132
+ 2. Create a feature branch: `git checkout -b feature/new-feature`.
133
+ 3. Commit changes: `git commit -m 'Add new feature'`.
134
+ 4. Push to the branch: `git push origin feature/new-feature`.
135
+ 5. Open a Pull Request.
136
+
137
+ ## License
138
+
139
+ This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
140
+
141
+ ## References
142
+ - [Model Context Protocol (MCP)](https://modelcontextprotocol.io/docs/getting-started/intro)
143
+ - [Claude Skills Blog](https://www.anthropic.com/news/skills)
144
+ - [SKILL Graph Discussion](https://x.com/arscontexta/status/2023957499183829467?s=20)
145
+
146
+ For questions, open an issue or contact the maintainer.
147
+
@@ -0,0 +1,16 @@
1
+ README.md
2
+ pyproject.toml
3
+ src/postgres_ai/__init__.py
4
+ src/postgres_ai/cli.py
5
+ src/postgres_ai.egg-info/PKG-INFO
6
+ src/postgres_ai.egg-info/SOURCES.txt
7
+ src/postgres_ai.egg-info/dependency_links.txt
8
+ src/postgres_ai.egg-info/entry_points.txt
9
+ src/postgres_ai.egg-info/requires.txt
10
+ src/postgres_ai.egg-info/top_level.txt
11
+ src/postgres_ai/connectors/pg_connector.py
12
+ src/postgres_ai/loaders/env_loader.py
13
+ src/postgres_ai/logger/mcp_logger.py
14
+ src/postgres_ai/mcp_server/server.py
15
+ src/postgres_ai/mcp_tools/tools.py
16
+ src/postgres_ai/types/pg_ai_types.py
@@ -0,0 +1,2 @@
1
+ [console_scripts]
2
+ postgres_ai = postgres_ai.cli:app
@@ -0,0 +1,8 @@
1
+ asyncpg>=0.31.0
2
+ databases>=0.9.0
3
+ fastapi>=0.131.0
4
+ fastmcp>=3.0.2
5
+ pandas>=3.0.1
6
+ polars>=1.38.1
7
+ typer>=0.24.1
8
+ uvicorn>=0.41.0
@@ -0,0 +1 @@
1
+ postgres_ai