codegraphcontext 0.1.0__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- codegraphcontext/__init__.py +2 -0
- codegraphcontext/__main__.py +5 -0
- codegraphcontext/prompts.py +75 -0
- codegraphcontext/server.py +630 -0
- codegraphcontext-0.1.0.dist-info/METADATA +144 -0
- codegraphcontext-0.1.0.dist-info/RECORD +10 -0
- codegraphcontext-0.1.0.dist-info/WHEEL +5 -0
- codegraphcontext-0.1.0.dist-info/entry_points.txt +2 -0
- codegraphcontext-0.1.0.dist-info/licenses/LICENSE +21 -0
- codegraphcontext-0.1.0.dist-info/top_level.txt +1 -0
|
@@ -0,0 +1,75 @@
|
|
|
1
|
+
# src/codegraphcontext/prompts.py
|
|
2
|
+
|
|
3
|
+
LLM_SYSTEM_PROMPT = """# AI Pair Programmer Instructions
|
|
4
|
+
|
|
5
|
+
## 1. Your Role and Goal
|
|
6
|
+
|
|
7
|
+
You are an expert AI pair programmer. Your primary goal is to help a developer understand, write, and refactor code within their **local project**. Your defining feature is your connection to a local Model Context Protocol (MCP) server, which gives you real-time, accurate information about the codebase.
|
|
8
|
+
|
|
9
|
+
## 2. Your Core Principles
|
|
10
|
+
|
|
11
|
+
### Principle I: Ground Your Answers in Fact
|
|
12
|
+
**Your CORE DIRECTIVE is to use the provided tools to gather facts from the MCP server *before* answering questions or generating code.** Do not guess. Your value comes from providing contextually-aware, accurate assistance.
|
|
13
|
+
|
|
14
|
+
### Principle II: Be an Agent, Not Just a Planner
|
|
15
|
+
**Your goal is to complete the user's task in the fewest steps possible.**
|
|
16
|
+
* If the user's request maps directly to a single tool, **execute that tool immediately.**
|
|
17
|
+
* Do not create a multi-step plan for a one-step task. The Standard Operating Procedures (SOPs) below are for complex queries that require reasoning and combining information from multiple tools.
|
|
18
|
+
|
|
19
|
+
**Example of what NOT to do:**
|
|
20
|
+
|
|
21
|
+
> **User:** "Start watching the `my-project` folder."
|
|
22
|
+
> **Incorrect Plan:**
|
|
23
|
+
> 1. Check if `watchdog` is installed.
|
|
24
|
+
> 2. Use the `watch_directory` tool on `my-project`.
|
|
25
|
+
> 3. Update a todo list.
|
|
26
|
+
|
|
27
|
+
**Example of the CORRECT, direct action:**
|
|
28
|
+
|
|
29
|
+
> **User:** "Start watching the `my-project` folder."
|
|
30
|
+
> **Correct Action:** Immediately call the `watch_directory` tool.
|
|
31
|
+
> ```json
|
|
32
|
+
> {
|
|
33
|
+
> "tool_name": "watch_directory",
|
|
34
|
+
> "arguments": { "path": "my-project" }
|
|
35
|
+
> }
|
|
36
|
+
> ```
|
|
37
|
+
|
|
38
|
+
## 3. Tool Manifest & Usage
|
|
39
|
+
|
|
40
|
+
| Tool Name | Purpose & When to Use |
|
|
41
|
+
| :--------------------------- | :------------------------------------------------------------------------------------------------------------------------------------ |
|
|
42
|
+
| **`find_code`** | **Your primary search tool.** Use this first for almost any query about locating code. |
|
|
43
|
+
| **`analyze_code_relationships`** | **Your deep analysis tool.** Use this after locating a specific item. Use query types like `find_callers` or `find_callees`. |
|
|
44
|
+
| **`add_code_to_graph`** | **Your indexing tool.** Use this when the user wants to add a new project folder or file to the context. |
|
|
45
|
+
| **`add_package_to_graph`** | **Your dependency indexing tool.** Use this to add a `pip` package to the context. |
|
|
46
|
+
| **`list_jobs`** & **`check_job_status`** | **Your job monitoring tools.** |
|
|
47
|
+
| **`watch_directory`** | **Your live-update tool.** Use this if the user wants to automatically keep the context updated as they work. |
|
|
48
|
+
| **`execute_cypher_query`** | **Expert Fallback Tool.** Use this *only* when other tools cannot answer a very specific or complex question about the code graph. Requires knowledge of Cypher. |
|
|
49
|
+
|
|
50
|
+
|
|
51
|
+
## 4. Standard Operating Procedures (SOPs) for Complex Tasks
|
|
52
|
+
|
|
53
|
+
**Note:** Follow these methodical workflows for **complex requests** that require multiple steps of reasoning or combining information from several tools. For direct commands, refer to Principle II and act immediately.
|
|
54
|
+
|
|
55
|
+
### SOP-1: Answering "Where is...?" or "How does...?" Questions
|
|
56
|
+
1. **Locate:** Use `find_code` to find the relevant code.
|
|
57
|
+
2. **Analyze:** Use `analyze_code_relationships` to understand its usage.
|
|
58
|
+
3. **Synthesize:** Combine the information into a clear explanation.
|
|
59
|
+
|
|
60
|
+
### SOP-2: Generating New Code
|
|
61
|
+
1. **Find Context:** Use `find_code` to find similar, existing code to match the style.
|
|
62
|
+
2. **Find Reusable Code:** Use `find_code` to locate specific helper functions the user wants you to use.
|
|
63
|
+
3. **Generate:** Write the code using the correct imports and signatures.
|
|
64
|
+
|
|
65
|
+
### SOP-3: Refactoring or Analyzing Impact
|
|
66
|
+
1. **Identify & Locate:** Use `find_code` to get the canonical path of the item to be changed.
|
|
67
|
+
2. **Assess Impact:** Use `analyze_code_relationships` with the `find_callers` query type to find all affected locations.
|
|
68
|
+
3. **Report Findings:** Present a clear list of all affected files.
|
|
69
|
+
|
|
70
|
+
### SOP-4: Using the Cypher Fallback
|
|
71
|
+
1. **Attempt Standard Tools:** First, always try to use `find_code` and `analyze_code_relationships`.
|
|
72
|
+
2. **Identify Failure:** If the standard tools cannot answer a complex, multi-step relationship query (e.g., "Find all functions that are called by a method in a class that inherits from 'BaseHandler'"), then and only then, resort to the fallback.
|
|
73
|
+
3. **Formulate & Execute:** Construct a Cypher query to find the answer and execute it using `execute_cypher_query`.
|
|
74
|
+
4. **Present Results:** Explain the results to the user based on the query output.
|
|
75
|
+
"""
|
|
@@ -0,0 +1,630 @@
|
|
|
1
|
+
# src/codegraphcontext/server.py
|
|
2
|
+
import asyncio
|
|
3
|
+
import json
|
|
4
|
+
import logging
|
|
5
|
+
import importlib
|
|
6
|
+
import stdlibs
|
|
7
|
+
import sys
|
|
8
|
+
import traceback
|
|
9
|
+
import os
|
|
10
|
+
from datetime import datetime
|
|
11
|
+
from pathlib import Path
|
|
12
|
+
from neo4j.exceptions import CypherSyntaxError
|
|
13
|
+
from dataclasses import asdict
|
|
14
|
+
from codegraphcontext.core.database import DatabaseManager
|
|
15
|
+
|
|
16
|
+
from typing import Any, Dict, Coroutine, Optional
|
|
17
|
+
|
|
18
|
+
from .prompts import LLM_SYSTEM_PROMPT
|
|
19
|
+
from .core.database import DatabaseManager
|
|
20
|
+
from .core.jobs import JobManager, JobStatus
|
|
21
|
+
from .core.watcher import CodeWatcher
|
|
22
|
+
from .tools.graph_builder import GraphBuilder
|
|
23
|
+
from .tools.code_finder import CodeFinder
|
|
24
|
+
from .tools.import_extractor import ImportExtractor
|
|
25
|
+
|
|
26
|
+
|
|
27
|
+
logger = logging.getLogger(__name__)
|
|
28
|
+
|
|
29
|
+
def debug_log(message):
|
|
30
|
+
"""Write debug message to a file"""
|
|
31
|
+
debug_file = os.path.expanduser("~/mcp_debug.log")
|
|
32
|
+
timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
|
|
33
|
+
with open(debug_file, "a") as f:
|
|
34
|
+
f.write(f"[{timestamp}] {message}\n")
|
|
35
|
+
f.flush()
|
|
36
|
+
|
|
37
|
+
class MCPServer:
|
|
38
|
+
"""Main MCP Server class that orchestrates all components."""
|
|
39
|
+
|
|
40
|
+
def __init__(self, loop=None):
|
|
41
|
+
try:
|
|
42
|
+
self.db_manager = DatabaseManager()
|
|
43
|
+
self.db_manager.get_driver() # Initialize connection early
|
|
44
|
+
except ValueError as e:
|
|
45
|
+
raise ValueError(f"Database configuration error: {e}")
|
|
46
|
+
|
|
47
|
+
self.job_manager = JobManager()
|
|
48
|
+
|
|
49
|
+
# Get the current event loop to pass to thread-sensitive components
|
|
50
|
+
if loop is None:
|
|
51
|
+
try:
|
|
52
|
+
loop = asyncio.get_running_loop()
|
|
53
|
+
except RuntimeError:
|
|
54
|
+
loop = asyncio.new_event_loop()
|
|
55
|
+
asyncio.set_event_loop(loop)
|
|
56
|
+
self.loop = loop
|
|
57
|
+
|
|
58
|
+
# Initialize tool handlers
|
|
59
|
+
self.graph_builder = GraphBuilder(self.db_manager, self.job_manager, loop)
|
|
60
|
+
self.code_finder = CodeFinder(self.db_manager)
|
|
61
|
+
self.import_extractor = ImportExtractor()
|
|
62
|
+
|
|
63
|
+
self.code_watcher = CodeWatcher(self.graph_builder)
|
|
64
|
+
|
|
65
|
+
self._init_tools()
|
|
66
|
+
|
|
67
|
+
def _init_tools(self):
|
|
68
|
+
"""Defines the complete tool manifest for the LLM."""
|
|
69
|
+
self.tools = {
|
|
70
|
+
"add_code_to_graph": {
|
|
71
|
+
"name": "add_code_to_graph",
|
|
72
|
+
"description": "Add code from a local folder to the graph. Returns a job ID for background processing.",
|
|
73
|
+
"inputSchema": {
|
|
74
|
+
"type": "object",
|
|
75
|
+
"properties": {
|
|
76
|
+
"path": {"type": "string", "description": "Path to the directory or file to add."},
|
|
77
|
+
"is_dependency": {"type": "boolean", "description": "Whether this code is a dependency.", "default": False}
|
|
78
|
+
},
|
|
79
|
+
"required": ["path"]
|
|
80
|
+
}
|
|
81
|
+
},
|
|
82
|
+
"check_job_status": {
|
|
83
|
+
"name": "check_job_status",
|
|
84
|
+
"description": "Check the status and progress of a background job.",
|
|
85
|
+
"inputSchema": {
|
|
86
|
+
"type": "object",
|
|
87
|
+
"properties": { "job_id": {"type": "string", "description": "Job ID from a previous tool call"} },
|
|
88
|
+
"required": ["job_id"]
|
|
89
|
+
}
|
|
90
|
+
},
|
|
91
|
+
"list_jobs": {
|
|
92
|
+
"name": "list_jobs",
|
|
93
|
+
"description": "List all background jobs and their current status.",
|
|
94
|
+
"inputSchema": {"type": "object", "properties": {}}
|
|
95
|
+
},
|
|
96
|
+
"find_code": {
|
|
97
|
+
"name": "find_code",
|
|
98
|
+
"description": "Find relevant code snippets related to a keyword (e.g., function name, class name, or content).",
|
|
99
|
+
"inputSchema": {
|
|
100
|
+
"type": "object",
|
|
101
|
+
"properties": { "query": {"type": "string", "description": "Keyword or phrase to search for"} },
|
|
102
|
+
"required": ["query"]
|
|
103
|
+
}
|
|
104
|
+
},
|
|
105
|
+
"analyze_code_relationships": {
|
|
106
|
+
"name": "analyze_code_relationships",
|
|
107
|
+
"description": "Analyze code relationships like 'who calls this function' or 'class hierarchy'.",
|
|
108
|
+
"inputSchema": {
|
|
109
|
+
"type": "object",
|
|
110
|
+
"properties": {
|
|
111
|
+
"query_type": {"type": "string", "description": "Type of relationship query to run."},
|
|
112
|
+
"target": {"type": "string", "description": "The function, class, or module to analyze."},
|
|
113
|
+
"context": {"type": "string", "description": "Optional: specific file path for precise results."}
|
|
114
|
+
},
|
|
115
|
+
"required": ["query_type", "target"]
|
|
116
|
+
}
|
|
117
|
+
},
|
|
118
|
+
"watch_directory": {
|
|
119
|
+
"name": "watch_directory",
|
|
120
|
+
"description": "Start watching a directory for code changes and automatically update the graph.",
|
|
121
|
+
"inputSchema": {
|
|
122
|
+
"type": "object",
|
|
123
|
+
"properties": { "path": {"type": "string", "description": "Path to directory to watch"} },
|
|
124
|
+
"required": ["path"]
|
|
125
|
+
}
|
|
126
|
+
},
|
|
127
|
+
"execute_cypher_query": {
|
|
128
|
+
"name": "execute_cypher_query",
|
|
129
|
+
"description": "Fallback tool to run a direct, read-only Cypher query against the code graph.",
|
|
130
|
+
"inputSchema": {
|
|
131
|
+
"type": "object",
|
|
132
|
+
"properties": { "cypher_query": {"type": "string", "description": "The read-only Cypher query to execute."} },
|
|
133
|
+
"required": ["cypher_query"]
|
|
134
|
+
}
|
|
135
|
+
},
|
|
136
|
+
"add_package_to_graph": {
|
|
137
|
+
"name": "add_package_to_graph",
|
|
138
|
+
"description": "Add a Python package to Neo4j graph by discovering its location. Returns immediately with job ID.",
|
|
139
|
+
"inputSchema": {
|
|
140
|
+
"type": "object",
|
|
141
|
+
"properties": {
|
|
142
|
+
"package_name": {"type": "string", "description": "Name of the Python package to add (e.g., 'requests')"},
|
|
143
|
+
"is_dependency": {"type": "boolean", "description": "Mark as a dependency", "default": True}
|
|
144
|
+
},
|
|
145
|
+
"required": ["package_name"]
|
|
146
|
+
}
|
|
147
|
+
},
|
|
148
|
+
"list_imports": {
|
|
149
|
+
"name": "list_imports",
|
|
150
|
+
"description": "Extract all package imports from code files in a directory or file",
|
|
151
|
+
"inputSchema": {
|
|
152
|
+
"type": "object",
|
|
153
|
+
"properties": {
|
|
154
|
+
"path": {"type": "string", "description": "Path to file or directory to analyze"},
|
|
155
|
+
"language": {"type": "string", "description": "Programming language (python, javascript, etc.)", "default": "python"},
|
|
156
|
+
"recursive": {"type": "boolean", "description": "Whether to analyze subdirectories recursively", "default": True}
|
|
157
|
+
},
|
|
158
|
+
"required": ["path"]
|
|
159
|
+
}
|
|
160
|
+
},
|
|
161
|
+
"find_dead_code": {
|
|
162
|
+
"name": "find_dead_code",
|
|
163
|
+
"description": "Find potentially unused functions (dead code) across the entire indexed codebase.",
|
|
164
|
+
"inputSchema": {
|
|
165
|
+
"type": "object",
|
|
166
|
+
"properties": {},
|
|
167
|
+
"additionalProperties": False
|
|
168
|
+
}
|
|
169
|
+
}
|
|
170
|
+
# Other tools like list_imports, add_package_to_graph can be added here following the same pattern
|
|
171
|
+
}
|
|
172
|
+
|
|
173
|
+
|
|
174
|
+
|
|
175
|
+
def get_database_status(self) -> dict:
|
|
176
|
+
"""Get current database connection status"""
|
|
177
|
+
return {"connected": self.db_manager.is_connected()}
|
|
178
|
+
|
|
179
|
+
def get_local_package_path(self, package_name: str) -> Optional[str]:
|
|
180
|
+
"""Get the local installation path of a Python package"""
|
|
181
|
+
try:
|
|
182
|
+
debug_log(f"Getting local path for package: {package_name}")
|
|
183
|
+
|
|
184
|
+
module = importlib.import_module(package_name)
|
|
185
|
+
|
|
186
|
+
if hasattr(module, '__file__') and module.__file__:
|
|
187
|
+
module_file = module.__file__
|
|
188
|
+
debug_log(f"Module file: {module_file}")
|
|
189
|
+
|
|
190
|
+
if module_file.endswith('__init__.py'):
|
|
191
|
+
package_path = os.path.dirname(module_file)
|
|
192
|
+
else:
|
|
193
|
+
package_path = os.path.dirname(module_file)
|
|
194
|
+
|
|
195
|
+
debug_log(f"Package path: {package_path}")
|
|
196
|
+
return package_path
|
|
197
|
+
|
|
198
|
+
elif hasattr(module, '__path__'):
|
|
199
|
+
if isinstance(module.__path__, list) and module.__path__:
|
|
200
|
+
package_path = module.__path__[0]
|
|
201
|
+
debug_log(f"Package path from __path__: {package_path}")
|
|
202
|
+
return package_path
|
|
203
|
+
else:
|
|
204
|
+
package_path = str(module.__path__)
|
|
205
|
+
debug_log(f"Package path from __path__ (str): {package_path}")
|
|
206
|
+
return package_path
|
|
207
|
+
|
|
208
|
+
debug_log(f"Could not determine path for {package_name}")
|
|
209
|
+
return None
|
|
210
|
+
|
|
211
|
+
except ImportError as e:
|
|
212
|
+
debug_log(f"Could not import {package_name}: {e}")
|
|
213
|
+
return None
|
|
214
|
+
except Exception as e:
|
|
215
|
+
debug_log(f"Error getting local path for {package_name}: {e}")
|
|
216
|
+
return None
|
|
217
|
+
|
|
218
|
+
def execute_cypher_query_tool(self, **args) -> Dict[str, Any]:
|
|
219
|
+
"""
|
|
220
|
+
Tool to execute a read-only Cypher query.
|
|
221
|
+
This is a powerful tool and includes safety checks to prevent database modification.
|
|
222
|
+
"""
|
|
223
|
+
cypher_query = args.get("cypher_query")
|
|
224
|
+
if not cypher_query:
|
|
225
|
+
return {"error": "Cypher query cannot be empty."}
|
|
226
|
+
|
|
227
|
+
# Safety Check: Prevent write operations.
|
|
228
|
+
forbidden_keywords = ['CREATE', 'MERGE', 'DELETE', 'SET', 'REMOVE', 'DROP', 'CALL apoc']
|
|
229
|
+
query_upper = cypher_query.upper()
|
|
230
|
+
if any(keyword in query_upper for keyword in forbidden_keywords):
|
|
231
|
+
return {
|
|
232
|
+
"error": "This tool only supports read-only queries. Prohibited keywords like CREATE, MERGE, DELETE, SET, etc., are not allowed."
|
|
233
|
+
}
|
|
234
|
+
|
|
235
|
+
try:
|
|
236
|
+
debug_log(f"Executing Cypher query: {cypher_query}")
|
|
237
|
+
with self.db_manager.get_driver().session() as session:
|
|
238
|
+
result = session.run(cypher_query)
|
|
239
|
+
# Convert results to a list of dictionaries for JSON serialization
|
|
240
|
+
records = [record.data() for record in result]
|
|
241
|
+
|
|
242
|
+
return {
|
|
243
|
+
"success": True,
|
|
244
|
+
"query": cypher_query,
|
|
245
|
+
"record_count": len(records),
|
|
246
|
+
"results": records
|
|
247
|
+
}
|
|
248
|
+
|
|
249
|
+
except CypherSyntaxError as e:
|
|
250
|
+
debug_log(f"Cypher syntax error: {str(e)}")
|
|
251
|
+
return {
|
|
252
|
+
"error": "Cypher syntax error.",
|
|
253
|
+
"details": str(e),
|
|
254
|
+
"query": cypher_query
|
|
255
|
+
}
|
|
256
|
+
except Exception as e:
|
|
257
|
+
debug_log(f"Error executing Cypher query: {str(e)}")
|
|
258
|
+
return {
|
|
259
|
+
"error": "An unexpected error occurred while executing the query.",
|
|
260
|
+
"details": str(e)
|
|
261
|
+
}
|
|
262
|
+
|
|
263
|
+
def find_dead_code_tool(self) -> Dict[str, Any]:
|
|
264
|
+
"""Tool to find potentially dead code across the entire project."""
|
|
265
|
+
try:
|
|
266
|
+
debug_log("Finding dead code.")
|
|
267
|
+
# The target argument from the old tool is not needed.
|
|
268
|
+
results = self.code_finder.find_dead_code()
|
|
269
|
+
|
|
270
|
+
return {
|
|
271
|
+
"success": True,
|
|
272
|
+
"query_type": "dead_code",
|
|
273
|
+
"results": results
|
|
274
|
+
}
|
|
275
|
+
except Exception as e:
|
|
276
|
+
debug_log(f"Error finding dead code: {str(e)}")
|
|
277
|
+
return {"error": f"Failed to find dead code: {str(e)}"}
|
|
278
|
+
|
|
279
|
+
def watch_directory_tool(self, **args) -> Dict[str, Any]:
|
|
280
|
+
"""Tool to start watching a directory."""
|
|
281
|
+
path = args.get("path")
|
|
282
|
+
if not path or not Path(path).is_dir():
|
|
283
|
+
return {"error": f"Invalid path provided: {path}. Must be a directory."}
|
|
284
|
+
|
|
285
|
+
try:
|
|
286
|
+
initial_scan_result = self.add_code_to_graph_tool(path=path, is_dependency=False)
|
|
287
|
+
if "error" in initial_scan_result:
|
|
288
|
+
return initial_scan_result
|
|
289
|
+
|
|
290
|
+
self.code_watcher.watch_directory(path)
|
|
291
|
+
|
|
292
|
+
return {
|
|
293
|
+
"success": True,
|
|
294
|
+
"message": f"Initial scan started (Job ID: {initial_scan_result.get('job_id')}). Now watching for live changes in {path}.",
|
|
295
|
+
"instructions": "Changes to .py files in this directory will now be automatically updated in the graph."
|
|
296
|
+
}
|
|
297
|
+
except Exception as e:
|
|
298
|
+
logger.error(f"Failed to start watching directory {path}: {e}")
|
|
299
|
+
return {"error": f"Failed to start watching directory: {str(e)}"}
|
|
300
|
+
|
|
301
|
+
def list_imports_tool(self, **args) -> Dict[str, Any]:
|
|
302
|
+
"""Tool to list all imports from code files"""
|
|
303
|
+
path = args.get("path")
|
|
304
|
+
language = args.get("language", "python")
|
|
305
|
+
recursive = args.get("recursive", True)
|
|
306
|
+
all_imports = set()
|
|
307
|
+
file_extensions = {
|
|
308
|
+
'python': ['.py'], 'javascript': ['.js', '.jsx', '.mjs'],
|
|
309
|
+
'typescript': ['.ts', '.tsx'], 'java': ['.java'],
|
|
310
|
+
}
|
|
311
|
+
|
|
312
|
+
extensions = file_extensions.get(language, ['.py'])
|
|
313
|
+
extract_func = {
|
|
314
|
+
'python': self.import_extractor.extract_python_imports,
|
|
315
|
+
'javascript': self.import_extractor.extract_javascript_imports,
|
|
316
|
+
'typescript': self.import_extractor.extract_javascript_imports,
|
|
317
|
+
'java': self.import_extractor.extract_java_imports,
|
|
318
|
+
}.get(language, self.import_extractor.extract_python_imports)
|
|
319
|
+
|
|
320
|
+
try:
|
|
321
|
+
path_obj = Path(path)
|
|
322
|
+
|
|
323
|
+
if path_obj.is_file():
|
|
324
|
+
if any(str(path_obj).endswith(ext) for ext in extensions):
|
|
325
|
+
all_imports.update(extract_func(str(path_obj)))
|
|
326
|
+
elif path_obj.is_dir():
|
|
327
|
+
pattern = "**/*" if recursive else "*"
|
|
328
|
+
for ext in extensions:
|
|
329
|
+
for file_path in path_obj.glob(f"{pattern}{ext}"):
|
|
330
|
+
if file_path.is_file():
|
|
331
|
+
all_imports.update(extract_func(str(file_path)))
|
|
332
|
+
else:
|
|
333
|
+
return {"error": f"Path {path} does not exist"}
|
|
334
|
+
|
|
335
|
+
if language == 'python':
|
|
336
|
+
# Get the list of stdlib modules for the current Python version
|
|
337
|
+
stdlib_modules = set(stdlibs.module_names)
|
|
338
|
+
# stdlib_modules = {
|
|
339
|
+
# 'os', 'sys', 'json', 'time', 'datetime', 'math', 'random', 're', 'collections',
|
|
340
|
+
# 'itertools', 'functools', 'operator', 'pathlib', 'urllib', 'http', 'logging',
|
|
341
|
+
# 'threading', 'multiprocessing', 'asyncio', 'typing', 'dataclasses', 'enum',
|
|
342
|
+
# 'abc', 'io', 'csv', 'sqlite3', 'pickle', 'base64', 'hashlib', 'hmac', 'secrets',
|
|
343
|
+
# 'unittest', 'doctest', 'pdb', 'profile', 'cProfile', 'timeit'
|
|
344
|
+
# }
|
|
345
|
+
all_imports = all_imports - stdlib_modules
|
|
346
|
+
|
|
347
|
+
return {
|
|
348
|
+
"imports": sorted(list(all_imports)), "language": language,
|
|
349
|
+
"path": path, "count": len(all_imports)
|
|
350
|
+
}
|
|
351
|
+
|
|
352
|
+
except Exception as e:
|
|
353
|
+
return {"error": f"Failed to analyze imports: {str(e)}"}
|
|
354
|
+
|
|
355
|
+
def add_code_to_graph_tool(self, **args) -> Dict[str, Any]:
|
|
356
|
+
"""Tool to add code to Neo4j graph with background processing"""
|
|
357
|
+
path = args.get("path")
|
|
358
|
+
is_dependency = args.get("is_dependency", False)
|
|
359
|
+
|
|
360
|
+
try:
|
|
361
|
+
path_obj = Path(path).resolve()
|
|
362
|
+
|
|
363
|
+
if not path_obj.exists():
|
|
364
|
+
return {"error": f"Path {path} does not exist"}
|
|
365
|
+
|
|
366
|
+
total_files, estimated_time = self.graph_builder.estimate_processing_time(path_obj)
|
|
367
|
+
|
|
368
|
+
job_id = self.job_manager.create_job(str(path_obj), is_dependency)
|
|
369
|
+
|
|
370
|
+
self.job_manager.update_job(job_id, total_files=total_files, estimated_duration=estimated_time)
|
|
371
|
+
|
|
372
|
+
coro = self.graph_builder.build_graph_from_path_async(
|
|
373
|
+
path_obj, is_dependency, job_id
|
|
374
|
+
)
|
|
375
|
+
asyncio.run_coroutine_threadsafe(coro, self.loop)
|
|
376
|
+
|
|
377
|
+
debug_log(f"Started background job {job_id} for path: {str(path_obj)}, is_dependency: {is_dependency}")
|
|
378
|
+
|
|
379
|
+
return {
|
|
380
|
+
"success": True, "job_id": job_id,
|
|
381
|
+
"message": f"Background processing started for {str(path_obj)}",
|
|
382
|
+
"estimated_files": total_files,
|
|
383
|
+
"estimated_duration_seconds": round(estimated_time, 2),
|
|
384
|
+
"estimated_duration_human": f"{int(estimated_time // 60)}m {int(estimated_time % 60)}s" if estimated_time >= 60 else f"{int(estimated_time)}s",
|
|
385
|
+
"instructions": f"Use 'check_job_status' with job_id '{job_id}' to monitor progress"
|
|
386
|
+
}
|
|
387
|
+
|
|
388
|
+
except Exception as e:
|
|
389
|
+
debug_log(f"Error creating background job: {str(e)}")
|
|
390
|
+
return {"error": f"Failed to start background processing: {str(e)}"}
|
|
391
|
+
|
|
392
|
+
def add_package_to_graph_tool(self, **args) -> Dict[str, Any]:
|
|
393
|
+
"""Tool to add a Python package to Neo4j graph by auto-discovering its location"""
|
|
394
|
+
package_name = args.get("package_name")
|
|
395
|
+
is_dependency = args.get("is_dependency", True)
|
|
396
|
+
|
|
397
|
+
try:
|
|
398
|
+
package_path = self.get_local_package_path(package_name)
|
|
399
|
+
|
|
400
|
+
if not package_path:
|
|
401
|
+
return {"error": f"Could not find package '{package_name}'. Make sure it's installed."}
|
|
402
|
+
|
|
403
|
+
if not os.path.exists(package_path):
|
|
404
|
+
return {"error": f"Package path '{package_path}' does not exist"}
|
|
405
|
+
|
|
406
|
+
path_obj = Path(package_path)
|
|
407
|
+
|
|
408
|
+
total_files, estimated_time = self.graph_builder.estimate_processing_time(path_obj)
|
|
409
|
+
|
|
410
|
+
job_id = self.job_manager.create_job(package_path, is_dependency)
|
|
411
|
+
|
|
412
|
+
self.job_manager.update_job(job_id, total_files=total_files, estimated_duration=estimated_time)
|
|
413
|
+
|
|
414
|
+
coro = self.graph_builder.build_graph_from_path_async(
|
|
415
|
+
path_obj, is_dependency, job_id
|
|
416
|
+
)
|
|
417
|
+
asyncio.run_coroutine_threadsafe(coro, self.loop)
|
|
418
|
+
|
|
419
|
+
debug_log(f"Started background job {job_id} for package: {package_name} at {package_path}, is_dependency: {is_dependency}")
|
|
420
|
+
|
|
421
|
+
return {
|
|
422
|
+
"success": True, "job_id": job_id, "package_name": package_name,
|
|
423
|
+
"discovered_path": package_path,
|
|
424
|
+
"message": f"Background processing started for package '{package_name}'",
|
|
425
|
+
"estimated_files": total_files,
|
|
426
|
+
"estimated_duration_seconds": round(estimated_time, 2),
|
|
427
|
+
"estimated_duration_human": f"{int(estimated_time // 60)}m {int(estimated_time % 60)}s" if estimated_time >= 60 else f"{int(estimated_time)}s",
|
|
428
|
+
"instructions": f"Use 'check_job_status' with job_id '{job_id}' to monitor progress"
|
|
429
|
+
}
|
|
430
|
+
|
|
431
|
+
except Exception as e:
|
|
432
|
+
debug_log(f"Error creating background job for package {package_name}: {str(e)}")
|
|
433
|
+
return {"error": f"Failed to start background processing for package '{package_name}': {str(e)}"}
|
|
434
|
+
|
|
435
|
+
def check_job_status_tool(self, **args) -> Dict[str, Any]:
|
|
436
|
+
"""Tool to check job status"""
|
|
437
|
+
job_id = args.get("job_id")
|
|
438
|
+
|
|
439
|
+
try:
|
|
440
|
+
job = self.job_manager.get_job(job_id)
|
|
441
|
+
|
|
442
|
+
if not job:
|
|
443
|
+
return {"error": f"Job {job_id} not found"}
|
|
444
|
+
|
|
445
|
+
job_dict = asdict(job)
|
|
446
|
+
|
|
447
|
+
if job.status == JobStatus.RUNNING:
|
|
448
|
+
if job.estimated_time_remaining:
|
|
449
|
+
remaining = job.estimated_time_remaining
|
|
450
|
+
job_dict["estimated_time_remaining_human"] = (
|
|
451
|
+
f"{int(remaining // 60)}m {int(remaining % 60)}s"
|
|
452
|
+
if remaining >= 60 else f"{int(remaining)}s"
|
|
453
|
+
)
|
|
454
|
+
|
|
455
|
+
if job.start_time:
|
|
456
|
+
elapsed = (datetime.now() - job.start_time).total_seconds()
|
|
457
|
+
job_dict["elapsed_time_human"] = (
|
|
458
|
+
f"{int(elapsed // 60)}m {int(elapsed % 60)}s"
|
|
459
|
+
if elapsed >= 60 else f"{int(elapsed)}s"
|
|
460
|
+
)
|
|
461
|
+
|
|
462
|
+
elif job.status == JobStatus.COMPLETED and job.start_time and job.end_time:
|
|
463
|
+
duration = (job.end_time - job.start_time).total_seconds()
|
|
464
|
+
job_dict["actual_duration_human"] = (
|
|
465
|
+
f"{int(duration // 60)}m {int(duration % 60)}s"
|
|
466
|
+
if duration >= 60 else f"{int(duration)}s"
|
|
467
|
+
)
|
|
468
|
+
|
|
469
|
+
job_dict["start_time"] = job.start_time.strftime("%Y-%m-%d %H:%M:%S")
|
|
470
|
+
if job.end_time:
|
|
471
|
+
job_dict["end_time"] = job.end_time.strftime("%Y-%m-%d %H:%M:%S")
|
|
472
|
+
|
|
473
|
+
job_dict["status"] = job.status.value
|
|
474
|
+
|
|
475
|
+
return {"success": True, "job": job_dict}
|
|
476
|
+
|
|
477
|
+
except Exception as e:
|
|
478
|
+
debug_log(f"Error checking job status: {str(e)}")
|
|
479
|
+
return {"error": f"Failed to check job status: {str(e)}"}
|
|
480
|
+
|
|
481
|
+
def list_jobs_tool(self) -> Dict[str, Any]:
|
|
482
|
+
"""Tool to list all jobs"""
|
|
483
|
+
try:
|
|
484
|
+
jobs = self.job_manager.list_jobs()
|
|
485
|
+
|
|
486
|
+
jobs_data = []
|
|
487
|
+
for job in jobs:
|
|
488
|
+
job_dict = asdict(job)
|
|
489
|
+
job_dict["status"] = job.status.value
|
|
490
|
+
job_dict["start_time"] = job.start_time.strftime("%Y-%m-%d %H:%M:%S")
|
|
491
|
+
if job.end_time:
|
|
492
|
+
job_dict["end_time"] = job.end_time.strftime("%Y-%m-%d %H:%M:%S")
|
|
493
|
+
jobs_data.append(job_dict)
|
|
494
|
+
|
|
495
|
+
jobs_data.sort(key=lambda x: x["start_time"], reverse=True)
|
|
496
|
+
|
|
497
|
+
return {"success": True, "jobs": jobs_data, "total_jobs": len(jobs_data)}
|
|
498
|
+
|
|
499
|
+
except Exception as e:
|
|
500
|
+
debug_log(f"Error listing jobs: {str(e)}")
|
|
501
|
+
return {"error": f"Failed to list jobs: {str(e)}"}
|
|
502
|
+
|
|
503
|
+
def analyze_code_relationships_tool(self, **args) -> Dict[str, Any]:
|
|
504
|
+
"""Tool to analyze code relationships"""
|
|
505
|
+
query_type = args.get("query_type")
|
|
506
|
+
target = args.get("target")
|
|
507
|
+
context = args.get("context")
|
|
508
|
+
|
|
509
|
+
if not query_type or not target:
|
|
510
|
+
return {
|
|
511
|
+
"error": "Both 'query_type' and 'target' are required",
|
|
512
|
+
"supported_query_types": [
|
|
513
|
+
"who_calls", "what_calls", "who_imports", "who_modifies",
|
|
514
|
+
"class_hierarchy", "overrides", "dead_code"
|
|
515
|
+
]
|
|
516
|
+
}
|
|
517
|
+
|
|
518
|
+
try:
|
|
519
|
+
debug_log(f"Analyzing relationships: {query_type} for {target}")
|
|
520
|
+
results = self.code_finder.analyze_code_relationships(query_type, target, context)
|
|
521
|
+
|
|
522
|
+
return {
|
|
523
|
+
"success": True, "query_type": query_type, "target": target,
|
|
524
|
+
"context": context, "results": results
|
|
525
|
+
}
|
|
526
|
+
|
|
527
|
+
except Exception as e:
|
|
528
|
+
debug_log(f"Error analyzing relationships: {str(e)}")
|
|
529
|
+
return {"error": f"Failed to analyze relationships: {str(e)}"}
|
|
530
|
+
|
|
531
|
+
def find_code_tool(self, **args) -> Dict[str, Any]:
|
|
532
|
+
"""Tool to find relevant code snippets"""
|
|
533
|
+
query = args.get("query")
|
|
534
|
+
|
|
535
|
+
try:
|
|
536
|
+
debug_log(f"Finding code for query: {query}")
|
|
537
|
+
results = self.code_finder.find_related_code(query)
|
|
538
|
+
|
|
539
|
+
return {"success": True, "query": query, "results": results}
|
|
540
|
+
|
|
541
|
+
except Exception as e:
|
|
542
|
+
debug_log(f"Error finding code: {str(e)}")
|
|
543
|
+
return {"error": f"Failed to find code: {str(e)}"}
|
|
544
|
+
|
|
545
|
+
|
|
546
|
+
async def handle_tool_call(self, tool_name: str, args: Dict[str, Any]) -> Dict[str, Any]:
|
|
547
|
+
"""Routes a tool call to the appropriate handler."""
|
|
548
|
+
tool_map: Dict[str, Coroutine] = {
|
|
549
|
+
"list_imports": self.list_imports_tool,
|
|
550
|
+
"add_package_to_graph": self.add_package_to_graph_tool,
|
|
551
|
+
"find_dead_code": self.find_dead_code_tool,
|
|
552
|
+
"find_code": self.find_code_tool,
|
|
553
|
+
"analyze_code_relationships": self.analyze_code_relationships_tool,
|
|
554
|
+
"watch_directory": self.watch_directory_tool,
|
|
555
|
+
"execute_cypher_query": self.execute_cypher_query_tool,
|
|
556
|
+
"add_code_to_graph": self.add_code_to_graph_tool,
|
|
557
|
+
"check_job_status": self.check_job_status_tool,
|
|
558
|
+
"list_jobs": self.list_jobs_tool
|
|
559
|
+
}
|
|
560
|
+
handler = tool_map.get(tool_name)
|
|
561
|
+
if handler:
|
|
562
|
+
return await asyncio.to_thread(handler, **args)
|
|
563
|
+
else:
|
|
564
|
+
return {"error": f"Unknown tool: {tool_name}"}
|
|
565
|
+
|
|
566
|
+
async def run(self):
|
|
567
|
+
"""Runs the main server loop, listening for JSON-RPC requests."""
|
|
568
|
+
logger.info("MCP Server is running. Waiting for requests...")
|
|
569
|
+
self.code_watcher.start()
|
|
570
|
+
|
|
571
|
+
loop = asyncio.get_event_loop()
|
|
572
|
+
while True:
|
|
573
|
+
try:
|
|
574
|
+
line = await loop.run_in_executor(None, sys.stdin.readline)
|
|
575
|
+
if not line:
|
|
576
|
+
logger.info("Client disconnected (EOF received). Shutting down.")
|
|
577
|
+
break
|
|
578
|
+
|
|
579
|
+
request = json.loads(line.strip())
|
|
580
|
+
method = request.get('method')
|
|
581
|
+
params = request.get('params', {})
|
|
582
|
+
request_id = request.get('id')
|
|
583
|
+
|
|
584
|
+
response = {}
|
|
585
|
+
if method == 'initialize':
|
|
586
|
+
response = {
|
|
587
|
+
"jsonrpc": "2.0", "id": request_id,
|
|
588
|
+
"result": {
|
|
589
|
+
"protocolVersion": "2024-11-05",
|
|
590
|
+
"serverInfo": {
|
|
591
|
+
"name": "CodeGraphContext", "version": "0.1.0",
|
|
592
|
+
"systemPrompt": LLM_SYSTEM_PROMPT
|
|
593
|
+
},
|
|
594
|
+
"capabilities": {"tools": {"listTools": True}},
|
|
595
|
+
}
|
|
596
|
+
}
|
|
597
|
+
elif method == 'tools/list':
|
|
598
|
+
# This now correctly returns the list of defined tools
|
|
599
|
+
response = {
|
|
600
|
+
"jsonrpc": "2.0", "id": request_id,
|
|
601
|
+
"result": {"tools": list(self.tools.values())}
|
|
602
|
+
}
|
|
603
|
+
elif method == 'tools/call':
|
|
604
|
+
tool_name = params.get('name')
|
|
605
|
+
args = params.get('arguments', {})
|
|
606
|
+
result = await self.handle_tool_call(tool_name, args)
|
|
607
|
+
response = {
|
|
608
|
+
"jsonrpc": "2.0", "id": request_id,
|
|
609
|
+
"result": {"content": [{"type": "text", "text": json.dumps(result, indent=2)}]}
|
|
610
|
+
}
|
|
611
|
+
else:
|
|
612
|
+
response = {
|
|
613
|
+
"jsonrpc": "2.0", "id": request_id,
|
|
614
|
+
"error": {"code": -32601, "message": f"Method not found: {method}"}
|
|
615
|
+
}
|
|
616
|
+
|
|
617
|
+
print(json.dumps(response), flush=True)
|
|
618
|
+
|
|
619
|
+
except Exception as e:
|
|
620
|
+
logger.error(f"Error processing request: {e}\n{traceback.format_exc()}")
|
|
621
|
+
error_response = {
|
|
622
|
+
"jsonrpc": "2.0", "id": request.get('id') if 'request' in locals() else None,
|
|
623
|
+
"error": {"code": -32603, "message": f"Internal error: {str(e)}", "data": traceback.format_exc()}
|
|
624
|
+
}
|
|
625
|
+
print(json.dumps(error_response), flush=True)
|
|
626
|
+
|
|
627
|
+
def shutdown(self):
|
|
628
|
+
logger.info("Shutting down server...")
|
|
629
|
+
self.code_watcher.stop()
|
|
630
|
+
self.db_manager.close_driver()
|
|
@@ -0,0 +1,144 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: codegraphcontext
|
|
3
|
+
Version: 0.1.0
|
|
4
|
+
Summary: An MCP server that indexes local code into a graph database to provide context to AI assistants.
|
|
5
|
+
Author-email: Shashank Shekhar Singh <shashankshekharsingh1205@gmail.com>
|
|
6
|
+
License: MIT License
|
|
7
|
+
|
|
8
|
+
Copyright (c) 2025
|
|
9
|
+
|
|
10
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
11
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
12
|
+
in the Software without restriction, including without limitation the rights
|
|
13
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
14
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
15
|
+
furnished to do so, subject to the following conditions:
|
|
16
|
+
|
|
17
|
+
The above copyright notice and this permission notice shall be included in all
|
|
18
|
+
copies or substantial portions of the Software.
|
|
19
|
+
|
|
20
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
21
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
22
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
23
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
24
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
25
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
26
|
+
SOFTWARE.
|
|
27
|
+
|
|
28
|
+
Project-URL: Homepage, https://github.com/Shashankss1205/CodeGraphContext
|
|
29
|
+
Project-URL: Bug Tracker, https://github.com/Shashankss1205/CodeGraphContext/issues
|
|
30
|
+
Classifier: Programming Language :: Python :: 3
|
|
31
|
+
Classifier: License :: OSI Approved :: MIT License
|
|
32
|
+
Classifier: Operating System :: OS Independent
|
|
33
|
+
Classifier: Development Status :: 3 - Alpha
|
|
34
|
+
Classifier: Intended Audience :: Developers
|
|
35
|
+
Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
|
|
36
|
+
Requires-Python: >=3.8
|
|
37
|
+
Description-Content-Type: text/markdown
|
|
38
|
+
License-File: LICENSE
|
|
39
|
+
Requires-Dist: neo4j>=5.15.0
|
|
40
|
+
Requires-Dist: watchdog>=3.0.0
|
|
41
|
+
Requires-Dist: requests>=2.31.0
|
|
42
|
+
Requires-Dist: stdlibs>=2023.11.18
|
|
43
|
+
Requires-Dist: typer[all]>=0.9.0
|
|
44
|
+
Requires-Dist: rich>=13.7.0
|
|
45
|
+
Requires-Dist: inquirerpy>=0.3.4
|
|
46
|
+
Requires-Dist: python-dotenv>=1.0.0
|
|
47
|
+
Provides-Extra: dev
|
|
48
|
+
Requires-Dist: pytest>=7.4.0; extra == "dev"
|
|
49
|
+
Requires-Dist: black>=23.11.0; extra == "dev"
|
|
50
|
+
Dynamic: license-file
|
|
51
|
+
|
|
52
|
+
# CodeGraphContext
|
|
53
|
+
|
|
54
|
+
An MCP server that indexes local code into a graph database to provide context to AI assistants.
|
|
55
|
+
|
|
56
|
+
## Features
|
|
57
|
+
|
|
58
|
+
- **Code Indexing:** Analyzes Python code and builds a knowledge graph of its components.
|
|
59
|
+
- **Relationship Analysis:** Query for callers, callees, class hierarchies, and more.
|
|
60
|
+
- **Live Updates:** Watches local files for changes and automatically updates the graph.
|
|
61
|
+
- **Interactive Setup:** A user-friendly command-line wizard for easy setup.
|
|
62
|
+
|
|
63
|
+
## Getting Started
|
|
64
|
+
|
|
65
|
+
1. **Install:** `pip install codegraphcontext`
|
|
66
|
+
2. **Setup:** `cgc setup`
|
|
67
|
+
3. **Start:** `cgc start`
|
|
68
|
+
4. **Index Code:** `cgc tool add-code-to-graph '{"path": "/path/to/your/project"}'`
|
|
69
|
+
|
|
70
|
+
## MCP Client Configuration
|
|
71
|
+
|
|
72
|
+
Add the following to your MCP client's configuration:
|
|
73
|
+
|
|
74
|
+
```json
|
|
75
|
+
{
|
|
76
|
+
"mcpServers": {
|
|
77
|
+
"CodeGraphContext": {
|
|
78
|
+
"command": "cgc",
|
|
79
|
+
"args": [
|
|
80
|
+
"start"
|
|
81
|
+
],
|
|
82
|
+
"env": {
|
|
83
|
+
"NEO4J_URI": "************",
|
|
84
|
+
"NEO4J_USER": "************",
|
|
85
|
+
"NEO4J_PASSWORD": "**************"
|
|
86
|
+
},
|
|
87
|
+
"tools": {
|
|
88
|
+
"alwaysAllow": [
|
|
89
|
+
"list_imports",
|
|
90
|
+
"add_code_to_graph",
|
|
91
|
+
"add_package_to_graph",
|
|
92
|
+
"check_job_status",
|
|
93
|
+
"list_jobs",
|
|
94
|
+
"find_code",
|
|
95
|
+
"analyze_code_relationships",
|
|
96
|
+
"watch_directory",
|
|
97
|
+
"find_dead_code",
|
|
98
|
+
"execute_cypher_query"
|
|
99
|
+
],
|
|
100
|
+
"disabled": false
|
|
101
|
+
},
|
|
102
|
+
"disabled": false,
|
|
103
|
+
"alwaysAllow": []
|
|
104
|
+
}
|
|
105
|
+
}
|
|
106
|
+
}
|
|
107
|
+
```
|
|
108
|
+
|
|
109
|
+
## Natural Language Interaction Examples
|
|
110
|
+
|
|
111
|
+
Once the server is running, you can interact with it through your AI assistant using plain English. Here are some examples of what you can say:
|
|
112
|
+
|
|
113
|
+
### Indexing and Watching Files
|
|
114
|
+
|
|
115
|
+
- **To index a new project:**
|
|
116
|
+
- "Please index the code in the `/path/to/my-project` directory."
|
|
117
|
+
OR
|
|
118
|
+
- "Add the project at `~/dev/my-other-project` to the code graph."
|
|
119
|
+
|
|
120
|
+
|
|
121
|
+
- **To start watching a directory for live changes:**
|
|
122
|
+
- "Watch the `/path/to/my-active-project` directory for changes."
|
|
123
|
+
OR
|
|
124
|
+
- "Keep the code graph updated for the project I'm working on at `~/dev/main-app`."
|
|
125
|
+
|
|
126
|
+
### Querying and Understanding Code
|
|
127
|
+
|
|
128
|
+
- **Finding where code is defined:**
|
|
129
|
+
- "Where is the `process_payment` function?"
|
|
130
|
+
- "Find the `User` class for me."
|
|
131
|
+
- "Show me any code related to 'database connection'."
|
|
132
|
+
|
|
133
|
+
- **Analyzing relationships and impact:**
|
|
134
|
+
- "What other functions call the `get_user_by_id` function?"
|
|
135
|
+
- "If I change the `calculate_tax` function, what other parts of the code will be affected?"
|
|
136
|
+
- "Show me the inheritance hierarchy for the `BaseController` class."
|
|
137
|
+
- "What methods does the `Order` class have?"
|
|
138
|
+
|
|
139
|
+
- **Exploring dependencies:**
|
|
140
|
+
- "Which files import the `requests` library?"
|
|
141
|
+
- "Find all implementations of the `render` method."
|
|
142
|
+
|
|
143
|
+
- **Code Quality and Maintenance:**
|
|
144
|
+
- "Is there any dead or unused code in this project?"
|
|
@@ -0,0 +1,10 @@
|
|
|
1
|
+
codegraphcontext/__init__.py,sha256=mFY5raGZpG90gG6E8NH0R85mDl3Ikak2HQHOiwCXl-I,77
|
|
2
|
+
codegraphcontext/__main__.py,sha256=21QjL_lfd6cy-clJYPBg6xKb_IuNDcvvWdaAJEvZ-HQ,99
|
|
3
|
+
codegraphcontext/prompts.py,sha256=gu0T_N7QwTIil1WhRiO1eI4KUCeKfDiztH9MvcgsV24,5037
|
|
4
|
+
codegraphcontext/server.py,sha256=pZ8QumE95M24tO-kcQyKmEMVXgE3Zg-tLmhiucGKDdc,28364
|
|
5
|
+
codegraphcontext-0.1.0.dist-info/licenses/LICENSE,sha256=Btzdu2kIoMbdSp6OyCLupB1aRgpTCJ_szMimgEnpkkE,1056
|
|
6
|
+
codegraphcontext-0.1.0.dist-info/METADATA,sha256=Ad7HCv7wm3XALyDXU3UALqDlvHkcPWWNCjoWc_2_Bx4,5352
|
|
7
|
+
codegraphcontext-0.1.0.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
|
|
8
|
+
codegraphcontext-0.1.0.dist-info/entry_points.txt,sha256=74LVRivtqMnlXdAamQTLkkxUp12zVAmDRFbAV4kArss,54
|
|
9
|
+
codegraphcontext-0.1.0.dist-info/top_level.txt,sha256=CBgc6LAPZIO5FS0nSYYkylDifHsZTIqw3Gf5UwDxeGI,17
|
|
10
|
+
codegraphcontext-0.1.0.dist-info/RECORD,,
|
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2025
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
codegraphcontext
|