iflow-mcp_kandrwmrtn-cplusplus_mcp 0.1.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,222 @@
1
+ Metadata-Version: 2.4
2
+ Name: iflow-mcp_kandrwmrtn-cplusplus_mcp
3
+ Version: 0.1.0
4
+ Summary: An MCP server for analyzing C++ codebases using libclang
5
+ License-File: LICENSE
6
+ Requires-Python: >=3.9
7
+ Requires-Dist: libclang>=16.0.0
8
+ Requires-Dist: mcp>=1.0.0
9
+ Description-Content-Type: text/markdown
10
+
11
+ # C++ MCP Server
12
+
13
+ An MCP (Model Context Protocol) server for analyzing C++ codebases using libclang.
14
+
15
+ ## Why Use This?
16
+
17
+ Instead of having Claude grep through your C++ codebase trying to understand the structure, this server provides semantic understanding of your code. Claude can instantly find classes, functions, and their relationships without getting lost in thousands of files. It understands C++ syntax, inheritance hierarchies, and call graphs - giving Claude the ability to navigate your codebase like an IDE would.
18
+
19
+ ## Features
20
+
21
+ Context-efficient C++ code analysis:
22
+ - **search_classes** - Find classes by name pattern
23
+ - **search_functions** - Find functions by name pattern
24
+ - **get_class_info** - Get detailed class information (methods, members, inheritance)
25
+ - **get_function_signature** - Get function signatures and parameters
26
+ - **find_in_file** - Search symbols within specific files
27
+ - **get_class_hierarchy** - Get complete inheritance hierarchy for a class
28
+ - **get_derived_classes** - Find all classes that inherit from a base class
29
+ - **find_callers** - Find all functions that call a specific function
30
+ - **find_callees** - Find all functions called by a specific function
31
+ - **get_call_path** - Find call paths from one function to another
32
+
33
+ ## Prerequisites
34
+
35
+ - Python 3.9 or higher
36
+ - pip (Python package manager)
37
+ - Git (for cloning the repository)
38
+ - LLVM's libclang (the setup scripts will attempt to download a portable build)
39
+
40
+ ## Setup
41
+
42
+ 1. Clone the repository:
43
+ ```bash
44
+ git clone <repository-url>
45
+ cd CPlusPlus-MCP-Server
46
+ ```
47
+
48
+ 2. Run the setup script for your platform (this creates a virtual environment, installs dependencies, and fetches libclang if possible):
49
+ - **Windows**
50
+ ```bash
51
+ server_setup.bat
52
+ ```
53
+ - **Linux/macOS**
54
+ ```bash
55
+ ./server_setup.sh
56
+ ```
57
+
58
+ 3. Test the installation (recommended):
59
+ ```bash
60
+ # Activate the virtual environment first
61
+ mcp_env\Scripts\activate
62
+
63
+ # Run the installation test
64
+ python scripts\test_installation.py
65
+ ```
66
+
67
+ This will verify that all components are properly installed and working. The test script lives at `scripts/test_installation.py`.
68
+
69
+ ## Configuring Claude Code
70
+
71
+ To use this MCP server with Claude Code, you need to add it to your Claude configuration file.
72
+
73
+ 1. Find and open your Claude configuration file. Common locations include:
74
+ ```
75
+ C:\Users\<YourUsername>\.claude.json
76
+ C:\Users\<YourUsername>\AppData\Roaming\Claude\.claude.json
77
+ %APPDATA%\Claude\.claude.json
78
+ ```
79
+
80
+ The exact location may vary depending on your Claude installation.
81
+
82
+ 2. Add the C++ MCP server to the `mcpServers` section:
83
+ ```json
84
+ {
85
+ "mcpServers": {
86
+ "cpp-analyzer": {
87
+ "command": "python",
88
+ "args": [
89
+ "-m",
90
+ "mcp_server.cpp_mcp_server"
91
+ ],
92
+ "cwd": "YOUR_INSTALLATION_PATH_HERE",
93
+ "env": {
94
+ "PYTHONPATH": "YOUR_INSTALLATION_PATH_HERE"
95
+ }
96
+ }
97
+ }
98
+ }
99
+ ```
100
+
101
+ **IMPORTANT:** Replace `YOUR_INSTALLATION_PATH_HERE` with the actual path where you cloned this repository.
102
+
103
+ 3. Restart Claude Desktop for the changes to take effect.
104
+
105
+ ## Configuring Codex CLI
106
+
107
+ To use this MCP server inside the OpenAI Codex CLI:
108
+
109
+ 1. Make sure the virtual environment is created (see setup above).
110
+ 2. Create a `.mcp.json` file in the project you open with Codex. The CLI reads this file to discover MCP servers.
111
+ 3. Add an entry that points to the Python module inside the virtual environment. Replace `YOUR_REPO_PATH` with the absolute path to this repository.
112
+
113
+ ```json
114
+ {
115
+ "mcpServers": {
116
+ "cpp-analyzer": {
117
+ "type": "stdio",
118
+ "command": "YOUR_REPO_PATH/mcp_env/bin/python",
119
+ "args": [
120
+ "-m",
121
+ "mcp_server.cpp_mcp_server"
122
+ ],
123
+ "env": {
124
+ "PYTHONPATH": "YOUR_REPO_PATH"
125
+ }
126
+ }
127
+ }
128
+ }
129
+ ```
130
+
131
+ On Windows change `command` to `YOUR_REPO_PATH\\mcp_env\\Scripts\\python.exe`.
132
+
133
+ 4. Restart the Codex CLI (or run `codex reload`) so it picks up the new server definition.
134
+ 5. Inside Codex, use the MCP palette or prompt instructions (for example, "use the cpp-analyzer tool to set the project directory to ...") to start indexing your C++ project.
135
+
136
+ If you keep the `.mcp.json` file inside this repository you can also add a `"cwd": "YOUR_REPO_PATH"` entry so Codex launches the server from the correct directory.
137
+
138
+ ## Usage with Claude
139
+
140
+ Once configured, you can use the C++ analyzer in your conversations with Claude:
141
+
142
+ 1. First, ask Claude to set your project directory using the MCP tool:
143
+ ```
144
+ "Use the cpp-analyzer tool to set the project directory to C:\path\to\your\cpp\project"
145
+ ```
146
+
147
+ **Note:** The initial indexing might take a long time for very large projects (several minutes for codebases with thousands of files). The server will cache the results for faster subsequent queries.
148
+
149
+ 2. Then you can ask questions like:
150
+ - "Find all classes containing 'Actor'"
151
+ - "Show me the Component class details"
152
+ - "What's the signature of BeginPlay function?"
153
+ - "Search for physics-related functions"
154
+ - "Show me the inheritance hierarchy for GameObject"
155
+ - "Find all functions that call Update()"
156
+ - "What functions does Render() call?"
157
+
158
+ ## Architecture
159
+
160
+ - Uses libclang for accurate C++ parsing
161
+ - Caches parsed AST for improved performance
162
+ - Supports incremental analysis and project-wide search
163
+ - Provides detailed symbol information including:
164
+ - Function signatures with parameter types and names
165
+ - Class members, methods, and inheritance
166
+ - Call graph analysis for understanding code flow
167
+ - File locations for easy navigation
168
+
169
+ ## Configuration Options
170
+
171
+ The server behavior can be configured via `cpp-analyzer-config.json`:
172
+
173
+ ```json
174
+ {
175
+ "exclude_directories": [".git", ".svn", "node_modules", "build", "Build"],
176
+ "exclude_patterns": ["*.generated.h", "*.generated.cpp", "*_test.cpp"],
177
+ "dependency_directories": ["vcpkg_installed", "third_party", "external"],
178
+ "include_dependencies": true,
179
+ "max_file_size_mb": 10
180
+ }
181
+ ```
182
+
183
+ - **exclude_directories**: Directories to skip during project scanning
184
+ - **exclude_patterns**: File patterns to exclude from analysis
185
+ - **dependency_directories**: Directories containing third-party dependencies
186
+ - **include_dependencies**: Whether to analyze files in dependency directories
187
+ - **max_file_size_mb**: Maximum file size to analyze (larger files are skipped)
188
+
189
+ ## Troubleshooting
190
+
191
+ ### Common Issues
192
+
193
+ 1. **"libclang not found" error**
194
+ - Run `server_setup.bat` (Windows) or `./server_setup.sh` (Linux/macOS) to let the project download libclang automatically
195
+ - If automatic download fails, manually download libclang:
196
+ 1. Go to: https://github.com/llvm/llvm-project/releases
197
+ 2. Download the appropriate file for your system:
198
+ - **Windows**: `clang+llvm-*-x86_64-pc-windows-msvc.tar.xz`
199
+ - **macOS**: `clang+llvm-*-x86_64-apple-darwin.tar.xz`
200
+ - **Linux**: `clang+llvm-*-x86_64-linux-gnu-ubuntu-*.tar.xz`
201
+ 3. Extract and copy the libclang library to the appropriate location:
202
+ - **Windows**: Copy `bin\libclang.dll` to `lib\windows\libclang.dll`
203
+ - **macOS**: Copy `lib\libclang.dylib` to `lib\macos\libclang.dylib`
204
+ - **Linux**: Copy `lib\libclang.so.*` to `lib\linux\libclang.so`
205
+
206
+ 2. **Server fails to start**
207
+ - Check that Python 3.9+ is installed: `python --version`
208
+ - Verify all dependencies are installed: `pip install -r requirements.txt`
209
+ - Run the installation test to identify issues:
210
+ ```bash
211
+ mcp_env\Scripts\activate
212
+ python -m mcp_server.test_installation
213
+ ```
214
+
215
+ 3. **Claude doesn't recognize the server**
216
+ - Ensure the paths in `.claude.json` are absolute paths
217
+ - Restart Claude Desktop after modifying the configuration
218
+
219
+ 4. **Claude uses grep/glob instead of the C++ analyzer**
220
+ - Be explicit in prompts: Say "use the cpp-analyzer to..." when asking about C++ code
221
+ - Add instructions to your project's `CLAUDE.md` file telling Claude to prefer the cpp-analyzer for C++ symbol searches
222
+ - The cpp-analyzer is much faster than grep for finding classes, functions, and understanding code structure
@@ -0,0 +1,14 @@
1
+ mcp_server/__init__.py,sha256=rb4OYhZrwqDG7frb5y0-Ath_IfmEZZH-G1W52xqurWg,20
2
+ mcp_server/cache_manager.py,sha256=vh27N5oikrBvDcvqU0G6hrdiT-dEa8H50irJcHNtEH8,8046
3
+ mcp_server/call_graph.py,sha256=k7lu3K5DVrc0cNU4L4K-Cbt4TawJLQXMr2D6sUgPX9E,4564
4
+ mcp_server/cpp_analyzer.py,sha256=3uIKJhqpoToc4jWXmNCyZnjpDEcoZcbz9emDOZJ3DIM,44837
5
+ mcp_server/cpp_analyzer_config.py,sha256=qQXqQ4O-b26X0YWZMGYqxsnmSBDbd3gbNC49R4XiFVE,4054
6
+ mcp_server/cpp_mcp_server.py,sha256=E0Gxq4mqZDsYqHS1qcU44zG5I6kw3cQPYeh8PZ9U-ps,70383
7
+ mcp_server/file_scanner.py,sha256=N2i1s1eV31OO4bRRFQezLtTOcghM-iL7koCKNPEzVo8,3457
8
+ mcp_server/search_engine.py,sha256=Kk4upwhOvpp9ejqix7OR9EZRZX75mpEVBtDRZpV5Bn0,5337
9
+ mcp_server/symbol_info.py,sha256=jcSVfs3RY9USThwrWZM4LN0DPUKfQx4clcBpZ_XE2NM,1479
10
+ iflow_mcp_kandrwmrtn_cplusplus_mcp-0.1.0.dist-info/METADATA,sha256=KOEY3oRSMsRlp_QmqyWc6VR4accd6mKegFvzsDBGvdM,8424
11
+ iflow_mcp_kandrwmrtn_cplusplus_mcp-0.1.0.dist-info/WHEEL,sha256=WLgqFyCfm_KASv4WHyYy0P3pM_m7J5L9k2skdKLirC8,87
12
+ iflow_mcp_kandrwmrtn_cplusplus_mcp-0.1.0.dist-info/entry_points.txt,sha256=hBHf2A-CyZD5JOYDPDzNlixe3tSO80uO8JpbtsacsK0,70
13
+ iflow_mcp_kandrwmrtn_cplusplus_mcp-0.1.0.dist-info/licenses/LICENSE,sha256=btWkLC_c3_pzYSBolypn-jdLF1MF5knaY-koLLP__mc,1068
14
+ iflow_mcp_kandrwmrtn_cplusplus_mcp-0.1.0.dist-info/RECORD,,
@@ -0,0 +1,4 @@
1
+ Wheel-Version: 1.0
2
+ Generator: hatchling 1.28.0
3
+ Root-Is-Purelib: true
4
+ Tag: py3-none-any
@@ -0,0 +1,2 @@
1
+ [console_scripts]
2
+ cpp-mcp-server = mcp_server.cpp_mcp_server:run_main
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2025 Chickenrikke
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
mcp_server/__init__.py ADDED
@@ -0,0 +1 @@
1
+ # MCP Server Package
@@ -0,0 +1,212 @@
1
+ """Cache management for C++ analyzer."""
2
+
3
+ import json
4
+ import hashlib
5
+ import time
6
+ import os
7
+ import sys
8
+ from pathlib import Path
9
+ from typing import Dict, List, Optional, Any
10
+ from collections import defaultdict
11
+ from .symbol_info import SymbolInfo
12
+
13
+
14
+ class CacheManager:
15
+ """Manages caching for the C++ analyzer."""
16
+
17
+ def __init__(self, project_root: Path):
18
+ self.project_root = project_root
19
+ self.cache_dir = self._get_cache_dir()
20
+ self.cache_dir.mkdir(parents=True, exist_ok=True)
21
+
22
+ def _get_cache_dir(self) -> Path:
23
+ """Get the cache directory for this project"""
24
+ # Use the MCP server directory for cache, not the project being analyzed
25
+ mcp_server_root = Path(__file__).parent.parent # Go up from mcp_server/cache_manager.py to root
26
+ cache_base = mcp_server_root / ".mcp_cache"
27
+
28
+ # Use a hash of the project path to create a unique cache directory
29
+ project_hash = hashlib.md5(str(self.project_root).encode()).hexdigest()[:8]
30
+ cache_dir = cache_base / f"{self.project_root.name}_{project_hash}"
31
+ return cache_dir
32
+
33
+ def get_file_hash(self, file_path: str) -> str:
34
+ """Calculate hash of a file"""
35
+ try:
36
+ with open(file_path, 'rb') as f:
37
+ return hashlib.md5(f.read()).hexdigest()
38
+ except:
39
+ return ""
40
+
41
+ def save_cache(self, class_index: Dict[str, List[SymbolInfo]],
42
+ function_index: Dict[str, List[SymbolInfo]],
43
+ file_hashes: Dict[str, str],
44
+ indexed_file_count: int,
45
+ include_dependencies: bool = False) -> bool:
46
+ """Save indexes to cache file"""
47
+ try:
48
+ cache_file = self.cache_dir / "cache_info.json"
49
+
50
+ # Convert to serializable format
51
+ cache_data = {
52
+ "version": "2.0", # Cache version
53
+ "include_dependencies": include_dependencies,
54
+ "class_index": {},
55
+ "function_index": {},
56
+ "file_hashes": file_hashes,
57
+ "indexed_file_count": indexed_file_count,
58
+ "timestamp": time.time()
59
+ }
60
+
61
+ # Convert class index
62
+ for name, infos in class_index.items():
63
+ cache_data["class_index"][name] = [info.to_dict() for info in infos]
64
+
65
+ # Convert function index
66
+ for name, infos in function_index.items():
67
+ cache_data["function_index"][name] = [info.to_dict() for info in infos]
68
+
69
+ # Save to file
70
+ cache_file.parent.mkdir(parents=True, exist_ok=True)
71
+ with open(cache_file, 'w') as f:
72
+ json.dump(cache_data, f, indent=2)
73
+
74
+ return True
75
+ except Exception as e:
76
+ print(f"Error saving cache: {e}", file=sys.stderr)
77
+ return False
78
+
79
+ def load_cache(self, include_dependencies: bool = False) -> Optional[Dict[str, Any]]:
80
+ """Load cache if it exists and is valid"""
81
+ cache_file = self.cache_dir / "cache_info.json"
82
+
83
+ if not cache_file.exists():
84
+ return None
85
+
86
+ try:
87
+ with open(cache_file, 'r') as f:
88
+ cache_data = json.load(f)
89
+
90
+ # Check cache version
91
+ if cache_data.get("version") != "2.0":
92
+ print("Cache version mismatch, rebuilding...", file=sys.stderr)
93
+ return None
94
+
95
+ # Check if dependencies setting matches
96
+ cached_include_deps = cache_data.get("include_dependencies", False)
97
+ if cached_include_deps != include_dependencies:
98
+ print(f"Cache dependencies setting mismatch (cached={cached_include_deps}, current={include_dependencies})",
99
+ file=sys.stderr)
100
+ return None
101
+
102
+ return cache_data
103
+
104
+ except Exception as e:
105
+ print(f"Error loading cache: {e}", file=sys.stderr)
106
+ return None
107
+
108
+ def get_file_cache_path(self, file_path: str) -> Path:
109
+ """Get the cache file path for a given source file"""
110
+ files_dir = self.cache_dir / "files"
111
+ cache_filename = hashlib.md5(file_path.encode()).hexdigest() + ".json"
112
+ return files_dir / cache_filename
113
+
114
+ def save_file_cache(self, file_path: str, symbols: List[SymbolInfo],
115
+ file_hash: str) -> bool:
116
+ """Save parsed symbols for a single file"""
117
+ try:
118
+ # Create files subdirectory
119
+ files_dir = self.cache_dir / "files"
120
+ files_dir.mkdir(exist_ok=True)
121
+
122
+ # Use hash of file path as cache filename
123
+ cache_file = self.get_file_cache_path(file_path)
124
+
125
+ # Prepare cache data
126
+ cache_data = {
127
+ "file_path": file_path,
128
+ "file_hash": file_hash,
129
+ "timestamp": time.time(),
130
+ "symbols": [s.to_dict() for s in symbols]
131
+ }
132
+
133
+ # Save to file
134
+ with open(cache_file, 'w') as f:
135
+ json.dump(cache_data, f, indent=2)
136
+
137
+ return True
138
+ except Exception as e:
139
+ # Silently fail for individual file caches
140
+ return False
141
+
142
+ def load_file_cache(self, file_path: str, current_hash: str) -> Optional[List[SymbolInfo]]:
143
+ """Load cached symbols for a file if hash matches"""
144
+ try:
145
+ cache_file = self.get_file_cache_path(file_path)
146
+
147
+ if not cache_file.exists():
148
+ return None
149
+
150
+ with open(cache_file, 'r') as f:
151
+ cache_data = json.load(f)
152
+
153
+ # Check if file hash matches
154
+ if cache_data.get("file_hash") != current_hash:
155
+ return None
156
+
157
+ # Reconstruct SymbolInfo objects
158
+ symbols = []
159
+ for s in cache_data.get("symbols", []):
160
+ symbols.append(SymbolInfo(**s))
161
+
162
+ return symbols
163
+ except:
164
+ return None
165
+
166
+ def remove_file_cache(self, file_path: str) -> bool:
167
+ """Remove cached data for a deleted file"""
168
+ try:
169
+ cache_file = self.get_file_cache_path(file_path)
170
+ if cache_file.exists():
171
+ cache_file.unlink()
172
+ return True
173
+ return False
174
+ except:
175
+ return False
176
+
177
+ def save_progress(self, total_files: int, indexed_files: int,
178
+ failed_files: int, cache_hits: int,
179
+ last_index_time: float, class_count: int,
180
+ function_count: int, status: str = "in_progress"):
181
+ """Save indexing progress"""
182
+ try:
183
+ progress_file = self.cache_dir / "indexing_progress.json"
184
+ progress_data = {
185
+ "project_root": str(self.project_root),
186
+ "total_files": total_files,
187
+ "indexed_files": indexed_files,
188
+ "failed_files": failed_files,
189
+ "cache_hits": cache_hits,
190
+ "last_index_time": last_index_time,
191
+ "timestamp": time.time(),
192
+ "class_count": class_count,
193
+ "function_count": function_count,
194
+ "status": status
195
+ }
196
+
197
+ with open(progress_file, 'w') as f:
198
+ json.dump(progress_data, f, indent=2)
199
+ except:
200
+ pass # Silently fail for progress tracking
201
+
202
+ def load_progress(self) -> Optional[Dict[str, Any]]:
203
+ """Load indexing progress if available"""
204
+ try:
205
+ progress_file = self.cache_dir / "indexing_progress.json"
206
+ if not progress_file.exists():
207
+ return None
208
+
209
+ with open(progress_file, 'r') as f:
210
+ return json.load(f)
211
+ except:
212
+ return None
@@ -0,0 +1,108 @@
1
+ """Call graph analysis for C++ code."""
2
+
3
+ import re
4
+ from typing import Dict, List, Set, Optional, Any
5
+ from collections import defaultdict
6
+ from .symbol_info import SymbolInfo
7
+
8
+
9
+ class CallGraphAnalyzer:
10
+ """Manages call graph analysis for C++ code."""
11
+
12
+ def __init__(self):
13
+ self.call_graph: Dict[str, Set[str]] = defaultdict(set) # Function USR -> Set of called USRs
14
+ self.reverse_call_graph: Dict[str, Set[str]] = defaultdict(set) # Function USR -> Set of caller USRs
15
+
16
+ def add_call(self, caller_usr: str, callee_usr: str):
17
+ """Add a function call relationship"""
18
+ if caller_usr and callee_usr:
19
+ self.call_graph[caller_usr].add(callee_usr)
20
+ self.reverse_call_graph[callee_usr].add(caller_usr)
21
+
22
+ def clear(self):
23
+ """Clear all call graph data"""
24
+ self.call_graph.clear()
25
+ self.reverse_call_graph.clear()
26
+
27
+ def remove_symbol(self, usr: str):
28
+ """Remove a symbol from the call graph completely"""
29
+ if usr in self.call_graph:
30
+ # Remove all calls made by this function
31
+ called_functions = self.call_graph[usr].copy()
32
+ for called_usr in called_functions:
33
+ self.reverse_call_graph[called_usr].discard(usr)
34
+ # Clean up empty sets
35
+ if not self.reverse_call_graph[called_usr]:
36
+ del self.reverse_call_graph[called_usr]
37
+ del self.call_graph[usr]
38
+
39
+ if usr in self.reverse_call_graph:
40
+ # Remove all calls to this function
41
+ calling_functions = self.reverse_call_graph[usr].copy()
42
+ for caller_usr in calling_functions:
43
+ self.call_graph[caller_usr].discard(usr)
44
+ # Clean up empty sets
45
+ if not self.call_graph[caller_usr]:
46
+ del self.call_graph[caller_usr]
47
+ del self.reverse_call_graph[usr]
48
+
49
+ def rebuild_from_symbols(self, symbols: List[SymbolInfo]):
50
+ """Rebuild call graph from symbol list"""
51
+ self.clear()
52
+ for symbol in symbols:
53
+ if symbol.usr and symbol.calls:
54
+ for called_usr in symbol.calls:
55
+ self.add_call(symbol.usr, called_usr)
56
+
57
+ def find_callers(self, function_usr: str) -> Set[str]:
58
+ """Find all functions that call the specified function"""
59
+ return self.reverse_call_graph.get(function_usr, set())
60
+
61
+ def find_callees(self, function_usr: str) -> Set[str]:
62
+ """Find all functions called by the specified function"""
63
+ return self.call_graph.get(function_usr, set())
64
+
65
+ def get_call_paths(self, from_usr: str, to_usr: str, max_depth: int = 10) -> List[List[str]]:
66
+ """Find all call paths from one function to another"""
67
+ if from_usr == to_usr:
68
+ return [[from_usr]]
69
+
70
+ if max_depth <= 0:
71
+ return []
72
+
73
+ paths = []
74
+
75
+ # Get direct callees
76
+ callees = self.find_callees(from_usr)
77
+
78
+ for callee in callees:
79
+ if callee == to_usr:
80
+ # Direct call found
81
+ paths.append([from_usr, to_usr])
82
+ else:
83
+ # Recursively search for paths
84
+ sub_paths = self.get_call_paths(callee, to_usr, max_depth - 1)
85
+ for sub_path in sub_paths:
86
+ paths.append([from_usr] + sub_path)
87
+
88
+ return paths
89
+
90
+ def get_call_statistics(self) -> Dict[str, Any]:
91
+ """Get statistics about the call graph"""
92
+ return {
93
+ "total_functions_with_calls": len(self.call_graph),
94
+ "total_functions_being_called": len(self.reverse_call_graph),
95
+ "total_unique_calls": sum(len(calls) for calls in self.call_graph.values()),
96
+ "most_called_functions": self._get_most_called_functions(10),
97
+ "functions_with_most_calls": self._get_functions_with_most_calls(10)
98
+ }
99
+
100
+ def _get_most_called_functions(self, limit: int) -> List[tuple]:
101
+ """Get the most frequently called functions"""
102
+ call_counts = [(usr, len(callers)) for usr, callers in self.reverse_call_graph.items()]
103
+ return sorted(call_counts, key=lambda x: x[1], reverse=True)[:limit]
104
+
105
+ def _get_functions_with_most_calls(self, limit: int) -> List[tuple]:
106
+ """Get functions that make the most calls"""
107
+ call_counts = [(usr, len(callees)) for usr, callees in self.call_graph.items()]
108
+ return sorted(call_counts, key=lambda x: x[1], reverse=True)[:limit]