codewrench 0.1.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,187 @@
1
+ Metadata-Version: 2.4
2
+ Name: codewrench
3
+ Version: 0.1.0
4
+ Summary: A multi-language code performance analyser with static analysis and AI-powered fix generation.
5
+ Author-email: Vishad Jain <vishadjain2304@gmail.com>
6
+ License: MIT
7
+ Project-URL: Homepage, https://github.com/vishaddjain/wrench
8
+ Project-URL: Issues, https://github.com/vishaddjain/wrench/issues
9
+ Keywords: performance,static-analysis,code-analysis,AI,developer-tools
10
+ Classifier: Programming Language :: Python :: 3
11
+ Classifier: License :: OSI Approved :: MIT License
12
+ Classifier: Operating System :: OS Independent
13
+ Classifier: Topic :: Software Development :: Quality Assurance
14
+ Classifier: Topic :: Software Development :: Testing
15
+ Requires-Python: >=3.10
16
+ Description-Content-Type: text/markdown
17
+ Requires-Dist: groq>=1.1.1
18
+ Requires-Dist: python-dotenv>=1.2.2
19
+ Requires-Dist: requests>=2.32.5
20
+ Requires-Dist: tree-sitter>=0.25.2
21
+ Requires-Dist: tree-sitter-python>=0.25.0
22
+ Requires-Dist: tree-sitter-javascript>=0.25.0
23
+ Requires-Dist: tree-sitter-typescript>=0.23.2
24
+ Requires-Dist: tree-sitter-go>=0.25.0
25
+ Requires-Dist: tree-sitter-c>=0.24.1
26
+ Requires-Dist: tree-sitter-cpp>=0.23.4
27
+
28
+ # 🔧 wrench
29
+
30
+ > Point it at your code. Get back what's slow and how to fix it.
31
+
32
+ Wrench is a multi-language performance analyser that combines static analysis with AI-powered explanations. It finds real performance issues in your code — nested loops, inefficient patterns, bad practices — then explains exactly why they're a problem and shows you the fix.
33
+
34
+ No cloud, no setup hell, no enterprise pricing. Just run it on a file.
35
+
36
+ ---
37
+
38
+ ## What it catches
39
+
40
+ **High priority**
41
+ - Nested loops (O(n²) and worse)
42
+ - Expensive function calls inside loops
43
+ - Repeated attribute access that should be cached
44
+ - String concatenation with `+` in loops
45
+
46
+ **Medium priority**
47
+ - List appends inside nested loops
48
+ - Unnecessary `list(range(n))` creation
49
+ - Bare `except:` and overly broad `except Exception`
50
+ - Global variable access inside loops
51
+ - Mutable default arguments
52
+
53
+ ---
54
+
55
+ ## Supported languages
56
+
57
+ | Language | Extension |
58
+ |----------|-----------|
59
+ | Python | `.py` |
60
+ | JavaScript | `.js` |
61
+ | TypeScript | `.ts` |
62
+ | Go | `.go` |
63
+ | C | `.c` |
64
+ | C++ | `.cpp`, `.cc` |
65
+
66
+ ---
67
+
68
+ ## Installation
69
+
70
+ ```bash
71
+ git clone https://github.com/yourusername/wrench.git
72
+ cd wrench
73
+ python -m venv venv
74
+ source venv/bin/activate # Mac/Linux
75
+ venv\Scripts\activate # Windows
76
+ pip install -r requirements.txt
77
+ ```
78
+
79
+ Create a `.env` file in the project root:
80
+
81
+ ```
82
+ GROQ_API_KEY=your_key_here
83
+ ```
84
+
85
+ Get a free Groq API key at [console.groq.com](https://console.groq.com)
86
+
87
+ ---
88
+
89
+ ## Usage
90
+
91
+ ```bash
92
+ python main.py yourfile.py
93
+ python main.py app.js
94
+ python main.py main.go
95
+ python main.py server.cpp
96
+ ```
97
+
98
+ That's it. Wrench detects the language from the file extension automatically.
99
+
100
+ ### Example output
101
+
102
+ ```
103
+ Nested loop at line 19 — potential O(n²)
104
+ String concatenation at line 22 — use ''.join() instead
105
+ Function call 'expensive_function' inside loop at line 25 — consider moving it out
106
+ Bare except at line 40 — catches everything including system exceptions, be specific
107
+
108
+ --- AI Analysis ---
109
+
110
+ 1. Nested loop at line 19
111
+ Problem: Two nested loops over the same data gives you O(n²) time complexity.
112
+ For 1000 items that's 1,000,000 iterations instead of 1,000.
113
+
114
+ Fix:
115
+ # before
116
+ for i in items:
117
+ for j in items:
118
+ process(i, j)
119
+
120
+ # after — use itertools or restructure with a dict lookup
121
+ lookup = {item: process(item) for item in items}
122
+ ```
123
+
124
+ ---
125
+
126
+ ## How it works
127
+
128
+ ```
129
+ your file
130
+
131
+ Tree-sitter parses it into a syntax tree
132
+
133
+ IR translator converts to language-agnostic representation
134
+
135
+ Detectors run static analysis on the IR
136
+
137
+ Findings sent to Groq (Llama 3.3 70B)
138
+
139
+ Plain English explanation + fix
140
+ ```
141
+
142
+ The static analysis layer is deterministic — it either finds a nested loop or it doesn't. No hallucination. The AI layer explains what the detectors already confirmed exists.
143
+
144
+ ---
145
+
146
+ ## Roadmap
147
+
148
+ - [x] Static analysis (Python, JS, TS, Go, C, C++)
149
+ - [x] AI-powered explanations and fixes
150
+ - [x] Multi-language IR architecture
151
+ - [ ] Runtime profiling (Layer 3)
152
+ - [ ] More detectors
153
+ - [ ] `pip install wrench` support
154
+ - [ ] Web UI
155
+
156
+ ---
157
+
158
+ ## Project structure
159
+
160
+ ```
161
+ wrench/
162
+ ├── detectors/
163
+ │ ├── base.py ← depth tracking, core visitor
164
+ │ ├── high.py ← high priority detectors
165
+ │ └── medium.py ← medium priority detectors
166
+ ├── languages/
167
+ │ ├── python_rules.py ← Tree-sitter node mappings per language
168
+ │ ├── js_rules.py
169
+ │ └── ...
170
+ ├── ir.py ← language-agnostic IR node
171
+ ├── ir_translator.py ← Tree-sitter → IR translation
172
+ ├── parser_engine.py ← language detection + parser setup
173
+ ├── ai_engine.py ← Groq integration
174
+ └── main.py ← entry point
175
+ ```
176
+
177
+ ---
178
+
179
+ ## Contributing
180
+
181
+ Pull requests welcome. If you want to add a new language, add a rules file in `languages/` mapping Tree-sitter node types to the generic IR types. That's it — the detectors work on all languages automatically.
182
+
183
+ Open an issue first for anything major.
184
+
185
+ ---
186
+
187
+ Built by [@vishaddjain]
@@ -0,0 +1,27 @@
1
+ wrench/ai_engine.py,sha256=Eo-FncE2A_-amHR7TBmsUalGgutUwM9vJjHIWYzRnls,2444
2
+ wrench/code.py,sha256=joTNtX15U2INRUokcjtWbK-hNwJROIA062R2Qv5efj4,1465
3
+ wrench/errors.py,sha256=ch4biH_7_U1eS6WUPZW_g6haXYsaIeNm_UyM0oTwGEU,891
4
+ wrench/ir.py,sha256=dWQcHI-kfGAU9NF6b1iWu0jFLQmLXZ_KGcOyIu9NSxY,326
5
+ wrench/ir_translator.py,sha256=p_yab9ORidH0Fe0AFWxqwt81Wu_tgNewlAH8E-Jxbq0,3712
6
+ wrench/main.py,sha256=BvTok3HfnV-vONaqBYtp54EKBWhrG0889rrEAnuPcDs,6490
7
+ wrench/parser_engine.py,sha256=J0z5XVwZMV58fAVoo9jyLh1-Uvg66bTZ86P5ZeszUrk,922
8
+ wrench/reports.py,sha256=g3kRoIgFbgwl0fXKk-TXRxuVNJFNMQrX24lozmSwehw,2663
9
+ wrench/wrenchignore.py,sha256=je-rxZfvC8rTaApmEZpv9O_QoxrogDXs1OcesZWrAD0,733
10
+ wrench/detectors/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
11
+ wrench/detectors/base.py,sha256=1c-__XrdtFBsgy_kp5-dGOpcC17L3c-3KmCjprCcs-0,1060
12
+ wrench/detectors/high.py,sha256=rU8COk1fElcNZIA4EVAFhUFfMZLHI7ogGRyFt5-YSkc,3510
13
+ wrench/detectors/medium.py,sha256=hnU7GXRwgi3ttvDAOENFA7hnaa3NyHYDIGgq7AKtsbU,3270
14
+ wrench/languages/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
15
+ wrench/languages/c_rules.py,sha256=ytLweRyI-NRRC4sHvLN_FP69Fq-QGLDQ7anacsBHH58,373
16
+ wrench/languages/cpp_rules.py,sha256=B3S98Uod3ts9xHn-slwt-OLEsjCdl_DruXbDsDEUi90,437
17
+ wrench/languages/go_rules.py,sha256=9F223iJbsmHDsUgOsIyczE398QqJum081rB_bNuR9_8,402
18
+ wrench/languages/javascript_rules.py,sha256=fm7H5A5qxMXEB0hcQdI7cQas6XIVQL2apWlBemjHJNc,503
19
+ wrench/languages/python_rules.py,sha256=Pwi1wkz3_fk0DYKMgdM1I9FkTKJ1rXjeAijGpXiAZCM,411
20
+ wrench/languages/typescript_rules.py,sha256=BBnno6Hx-sDdMxM_m87gcJfeezDAHThXqKPC1yGW7iI,524
21
+ wrench/profilers/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
22
+ wrench/profilers/profiler.py,sha256=VGbI2ZnBtY_P9-MnHgdeFNRNZBp1cNEQnk7lefRMd68,1275
23
+ codewrench-0.1.0.dist-info/METADATA,sha256=Fbihlf7MPwrh0mdYirwcqWwEdP6mgTqsnLMLCTrPCvs,5185
24
+ codewrench-0.1.0.dist-info/WHEEL,sha256=aeYiig01lYGDzBgS8HxWXOg3uV61G9ijOsup-k9o1sk,91
25
+ codewrench-0.1.0.dist-info/entry_points.txt,sha256=TCQ4dR34hDgfKgEcNE8JDcioDrObD9a33lsNnDVsC4M,41
26
+ codewrench-0.1.0.dist-info/top_level.txt,sha256=0PDz80PbdzNQqVjw3WmSLTb5oyb6Fvq3uhj4xlJlN-Y,7
27
+ codewrench-0.1.0.dist-info/RECORD,,
@@ -0,0 +1,5 @@
1
+ Wheel-Version: 1.0
2
+ Generator: setuptools (82.0.1)
3
+ Root-Is-Purelib: true
4
+ Tag: py3-none-any
5
+
@@ -0,0 +1,2 @@
1
+ [console_scripts]
2
+ codewrench = main:main
@@ -0,0 +1 @@
1
+ wrench
wrench/ai_engine.py ADDED
@@ -0,0 +1,94 @@
1
+ from groq import Groq
2
+ from dotenv import load_dotenv
3
+ import os
4
+
5
+ load_dotenv()
6
+ client = Groq(api_key=os.getenv("GROQ_API_KEY"))
7
+
8
+ #For single file
9
+ def analyse(code, warnings):
10
+ if not warnings:
11
+ return "No issues found — nothing to analyse."
12
+
13
+ warnings_text = "\n".join(f"- {w}" for w in warnings)
14
+
15
+ prompt = f"""
16
+ You are a performance analytics expert.
17
+
18
+ Here is the code:
19
+ {code}
20
+
21
+ These performance issues were detected:
22
+ {warnings_text}
23
+
24
+ For each issue:
25
+ 1. Explain in plain English why it's a problem
26
+ 2. Show a fixed version of that specific code
27
+ 3. Explain why the fix is better
28
+
29
+ Be concise and practical.
30
+ """
31
+
32
+ response = client.chat.completions.create(
33
+ model="llama-3.3-70b-versatile",
34
+ messages=[{"role": "user", "content": prompt}]
35
+ )
36
+ return response.choices[0].message.content
37
+
38
+ #For Folder
39
+ def analyse_folder(all_results):
40
+ if not all_results:
41
+ return "No issues found — nothing to analyse."
42
+
43
+ combined = ""
44
+ for filepath, warnings in all_results.items():
45
+ warnings_text = "\n".join(f" - {w}" for w in warnings)
46
+ combined += f"\nFile: {filepath}\n{warnings_text}\n"
47
+
48
+ prompt = f"""
49
+ You are a performance analytics expert.
50
+
51
+ These performance issues were detected across a codebase:
52
+
53
+ {combined}
54
+
55
+ For each file, go through each issue and:
56
+ 1. Explain in plain English why it's a problem
57
+ 2. Suggest a fix with a short code example
58
+ 3. Explain why the fix is better
59
+
60
+ Be concise and practical. Group your response by file.
61
+ """
62
+
63
+ response = client.chat.completions.create(
64
+ model="llama-3.3-70b-versatile",
65
+ messages=[{"role": "user", "content": prompt}]
66
+ )
67
+ return response.choices[0].message.content
68
+
69
+ #For testing purpose to differentiate the performance
70
+ def get_fixed_code(code, warnings):
71
+ if not warnings:
72
+ return code
73
+
74
+ warnings_text = "\n".join(f"- {w}" for w in warnings)
75
+
76
+ prompt = f"""
77
+ You are a performance analytics expert.
78
+
79
+ Here is the original code:
80
+ {code}
81
+
82
+ These performance issues were detected:
83
+ {warnings_text}
84
+
85
+ Return ONLY the fixed version of the complete code.
86
+ No explanations, no comments, no markdown, no code blocks.
87
+ Just the raw fixed code that can be executed directly.
88
+ """
89
+
90
+ response = client.chat.completions.create(
91
+ model="llama-3.3-70b-versatile",
92
+ messages=[{"role": "user", "content": prompt}]
93
+ )
94
+ return response.choices[0].message.content
wrench/code.py ADDED
@@ -0,0 +1,67 @@
1
+ import re
2
+ import requests
3
+
4
+ # nested loops
5
+ def process_data(items):
6
+ results = []
7
+ for i in items:
8
+ for j in items:
9
+ results.append(i + j) # list append in nested loop
10
+ return results
11
+
12
+ # string concat in loop
13
+ def build_string(words):
14
+ result = ""
15
+ for word in words:
16
+ result += word # string concat
17
+ return result
18
+
19
+ # re.compile in loop
20
+ def match_emails(emails):
21
+ for email in emails:
22
+ pattern = re.compile(r'\w+@\w+\.\w+') # re.compile in loop
23
+ if pattern.match(email):
24
+ print(f"Valid: {email}") # print in loop
25
+
26
+ # expensive call in loop
27
+ def fetch_data(urls):
28
+ for url in urls:
29
+ response = requests.get(url) # I/O in loop
30
+ print(response.status_code)
31
+
32
+ # mutable default argument
33
+ def add_item(item, items=[]):
34
+ items.append(item)
35
+ return items
36
+
37
+ # bare except
38
+ def risky():
39
+ try:
40
+ x = 1 / 0
41
+ except:
42
+ pass
43
+
44
+ # global in loop
45
+ counter = 0
46
+ def increment(n):
47
+ global counter
48
+ for i in range(n):
49
+ counter += 1 # global in loop
50
+
51
+ # unnecessary object creation in loop
52
+ def make_dicts(n):
53
+ for i in range(n):
54
+ d = dict() # unnecessary object creation
55
+ d['key'] = i
56
+
57
+ # import inside function
58
+ def parse_json(data):
59
+ import json # import inside function
60
+ return json.loads(data)
61
+
62
+ # list concat with +
63
+ def combine(lists):
64
+ result = []
65
+ for l in lists:
66
+ result = result + l # list concat with +
67
+ return result
File without changes
@@ -0,0 +1,35 @@
1
+ class BaseDetectors:
2
+ def __init__(self):
3
+ self.depth = 0
4
+ self.warnings = []
5
+ self.attr_counts = {}
6
+ self.global_vars = set()
7
+ self.function_depth = 0
8
+
9
+ def visit(self, node):
10
+ method = f"visit_{node.node_type}"
11
+ visitor = getattr(self, method, self.generic_visit)
12
+ visitor(node)
13
+
14
+ def generic_visit(self, node):
15
+ for child in node.children:
16
+ self.visit(child)
17
+
18
+ def visit_loop(self, node):
19
+ self.depth += 1
20
+ self.generic_visit(node)
21
+ self.depth -= 1
22
+
23
+ def visit_function_def(self, node):
24
+ self.function_depth += 1
25
+ original_depth = self.depth
26
+ self.depth = 0
27
+ self.generic_visit(node)
28
+ self.depth = original_depth
29
+ self.function_depth -= 1
30
+
31
+ def visit_global_statement(self, node):
32
+ for child in node.children:
33
+ if hasattr(child, 'node_type') and child.node_type == "identifier":
34
+ self.global_vars.add(child.metadata.get("name", ""))
35
+ self.generic_visit(node)
@@ -0,0 +1,78 @@
1
+ from .base import BaseDetectors
2
+
3
+ class HighDetectors(BaseDetectors):
4
+
5
+ CHEAP_CALLS = {"print", "len", "range", "str", "int", "float", "bool", "type"}
6
+ EXPENSIVE_CALLS = {"open", "requests", "get", "post", "read", "write", "connect", "execute"}
7
+ UNNECESSARY_OBJECT = {"dict", "list", "tuple", "set", "object"}
8
+
9
+ def visit_loop(self, node):
10
+ self.depth += 1
11
+ if self.depth >= 2 :
12
+ self.warnings.append(
13
+ f"Nested loop at line {node.lineno} - potential O(n²)."
14
+ )
15
+ self.generic_visit(node)
16
+ self.depth -= 1
17
+
18
+ def visit_function_call(self, node):
19
+ if self.depth >= 1:
20
+ name = node.metadata.get("name", None)
21
+ if name and name == "re.compile":
22
+ self.warnings.append(
23
+ f"re.compile() inside loop at line {node.lineno} — move it outside the loop, compile once and reuse."
24
+ )
25
+ elif name and name in ["print", "logging.info", "logging.warning", "logging.error"]:
26
+ self.warnings.append(
27
+ f"print()/logging call inside loop at line {node.lineno} — I/O on every iteration, move outside or use buffered logging."
28
+ )
29
+ elif name and name == "len":
30
+ self.warnings.append(
31
+ f"len() called inside loop at line {node.lineno} — cache the result before the loop to avoid repeated calls."
32
+ )
33
+ elif name and name in self.EXPENSIVE_CALLS:
34
+ self.warnings.append(
35
+ f"I/O call {name} inside loop at line {node.lineno} - consider moving/removing it out."
36
+ )
37
+ elif name and name in self.UNNECESSARY_OBJECT:
38
+ self.warnings.append(
39
+ f"Creating new object/literal inside loop - causes GC/allocation pressure. Consider moving outside or reusing."
40
+ )
41
+ elif name and name not in self.CHEAP_CALLS:
42
+ self.warnings.append(
43
+ f"Function call '{name}' inside loop at line {node.lineno} — consider moving it out."
44
+ )
45
+
46
+ self.generic_visit(node)
47
+
48
+ def visit_attribute_access(self, node):
49
+ if self.depth >= 1:
50
+ name = node.metadata.get("name", None)
51
+ if name:
52
+ if name not in self.attr_counts:
53
+ self.attr_counts[name] = []
54
+ self.attr_counts[name].append(node.lineno)
55
+ self.generic_visit(node)
56
+
57
+ def visit_string_concat(self, node):
58
+ if self.depth >= 2:
59
+ self.warnings.append(
60
+ f"String concatenation in nested loop — quadratic complexity at line {node.lineno}, use join() outside the loop"
61
+ )
62
+ elif self.depth >= 1:
63
+ self.warnings.append(
64
+ f"String concatenation at line {node.lineno} — use ''.join() instead."
65
+ )
66
+ self.generic_visit(node)
67
+
68
+ def check_attr_counts(self):
69
+ for key, lines in self.attr_counts.items():
70
+ if len(lines) >= 2:
71
+ self.warnings.append(
72
+ f"Attribute '{key}' accessed {len(lines)} times in loop at lines {lines} — cache it."
73
+ )
74
+ def visit_await(self, node):
75
+ if self.depth >= 1:
76
+ self.warnings.append(
77
+ f"await inside loop at line {node.lineno} — sequential async calls, use asyncio.gather() or Promise.all() to run concurrently."
78
+ )
@@ -0,0 +1,90 @@
1
+ from .base import BaseDetectors
2
+
3
+ class MediumDetectors(BaseDetectors):
4
+
5
+ SORT_CALLS = {"sort", "sorted", "sort_by", "order_by"}
6
+
7
+ def __init__(self):
8
+ super().__init__()
9
+
10
+ def visit_global_statement(self, node):
11
+ names = node.metadata.get("names", [])
12
+ for name in names:
13
+ self.global_vars.add(name)
14
+ self.generic_visit(node)
15
+
16
+ def visit_function_call(self, node):
17
+
18
+ name = node.metadata.get("name", None)
19
+
20
+ if self.depth >= 1:
21
+ if name and name in self.SORT_CALLS:
22
+ self.warnings.append(
23
+ f"Unnecessary sorting {name} inside loops at line {node.lineno}."
24
+ )
25
+
26
+ if self.depth >= 2:
27
+ if name == "append":
28
+ self.warnings.append(
29
+ f"List append inside nested loop at line {node.lineno} — consider restructuring."
30
+ )
31
+
32
+ if name == "list":
33
+ children_names = [
34
+ c.metadata.get("name", "")
35
+ for c in node.children
36
+ ]
37
+ if "range" in children_names:
38
+ self.warnings.append(
39
+ f"Unnecessary list creation at line {node.lineno} — just use range(n) directly."
40
+ )
41
+
42
+ self.generic_visit(node)
43
+
44
+ def visit_exception_handler(self, node):
45
+ exception_type = node.metadata.get("exception_type", None)
46
+ if exception_type is None:
47
+ self.warnings.append(
48
+ f"Bare except at line {node.lineno} — catches everything, be specific."
49
+ )
50
+ elif exception_type == "Exception":
51
+ self.warnings.append(
52
+ f"Overly broad 'except Exception' at line {node.lineno} — catch specific exceptions."
53
+ )
54
+ if self.depth >= 1:
55
+ self.warnings.append(
56
+ f"try/except inside loop at line {node.lineno} — exception handling overhead on every iteration, move outside if possible."
57
+ )
58
+ self.generic_visit(node)
59
+
60
+ def visit_function_def(self, node):
61
+ defaults = node.metadata.get("mutable_defaults", [])
62
+ for lineno in defaults:
63
+ self.warnings.append(
64
+ f"Mutable default argument at line {lineno} — use None instead."
65
+ )
66
+ super().visit_function_def(node)
67
+
68
+
69
+ def visit_identifier(self, node):
70
+ if self.depth >= 1:
71
+ name = node.metadata.get("name", None)
72
+ if name and name in self.global_vars:
73
+ self.warnings.append(
74
+ f"Global variable '{name}' accessed inside loop at line {node.lineno} — consider caching it locally."
75
+ )
76
+ self.generic_visit(node)
77
+
78
+ def visit_import(self, node):
79
+ if self.function_depth >= 1:
80
+ self.warnings.append(
81
+ f"Import is at function level instead of top at line {node.lineno}."
82
+ )
83
+ self.generic_visit(node)
84
+
85
+ def visit_list_concat(self, node):
86
+ if self.depth >= 1:
87
+ self.warnings.append(
88
+ f"List concatenation with '+' inside loop at line {node.lineno} — use .extend() or += instead, avoids creating a new list each iteration."
89
+ )
90
+ self.generic_visit(node)
wrench/errors.py ADDED
@@ -0,0 +1,16 @@
1
+ MESSAGES = {
2
+ "unsupported_language": "Skipping '{path}' — language not supported (Python, JS, TS, Go, C, C++ only).",
3
+ "file_not_found": "Error: '{path}' does not exist.",
4
+ "permission_error": "Skipping '{path}' — permission denied.",
5
+ "binary_file": "Skipping '{path}' — binary or non-text file.",
6
+ "syntax_error": "Skipping '{path}' — could not parse file, possible syntax error.",
7
+ "empty_file": "Skipping '{path}' — file is empty.",
8
+ "api_error": "AI analysis failed for '{path}' — check your GROQ_API_KEY or network connection.",
9
+ "profiling_error": "Profiling failed for '{path}' — skipping benchmark.",
10
+ }
11
+
12
+ def handle_error(error_type, path, fatal=False):
13
+ message = MESSAGES.get(error_type, "Unknown error for '{path}'.")
14
+ print(message.format(path=path))
15
+ if fatal:
16
+ exit(1)
wrench/ir.py ADDED
@@ -0,0 +1,11 @@
1
+ class IRNode:
2
+ def __init__(self, node_type, lineno, children=None, metadata=None):
3
+ self.node_type = node_type
4
+ self.lineno = lineno
5
+ self.children = children or []
6
+ self.metadata = metadata or {}
7
+
8
+ def __repr__(self):
9
+ return f"IRNode({self.node_type}, line={self.lineno})"
10
+
11
+
@@ -0,0 +1,104 @@
1
+ from ir import IRNode
2
+
3
+ GENERIC_TYPES = {
4
+ "LOOP": "loop",
5
+ "FUNCTION_CALL": "function_call",
6
+ "FUNCTION_DEF": "function_def",
7
+ "ATTRIBUTE_ACCESS": "attribute_access",
8
+ "STRING_CONCAT": "string_concat",
9
+ "EXCEPTION_HANDLER": "exception_handler",
10
+ "GLOBAL_STATEMENT": "global_statement",
11
+ "IMPORT": "import",
12
+ "AWAIT": "await",
13
+ "IDENTIFIER": "identifier"
14
+ }
15
+
16
+ class IRTranslator:
17
+ def __init__(self, rules):
18
+ self.rules = rules
19
+
20
+ def translate(self, node):
21
+ node_type = self.get_generic_type(node.type)
22
+ if node_type is None:
23
+ list_concat_types = getattr(self.rules, "LIST_CONCAT", [])
24
+ if node.type in list_concat_types:
25
+ op = node.child_by_field_name("operator")
26
+ if op and op.text.decode("utf8") == "+":
27
+ node_type = "list_concat"
28
+
29
+ lineno = node.start_point[0] + 1
30
+ children = [self.translate(child) for child in node.children]
31
+ metadata = self.extract_metadata(node, node_type)
32
+ return IRNode(node_type or f"_raw_{node.type}", lineno, children, metadata)
33
+
34
+ def extract_metadata(self, node, node_type):
35
+ metadata = {}
36
+
37
+ if node_type == "function_call":
38
+ metadata["name"] = self.get_call_name(node)
39
+
40
+ elif node_type == "attribute_access":
41
+ metadata["name"] = self.get_attribute_name(node)
42
+
43
+ elif node_type == "identifier":
44
+ metadata["name"] = node.text.decode("utf8") if node.text else None
45
+
46
+ elif node_type == "exception_handler":
47
+ metadata["exception_type"] = self.get_exception_type(node)
48
+
49
+ elif node_type == "function_def":
50
+ metadata["mutable_defaults"] = self.get_mutable_defaults(node)
51
+
52
+ elif node_type == "global_statement":
53
+ metadata["names"] = [
54
+ child.text.decode("utf8")
55
+ for child in node.children
56
+ if child.type == "identifier"
57
+ ]
58
+
59
+ return metadata
60
+
61
+ def get_call_name(self, node):
62
+ for child in node.children:
63
+ if child.type == "identifier":
64
+ return child.text.decode("utf8")
65
+ elif child.type == "attribute":
66
+ parts = []
67
+ for subchild in child.children:
68
+ if subchild.type == "identifier":
69
+ parts.append(subchild.text.decode("utf8"))
70
+ if parts:
71
+ return ".".join(parts)
72
+ return None
73
+
74
+ def get_attribute_name(self, node):
75
+ parts = []
76
+ for child in node.children:
77
+ if child.type == "identifier":
78
+ parts.append(child.text.decode("utf8"))
79
+ return ".".join(parts) if parts else None
80
+
81
+ def get_exception_type(self, node):
82
+ for child in node.children:
83
+ if child.type == "identifier":
84
+ return child.text.decode("utf8")
85
+ return None
86
+
87
+ def get_mutable_defaults(self, node):
88
+ mutable_lines = []
89
+ def check_mutable_recursive(n):
90
+ if n.type in ("list", "dictionary", "set"):
91
+ mutable_lines.append(n.start_point[0] + 1)
92
+ for child in n.children:
93
+ check_mutable_recursive(child)
94
+ for child in node.children:
95
+ check_mutable_recursive(child)
96
+ return mutable_lines
97
+
98
+ def get_generic_type(self, tree_sitter_type):
99
+ for generic, mapped in GENERIC_TYPES.items():
100
+ rule_list = getattr(self.rules, generic + "_TYPES",
101
+ getattr(self.rules, generic, []))
102
+ if tree_sitter_type in rule_list:
103
+ return mapped
104
+ return None
File without changes
@@ -0,0 +1,10 @@
1
+ LOOP_TYPES = ["for_statement", "while_statement", "do_statement"]
2
+ FUNCTION_CALL = ["call_expression"]
3
+ FUNCTION_DEF = ["function_definition"]
4
+ ATTRIBUTE_ACCESS = ["field_expression"]
5
+ STRING_CONCAT = ["assignment_expression"]
6
+ EXCEPTION_HANDLER = []
7
+ GLOBAL_STATEMENT = ["declaration"]
8
+ IMPORT = ["preproc_include"]
9
+ LIST_CONCAT = ["binary_expression"]
10
+ IDENTIFIER = ["identifier"]
@@ -0,0 +1,10 @@
1
+ LOOP_TYPES = ["for_statement", "while_statement", "do_statement", "range_based_for_statement"]
2
+ FUNCTION_CALL = ["call_expression"]
3
+ FUNCTION_DEF = ["function_definition"]
4
+ ATTRIBUTE_ACCESS = ["field_expression"]
5
+ STRING_CONCAT = ["assignment_expression"]
6
+ EXCEPTION_HANDLER = ["catch_clause"]
7
+ GLOBAL_STATEMENT = ["declaration"]
8
+ IMPORT = ["preproc_include", "using_declaration"]
9
+ LIST_CONCAT = ["binary_expression"]
10
+ IDENTIFIER = ["identifier"]
@@ -0,0 +1,10 @@
1
+ LOOP_TYPES = ["for_statement"]
2
+ FUNCTION_CALL = ["call_expression"]
3
+ FUNCTION_DEF = ["function_declaration", "method_declaration"]
4
+ ATTRIBUTE_ACCESS = ["selector_expression"]
5
+ STRING_CONCAT = ["assignment_statement"]
6
+ EXCEPTION_HANDLER = ["defer_statement"]
7
+ GLOBAL_STATEMENT = ["var_declaration"]
8
+ IMPORT = ["import_declaration", "import_spec"]
9
+ LIST_CONCAT = ["binary_expression"]
10
+ IDENTIFIER = ["identifier"]
@@ -0,0 +1,11 @@
1
+ LOOP_TYPES = ["for_statement", "for_in_statement", "for_of_statement", "while_statement"]
2
+ FUNCTION_CALL = ["call_expression"]
3
+ FUNCTION_DEF = ["function_declaration", "arrow_function", "function_expression"]
4
+ ATTRIBUTE_ACCESS = ["member_expression"]
5
+ STRING_CONCAT = ["augmented_assignment_expression"]
6
+ EXCEPTION_HANDLER = ["catch_clause"]
7
+ GLOBAL_STATEMENT = []
8
+ IMPORT = ["import_statement", "import_declaration"]
9
+ LIST_CONCAT = ["binary_expression"]
10
+ AWAIT = ["await_expression"]
11
+ IDENTIFIER = ["identifier"]
@@ -0,0 +1,11 @@
1
+ LOOP_TYPES = ["for_statement", "while_statement"]
2
+ FUNCTION_CALL = ["call"]
3
+ FUNCTION_DEF = ["function_definition"]
4
+ ATTRIBUTE_ACCESS = ["attribute"]
5
+ STRING_CONCAT = ["augmented_assignment"]
6
+ EXCEPTION_HANDLER = ["except_clause"]
7
+ GLOBAL_STATEMENT = ["global_statement"]
8
+ IMPORT = ["import_statement", "import_from_statement"]
9
+ LIST_CONCAT = ["binary_operator"]
10
+ AWAIT = ["await_expression"]
11
+ IDENTIFIER = ["identifier"]
@@ -0,0 +1,11 @@
1
+ LOOP_TYPES = ["for_statement", "for_in_statement", "for_of_statement", "while_statement"]
2
+ FUNCTION_CALL = ["call_expression"]
3
+ FUNCTION_DEF = ["function_declaration", "arrow_function", "function_expression", "method_definition"]
4
+ ATTRIBUTE_ACCESS = ["member_expression"]
5
+ STRING_CONCAT = ["augmented_assignment_expression"]
6
+ EXCEPTION_HANDLER = ["catch_clause"]
7
+ GLOBAL_STATEMENT = []
8
+ IMPORT = ["import_statement", "import_declaration"]
9
+ LIST_CONCAT = ["binary_expression"]
10
+ AWAIT = ["await_expression"]
11
+ IDENTIFIER = ["identifier"]
wrench/main.py ADDED
@@ -0,0 +1,216 @@
1
+ import sys
2
+ import os
3
+ import threading
4
+ from detectors.high import HighDetectors
5
+ from detectors.medium import MediumDetectors
6
+ from ai_engine import analyse, get_fixed_code, analyse_folder as analyse_folder_ai
7
+ from parser_engine import get_parser, detect_language
8
+ from ir_translator import IRTranslator
9
+ from profilers.profiler import profile_file, parse_stats, write_temp_file, delete_temp_file
10
+ from errors import handle_error
11
+ from wrenchignore import load_wrenchignore, is_ignored
12
+ from reports import print_summary, print_profiling, ask_and_analyse, ask_and_apply_fixes, save_report
13
+
14
+ IGNORE_DIRS = {"venv", "node_modules", ".git", "__pycache__", "dist", "build", ".vscode"}
15
+
16
+ def get_rules(language):
17
+ if language == "python":
18
+ import languages.python_rules as rules
19
+ elif language == "javascript":
20
+ import languages.javascript_rules as rules
21
+ elif language == "typescript":
22
+ import languages.typescript_rules as rules
23
+ elif language == "go":
24
+ import languages.go_rules as rules
25
+ elif language == "c":
26
+ import languages.c_rules as rules
27
+ elif language == "cpp":
28
+ import languages.cpp_rules as rules
29
+ else:
30
+ return None
31
+ return rules
32
+
33
+ def run_analysis(filepath):
34
+ # wrenchignore check
35
+ patterns = load_wrenchignore(os.path.dirname(filepath))
36
+ if is_ignored(filepath, patterns):
37
+ return [], None, None
38
+
39
+ # language check
40
+ language = detect_language(filepath)
41
+ if language is None:
42
+ handle_error("unsupported_language", filepath)
43
+ return [], None, None
44
+
45
+ # file reading
46
+ try:
47
+ with open(filepath, "r", encoding="utf8") as f:
48
+ code = f.read()
49
+ except FileNotFoundError:
50
+ handle_error("file_not_found", filepath, fatal=True)
51
+ except PermissionError:
52
+ handle_error("permission_error", filepath)
53
+ return [], None, None
54
+ except UnicodeDecodeError:
55
+ handle_error("binary_file", filepath)
56
+ return [], None, None
57
+
58
+ # empty file check
59
+ if not code.strip():
60
+ handle_error("empty_file", filepath)
61
+ return [], None, None
62
+
63
+ # parsing
64
+ try:
65
+ rules = get_rules(language)
66
+ parser = get_parser(language)
67
+ tree = parser.parse(bytes(code, "utf8"))
68
+ translator = IRTranslator(rules)
69
+ ir_tree = translator.translate(tree.root_node)
70
+ except Exception:
71
+ handle_error("syntax_error", filepath)
72
+ return [], None, None
73
+
74
+ warnings = []
75
+ for DetectorClass in [HighDetectors, MediumDetectors]:
76
+ detector = DetectorClass()
77
+ detector.visit(ir_tree)
78
+ if hasattr(detector, 'check_attr_counts'):
79
+ detector.check_attr_counts()
80
+ warnings.extend(detector.warnings)
81
+
82
+ return warnings, language, code
83
+
84
+ def get_files(folder):
85
+ patterns = load_wrenchignore(folder)
86
+ files = []
87
+ for root, dirs, filenames in os.walk(folder):
88
+ dirs[:] = [d for d in dirs if d not in IGNORE_DIRS]
89
+ for f in filenames:
90
+ filepath = os.path.join(root, f)
91
+ if detect_language(f) is not None and not is_ignored(filepath, patterns):
92
+ files.append(filepath)
93
+ return files
94
+
95
+ def analyse_single_file(filename):
96
+ warnings, language, code = run_analysis(filename)
97
+
98
+ if language is None:
99
+ return
100
+
101
+ if not warnings:
102
+ print("No issues found!")
103
+ return
104
+
105
+ # print summary
106
+ print_summary(1, {language}, {filename: warnings})
107
+
108
+ # print warnings
109
+ print("\n--- Warnings ---\n")
110
+ for w in warnings:
111
+ print(f" {w}")
112
+
113
+ results = {}
114
+
115
+ def run_profiling():
116
+ try:
117
+ if language != "python":
118
+ results["profiling"] = None
119
+ return
120
+
121
+ before_raw = profile_file(filename)
122
+ before_stats = parse_stats(before_raw)
123
+
124
+ fixed_code = get_fixed_code(code, warnings)
125
+ temp_file = write_temp_file(fixed_code, filename)
126
+ after_raw = profile_file(temp_file)
127
+ after_stats = parse_stats(after_raw)
128
+ delete_temp_file(temp_file)
129
+
130
+ results["before"] = before_stats
131
+ results["after"] = after_stats
132
+
133
+ except Exception:
134
+ handle_error("profiling_error", filename)
135
+ results["profiling"] = None
136
+
137
+ t = threading.Thread(target=run_profiling)
138
+ t.start()
139
+ t.join()
140
+
141
+ # profiling output
142
+ if results.get("profiling") is None and language != "python":
143
+ print("\n--- Profiling not supported for this language yet ---")
144
+ elif results.get("before") and results.get("after"):
145
+ print_profiling(results["before"], results["after"])
146
+
147
+ # AI analysis — ask user
148
+ ask_and_analyse(code, warnings)
149
+
150
+ # apply fixes — ask user
151
+ ask_and_apply_fixes(code, warnings, filename)
152
+
153
+ # save report — ask user
154
+ save_report(1, {language}, {filename: warnings})
155
+
156
+ def analyse_folder(folder):
157
+ files = get_files(folder)
158
+
159
+ if not files:
160
+ print("No supported files found in folder.")
161
+ return
162
+
163
+ all_results = {}
164
+ languages = set()
165
+ for file in files:
166
+ warnings, language, code = run_analysis(file)
167
+ if language:
168
+ languages.add(language)
169
+ if warnings:
170
+ all_results[file] = warnings
171
+
172
+ if not all_results:
173
+ print("No issues found across all files!")
174
+ return
175
+
176
+ # print summary
177
+ print_summary(len(files), languages, all_results)
178
+
179
+ # print warnings per file
180
+ print("\n--- Warnings ---\n")
181
+ for file, warnings in all_results.items():
182
+ print(f"--- {file} ---")
183
+ for w in warnings:
184
+ print(f" {w}")
185
+ print()
186
+
187
+ # AI analysis — one call for whole folder, ask user
188
+ analysis = None
189
+ choice = input("\nWant AI analysis? (y/n): ").strip().lower()
190
+ if choice == 'y':
191
+ try:
192
+ analysis = analyse_folder_ai(all_results)
193
+ print("\n--- AI Analysis ---\n")
194
+ print(analysis)
195
+ except Exception:
196
+ handle_error("api_error", folder)
197
+
198
+ # save report — ask user
199
+ save_report(len(files), languages, all_results, analysis=analysis)
200
+
201
+ def main():
202
+ if len(sys.argv) < 2:
203
+ print("Usage: codewrench <filename or folder>")
204
+ exit()
205
+
206
+ target = sys.argv[1]
207
+
208
+ if os.path.isdir(target):
209
+ analyse_folder(target)
210
+ elif os.path.isfile(target):
211
+ analyse_single_file(target)
212
+ else:
213
+ handle_error("file_not_found", target, fatal=True)
214
+
215
+ if __name__ == "__main__":
216
+ main()
@@ -0,0 +1,34 @@
1
+ from tree_sitter import Language, Parser
2
+ import tree_sitter_python as tspython
3
+ import tree_sitter_javascript as tsjavascript
4
+ import tree_sitter_typescript as tstypescript
5
+ import tree_sitter_go as tsgo
6
+ import tree_sitter_c as tsc
7
+ import tree_sitter_cpp as tscpp
8
+
9
+ LANGUAGES = {
10
+ "python": Language(tspython.language()),
11
+ "javascript": Language(tsjavascript.language()),
12
+ "typescript": Language(tstypescript.language_typescript()),
13
+ "go": Language(tsgo.language()),
14
+ "c": Language(tsc.language()),
15
+ "cpp": Language(tscpp.language()),
16
+ }
17
+
18
+ def get_parser(language) :
19
+ parser = Parser(LANGUAGES[language])
20
+ return parser
21
+
22
+ def detect_language(filename):
23
+ ext = filename.split(".")[-1]
24
+ mapping = {
25
+ "py": "python",
26
+ "js": "javascript",
27
+ "ts": "typescript",
28
+ "go": "go",
29
+ "c": "c",
30
+ "cpp": "cpp",
31
+ "cc": "cpp",
32
+ }
33
+ return mapping.get(ext, None)
34
+
File without changes
@@ -0,0 +1,44 @@
1
+ import os
2
+ import sys
3
+ import subprocess
4
+
5
+ def profile_file(filename):
6
+ result = subprocess.run(
7
+ [sys.executable, "-m", "cProfile", "-s", "cumulative", filename],
8
+ capture_output= True,
9
+ text= True,
10
+ cwd=os.path.dirname(os.path.abspath(filename))
11
+ )
12
+ return result.stdout
13
+
14
+ def parse_stats(raw_output):
15
+ lines = raw_output.strip().split("\n")
16
+ stats = []
17
+ for line in lines[4:]:
18
+ parts = line.split()
19
+ if len(parts) >= 6:
20
+ try:
21
+ float(parts[1])
22
+ float(parts[3])
23
+ function = " ".join(parts[5:])
24
+ if not function.startswith("{"):
25
+ stats.append({
26
+ "ncalls": parts[0],
27
+ "tottime": parts[1],
28
+ "cumtime": parts[3],
29
+ "function": function
30
+ })
31
+ except ValueError:
32
+ continue
33
+ return stats
34
+
35
+ def write_temp_file(code, original_filename):
36
+ temp_file = original_filename.replace("py", "_wrench_temp.py")
37
+ with open(temp_file, "w") as f:
38
+ f.write(code)
39
+ return temp_file
40
+
41
+ def delete_temp_file(temp_filename):
42
+ if os.path.exists(temp_filename):
43
+ os.remove(temp_filename)
44
+
wrench/reports.py ADDED
@@ -0,0 +1,72 @@
1
+ from ai_engine import analyse, get_fixed_code
2
+
3
+ def print_summary(files_scanned, languages, all_results):
4
+ total_issues = sum(len(w) for w in all_results.values())
5
+ print("=" * 40)
6
+ print("CODEWRENCH REPORT".center(40))
7
+ print("=" * 40)
8
+ print(f"Files Scanned : {files_scanned}")
9
+ print(f"Languages : {', '.join(languages)}")
10
+ print(f"Issues Found : {total_issues} across {len(all_results)} files")
11
+ print("=" * 40)
12
+
13
+ def print_profiling(before_stats, after_stats):
14
+ print("Top 5 slowest functions BEFORE fix:")
15
+ for stat in before_stats[:5]:
16
+ func = stat['function'].split(":")[-1]
17
+ print(f" {func:<30} cumtime: {stat['cumtime']}s")
18
+
19
+ print("\nTop 5 slowest functions AFTER fix:")
20
+ for stat in after_stats[:5]:
21
+ func = stat['function'].split(":")[-1]
22
+ print(f" {func:<30} cumtime: {stat['cumtime']}s")
23
+
24
+ def ask_and_analyse(code, warnings):
25
+ choice = input("\nWant AI analysis? (y/n): ").strip().lower()
26
+ if choice == 'y':
27
+ print("\n--- AI Analysis ---\n")
28
+ result = analyse(code, warnings)
29
+ print(result)
30
+
31
+ def ask_and_apply_fixes(code, warnings, filepath):
32
+ choice = input("\nWant to apply fixes to files? (y/n): ").strip().lower()
33
+ if choice == 'y':
34
+ fixed_code = get_fixed_code(code, warnings)
35
+ with open(filepath + ".bak", "w", encoding="utf8") as f:
36
+ f.write(code)
37
+ with open(filepath, "w", encoding="utf8") as f:
38
+ f.write(fixed_code)
39
+ print(f"Original saved as {filepath}.bak")
40
+ print(f"Fixes applied to {filepath}")
41
+
42
+ def save_report(files_scanned, languages, all_results, analysis=None):
43
+ choice = input("\nSave report? (y/n): ").strip().lower()
44
+ if choice != 'y':
45
+ return
46
+
47
+ total_issues = sum(len(w) for w in all_results.values())
48
+
49
+ with open("codewrench_report.md", "w", encoding="utf8") as f:
50
+ # header
51
+ f.write("# Codewrench Report\n\n")
52
+ f.write(f"**Files Scanned:** {files_scanned}\n\n")
53
+ f.write(f"**Languages:** {', '.join(languages)}\n\n")
54
+ f.write(f"**Issues Found:** {total_issues} across {len(all_results)} files\n\n")
55
+ f.write("---\n\n")
56
+
57
+ # warnings per file
58
+ f.write("## Warnings\n\n")
59
+ for filepath, warnings in all_results.items():
60
+ f.write(f"### {filepath}\n\n")
61
+ for w in warnings:
62
+ f.write(f"- {w}\n")
63
+ f.write("\n")
64
+
65
+ # AI analysis
66
+ if analysis:
67
+ f.write("---\n\n")
68
+ f.write("## AI Analysis\n\n")
69
+ f.write(analysis)
70
+ f.write("\n")
71
+
72
+ print("Report saved to codewrench_report.md")
wrench/wrenchignore.py ADDED
@@ -0,0 +1,25 @@
1
+ import os
2
+ import fnmatch
3
+
4
+ def load_wrenchignore(root):
5
+ patterns = []
6
+ try:
7
+ with open(os.path.join(root, ".wrenchignore"), 'r', encoding="utf8") as f:
8
+ for line in f:
9
+ line = line.strip()
10
+ if not line or line.startswith('#'):
11
+ continue
12
+ patterns.append(line)
13
+ except FileNotFoundError:
14
+ pass
15
+ return patterns
16
+
17
+ def is_ignored(filepath, patterns):
18
+ filename = os.path.basename(filepath)
19
+ for pattern in patterns:
20
+ if pattern.endswith('/'):
21
+ if pattern.rstrip('/') in filepath.split(os.sep):
22
+ return True
23
+ elif fnmatch.fnmatch(filename, pattern):
24
+ return True
25
+ return False