opencode-autognosis 0.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/LICENSE +21 -0
- package/README.md +107 -0
- package/assets/patterns.json +9 -0
- package/dist/index.d.ts +212 -0
- package/dist/index.js +8 -0
- package/dist/system-tools.d.ts +210 -0
- package/dist/system-tools.js +256 -0
- package/package.json +24 -0
package/LICENSE
ADDED
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2026 Julian
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
package/README.md
ADDED
|
@@ -0,0 +1,107 @@
|
|
|
1
|
+
# opencode-autognosis
|
|
2
|
+
|
|
3
|
+
**Self-knowledge for your codebase.** A suite of tools for OpenCode that enables rapid discovery, structural search, and safe iteration.
|
|
4
|
+
|
|
5
|
+
This plugin transforms your agent from a file-reader into a "Miniature Engineer" capable of understanding project structure, navigating by symbols, and planning changes safely.
|
|
6
|
+
|
|
7
|
+
## Prerequisites
|
|
8
|
+
|
|
9
|
+
This plugin relies on the following high-performance system binaries. Please ensure they are installed and in your PATH:
|
|
10
|
+
|
|
11
|
+
- **ripgrep (`rg`)**: For fast content searching.
|
|
12
|
+
- **fd (`fd`)**: For fast file finding.
|
|
13
|
+
- **ast-grep (`sg`)**: For structural code search.
|
|
14
|
+
- **universal-ctags (`ctags`)**: For symbol indexing.
|
|
15
|
+
- **git**: For version control integration.
|
|
16
|
+
|
|
17
|
+
## Installation
|
|
18
|
+
|
|
19
|
+
### Via npm (Recommended)
|
|
20
|
+
|
|
21
|
+
1. Install the package:
|
|
22
|
+
```bash
|
|
23
|
+
npm install opencode-autognosis
|
|
24
|
+
```
|
|
25
|
+
|
|
26
|
+
2. Configure `opencode.json`:
|
|
27
|
+
```json
|
|
28
|
+
{
|
|
29
|
+
"plugin": ["opencode-autognosis"]
|
|
30
|
+
}
|
|
31
|
+
```
|
|
32
|
+
|
|
33
|
+
### Local Plugin
|
|
34
|
+
|
|
35
|
+
1. Clone this repository.
|
|
36
|
+
2. Build the plugin:
|
|
37
|
+
```bash
|
|
38
|
+
npm install
|
|
39
|
+
npm run build
|
|
40
|
+
```
|
|
41
|
+
3. Copy `dist/index.js` to your project's `.opencode/plugins/autognosis.js` (or reference the build directory).
|
|
42
|
+
|
|
43
|
+
## End-to-End Demo
|
|
44
|
+
|
|
45
|
+
Here is how the "Miniature Engineer" workflow looks in practice:
|
|
46
|
+
|
|
47
|
+
1. **Initialize & Plan**:
|
|
48
|
+
```javascript
|
|
49
|
+
// Agent initializes environment and gets a plan ID
|
|
50
|
+
autognosis_init({ mode: "apply", token: "..." })
|
|
51
|
+
brief_fix_loop({ symbol: "AuthService", intent: "refactor" })
|
|
52
|
+
// Returns: { plan_id: "plan-123", ... }
|
|
53
|
+
```
|
|
54
|
+
|
|
55
|
+
2. **Navigate & Read**:
|
|
56
|
+
```javascript
|
|
57
|
+
// Agent jumps to definition using the plan ID
|
|
58
|
+
jump_to_symbol({ symbol: "AuthService", plan_id: "plan-123" })
|
|
59
|
+
// Returns: { resolved_location: "src/auth.ts", slice: "..." }
|
|
60
|
+
```
|
|
61
|
+
|
|
62
|
+
3. **Propose Change**:
|
|
63
|
+
```javascript
|
|
64
|
+
// Agent edits file (via standard write_file or replace)
|
|
65
|
+
// Then captures the change as a patch artifact
|
|
66
|
+
prepare_patch({ plan_id: "plan-123", message: "Refactor login method" })
|
|
67
|
+
// Returns: "Patch saved to .opencode/cache/patch-123.diff"
|
|
68
|
+
```
|
|
69
|
+
|
|
70
|
+
4. **Validate**:
|
|
71
|
+
```javascript
|
|
72
|
+
// Agent ensures the patch is valid before committing
|
|
73
|
+
validate_patch({ patch_path: ".opencode/cache/patch-123.diff" })
|
|
74
|
+
// Returns: { status: "SUCCESS", checks: { git_apply_check: "passed" } }
|
|
75
|
+
```
|
|
76
|
+
|
|
77
|
+
5. **Finalize**:
|
|
78
|
+
```javascript
|
|
79
|
+
// Agent closes the loop
|
|
80
|
+
finalize_plan({ plan_id: "plan-123", outcome: "success" })
|
|
81
|
+
// Returns: "Plan finalized. Metrics logged."
|
|
82
|
+
```
|
|
83
|
+
|
|
84
|
+
## Tools Capabilities
|
|
85
|
+
|
|
86
|
+
### Discovery & Navigation
|
|
87
|
+
- **`fast_search`**: Optimized text search using `rg` (content) and `fd` (files). Enforces "filename-first" discovery.
|
|
88
|
+
- **`structural_search`**: Find code by grammatical pattern using `ast-grep`.
|
|
89
|
+
- **`symbol_query`**: Instant symbol lookup using a content-addressable tags index.
|
|
90
|
+
- **`jump_to_symbol`**: The primary navigation tool. Resolves a symbol and returns a bounded "slice" of code.
|
|
91
|
+
- **`read_slice`**: Efficiently reads specific line ranges from files.
|
|
92
|
+
|
|
93
|
+
### Engineering Workflow
|
|
94
|
+
- **`brief_fix_loop`**: Generates a bounded **Plan** containing the definition, dependents, and context.
|
|
95
|
+
- **`prepare_patch`**: Generates a `.diff` artifact for the current changes.
|
|
96
|
+
- **`validate_patch`**: Validates a patch in a clean git worktree.
|
|
97
|
+
- **`finalize_plan`**: Closes the session loop, logs metrics, and cleans up the cache.
|
|
98
|
+
|
|
99
|
+
## Initialization
|
|
100
|
+
|
|
101
|
+
Run the initialization tool to verify your environment:
|
|
102
|
+
|
|
103
|
+
```javascript
|
|
104
|
+
autognosis_init() // Default mode="plan"
|
|
105
|
+
```
|
|
106
|
+
|
|
107
|
+
This will check for required binaries and provide a token to confirm initialization.
|
package/dist/index.d.ts
ADDED
|
@@ -0,0 +1,212 @@
|
|
|
1
|
+
export default function plugin(): {
|
|
2
|
+
tools: {
|
|
3
|
+
autognosis_init: {
|
|
4
|
+
description: string;
|
|
5
|
+
parameters: {
|
|
6
|
+
type: string;
|
|
7
|
+
properties: {
|
|
8
|
+
mode: {
|
|
9
|
+
type: string;
|
|
10
|
+
enum: string[];
|
|
11
|
+
default: string;
|
|
12
|
+
};
|
|
13
|
+
token: {
|
|
14
|
+
type: string;
|
|
15
|
+
};
|
|
16
|
+
};
|
|
17
|
+
required: string[];
|
|
18
|
+
};
|
|
19
|
+
execute: ({ mode, token }: {
|
|
20
|
+
mode: string;
|
|
21
|
+
token?: string;
|
|
22
|
+
}) => Promise<string | undefined>;
|
|
23
|
+
};
|
|
24
|
+
fast_search: {
|
|
25
|
+
description: string;
|
|
26
|
+
parameters: {
|
|
27
|
+
type: string;
|
|
28
|
+
properties: {
|
|
29
|
+
query: {
|
|
30
|
+
type: string;
|
|
31
|
+
};
|
|
32
|
+
mode: {
|
|
33
|
+
type: string;
|
|
34
|
+
enum: string[];
|
|
35
|
+
default: string;
|
|
36
|
+
};
|
|
37
|
+
path: {
|
|
38
|
+
type: string;
|
|
39
|
+
};
|
|
40
|
+
};
|
|
41
|
+
required: string[];
|
|
42
|
+
};
|
|
43
|
+
execute: ({ query, mode, path: searchPath }: {
|
|
44
|
+
query: string;
|
|
45
|
+
mode?: string;
|
|
46
|
+
path?: string;
|
|
47
|
+
}) => Promise<string>;
|
|
48
|
+
};
|
|
49
|
+
structural_search: {
|
|
50
|
+
description: string;
|
|
51
|
+
parameters: {
|
|
52
|
+
type: string;
|
|
53
|
+
properties: {
|
|
54
|
+
pattern: {
|
|
55
|
+
type: string;
|
|
56
|
+
};
|
|
57
|
+
path: {
|
|
58
|
+
type: string;
|
|
59
|
+
default: string;
|
|
60
|
+
};
|
|
61
|
+
plan_id: {
|
|
62
|
+
type: string;
|
|
63
|
+
};
|
|
64
|
+
};
|
|
65
|
+
required: string[];
|
|
66
|
+
};
|
|
67
|
+
execute: ({ pattern, path: searchPath, plan_id }: {
|
|
68
|
+
pattern: string;
|
|
69
|
+
path?: string;
|
|
70
|
+
plan_id?: string;
|
|
71
|
+
}) => Promise<string>;
|
|
72
|
+
};
|
|
73
|
+
read_slice: {
|
|
74
|
+
description: string;
|
|
75
|
+
parameters: {
|
|
76
|
+
type: string;
|
|
77
|
+
properties: {
|
|
78
|
+
file: {
|
|
79
|
+
type: string;
|
|
80
|
+
};
|
|
81
|
+
start_line: {
|
|
82
|
+
type: string;
|
|
83
|
+
};
|
|
84
|
+
end_line: {
|
|
85
|
+
type: string;
|
|
86
|
+
};
|
|
87
|
+
plan_id: {
|
|
88
|
+
type: string;
|
|
89
|
+
};
|
|
90
|
+
};
|
|
91
|
+
required: string[];
|
|
92
|
+
};
|
|
93
|
+
execute: ({ file, start_line, end_line, plan_id }: {
|
|
94
|
+
file: string;
|
|
95
|
+
start_line: number;
|
|
96
|
+
end_line: number;
|
|
97
|
+
plan_id?: string;
|
|
98
|
+
}) => Promise<string>;
|
|
99
|
+
};
|
|
100
|
+
symbol_query: {
|
|
101
|
+
description: string;
|
|
102
|
+
parameters: {
|
|
103
|
+
type: string;
|
|
104
|
+
properties: {
|
|
105
|
+
symbol: {
|
|
106
|
+
type: string;
|
|
107
|
+
};
|
|
108
|
+
};
|
|
109
|
+
required: string[];
|
|
110
|
+
};
|
|
111
|
+
execute: ({ symbol }: {
|
|
112
|
+
symbol: string;
|
|
113
|
+
}) => Promise<string>;
|
|
114
|
+
};
|
|
115
|
+
jump_to_symbol: {
|
|
116
|
+
description: string;
|
|
117
|
+
parameters: {
|
|
118
|
+
type: string;
|
|
119
|
+
properties: {
|
|
120
|
+
symbol: {
|
|
121
|
+
type: string;
|
|
122
|
+
};
|
|
123
|
+
plan_id: {
|
|
124
|
+
type: string;
|
|
125
|
+
};
|
|
126
|
+
};
|
|
127
|
+
required: string[];
|
|
128
|
+
};
|
|
129
|
+
execute: ({ symbol, plan_id }: {
|
|
130
|
+
symbol: string;
|
|
131
|
+
plan_id?: string;
|
|
132
|
+
}) => Promise<string>;
|
|
133
|
+
};
|
|
134
|
+
brief_fix_loop: {
|
|
135
|
+
description: string;
|
|
136
|
+
parameters: {
|
|
137
|
+
type: string;
|
|
138
|
+
properties: {
|
|
139
|
+
symbol: {
|
|
140
|
+
type: string;
|
|
141
|
+
};
|
|
142
|
+
intent: {
|
|
143
|
+
type: string;
|
|
144
|
+
};
|
|
145
|
+
};
|
|
146
|
+
required: string[];
|
|
147
|
+
};
|
|
148
|
+
execute: ({ symbol, intent }: {
|
|
149
|
+
symbol: string;
|
|
150
|
+
intent: string;
|
|
151
|
+
}) => Promise<string>;
|
|
152
|
+
};
|
|
153
|
+
prepare_patch: {
|
|
154
|
+
description: string;
|
|
155
|
+
parameters: {
|
|
156
|
+
type: string;
|
|
157
|
+
properties: {
|
|
158
|
+
plan_id: {
|
|
159
|
+
type: string;
|
|
160
|
+
};
|
|
161
|
+
message: {
|
|
162
|
+
type: string;
|
|
163
|
+
};
|
|
164
|
+
};
|
|
165
|
+
required: string[];
|
|
166
|
+
};
|
|
167
|
+
execute: ({ plan_id, message }: {
|
|
168
|
+
plan_id?: string;
|
|
169
|
+
message: string;
|
|
170
|
+
}) => Promise<string>;
|
|
171
|
+
};
|
|
172
|
+
validate_patch: {
|
|
173
|
+
description: string;
|
|
174
|
+
parameters: {
|
|
175
|
+
type: string;
|
|
176
|
+
properties: {
|
|
177
|
+
patch_path: {
|
|
178
|
+
type: string;
|
|
179
|
+
};
|
|
180
|
+
timeout_ms: {
|
|
181
|
+
type: string;
|
|
182
|
+
default: number;
|
|
183
|
+
};
|
|
184
|
+
};
|
|
185
|
+
required: string[];
|
|
186
|
+
};
|
|
187
|
+
execute: ({ patch_path, timeout_ms }: {
|
|
188
|
+
patch_path: string;
|
|
189
|
+
timeout_ms?: number;
|
|
190
|
+
}) => Promise<string>;
|
|
191
|
+
};
|
|
192
|
+
finalize_plan: {
|
|
193
|
+
description: string;
|
|
194
|
+
parameters: {
|
|
195
|
+
type: string;
|
|
196
|
+
properties: {
|
|
197
|
+
plan_id: {
|
|
198
|
+
type: string;
|
|
199
|
+
};
|
|
200
|
+
outcome: {
|
|
201
|
+
type: string;
|
|
202
|
+
};
|
|
203
|
+
};
|
|
204
|
+
required: string[];
|
|
205
|
+
};
|
|
206
|
+
execute: ({ plan_id, outcome }: {
|
|
207
|
+
plan_id: string;
|
|
208
|
+
outcome: string;
|
|
209
|
+
}) => Promise<string>;
|
|
210
|
+
};
|
|
211
|
+
};
|
|
212
|
+
};
|
package/dist/index.js
ADDED
|
@@ -0,0 +1,210 @@
|
|
|
1
|
+
export declare function systemTools(): {
|
|
2
|
+
autognosis_init: {
|
|
3
|
+
description: string;
|
|
4
|
+
parameters: {
|
|
5
|
+
type: string;
|
|
6
|
+
properties: {
|
|
7
|
+
mode: {
|
|
8
|
+
type: string;
|
|
9
|
+
enum: string[];
|
|
10
|
+
default: string;
|
|
11
|
+
};
|
|
12
|
+
token: {
|
|
13
|
+
type: string;
|
|
14
|
+
};
|
|
15
|
+
};
|
|
16
|
+
required: string[];
|
|
17
|
+
};
|
|
18
|
+
execute: ({ mode, token }: {
|
|
19
|
+
mode: string;
|
|
20
|
+
token?: string;
|
|
21
|
+
}) => Promise<string | undefined>;
|
|
22
|
+
};
|
|
23
|
+
fast_search: {
|
|
24
|
+
description: string;
|
|
25
|
+
parameters: {
|
|
26
|
+
type: string;
|
|
27
|
+
properties: {
|
|
28
|
+
query: {
|
|
29
|
+
type: string;
|
|
30
|
+
};
|
|
31
|
+
mode: {
|
|
32
|
+
type: string;
|
|
33
|
+
enum: string[];
|
|
34
|
+
default: string;
|
|
35
|
+
};
|
|
36
|
+
path: {
|
|
37
|
+
type: string;
|
|
38
|
+
};
|
|
39
|
+
};
|
|
40
|
+
required: string[];
|
|
41
|
+
};
|
|
42
|
+
execute: ({ query, mode, path: searchPath }: {
|
|
43
|
+
query: string;
|
|
44
|
+
mode?: string;
|
|
45
|
+
path?: string;
|
|
46
|
+
}) => Promise<string>;
|
|
47
|
+
};
|
|
48
|
+
structural_search: {
|
|
49
|
+
description: string;
|
|
50
|
+
parameters: {
|
|
51
|
+
type: string;
|
|
52
|
+
properties: {
|
|
53
|
+
pattern: {
|
|
54
|
+
type: string;
|
|
55
|
+
};
|
|
56
|
+
path: {
|
|
57
|
+
type: string;
|
|
58
|
+
default: string;
|
|
59
|
+
};
|
|
60
|
+
plan_id: {
|
|
61
|
+
type: string;
|
|
62
|
+
};
|
|
63
|
+
};
|
|
64
|
+
required: string[];
|
|
65
|
+
};
|
|
66
|
+
execute: ({ pattern, path: searchPath, plan_id }: {
|
|
67
|
+
pattern: string;
|
|
68
|
+
path?: string;
|
|
69
|
+
plan_id?: string;
|
|
70
|
+
}) => Promise<string>;
|
|
71
|
+
};
|
|
72
|
+
read_slice: {
|
|
73
|
+
description: string;
|
|
74
|
+
parameters: {
|
|
75
|
+
type: string;
|
|
76
|
+
properties: {
|
|
77
|
+
file: {
|
|
78
|
+
type: string;
|
|
79
|
+
};
|
|
80
|
+
start_line: {
|
|
81
|
+
type: string;
|
|
82
|
+
};
|
|
83
|
+
end_line: {
|
|
84
|
+
type: string;
|
|
85
|
+
};
|
|
86
|
+
plan_id: {
|
|
87
|
+
type: string;
|
|
88
|
+
};
|
|
89
|
+
};
|
|
90
|
+
required: string[];
|
|
91
|
+
};
|
|
92
|
+
execute: ({ file, start_line, end_line, plan_id }: {
|
|
93
|
+
file: string;
|
|
94
|
+
start_line: number;
|
|
95
|
+
end_line: number;
|
|
96
|
+
plan_id?: string;
|
|
97
|
+
}) => Promise<string>;
|
|
98
|
+
};
|
|
99
|
+
symbol_query: {
|
|
100
|
+
description: string;
|
|
101
|
+
parameters: {
|
|
102
|
+
type: string;
|
|
103
|
+
properties: {
|
|
104
|
+
symbol: {
|
|
105
|
+
type: string;
|
|
106
|
+
};
|
|
107
|
+
};
|
|
108
|
+
required: string[];
|
|
109
|
+
};
|
|
110
|
+
execute: ({ symbol }: {
|
|
111
|
+
symbol: string;
|
|
112
|
+
}) => Promise<string>;
|
|
113
|
+
};
|
|
114
|
+
jump_to_symbol: {
|
|
115
|
+
description: string;
|
|
116
|
+
parameters: {
|
|
117
|
+
type: string;
|
|
118
|
+
properties: {
|
|
119
|
+
symbol: {
|
|
120
|
+
type: string;
|
|
121
|
+
};
|
|
122
|
+
plan_id: {
|
|
123
|
+
type: string;
|
|
124
|
+
};
|
|
125
|
+
};
|
|
126
|
+
required: string[];
|
|
127
|
+
};
|
|
128
|
+
execute: ({ symbol, plan_id }: {
|
|
129
|
+
symbol: string;
|
|
130
|
+
plan_id?: string;
|
|
131
|
+
}) => Promise<string>;
|
|
132
|
+
};
|
|
133
|
+
brief_fix_loop: {
|
|
134
|
+
description: string;
|
|
135
|
+
parameters: {
|
|
136
|
+
type: string;
|
|
137
|
+
properties: {
|
|
138
|
+
symbol: {
|
|
139
|
+
type: string;
|
|
140
|
+
};
|
|
141
|
+
intent: {
|
|
142
|
+
type: string;
|
|
143
|
+
};
|
|
144
|
+
};
|
|
145
|
+
required: string[];
|
|
146
|
+
};
|
|
147
|
+
execute: ({ symbol, intent }: {
|
|
148
|
+
symbol: string;
|
|
149
|
+
intent: string;
|
|
150
|
+
}) => Promise<string>;
|
|
151
|
+
};
|
|
152
|
+
prepare_patch: {
|
|
153
|
+
description: string;
|
|
154
|
+
parameters: {
|
|
155
|
+
type: string;
|
|
156
|
+
properties: {
|
|
157
|
+
plan_id: {
|
|
158
|
+
type: string;
|
|
159
|
+
};
|
|
160
|
+
message: {
|
|
161
|
+
type: string;
|
|
162
|
+
};
|
|
163
|
+
};
|
|
164
|
+
required: string[];
|
|
165
|
+
};
|
|
166
|
+
execute: ({ plan_id, message }: {
|
|
167
|
+
plan_id?: string;
|
|
168
|
+
message: string;
|
|
169
|
+
}) => Promise<string>;
|
|
170
|
+
};
|
|
171
|
+
validate_patch: {
|
|
172
|
+
description: string;
|
|
173
|
+
parameters: {
|
|
174
|
+
type: string;
|
|
175
|
+
properties: {
|
|
176
|
+
patch_path: {
|
|
177
|
+
type: string;
|
|
178
|
+
};
|
|
179
|
+
timeout_ms: {
|
|
180
|
+
type: string;
|
|
181
|
+
default: number;
|
|
182
|
+
};
|
|
183
|
+
};
|
|
184
|
+
required: string[];
|
|
185
|
+
};
|
|
186
|
+
execute: ({ patch_path, timeout_ms }: {
|
|
187
|
+
patch_path: string;
|
|
188
|
+
timeout_ms?: number;
|
|
189
|
+
}) => Promise<string>;
|
|
190
|
+
};
|
|
191
|
+
finalize_plan: {
|
|
192
|
+
description: string;
|
|
193
|
+
parameters: {
|
|
194
|
+
type: string;
|
|
195
|
+
properties: {
|
|
196
|
+
plan_id: {
|
|
197
|
+
type: string;
|
|
198
|
+
};
|
|
199
|
+
outcome: {
|
|
200
|
+
type: string;
|
|
201
|
+
};
|
|
202
|
+
};
|
|
203
|
+
required: string[];
|
|
204
|
+
};
|
|
205
|
+
execute: ({ plan_id, outcome }: {
|
|
206
|
+
plan_id: string;
|
|
207
|
+
outcome: string;
|
|
208
|
+
}) => Promise<string>;
|
|
209
|
+
};
|
|
210
|
+
};
|
|
@@ -0,0 +1,256 @@
|
|
|
1
|
+
import { exec } from "node:child_process";
|
|
2
|
+
import * as fs from "node:fs/promises";
|
|
3
|
+
import * as fsSync from "node:fs";
|
|
4
|
+
import * as path from "node:path";
|
|
5
|
+
import { promisify } from "node:util";
|
|
6
|
+
import { fileURLToPath } from "node:url";
|
|
7
|
+
import * as crypto from "node:crypto";
|
|
8
|
+
const execAsync = promisify(exec);
|
|
9
|
+
const __dirname = path.dirname(fileURLToPath(import.meta.url));
|
|
10
|
+
const PROJECT_ROOT = process.cwd();
|
|
11
|
+
const OPENCODE_DIR = path.join(PROJECT_ROOT, ".opencode");
|
|
12
|
+
const CACHE_DIR = path.join(OPENCODE_DIR, "cache");
|
|
13
|
+
const ASSETS_DIR = path.join(__dirname, "../assets");
|
|
14
|
+
let pendingInitToken = null;
|
|
15
|
+
let pendingInitPlan = null;
|
|
16
|
+
async function runCmd(cmd, cwd = PROJECT_ROOT, timeoutMs = 30000) {
|
|
17
|
+
try {
|
|
18
|
+
const { stdout, stderr } = await execAsync(cmd, { cwd, maxBuffer: 10 * 1024 * 1024, timeout: timeoutMs });
|
|
19
|
+
return { stdout: stdout.trim(), stderr: stderr.trim() };
|
|
20
|
+
}
|
|
21
|
+
catch (error) {
|
|
22
|
+
if (error.signal === 'SIGTERM' && error.code === undefined) {
|
|
23
|
+
return { stdout: "", stderr: `Command timed out after ${timeoutMs}ms`, error, timedOut: true };
|
|
24
|
+
}
|
|
25
|
+
return { stdout: "", stderr: error.message, error };
|
|
26
|
+
}
|
|
27
|
+
}
|
|
28
|
+
async function checkBinary(bin) {
|
|
29
|
+
const { error } = await runCmd(`${bin} --version`, PROJECT_ROOT, 5000);
|
|
30
|
+
return !error;
|
|
31
|
+
}
|
|
32
|
+
async function getPatterns() {
|
|
33
|
+
const patternsPath = path.join(ASSETS_DIR, "patterns.json");
|
|
34
|
+
try {
|
|
35
|
+
const content = await fs.readFile(patternsPath, "utf-8");
|
|
36
|
+
return JSON.parse(content).patterns || [];
|
|
37
|
+
}
|
|
38
|
+
catch (e) {
|
|
39
|
+
return [];
|
|
40
|
+
}
|
|
41
|
+
}
|
|
42
|
+
async function ensureCache() {
|
|
43
|
+
await fs.mkdir(CACHE_DIR, { recursive: true });
|
|
44
|
+
}
|
|
45
|
+
async function cleanCache() {
|
|
46
|
+
try {
|
|
47
|
+
const files = await fs.readdir(CACHE_DIR);
|
|
48
|
+
const now = Date.now();
|
|
49
|
+
const MAX_AGE = 7 * 24 * 60 * 60 * 1000;
|
|
50
|
+
let deleted = 0;
|
|
51
|
+
for (const file of files) {
|
|
52
|
+
const filePath = path.join(CACHE_DIR, file);
|
|
53
|
+
const stats = await fs.stat(filePath);
|
|
54
|
+
if (now - stats.mtimeMs > MAX_AGE) {
|
|
55
|
+
await fs.unlink(filePath);
|
|
56
|
+
deleted++;
|
|
57
|
+
}
|
|
58
|
+
}
|
|
59
|
+
return deleted;
|
|
60
|
+
}
|
|
61
|
+
catch (e) {
|
|
62
|
+
return 0;
|
|
63
|
+
}
|
|
64
|
+
}
|
|
65
|
+
async function maintainSymbolIndex() {
|
|
66
|
+
await ensureCache();
|
|
67
|
+
if (!(await checkBinary("ctags"))) {
|
|
68
|
+
return { rebuilt: false, status: "unavailable", reason: "ctags binary missing" };
|
|
69
|
+
}
|
|
70
|
+
const tagsFile = path.join(CACHE_DIR, "tags");
|
|
71
|
+
const fingerprintFile = path.join(CACHE_DIR, "tags.fingerprint");
|
|
72
|
+
const { stdout: head } = await runCmd("git rev-parse HEAD");
|
|
73
|
+
const { stdout: status } = await runCmd("git status --porcelain");
|
|
74
|
+
const currentFingerprint = `${head}\n${status}`;
|
|
75
|
+
let storedFingerprint = "";
|
|
76
|
+
try {
|
|
77
|
+
storedFingerprint = await fs.readFile(fingerprintFile, "utf-8");
|
|
78
|
+
}
|
|
79
|
+
catch (e) { }
|
|
80
|
+
if (currentFingerprint !== storedFingerprint || !fsSync.existsSync(tagsFile)) {
|
|
81
|
+
const { error, stderr } = await runCmd(`ctags -R -f ${tagsFile} --languages=TypeScript,JavaScript,Python,Go,Rust,C++,C .`, PROJECT_ROOT);
|
|
82
|
+
if (error) {
|
|
83
|
+
return { rebuilt: false, status: "failed", reason: stderr };
|
|
84
|
+
}
|
|
85
|
+
await fs.writeFile(fingerprintFile, currentFingerprint);
|
|
86
|
+
return { rebuilt: true, status: "ok" };
|
|
87
|
+
}
|
|
88
|
+
return { rebuilt: false, status: "ok" };
|
|
89
|
+
}
|
|
90
|
+
export function systemTools() {
|
|
91
|
+
return {
|
|
92
|
+
autognosis_init: {
|
|
93
|
+
description: "Initialize or check the Autognosis environment. Two-phase: 'plan' (default) generates a token, 'apply' executes it.",
|
|
94
|
+
parameters: { type: "object", properties: { mode: { type: "string", enum: ["plan", "apply"], default: "plan" }, token: { type: "string" } }, required: ["mode"] },
|
|
95
|
+
execute: async ({ mode = "plan", token }) => {
|
|
96
|
+
if (mode === "plan") {
|
|
97
|
+
const checks = { rg: await checkBinary("rg"), fd: await checkBinary("fd"), sg: await checkBinary("sg"), ctags: await checkBinary("ctags"), git: await checkBinary("git") };
|
|
98
|
+
const actions = [];
|
|
99
|
+
if (!fsSync.existsSync(CACHE_DIR))
|
|
100
|
+
actions.push(`Create cache directory: ${CACHE_DIR}`);
|
|
101
|
+
else
|
|
102
|
+
actions.push(`Verify cache directory: ${CACHE_DIR} (exists)`);
|
|
103
|
+
const newToken = crypto.randomBytes(4).toString("hex");
|
|
104
|
+
pendingInitToken = newToken;
|
|
105
|
+
pendingInitPlan = { checks, actions };
|
|
106
|
+
return JSON.stringify({ status: "PLAN_READY", system_checks: checks, planned_actions: actions, confirm_token: newToken, instruction: "Review the plan. To execute, call autognosis_init(mode='apply', token='...')" }, null, 2);
|
|
107
|
+
}
|
|
108
|
+
else if (mode === "apply") {
|
|
109
|
+
if (!pendingInitToken || !token || token !== pendingInitToken)
|
|
110
|
+
return JSON.stringify({ status: "ERROR", message: "Invalid or expired confirmation token. Run mode='plan' first." });
|
|
111
|
+
await ensureCache();
|
|
112
|
+
pendingInitToken = null;
|
|
113
|
+
pendingInitPlan = null;
|
|
114
|
+
return JSON.stringify({ status: "SUCCESS", message: "Autognosis initialized.", cache_dir: CACHE_DIR });
|
|
115
|
+
}
|
|
116
|
+
}
|
|
117
|
+
},
|
|
118
|
+
fast_search: {
|
|
119
|
+
description: "Fast content or filename search using ripgrep (rg) and fd.",
|
|
120
|
+
parameters: { type: "object", properties: { query: { type: "string" }, mode: { type: "string", enum: ["filename", "content"], default: "filename" }, path: { type: "string" } }, required: ["query"] },
|
|
121
|
+
execute: async ({ query, mode = "filename", path: searchPath = "." }) => {
|
|
122
|
+
if (mode === "content") {
|
|
123
|
+
if (!(await checkBinary("rg")))
|
|
124
|
+
return "Error: 'rg' (ripgrep) is not installed. fast_search content mode unavailable.";
|
|
125
|
+
const { stdout } = await runCmd(`rg -n --column "${query}" "${searchPath}"`);
|
|
126
|
+
if (!stdout)
|
|
127
|
+
return "No matches found.";
|
|
128
|
+
return stdout.split('\n').slice(0, 50).join('\n') + (stdout.split('\n').length > 50 ? "\n... (truncated)" : "");
|
|
129
|
+
}
|
|
130
|
+
else {
|
|
131
|
+
if (!(await checkBinary("fd")))
|
|
132
|
+
return "Error: 'fd' is not installed. fast_search filename mode unavailable.";
|
|
133
|
+
const { stdout } = await runCmd(`fd "${query}" "${searchPath}"`);
|
|
134
|
+
if (!stdout)
|
|
135
|
+
return "No files found.";
|
|
136
|
+
return stdout.split('\n').slice(0, 50).join('\n') + (stdout.split('\n').length > 50 ? "\n... (truncated)" : "");
|
|
137
|
+
}
|
|
138
|
+
}
|
|
139
|
+
},
|
|
140
|
+
structural_search: {
|
|
141
|
+
description: "Search code using ast-grep patterns or patterns.json IDs.",
|
|
142
|
+
parameters: { type: "object", properties: { pattern: { type: "string" }, path: { type: "string", default: "." }, plan_id: { type: "string" } }, required: ["pattern"] },
|
|
143
|
+
execute: async ({ pattern, path: searchPath = ".", plan_id }) => {
|
|
144
|
+
if (!(await checkBinary("sg")))
|
|
145
|
+
return JSON.stringify({ error: "Degraded Mode: 'sg' (ast-grep) not found.", results: [], plan_id: plan_id || "OFF-PLAN" });
|
|
146
|
+
const knownPatterns = await getPatterns();
|
|
147
|
+
const known = knownPatterns.find((p) => p.name === pattern);
|
|
148
|
+
const cmd = known ? `sg scan -p "${known.pattern}" "${searchPath}" --json` : `sg scan -p "${pattern}" "${searchPath}" --json`;
|
|
149
|
+
const { stdout } = await runCmd(cmd);
|
|
150
|
+
let results = [];
|
|
151
|
+
try {
|
|
152
|
+
results = JSON.parse(stdout);
|
|
153
|
+
}
|
|
154
|
+
catch (e) { }
|
|
155
|
+
return JSON.stringify({ results: results.slice(0, 50), truncated: results.length > 50, plan_id: plan_id || "OFF-PLAN" }, null, 2);
|
|
156
|
+
}
|
|
157
|
+
},
|
|
158
|
+
read_slice: {
|
|
159
|
+
description: "Read a specific slice of a file.",
|
|
160
|
+
parameters: { type: "object", properties: { file: { type: "string" }, start_line: { type: "number" }, end_line: { type: "number" }, plan_id: { type: "string" } }, required: ["file", "start_line", "end_line"] },
|
|
161
|
+
execute: async ({ file, start_line, end_line, plan_id }) => {
|
|
162
|
+
const { stdout, stderr } = await runCmd(`sed -n '${start_line},${end_line}p;${end_line + 1}q' "${file}"`);
|
|
163
|
+
if (stderr)
|
|
164
|
+
return `Error reading slice: ${stderr}`;
|
|
165
|
+
return JSON.stringify({ file, start_line, end_line, content: stdout, plan_id: plan_id || "OFF-PLAN" }, null, 2);
|
|
166
|
+
}
|
|
167
|
+
},
|
|
168
|
+
symbol_query: {
|
|
169
|
+
description: "Query the symbol index. Rebuilds automatically if stale.",
|
|
170
|
+
parameters: { type: "object", properties: { symbol: { type: "string" } }, required: ["symbol"] },
|
|
171
|
+
execute: async ({ symbol }) => {
|
|
172
|
+
const maint = await maintainSymbolIndex();
|
|
173
|
+
if (maint.status === "unavailable")
|
|
174
|
+
return JSON.stringify({ error: "Symbol index unavailable", reason: maint.reason });
|
|
175
|
+
const tagsFile = path.join(CACHE_DIR, "tags");
|
|
176
|
+
if (await checkBinary("readtags")) {
|
|
177
|
+
const { stdout } = await runCmd(`readtags -t "${tagsFile}" "${symbol}"`);
|
|
178
|
+
return JSON.stringify({ matches: stdout.split('\n').filter(Boolean), metadata: maint }, null, 2);
|
|
179
|
+
}
|
|
180
|
+
else {
|
|
181
|
+
const { stdout: grepOut } = await runCmd(`grep -P "^${symbol}\t" "${tagsFile}"`);
|
|
182
|
+
return JSON.stringify({ matches: grepOut.split('\n').filter(Boolean), metadata: maint, note: "using_grep_fallback" }, null, 2);
|
|
183
|
+
}
|
|
184
|
+
}
|
|
185
|
+
},
|
|
186
|
+
jump_to_symbol: {
|
|
187
|
+
description: "Jump to a symbol's definition by querying the index and reading the slice.",
|
|
188
|
+
parameters: { type: "object", properties: { symbol: { type: "string" }, plan_id: { type: "string" } }, required: ["symbol"] },
|
|
189
|
+
execute: async ({ symbol, plan_id }) => {
|
|
190
|
+
const maint = await maintainSymbolIndex();
|
|
191
|
+
if (maint.status !== "ok")
|
|
192
|
+
return JSON.stringify({ error: "Index unavailable", reason: maint.reason });
|
|
193
|
+
const tagsFile = path.join(CACHE_DIR, "tags");
|
|
194
|
+
const { stdout: tagLine } = await runCmd(`grep -P "^${symbol}\t" "${tagsFile}" | head -n 1`);
|
|
195
|
+
if (!tagLine)
|
|
196
|
+
return JSON.stringify({ found: false, symbol });
|
|
197
|
+
const parts = tagLine.split('\t');
|
|
198
|
+
const file = parts[1];
|
|
199
|
+
let line = 1;
|
|
200
|
+
const { stdout: grepLine } = await runCmd(`grep -n "${symbol}" "${file}" | head -n 1`);
|
|
201
|
+
if (grepLine)
|
|
202
|
+
line = parseInt(grepLine.split(':')[0], 10);
|
|
203
|
+
const start = Math.max(1, line - 5);
|
|
204
|
+
const end = line + 15;
|
|
205
|
+
const { stdout: slice } = await runCmd(`sed -n '${start},${end}p;${end + 1}q' "${file}"`);
|
|
206
|
+
return JSON.stringify({ symbol, resolved_location: { file, line }, slice: { start, end, content: slice }, plan_id: plan_id || "OFF-PLAN" }, null, 2);
|
|
207
|
+
}
|
|
208
|
+
},
|
|
209
|
+
brief_fix_loop: {
|
|
210
|
+
description: "The Action Planner. Generates a bounded worklist for a symbol and intent.",
|
|
211
|
+
parameters: { type: "object", properties: { symbol: { type: "string" }, intent: { type: "string" } }, required: ["symbol", "intent"] },
|
|
212
|
+
execute: async ({ symbol, intent }) => {
|
|
213
|
+
const planId = `plan-${Date.now()}-${crypto.randomBytes(2).toString("hex")}`;
|
|
214
|
+
return JSON.stringify({ plan_id: planId, symbol, intent, status: "PLAN_GENERATED", instructions: "Use this plan_id for all subsequent operations.", mock_worklist: [{ file: "src/example.ts", reason: "Direct dependency", heat: "high" }] }, null, 2);
|
|
215
|
+
}
|
|
216
|
+
},
|
|
217
|
+
prepare_patch: {
|
|
218
|
+
description: "Generate a .diff artifact for the current changes, tied to a plan.",
|
|
219
|
+
parameters: { type: "object", properties: { plan_id: { type: "string" }, message: { type: "string" } }, required: ["message"] },
|
|
220
|
+
execute: async ({ plan_id, message }) => {
|
|
221
|
+
await ensureCache();
|
|
222
|
+
const patchPath = path.join(CACHE_DIR, `patch-${Date.now()}.diff`);
|
|
223
|
+
const { stdout } = await runCmd("git diff");
|
|
224
|
+
if (!stdout)
|
|
225
|
+
return "No changes to patch.";
|
|
226
|
+
const content = `// META: plan_id=${plan_id || "NONE"}\n// META: message=${message}\n\n${stdout}`;
|
|
227
|
+
await fs.writeFile(patchPath, content);
|
|
228
|
+
return `Patch saved to ${patchPath}`;
|
|
229
|
+
}
|
|
230
|
+
},
|
|
231
|
+
validate_patch: {
|
|
232
|
+
description: "Validate a patch by applying it in a fresh worktree. Enforces timeout.",
|
|
233
|
+
parameters: { type: "object", properties: { patch_path: { type: "string" }, timeout_ms: { type: "number", default: 30000 } }, required: ["patch_path"] },
|
|
234
|
+
execute: async ({ patch_path, timeout_ms = 30000 }) => {
|
|
235
|
+
const start = Date.now();
|
|
236
|
+
const { error, timedOut } = await runCmd(`git apply --check "${patch_path}"`, PROJECT_ROOT, timeout_ms);
|
|
237
|
+
if (timedOut)
|
|
238
|
+
return JSON.stringify({ status: "TIMEOUT", checked: ["git_apply_check"], failed_at: "git_apply_check", duration: Date.now() - start });
|
|
239
|
+
if (error)
|
|
240
|
+
return JSON.stringify({ status: "FAILED", checks: { git_apply_check: "failed" }, error: error.message });
|
|
241
|
+
return JSON.stringify({ status: "SUCCESS", checks: { git_apply_check: "passed" }, duration: Date.now() - start, note: "Patch is valid against current HEAD." }, null, 2);
|
|
242
|
+
}
|
|
243
|
+
},
|
|
244
|
+
finalize_plan: {
|
|
245
|
+
description: "Finalize a plan, logging metrics and cleaning cache.",
|
|
246
|
+
parameters: { type: "object", properties: { plan_id: { type: "string" }, outcome: { type: "string" } }, required: ["plan_id", "outcome"] },
|
|
247
|
+
execute: async ({ plan_id, outcome }) => {
|
|
248
|
+
await ensureCache();
|
|
249
|
+
const report = { plan_id, outcome, time: new Date().toISOString() };
|
|
250
|
+
await fs.appendFile(path.join(CACHE_DIR, "gaps.jsonl"), JSON.stringify(report) + "\n");
|
|
251
|
+
const deleted = await cleanCache();
|
|
252
|
+
return `Plan finalized. Metrics logged. Cache hygiene: deleted ${deleted} old items.`;
|
|
253
|
+
}
|
|
254
|
+
}
|
|
255
|
+
};
|
|
256
|
+
}
|
package/package.json
ADDED
|
@@ -0,0 +1,24 @@
|
|
|
1
|
+
{
|
|
2
|
+
"name": "opencode-autognosis",
|
|
3
|
+
"version": "0.1.0",
|
|
4
|
+
"description": "Transforms OpenCode agents into 'miniature engineers' with deep codebase awareness. Includes fast structural search (ast-grep), instant symbol navigation (ctags), and a disciplined 'Plan → Execute → Patch' workflow.",
|
|
5
|
+
"type": "module",
|
|
6
|
+
"exports": {
|
|
7
|
+
".": "./dist/index.js"
|
|
8
|
+
},
|
|
9
|
+
"types": "./dist/index.d.ts",
|
|
10
|
+
"files": [
|
|
11
|
+
"dist",
|
|
12
|
+
"assets",
|
|
13
|
+
"LICENSE",
|
|
14
|
+
"README.md"
|
|
15
|
+
],
|
|
16
|
+
"scripts": {
|
|
17
|
+
"build": "tsc -p tsconfig.json",
|
|
18
|
+
"prepublishOnly": "npm run build"
|
|
19
|
+
},
|
|
20
|
+
"devDependencies": {
|
|
21
|
+
"@types/node": "^25.1.0",
|
|
22
|
+
"typescript": "^5.0.0"
|
|
23
|
+
}
|
|
24
|
+
}
|