@dastbal/nestjs-ai-agent 1.0.0 β 1.0.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +6 -112
- package/dist/bin/cli.js +1 -1
- package/dist/core/agent/factory.d.ts +9 -0
- package/dist/core/rag/indexer.d.ts +8 -1
- package/dist/core/rag/indexer.js +14 -7
- package/dist/core/rag/retriever.d.ts +15 -4
- package/dist/core/rag/retriever.js +108 -82
- package/dist/core/tools/tools.d.ts +22 -0
- package/dist/core/tools/tools.js +80 -27
- package/package.json +3 -3
package/README.md
CHANGED
|
@@ -1,116 +1,10 @@
|
|
|
1
|
-
|
|
2
|
-
|
|
3
|
-

|
|
4
|
-
|
|
5
|
-
|
|
6
|
-
# @dastbal/nestjs-ai-agent π€
|
|
7
|
-
|
|
8
|
-
Autonomous AI Agent acting as a **Principal Software Engineer** for NestJS projects.
|
|
9
|
-
|
|
10
|
-
## Features
|
|
11
|
-
- **The Surgeon Rule**: Reads and analyzes files before writing code.
|
|
12
|
-
- **RAG Powered**: Uses your codebase as context for precise implementations.
|
|
13
|
-
- **SQLite Persistence**: Remembers conversation history between sessions.
|
|
14
|
-
- **Self-Correcting**: Automatically runs integrity checks and fixes compilation errors.
|
|
15
|
-
|
|
16
|
-
## Installation
|
|
17
|
-
```bash
|
|
1
|
+
@dastbal/nestjs-ai-agent π§ββοΈAutonomous Principal Software Engineer for NestJSTransform your NestJS development with an agent that doesn't just "chat", but operates directly on your codebase with Senior-level precision.π Quick StartBash# Install the agent
|
|
18
2
|
npm install @dastbal/nestjs-ai-agent
|
|
19
3
|
|
|
20
|
-
|
|
21
|
-
|
|
22
|
-
|
|
23
|
-
|
|
24
|
-
This agent uses Google Vertex AI. To allow the agent to operate, you must provide your Google Cloud credentials.
|
|
25
|
-
|
|
26
|
-
Create your credentials file: Place your Google Service Account JSON file in the root of your project and name it exactly: credentials_vertex.json
|
|
27
|
-
|
|
28
|
-
Setup your environment: Create a .env file in your project root and point to that file:
|
|
29
|
-
|
|
30
|
-
Code snippet
|
|
31
|
-
GOOGLE_APPLICATION_CREDENTIALS="./credentials_vertex.json"
|
|
32
|
-
# Optional: specify your project and location
|
|
4
|
+
# Run your first command
|
|
5
|
+
npx gen "Create a new Payments service with DDD patterns"
|
|
6
|
+
β¨ Key FeaturesFeatureDescriptionπ RAG SearchSemantic search across your entire codebase before proposing changes.π©Ί The Surgeon RuleNever overwrites a file without reading and analyzing it first.β
Self-HealingRuns integrity checks and auto-fixes compilation errors (up to 3 retries).πΎ Safe WritesAutomatic backups before any file modification.π§ SQLite MemoryRemembers conversation threads and preferences across restarts.π ConfigurationThis agent leverages Google Vertex AI. To start, you need to provide your Google Cloud credentials.Credentials File: Place your Google Service Account JSON in the root folder.Naming: Name it exactly credentials_vertex.json.Environment: Add the following to your .env file:Code snippetGOOGLE_APPLICATION_CREDENTIALS="./credentials_vertex.json"
|
|
7
|
+
# Optional config
|
|
33
8
|
GCP_PROJECT_ID="your-project-id"
|
|
34
9
|
GCP_LOCATION="us-central1"
|
|
35
|
-
|
|
36
|
-
|
|
37
|
-
## Introduction
|
|
38
|
-
|
|
39
|
-
I am a specialized AI, a Principal Software Engineer, crafted to assist you in your NestJS projects. I operate directly on the project's local file system, enabling me to read, understand, and modify your codebase with precision and efficiency. My goal is to help you write clean, well-documented, and testable code, adhering to the best practices of NestJS and Domain-Driven Design (DDD).
|
|
40
|
-
|
|
41
|
-
## Core Functionality
|
|
42
|
-
|
|
43
|
-
Here's a breakdown of how I work:
|
|
44
|
-
|
|
45
|
-
1. **Understanding Your Needs:** I start by carefully interpreting your requests. I clarify any ambiguities and break down complex tasks into manageable steps.
|
|
46
|
-
|
|
47
|
-
2. **Code Exploration (π `ask_codebase`):**
|
|
48
|
-
* Before making any changes, I use the `ask_codebase` tool. This is my primary research instrument.
|
|
49
|
-
* It allows me to perform semantic searches and analyze the project's dependency graph.
|
|
50
|
-
* I use it to find relevant code snippets, understand how different parts of the system interact, and locate the exact file paths I need.
|
|
51
|
-
|
|
52
|
-
3. **Code Comprehension (π `safe_read_file`):**
|
|
53
|
-
* When I need to modify a file, I *always* read its contents first using `safe_read_file`.
|
|
54
|
-
* This ensures I understand the existing code, its structure, and any existing documentation (TSDocs).
|
|
55
|
-
* This is crucial for avoiding regressions and maintaining code quality.
|
|
56
|
-
|
|
57
|
-
4. **Code Generation & Modification (βοΈ):**
|
|
58
|
-
* Based on your instructions and my understanding of the code, I generate or modify code.
|
|
59
|
-
* I adhere to strict TypeScript typing, DDD principles, and NestJS best practices.
|
|
60
|
-
* I also create corresponding tests to ensure code reliability.
|
|
61
|
-
|
|
62
|
-
5. **Code Writing (πΎ `safe_write_file`):**
|
|
63
|
-
* I write the generated or modified code to the file system using `safe_write_file`.
|
|
64
|
-
* This tool automatically creates a backup of the original file, providing a safety net.
|
|
65
|
-
|
|
66
|
-
6. **Integrity Validation (β
`run_integrity_check`):**
|
|
67
|
-
* Immediately after writing or modifying code, I run `run_integrity_check`.
|
|
68
|
-
* This is a critical step. It uses the TypeScript compiler to ensure that the code is type-safe and compiles without errors.
|
|
69
|
-
* This helps catch any mistakes I might have made.
|
|
70
|
-
|
|
71
|
-
7. **Self-Correction (π οΈ):**
|
|
72
|
-
* If the integrity check fails, I analyze the error messages and attempt to fix the code myself.
|
|
73
|
-
* I will retry up to three times. If I cannot fix the error, I will ask for human assistance.
|
|
74
|
-
|
|
75
|
-
8. **Iteration and Refinement (π):**
|
|
76
|
-
* I repeat these steps as needed, refining the code and ensuring it meets your requirements.
|
|
77
|
-
|
|
78
|
-
9. **Documentation (π):**
|
|
79
|
-
* I always strive to maintain and improve code documentation (TSDocs).
|
|
80
|
-
|
|
81
|
-
10. **Learning and Adaptation (π§ ):**
|
|
82
|
-
* I learn from your feedback. If you correct a style preference, I store it in `/memories/style-guide.txt` to improve future responses.
|
|
83
|
-
|
|
84
|
-
## Tools and Technologies
|
|
85
|
-
|
|
86
|
-
I am built upon a foundation of powerful tools and technologies:
|
|
87
|
-
|
|
88
|
-
* **Language Model:** I leverage a large language model (LLM) from Google, enabling me to understand and generate human-like text.
|
|
89
|
-
* **Retrieval-Augmented Generation (RAG):** I utilize a live RAG system. This means I can dynamically access and process information from your codebase to provide accurate and relevant responses. This allows me to understand the context of your project and generate code that fits seamlessly.
|
|
90
|
-
* **NestJS Expertise:** I have been specifically trained on NestJS best practices, DDD, and related concepts.
|
|
91
|
-
* **TypeScript:** I am fluent in TypeScript and adhere to strict typing.
|
|
92
|
-
* **Libraries:** I utilize a suite of libraries for code analysis, generation, and validation.
|
|
93
|
-
* **File System Access:** I have secure access to the project's local file system, allowing me to directly modify your code.
|
|
94
|
-
|
|
95
|
-
## Safety and Quality
|
|
96
|
-
|
|
97
|
-
* **Strict Typing:** I enforce strict TypeScript typing to prevent common errors.
|
|
98
|
-
* **Comprehensive Testing:** I create tests alongside the code to ensure its reliability.
|
|
99
|
-
* **Error Handling:** I use standard NestJS HTTP exceptions and avoid swallowing errors silently.
|
|
100
|
-
* **Code Reviews:** I am designed to be used in conjunction with human code reviews to ensure the highest quality.
|
|
101
|
-
|
|
102
|
-
## How to Interact with Me
|
|
103
|
-
|
|
104
|
-
Simply provide me with clear instructions, and I will do my best to assist you. For example:
|
|
105
|
-
|
|
106
|
-
* "Add a new DTO for the User entity."
|
|
107
|
-
* "Create a service to handle authentication."
|
|
108
|
-
* "Write a unit test for the UserService."
|
|
109
|
-
|
|
110
|
-
I will guide you through the process, providing feedback and ensuring the code meets your requirements.
|
|
111
|
-
|
|
112
|
-
## Disclaimer
|
|
113
|
-
|
|
114
|
-
I am an AI assistant and should be used responsibly. Always review the code I generate before deploying it to production. I am constantly learning and improving, but I am not perfect. Please report any issues or suggestions.
|
|
115
|
-
|
|
116
|
-
## Let's Build Something Amazing! β¨
|
|
10
|
+
[!CAUTION]Security First: Always add credentials_vertex.json and .env to your .gitignore.π οΈ Internal WorkflowThe agent follows a strict Principal Engineer protocol:Research: Uses ask_codebase to find patterns and dependencies.Comprehension: Reads existing logic with safe_read_file to avoid regressions.Implementation: Writes code following DDD, Strict Typing (No any), and TSDocs.Validation: Runs run_integrity_check (TypeScript compiler) immediately.Human-in-the-loop: Pauses for approval before critical disk operations.π‘ Usage ExamplesTry these commands to see the agent in action:Scaffolding: "Create a UserEntity with email and password fields using TypeORM"Logic: "Add a validation pipe to the login DTO"Testing: "Write a unit test for the AuthService including mocks for the repository"Refactoring: "Standardize all HTTP exceptions in the users controller"π§ Learning & AdaptationThe agent learns from your feedback. If you provide a style correction, it stores it in .agent/memories/style-guide.txt to ensure future code aligns with your personal or team preferences.π LicenseReleased under the MIT License. Build something amazing! β¨
|
package/dist/bin/cli.js
CHANGED
|
@@ -64,7 +64,7 @@ program
|
|
|
64
64
|
const threadId = "cli-user-session";
|
|
65
65
|
const agent = await factory_1.AgentFactory.create(threadId);
|
|
66
66
|
log.ai(`Procesando: "${instruction}"`);
|
|
67
|
-
const response = await agent.invoke({ messages: [{ role: "user", content: instruction }] }, { configurable: { thread_id: threadId } });
|
|
67
|
+
const response = await agent.invoke({ messages: [{ role: "user", content: instruction }] }, { configurable: { thread_id: threadId }, recursionLimit: 50 });
|
|
68
68
|
const lastMessage = response.messages[response.messages.length - 1];
|
|
69
69
|
if (lastMessage && lastMessage.content) {
|
|
70
70
|
console.log("\n" + chalk_1.default.green("--- RESPUESTA DEL AGENTE ---"));
|
|
@@ -1,24 +1,33 @@
|
|
|
1
1
|
export declare class AgentFactory {
|
|
2
2
|
static create(threadId?: string): Promise<import("langchain").ReactAgent<import("langchain").AgentTypeConfig<import("langchain").ResponseFormatUndefined, undefined, import("langchain").AnyAnnotationRoot, readonly import("langchain").AgentMiddleware<any, any, any, readonly (import("@langchain/core/tools").ClientTool | import("@langchain/core/tools").ServerTool)[]>[], readonly [import("langchain").DynamicStructuredTool<import("zod").ZodObject<{
|
|
3
3
|
query: import("zod").ZodString;
|
|
4
|
+
projectRoot: import("zod").ZodOptional<import("zod").ZodString>;
|
|
4
5
|
}, import("zod/v4/core").$strip>, {
|
|
5
6
|
query: string;
|
|
7
|
+
projectRoot?: string | undefined;
|
|
6
8
|
}, {
|
|
7
9
|
query: string;
|
|
10
|
+
projectRoot?: string | undefined;
|
|
8
11
|
}, string, "ask_codebase">, import("langchain").DynamicStructuredTool<import("zod").ZodObject<{}, import("zod/v4/core").$strip>, Record<string, never>, Record<string, never>, string, "run_integrity_check">, import("langchain").DynamicStructuredTool<import("zod").ZodObject<{
|
|
9
12
|
filePath: import("zod").ZodString;
|
|
10
13
|
content: import("zod").ZodString;
|
|
14
|
+
projectRoot: import("zod").ZodOptional<import("zod").ZodString>;
|
|
11
15
|
}, import("zod/v4/core").$strip>, {
|
|
12
16
|
filePath: string;
|
|
13
17
|
content: string;
|
|
18
|
+
projectRoot?: string | undefined;
|
|
14
19
|
}, {
|
|
15
20
|
filePath: string;
|
|
16
21
|
content: string;
|
|
22
|
+
projectRoot?: string | undefined;
|
|
17
23
|
}, string, "safe_write_file">, import("langchain").DynamicStructuredTool<import("zod").ZodObject<{
|
|
18
24
|
filePath: import("zod").ZodString;
|
|
25
|
+
projectRoot: import("zod").ZodOptional<import("zod").ZodString>;
|
|
19
26
|
}, import("zod/v4/core").$strip>, {
|
|
20
27
|
filePath: string;
|
|
28
|
+
projectRoot?: string | undefined;
|
|
21
29
|
}, {
|
|
22
30
|
filePath: string;
|
|
31
|
+
projectRoot?: string | undefined;
|
|
23
32
|
}, string, "safe_read_file">, import("langchain").DynamicStructuredTool<import("zod").ZodObject<{}, import("zod/v4/core").$strip>, Record<string, never>, Record<string, never>, string, "refresh_project_index">]>>>;
|
|
24
33
|
}
|
|
@@ -12,26 +12,33 @@ export declare class IndexerService {
|
|
|
12
12
|
/**
|
|
13
13
|
* Main Entry Point: Scans the project and updates the brain.
|
|
14
14
|
* Scans files, checks hashes, generates embeddings, and saves the knowledge graph.
|
|
15
|
-
*
|
|
15
|
+
* @param sourceDir - Relative path to source code (usually 'src'). Defaults to 'src'.
|
|
16
16
|
*/
|
|
17
17
|
indexProject(sourceDir?: string): Promise<void>;
|
|
18
18
|
/**
|
|
19
19
|
* Processes a single file: Reads content, Calculates Hash, Parses AST,
|
|
20
20
|
* Updates Registry, and Accumulates Chunks.
|
|
21
|
+
* @param filePath - The relative path of the file to process.
|
|
22
|
+
* @param chunkAccumulator - An array to accumulate processed code chunks.
|
|
23
|
+
* @param edgeAccumulator - An array to accumulate dependency graph edges.
|
|
21
24
|
*/
|
|
22
25
|
private processSingleFile;
|
|
23
26
|
/**
|
|
24
27
|
* Generates embeddings using Vertex AI and saves them to SQLite in transactions.
|
|
28
|
+
* @param allChunks - An array of all processed chunks to embed and save.
|
|
25
29
|
*/
|
|
26
30
|
private embedAndSaveBatches;
|
|
27
31
|
/**
|
|
28
32
|
* Persists dependency relationships into the graph table.
|
|
29
33
|
* Uses 'INSERT OR IGNORE' to prevent duplicates without errors.
|
|
34
|
+
* @param edges - An array of GraphEdge objects to save.
|
|
30
35
|
*/
|
|
31
36
|
private saveGraph;
|
|
32
37
|
/**
|
|
33
38
|
* Recursively gets all .ts files in a directory.
|
|
34
39
|
* Returns RELATIVE paths (e.g., 'src/users/users.service.ts') to ensure consistency in DB.
|
|
40
|
+
* @param dir - The directory to search in.
|
|
41
|
+
* @param fileList - An accumulator for the list of files found.
|
|
35
42
|
*/
|
|
36
43
|
private getAllFiles;
|
|
37
44
|
}
|
package/dist/core/rag/indexer.js
CHANGED
|
@@ -56,7 +56,7 @@ class IndexerService {
|
|
|
56
56
|
/**
|
|
57
57
|
* Main Entry Point: Scans the project and updates the brain.
|
|
58
58
|
* Scans files, checks hashes, generates embeddings, and saves the knowledge graph.
|
|
59
|
-
*
|
|
59
|
+
* @param sourceDir - Relative path to source code (usually 'src'). Defaults to 'src'.
|
|
60
60
|
*/
|
|
61
61
|
async indexProject(sourceDir = 'src') {
|
|
62
62
|
const rootDir = process.cwd();
|
|
@@ -100,6 +100,9 @@ class IndexerService {
|
|
|
100
100
|
/**
|
|
101
101
|
* Processes a single file: Reads content, Calculates Hash, Parses AST,
|
|
102
102
|
* Updates Registry, and Accumulates Chunks.
|
|
103
|
+
* @param filePath - The relative path of the file to process.
|
|
104
|
+
* @param chunkAccumulator - An array to accumulate processed code chunks.
|
|
105
|
+
* @param edgeAccumulator - An array to accumulate dependency graph edges.
|
|
103
106
|
*/
|
|
104
107
|
processSingleFile(filePath, chunkAccumulator, edgeAccumulator) {
|
|
105
108
|
try {
|
|
@@ -131,6 +134,7 @@ class IndexerService {
|
|
|
131
134
|
}
|
|
132
135
|
/**
|
|
133
136
|
* Generates embeddings using Vertex AI and saves them to SQLite in transactions.
|
|
137
|
+
* @param allChunks - An array of all processed chunks to embed and save.
|
|
134
138
|
*/
|
|
135
139
|
async embedAndSaveBatches(allChunks) {
|
|
136
140
|
console.log(`π§ Generating Embeddings for ${allChunks.length} chunks...`);
|
|
@@ -150,9 +154,9 @@ class IndexerService {
|
|
|
150
154
|
const embeddingsModel = provider_1.LLMProvider.getEmbeddingsModel();
|
|
151
155
|
const vectors = await embeddingsModel.embedDocuments(textsToEmbed);
|
|
152
156
|
// 3. Save to DB (Transaction for performance)
|
|
153
|
-
const insertChunk = this.db.prepare(`
|
|
154
|
-
INSERT OR REPLACE INTO code_chunks (id, file_path, chunk_type, content, vector_json, metadata)
|
|
155
|
-
VALUES (?, ?, ?, ?, ?, ?)
|
|
157
|
+
const insertChunk = this.db.prepare(`
|
|
158
|
+
INSERT OR REPLACE INTO code_chunks (id, file_path, chunk_type, content, vector_json, metadata)
|
|
159
|
+
VALUES (?, ?, ?, ?, ?, ?)
|
|
156
160
|
`);
|
|
157
161
|
// Explicitly typed transaction callback to fix TS7006
|
|
158
162
|
const insertMany = this.db.transaction((chunks, vectors) => {
|
|
@@ -174,13 +178,14 @@ class IndexerService {
|
|
|
174
178
|
/**
|
|
175
179
|
* Persists dependency relationships into the graph table.
|
|
176
180
|
* Uses 'INSERT OR IGNORE' to prevent duplicates without errors.
|
|
181
|
+
* @param edges - An array of GraphEdge objects to save.
|
|
177
182
|
*/
|
|
178
183
|
saveGraph(edges) {
|
|
179
184
|
if (!edges || edges.length === 0)
|
|
180
185
|
return;
|
|
181
|
-
const insertEdge = this.db.prepare(`
|
|
182
|
-
INSERT OR IGNORE INTO dependency_graph (source, target, relation)
|
|
183
|
-
VALUES (?, ?, ?)
|
|
186
|
+
const insertEdge = this.db.prepare(`
|
|
187
|
+
INSERT OR IGNORE INTO dependency_graph (source, target, relation)
|
|
188
|
+
VALUES (?, ?, ?)
|
|
184
189
|
`);
|
|
185
190
|
// Explicitly typed transaction callback to fix TS7006
|
|
186
191
|
const runMany = this.db.transaction((edges) => {
|
|
@@ -191,6 +196,8 @@ class IndexerService {
|
|
|
191
196
|
/**
|
|
192
197
|
* Recursively gets all .ts files in a directory.
|
|
193
198
|
* Returns RELATIVE paths (e.g., 'src/users/users.service.ts') to ensure consistency in DB.
|
|
199
|
+
* @param dir - The directory to search in.
|
|
200
|
+
* @param fileList - An accumulator for the list of files found.
|
|
194
201
|
*/
|
|
195
202
|
getAllFiles(dir, fileList = []) {
|
|
196
203
|
const files = fs.readdirSync(dir);
|
|
@@ -3,17 +3,25 @@ interface SearchResult {
|
|
|
3
3
|
chunk: ProcessedChunk;
|
|
4
4
|
score: number;
|
|
5
5
|
}
|
|
6
|
+
/**
|
|
7
|
+
* @description Service responsible for retrieving relevant code context from the codebase using vector search and dependency graph analysis.
|
|
8
|
+
*/
|
|
6
9
|
export declare class RetrieverService {
|
|
7
10
|
private db;
|
|
8
11
|
/**
|
|
9
12
|
* Searches the codebase using Vector Embeddings (Cosine Similarity).
|
|
10
|
-
* @param query - The natural language query.
|
|
11
|
-
* @param limit -
|
|
13
|
+
* @param query - The natural language query to search for.
|
|
14
|
+
* @param limit - The maximum number of chunks to retrieve. Defaults to 5.
|
|
15
|
+
* @param projectRoot - Optional. The project root directory. This parameter provides context for the search and ensures consistency with how paths were indexed. Defaults to process.cwd().
|
|
16
|
+
* @returns {Promise<SearchResult[]>} A promise that resolves to an array of search results, each containing a code chunk and its relevance score.
|
|
12
17
|
*/
|
|
13
|
-
query(query: string, limit?: number): Promise<SearchResult[]>;
|
|
18
|
+
query(query: string, limit?: number, projectRoot?: string): Promise<SearchResult[]>;
|
|
14
19
|
/**
|
|
15
20
|
* Retrieves the 'Graph Dependencies' for a specific file from the DB.
|
|
16
21
|
* This allows the Agent to know what other files are related (DTOs, Interfaces).
|
|
22
|
+
* @param sourcePath - The path to the source file (expected to be relative to projectRoot).
|
|
23
|
+
* @param projectRoot - Optional. The project root directory. This parameter provides context for interpreting paths. Defaults to process.cwd().
|
|
24
|
+
* @returns {string[]} An array of paths representing the dependencies of the source file.
|
|
17
25
|
*/
|
|
18
26
|
private getDependencies;
|
|
19
27
|
/**
|
|
@@ -22,7 +30,10 @@ export declare class RetrieverService {
|
|
|
22
30
|
* 1. The matched code snippets (Vector Search).
|
|
23
31
|
* 2. The file's dependencies (Graph Search).
|
|
24
32
|
* 3. Explicit File Paths to encourage using 'read_file'.
|
|
33
|
+
* @param query - The search query.
|
|
34
|
+
* @param projectRoot - Optional. The project root directory. This parameter provides context for the search and ensures consistency with how paths were indexed. Defaults to process.cwd().
|
|
35
|
+
* @returns {Promise<string>} A promise that resolves to the formatted context report string.
|
|
25
36
|
*/
|
|
26
|
-
getContextForLLM(query: string): Promise<string>;
|
|
37
|
+
getContextForLLM(query: string, projectRoot?: string): Promise<string>;
|
|
27
38
|
}
|
|
28
39
|
export {};
|
|
@@ -32,68 +32,89 @@ var __importStar = (this && this.__importStar) || (function () {
|
|
|
32
32
|
return result;
|
|
33
33
|
};
|
|
34
34
|
})();
|
|
35
|
+
var __importDefault = (this && this.__importDefault) || function (mod) {
|
|
36
|
+
return (mod && mod.__esModule) ? mod : { "default": mod };
|
|
37
|
+
};
|
|
35
38
|
Object.defineProperty(exports, "__esModule", { value: true });
|
|
36
39
|
exports.RetrieverService = void 0;
|
|
37
40
|
const db_1 = require("../state/db");
|
|
38
41
|
const provider_1 = require("../llm/provider");
|
|
39
42
|
const math_1 = require("./math");
|
|
40
43
|
const path = __importStar(require("path"));
|
|
44
|
+
const chalk_1 = __importDefault(require("chalk")); // Import chalk for colored logs
|
|
45
|
+
const log = {
|
|
46
|
+
tool: (msg) => console.log(chalk_1.default.yellow('π οΈ [TOOL]: ') + msg),
|
|
47
|
+
error: (msg) => console.log(chalk_1.default.red('β [ERR]: ') + msg),
|
|
48
|
+
};
|
|
49
|
+
/**
|
|
50
|
+
* @description Service responsible for retrieving relevant code context from the codebase using vector search and dependency graph analysis.
|
|
51
|
+
*/
|
|
41
52
|
class RetrieverService {
|
|
42
53
|
constructor() {
|
|
43
54
|
this.db = db_1.AgentDB.getInstance();
|
|
44
55
|
}
|
|
45
56
|
/**
|
|
46
57
|
* Searches the codebase using Vector Embeddings (Cosine Similarity).
|
|
47
|
-
* @param query - The natural language query.
|
|
48
|
-
* @param limit -
|
|
58
|
+
* @param query - The natural language query to search for.
|
|
59
|
+
* @param limit - The maximum number of chunks to retrieve. Defaults to 5.
|
|
60
|
+
* @param projectRoot - Optional. The project root directory. This parameter provides context for the search and ensures consistency with how paths were indexed. Defaults to process.cwd().
|
|
61
|
+
* @returns {Promise<SearchResult[]>} A promise that resolves to an array of search results, each containing a code chunk and its relevance score.
|
|
49
62
|
*/
|
|
50
|
-
async query(query, limit = 5) {
|
|
51
|
-
|
|
52
|
-
|
|
53
|
-
|
|
54
|
-
|
|
55
|
-
|
|
56
|
-
|
|
57
|
-
const
|
|
58
|
-
|
|
59
|
-
|
|
60
|
-
|
|
61
|
-
|
|
62
|
-
|
|
63
|
-
|
|
64
|
-
|
|
65
|
-
|
|
66
|
-
|
|
67
|
-
|
|
68
|
-
|
|
69
|
-
|
|
70
|
-
|
|
71
|
-
|
|
72
|
-
|
|
63
|
+
async query(query, limit = 5, projectRoot = process.cwd()) {
|
|
64
|
+
log.tool(`RetrieverService.query: Searching with query: "${query}" (Project Root Context: "${projectRoot}")`);
|
|
65
|
+
try {
|
|
66
|
+
const embeddingModel = provider_1.LLMProvider.getEmbeddingsModel();
|
|
67
|
+
const queryVector = await embeddingModel.embedQuery(query);
|
|
68
|
+
const stmt = this.db.prepare('SELECT * FROM code_chunks');
|
|
69
|
+
const rows = stmt.all();
|
|
70
|
+
const scoredChunks = rows.map((row) => {
|
|
71
|
+
const vector = JSON.parse(row.vector_json);
|
|
72
|
+
const score = (0, math_1.cosineSimilarity)(queryVector, vector);
|
|
73
|
+
const metadata = JSON.parse(row.metadata);
|
|
74
|
+
return {
|
|
75
|
+
score,
|
|
76
|
+
chunk: {
|
|
77
|
+
id: row.id,
|
|
78
|
+
type: row.chunk_type,
|
|
79
|
+
content: row.content,
|
|
80
|
+
metadata: metadata,
|
|
81
|
+
// filePath is expected to be a relative path stored in the DB
|
|
82
|
+
filePath: row.file_path || metadata.filePath,
|
|
83
|
+
},
|
|
84
|
+
};
|
|
85
|
+
});
|
|
86
|
+
log.tool(`RetrieverService.query: Found ${scoredChunks.length} raw chunks.`);
|
|
87
|
+
return scoredChunks.sort((a, b) => b.score - a.score).slice(0, limit);
|
|
88
|
+
}
|
|
89
|
+
catch (error) {
|
|
90
|
+
log.error(`RetrieverService.query: Error during search: ${error.message}`);
|
|
91
|
+
throw new Error(`Failed to perform codebase search: ${error.message}`);
|
|
92
|
+
}
|
|
73
93
|
}
|
|
74
94
|
/**
|
|
75
95
|
* Retrieves the 'Graph Dependencies' for a specific file from the DB.
|
|
76
96
|
* This allows the Agent to know what other files are related (DTOs, Interfaces).
|
|
97
|
+
* @param sourcePath - The path to the source file (expected to be relative to projectRoot).
|
|
98
|
+
* @param projectRoot - Optional. The project root directory. This parameter provides context for interpreting paths. Defaults to process.cwd().
|
|
99
|
+
* @returns {string[]} An array of paths representing the dependencies of the source file.
|
|
77
100
|
*/
|
|
78
|
-
getDependencies(sourcePath) {
|
|
79
|
-
//
|
|
80
|
-
//
|
|
101
|
+
getDependencies(sourcePath, projectRoot = process.cwd()) {
|
|
102
|
+
// Paths in the DB are relative to the project root where indexing occurred.
|
|
103
|
+
// Normalization ensures consistency regardless of OS path separators.
|
|
81
104
|
const normalizedPath = sourcePath.split(path.sep).join('/');
|
|
82
105
|
try {
|
|
83
|
-
|
|
84
|
-
|
|
85
|
-
|
|
86
|
-
|
|
87
|
-
FROM dependency_graph
|
|
88
|
-
WHERE source = ? OR source = ?
|
|
106
|
+
const stmt = this.db.prepare(`
|
|
107
|
+
SELECT target
|
|
108
|
+
FROM dependency_graph
|
|
109
|
+
WHERE source = ? OR source = ?
|
|
89
110
|
`);
|
|
90
|
-
//
|
|
111
|
+
// Query using both normalized and original path for robustness, though normalized should match DB.
|
|
91
112
|
const results = stmt.all(normalizedPath, sourcePath);
|
|
92
|
-
|
|
113
|
+
log.tool(`RetrieverService.getDependencies: Found ${results.length} dependencies for ${sourcePath}`);
|
|
93
114
|
return results.map((row) => row.target);
|
|
94
115
|
}
|
|
95
116
|
catch (error) {
|
|
96
|
-
|
|
117
|
+
log.error(`RetrieverService.getDependencies: Error fetching dependencies for ${sourcePath}: ${error.message}`);
|
|
97
118
|
return [];
|
|
98
119
|
}
|
|
99
120
|
}
|
|
@@ -103,54 +124,59 @@ class RetrieverService {
|
|
|
103
124
|
* 1. The matched code snippets (Vector Search).
|
|
104
125
|
* 2. The file's dependencies (Graph Search).
|
|
105
126
|
* 3. Explicit File Paths to encourage using 'read_file'.
|
|
127
|
+
* @param query - The search query.
|
|
128
|
+
* @param projectRoot - Optional. The project root directory. This parameter provides context for the search and ensures consistency with how paths were indexed. Defaults to process.cwd().
|
|
129
|
+
* @returns {Promise<string>} A promise that resolves to the formatted context report string.
|
|
106
130
|
*/
|
|
107
|
-
async getContextForLLM(query) {
|
|
108
|
-
|
|
109
|
-
|
|
110
|
-
|
|
111
|
-
|
|
112
|
-
const
|
|
113
|
-
|
|
114
|
-
|
|
115
|
-
|
|
116
|
-
|
|
117
|
-
|
|
118
|
-
|
|
119
|
-
|
|
120
|
-
|
|
121
|
-
|
|
122
|
-
}
|
|
123
|
-
|
|
124
|
-
filesMap.get(path)?.chunks.push(res.chunk);
|
|
125
|
-
}
|
|
126
|
-
// Build the formatted string
|
|
127
|
-
let output = `π **RAG ANALYSIS REPORT**\n`;
|
|
128
|
-
output += `Query: "${query}"\n`;
|
|
129
|
-
output += `Found ${filesMap.size} relevant files.\n\n`;
|
|
130
|
-
filesMap.forEach((fileCtx) => {
|
|
131
|
-
const relevancePct = (fileCtx.relevance * 100).toFixed(1);
|
|
132
|
-
output += `=================================================================\n`;
|
|
133
|
-
output += `π **FILE:** ${fileCtx.filePath}\n`;
|
|
134
|
-
output += `π **RELEVANCE:** ${relevancePct}%\n`;
|
|
135
|
-
if (fileCtx.imports.length > 0) {
|
|
136
|
-
output += `π **DEPENDENCIES (Imports):**\n`;
|
|
137
|
-
// Show top 5 imports to give context on DTOs/Entities used
|
|
138
|
-
fileCtx.imports
|
|
139
|
-
.slice(0, 5)
|
|
140
|
-
.forEach((imp) => (output += ` - ${imp}\n`));
|
|
141
|
-
if (fileCtx.imports.length > 5)
|
|
142
|
-
output += ` - (...and ${fileCtx.imports.length - 5} more)\n`;
|
|
131
|
+
async getContextForLLM(query, projectRoot = process.cwd()) {
|
|
132
|
+
log.tool(`RetrieverService.getContextForLLM: Generating context for query: "${query}" (Project Root Context: "${projectRoot}")`);
|
|
133
|
+
try {
|
|
134
|
+
const results = await this.query(query, 4, projectRoot);
|
|
135
|
+
const filesMap = new Map();
|
|
136
|
+
for (const res of results) {
|
|
137
|
+
const filePath = res.chunk.filePath || 'unknown';
|
|
138
|
+
if (!filesMap.has(filePath)) {
|
|
139
|
+
filesMap.set(filePath, {
|
|
140
|
+
filePath: filePath,
|
|
141
|
+
relevance: res.score,
|
|
142
|
+
chunks: [],
|
|
143
|
+
// Pass projectRoot to getDependencies for consistency, though it primarily uses DB paths.
|
|
144
|
+
imports: this.getDependencies(filePath, projectRoot),
|
|
145
|
+
});
|
|
146
|
+
}
|
|
147
|
+
filesMap.get(filePath)?.chunks.push(res.chunk);
|
|
143
148
|
}
|
|
144
|
-
output
|
|
145
|
-
|
|
146
|
-
|
|
147
|
-
|
|
149
|
+
let output = `π **RAG ANALYSIS REPORT**\n`;
|
|
150
|
+
output += `Query: "${query}"\n`;
|
|
151
|
+
output += `Found ${filesMap.size} relevant files.\n\n`;
|
|
152
|
+
filesMap.forEach((fileCtx) => {
|
|
153
|
+
const relevancePct = (fileCtx.relevance * 100).toFixed(1);
|
|
154
|
+
output += `=================================================================\n`;
|
|
155
|
+
output += `π **FILE:** ${fileCtx.filePath}\n`;
|
|
156
|
+
output += `π **RELEVANCE:** ${relevancePct}%\n`;
|
|
157
|
+
if (fileCtx.imports.length > 0) {
|
|
158
|
+
output += `π **DEPENDENCIES (Imports):**\n`;
|
|
159
|
+
fileCtx.imports
|
|
160
|
+
.slice(0, 5)
|
|
161
|
+
.forEach((imp) => (output += ` - ${imp}\n`));
|
|
162
|
+
if (fileCtx.imports.length > 5)
|
|
163
|
+
output += ` - (...and ${fileCtx.imports.length - 5} more)\n`;
|
|
164
|
+
}
|
|
165
|
+
output += `\nπ **CODE SNIPPETS:**\n`;
|
|
166
|
+
fileCtx.chunks.forEach((chunk) => {
|
|
167
|
+
output += ` --- [${chunk.metadata.methodName || 'Class Structure'}] ---\n`;
|
|
168
|
+
output += `${chunk.content.trim()}\n\n`;
|
|
169
|
+
});
|
|
170
|
+
output += `π‘ **AGENT HINT:** To edit this file or see full imports, run: read_file("${fileCtx.filePath}")\n`;
|
|
171
|
+
output += `=================================================================\n\n`;
|
|
148
172
|
});
|
|
149
|
-
|
|
150
|
-
output
|
|
151
|
-
}
|
|
152
|
-
|
|
153
|
-
|
|
173
|
+
log.tool(`RetrieverService.getContextForLLM: Successfully generated context report.`);
|
|
174
|
+
return output;
|
|
175
|
+
}
|
|
176
|
+
catch (error) {
|
|
177
|
+
log.error(`RetrieverService.getContextForLLM: Error generating context: ${error.message}`);
|
|
178
|
+
throw new Error(`Failed to generate context for LLM: ${error.message}`);
|
|
179
|
+
}
|
|
154
180
|
}
|
|
155
181
|
}
|
|
156
182
|
exports.RetrieverService = RetrieverService;
|
|
@@ -1,20 +1,39 @@
|
|
|
1
1
|
import { z } from 'zod';
|
|
2
|
+
/**
|
|
3
|
+
* @description WRITES code to the REAL local disk. Creates a backup automatically.
|
|
4
|
+
* @param filePath - Relative path to the file (e.g., 'src/app.service.ts'). This path is relative to the project root.
|
|
5
|
+
* @param content - Full file content.
|
|
6
|
+
* @param projectRoot - Optional. The project root directory. Defaults to process.cwd().
|
|
7
|
+
* @returns {Promise<string>} - A confirmation message or an error message.
|
|
8
|
+
*/
|
|
2
9
|
export declare const safeWriteFileTool: import("@langchain/core/tools").DynamicStructuredTool<z.ZodObject<{
|
|
3
10
|
filePath: z.ZodString;
|
|
4
11
|
content: z.ZodString;
|
|
12
|
+
projectRoot: z.ZodOptional<z.ZodString>;
|
|
5
13
|
}, z.core.$strip>, {
|
|
6
14
|
filePath: string;
|
|
7
15
|
content: string;
|
|
16
|
+
projectRoot?: string | undefined;
|
|
8
17
|
}, {
|
|
9
18
|
filePath: string;
|
|
10
19
|
content: string;
|
|
20
|
+
projectRoot?: string | undefined;
|
|
11
21
|
}, string, "safe_write_file">;
|
|
22
|
+
/**
|
|
23
|
+
* @description READS code from the REAL local disk.
|
|
24
|
+
* @param filePath - Relative path to the file (e.g., 'src/app.service.ts'). This path is relative to the project root.
|
|
25
|
+
* @param projectRoot - Optional. The project root directory. Defaults to process.cwd().
|
|
26
|
+
* @returns {Promise<string>} - The file content or an error message.
|
|
27
|
+
*/
|
|
12
28
|
export declare const safeReadFileTool: import("@langchain/core/tools").DynamicStructuredTool<z.ZodObject<{
|
|
13
29
|
filePath: z.ZodString;
|
|
30
|
+
projectRoot: z.ZodOptional<z.ZodString>;
|
|
14
31
|
}, z.core.$strip>, {
|
|
15
32
|
filePath: string;
|
|
33
|
+
projectRoot?: string | undefined;
|
|
16
34
|
}, {
|
|
17
35
|
filePath: string;
|
|
36
|
+
projectRoot?: string | undefined;
|
|
18
37
|
}, string, "safe_read_file">;
|
|
19
38
|
/**
|
|
20
39
|
* π TOOL: Ask Codebase (Semantic & Graph Search)
|
|
@@ -22,10 +41,13 @@ export declare const safeReadFileTool: import("@langchain/core/tools").DynamicSt
|
|
|
22
41
|
*/
|
|
23
42
|
export declare const askCodebaseTool: import("@langchain/core/tools").DynamicStructuredTool<z.ZodObject<{
|
|
24
43
|
query: z.ZodString;
|
|
44
|
+
projectRoot: z.ZodOptional<z.ZodString>;
|
|
25
45
|
}, z.core.$strip>, {
|
|
26
46
|
query: string;
|
|
47
|
+
projectRoot?: string | undefined;
|
|
27
48
|
}, {
|
|
28
49
|
query: string;
|
|
50
|
+
projectRoot?: string | undefined;
|
|
29
51
|
}, string, "ask_codebase">;
|
|
30
52
|
/**
|
|
31
53
|
* β
TOOL: Integrity Check (Compiler)
|
package/dist/core/tools/tools.js
CHANGED
|
@@ -56,12 +56,11 @@ const log = {
|
|
|
56
56
|
/**
|
|
57
57
|
* πΎ Genera un backup antes de modificar un archivo real.
|
|
58
58
|
*/
|
|
59
|
-
const createBackup = (filePath) => {
|
|
60
|
-
const
|
|
61
|
-
const backupDir = path.join(rootDir, '.agent', 'backups');
|
|
59
|
+
const createBackup = (filePath, projectRoot = process.cwd()) => {
|
|
60
|
+
const backupDir = path.join(projectRoot, '.agent', 'backups');
|
|
62
61
|
if (!fs.existsSync(backupDir))
|
|
63
62
|
fs.mkdirSync(backupDir, { recursive: true });
|
|
64
|
-
const realPath = path.resolve(
|
|
63
|
+
const realPath = path.resolve(projectRoot, filePath);
|
|
65
64
|
if (fs.existsSync(realPath)) {
|
|
66
65
|
const timestamp = new Date().toISOString().replace(/[:.]/g, '-');
|
|
67
66
|
const filename = path.basename(realPath);
|
|
@@ -69,63 +68,111 @@ const createBackup = (filePath) => {
|
|
|
69
68
|
fs.copyFileSync(realPath, backupPath);
|
|
70
69
|
}
|
|
71
70
|
};
|
|
72
|
-
|
|
71
|
+
/**
|
|
72
|
+
* @description WRITES code to the REAL local disk. Creates a backup automatically.
|
|
73
|
+
* @param filePath - Relative path to the file (e.g., 'src/app.service.ts'). This path is relative to the project root.
|
|
74
|
+
* @param content - Full file content.
|
|
75
|
+
* @param projectRoot - Optional. The project root directory. Defaults to process.cwd().
|
|
76
|
+
* @returns {Promise<string>} - A confirmation message or an error message.
|
|
77
|
+
*/
|
|
78
|
+
exports.safeWriteFileTool = (0, tools_1.tool)(async ({ filePath, content, projectRoot = process.cwd() }) => {
|
|
73
79
|
try {
|
|
74
|
-
|
|
75
|
-
const targetPath = path.resolve(
|
|
76
|
-
|
|
77
|
-
|
|
80
|
+
// Resolve the target path relative to the project root
|
|
81
|
+
const targetPath = path.resolve(projectRoot, filePath);
|
|
82
|
+
log.tool(`safe_write_file: Attempting to write to: ${targetPath}`);
|
|
83
|
+
// Canonicalize paths for reliable comparison
|
|
84
|
+
const canonicalProjectPath = fs.realpathSync(projectRoot);
|
|
85
|
+
const canonicalTargetPath = fs.realpathSync(targetPath);
|
|
86
|
+
// Security Check: Ensure the target path is within the project root
|
|
87
|
+
if (!canonicalTargetPath.startsWith(canonicalProjectPath)) {
|
|
88
|
+
log.error(`safe_write_file: Access denied. Target path "${canonicalTargetPath}" is outside the project root "${canonicalProjectPath}".`);
|
|
89
|
+
return 'β Error: Access denied. Attempted to write outside the project root.';
|
|
90
|
+
}
|
|
91
|
+
// Ensure the directory exists
|
|
78
92
|
const dir = path.dirname(targetPath);
|
|
79
|
-
if (!fs.existsSync(dir))
|
|
93
|
+
if (!fs.existsSync(dir)) {
|
|
94
|
+
log.tool(`safe_write_file: Creating directory: ${dir}`);
|
|
80
95
|
fs.mkdirSync(dir, { recursive: true });
|
|
81
|
-
|
|
96
|
+
}
|
|
97
|
+
// Create backup
|
|
98
|
+
createBackup(filePath, projectRoot);
|
|
99
|
+
// Write the file
|
|
82
100
|
fs.writeFileSync(targetPath, content, 'utf-8');
|
|
83
|
-
log.
|
|
101
|
+
log.tool(`File successfully written to: ${targetPath}`);
|
|
102
|
+
// Indexing (asynchronous)
|
|
103
|
+
log.sys(`Indexing change in: ${filePath}`);
|
|
84
104
|
const indexer = new indexer_1.IndexerService();
|
|
85
|
-
|
|
105
|
+
// Corrected: Pass only the sourceDir (defaulting to 'src') if needed, not projectRoot here.
|
|
106
|
+
indexer.indexProject('src').catch((err) => log.error(` Indexing failed: ${err.message}`));
|
|
86
107
|
return `β
File saved to REAL DISK: ${filePath}`;
|
|
87
108
|
}
|
|
88
109
|
catch (error) {
|
|
110
|
+
log.error(`safe_write_file: Error writing file "${filePath}": ${error.message}`);
|
|
89
111
|
return `β Error: ${error.message}`;
|
|
90
112
|
}
|
|
91
113
|
}, {
|
|
92
114
|
name: 'safe_write_file',
|
|
93
|
-
description: 'WRITES code to the REAL local disk. Creates a backup automatically.',
|
|
115
|
+
description: 'WRITES code to the REAL local disk. Creates a backup automatically. Ensures the file is within the project root.',
|
|
94
116
|
schema: zod_1.z.object({
|
|
95
|
-
filePath: zod_1.z.string().describe('Relative path (e.g., src/app.service.ts)'),
|
|
96
|
-
content: zod_1.z.string().describe('Full file content'),
|
|
117
|
+
filePath: zod_1.z.string().describe('Relative path to the file (e.g., src/app.service.ts).'),
|
|
118
|
+
content: zod_1.z.string().describe('Full file content.'),
|
|
119
|
+
projectRoot: zod_1.z.string().optional().describe('Optional. The project root directory. Defaults to process.cwd().'),
|
|
97
120
|
}),
|
|
98
121
|
});
|
|
99
122
|
// --- TOOL 2: READ FILE (Manual) ---
|
|
100
|
-
|
|
123
|
+
/**
|
|
124
|
+
* @description READS code from the REAL local disk.
|
|
125
|
+
* @param filePath - Relative path to the file (e.g., 'src/app.service.ts'). This path is relative to the project root.
|
|
126
|
+
* @param projectRoot - Optional. The project root directory. Defaults to process.cwd().
|
|
127
|
+
* @returns {Promise<string>} - The file content or an error message.
|
|
128
|
+
*/
|
|
129
|
+
exports.safeReadFileTool = (0, tools_1.tool)(async ({ filePath, projectRoot = process.cwd() }) => {
|
|
101
130
|
try {
|
|
102
|
-
const
|
|
103
|
-
|
|
104
|
-
|
|
131
|
+
const targetPath = path.resolve(projectRoot, filePath);
|
|
132
|
+
log.tool(`safe_read_file: Attempting to read from: ${targetPath}`);
|
|
133
|
+
// Canonicalize paths for reliable comparison
|
|
134
|
+
const canonicalProjectPath = fs.realpathSync(projectRoot);
|
|
135
|
+
const canonicalTargetPath = fs.realpathSync(targetPath);
|
|
136
|
+
// Security Check: Ensure the target path is within the project root
|
|
137
|
+
if (!canonicalTargetPath.startsWith(canonicalProjectPath)) {
|
|
138
|
+
log.error(`safe_read_file: Access denied. Target path "${canonicalTargetPath}" is outside the project root "${canonicalProjectPath}".`);
|
|
139
|
+
return 'β Error: Access denied. Attempted to read outside the project root.';
|
|
140
|
+
}
|
|
141
|
+
if (!fs.existsSync(targetPath)) {
|
|
142
|
+
log.error(`safe_read_file: File not found: ${targetPath}`);
|
|
105
143
|
return 'β File not found.';
|
|
106
|
-
|
|
144
|
+
}
|
|
145
|
+
const content = fs.readFileSync(targetPath, 'utf-8');
|
|
146
|
+
log.tool(`Successfully read file: ${targetPath}`);
|
|
147
|
+
return content;
|
|
107
148
|
}
|
|
108
149
|
catch (e) {
|
|
150
|
+
log.error(`safe_read_file: Error reading file "${filePath}": ${e.message}`);
|
|
109
151
|
return `Error: ${e.message}`;
|
|
110
152
|
}
|
|
111
153
|
}, {
|
|
112
154
|
name: 'safe_read_file',
|
|
113
|
-
description: 'READS code from the REAL local disk.',
|
|
114
|
-
schema: zod_1.z.object({
|
|
155
|
+
description: 'READS code from the REAL local disk. Ensures the file is within the project root.',
|
|
156
|
+
schema: zod_1.z.object({
|
|
157
|
+
filePath: zod_1.z.string().describe('Relative path to the file (e.g., src/app.service.ts).'),
|
|
158
|
+
projectRoot: zod_1.z.string().optional().describe('Optional. The project root directory. Defaults to process.cwd().')
|
|
159
|
+
}),
|
|
115
160
|
});
|
|
116
161
|
/**
|
|
117
162
|
* π TOOL: Ask Codebase (Semantic & Graph Search)
|
|
118
163
|
* This is the Agent's "Eyes". It retrieves code + context.
|
|
119
164
|
*/
|
|
120
|
-
exports.askCodebaseTool = (0, tools_1.tool)(async ({ query }) => {
|
|
165
|
+
exports.askCodebaseTool = (0, tools_1.tool)(async ({ query, projectRoot = process.cwd() }) => {
|
|
121
166
|
try {
|
|
122
|
-
|
|
167
|
+
log.tool(`ask_codebase: Querying codebase for: "${query}" with projectRoot: "${projectRoot}"`);
|
|
123
168
|
const retriever = new retriever_1.RetrieverService();
|
|
124
|
-
const context = await retriever.getContextForLLM(query);
|
|
169
|
+
const context = await retriever.getContextForLLM(query, projectRoot);
|
|
170
|
+
log.tool(`ask_codebase: Found context for query: "${query}"`);
|
|
125
171
|
return context;
|
|
126
172
|
}
|
|
127
173
|
catch (error) {
|
|
128
|
-
|
|
174
|
+
log.error(`ask_codebase: Error querying codebase: ${error.message}`);
|
|
175
|
+
return `β Error querying codebase: ${error.message}`;
|
|
129
176
|
}
|
|
130
177
|
}, {
|
|
131
178
|
name: 'ask_codebase',
|
|
@@ -138,6 +185,7 @@ exports.askCodebaseTool = (0, tools_1.tool)(async ({ query }) => {
|
|
|
138
185
|
query: zod_1.z
|
|
139
186
|
.string()
|
|
140
187
|
.describe("A natural language query describing the logic, DTO, or functionality you are looking for. (e.g., 'How is the RefundEntity defined?', 'Show me the auth guard')"),
|
|
188
|
+
projectRoot: zod_1.z.string().optional().describe('Optional. The project root directory. Defaults to process.cwd().'),
|
|
141
189
|
}),
|
|
142
190
|
});
|
|
143
191
|
/**
|
|
@@ -147,12 +195,15 @@ exports.askCodebaseTool = (0, tools_1.tool)(async ({ query }) => {
|
|
|
147
195
|
exports.integrityCheckTool = (0, tools_1.tool)(async () => {
|
|
148
196
|
try {
|
|
149
197
|
const rootDir = process.cwd();
|
|
198
|
+
log.tool('integrityCheckTool: Running integrity check');
|
|
150
199
|
// 'tsc --noEmit' checks types without generating JS files. Fast and safe.
|
|
151
200
|
const { stdout } = await execAsync('npx tsc --noEmit', { cwd: rootDir });
|
|
201
|
+
log.tool('integrityCheckTool: Integrity check passed');
|
|
152
202
|
console.log('INTEGRITY CHECK PASSED.', rootDir, stdout);
|
|
153
203
|
return `β
INTEGRITY CHECK PASSED. The codebase is strictly typed and compiles correctly.\n${stdout}`;
|
|
154
204
|
}
|
|
155
205
|
catch (error) {
|
|
206
|
+
log.error('integrityCheckTool: Integrity check failed');
|
|
156
207
|
// Return the exact compiler error so the agent can fix it
|
|
157
208
|
return `β INTEGRITY CHECK FAILED. You must fix these TypeScript errors before finishing:\n${error.stdout || error.message}`;
|
|
158
209
|
}
|
|
@@ -172,9 +223,11 @@ exports.integrityCheckTool = (0, tools_1.tool)(async () => {
|
|
|
172
223
|
exports.refreshIndexTool = (0, tools_1.tool)(async () => {
|
|
173
224
|
log.sys('π Starting full project re-indexing...');
|
|
174
225
|
try {
|
|
226
|
+
log.tool('refreshIndexTool: Starting full project re-indexing...'); // ADDED LOG
|
|
175
227
|
// Start the expensive operation
|
|
176
228
|
const indexer = new indexer_1.IndexerService();
|
|
177
229
|
indexer.indexProject().catch((err) => log.error(` ${err.message}`));
|
|
230
|
+
log.tool('refreshIndexTool: Re-indexing completed successfully.'); // ADDED LOG
|
|
178
231
|
log.sys('β
Re-indexing completed successfully.');
|
|
179
232
|
return 'β
Index successfully updated. I now have access to the latest code version.';
|
|
180
233
|
}
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@dastbal/nestjs-ai-agent",
|
|
3
|
-
"version": "1.0.
|
|
3
|
+
"version": "1.0.2",
|
|
4
4
|
"description": "Autonomous AI Agent for NestJS - Principal Software Engineer Level with RAG and SQLite persistence",
|
|
5
5
|
"author": "David Balladares",
|
|
6
6
|
"license": "MIT",
|
|
@@ -38,8 +38,8 @@
|
|
|
38
38
|
"zod": "^4.3.5"
|
|
39
39
|
},
|
|
40
40
|
"peerDependencies": {
|
|
41
|
-
|
|
42
|
-
|
|
41
|
+
"@nestjs/common": "^10.0.0 || ^11.0.0",
|
|
42
|
+
"reflect-metadata": "^0.1.13 || ^0.2.0"
|
|
43
43
|
},
|
|
44
44
|
"devDependencies": {
|
|
45
45
|
"@nestjs/common": "^10.4.22",
|