ronds-mcp-tracker 0.1.8 → 0.1.10

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,141 +1,84 @@
1
- # Node.js MCP Code Change Tracker
2
-
3
- A Node.js MCP server that records code changes and publishes them to Kafka.
4
-
5
- ## Features
6
- - Install via a single `npx` command
7
- - Kafka producer powered by `kafkajs`
8
- - Automatically records code changes to Kafka
9
- - Failed queue persistence with replay support
10
-
11
- ## Installation
12
- Example MCP config (default worker_id precedence: `./.ai_conifg/config.json` -> `MCP_TRACKER_WORKER_ID` -> OS username):
13
-
14
- ```json
15
- {
16
- "mcpServers": {
17
- "code-change-tracker": {
18
- "command": "npx",
19
- "args": ["@yourorg/mcp-tracker"]
20
- }
21
- }
22
- }
23
- ```
24
-
25
- Custom worker id:
26
-
27
- ```json
28
- {
29
- "mcpServers": {
30
- "code-change-tracker": {
31
- "command": "npx",
32
- "args": ["@yourorg/mcp-tracker"],
33
- "env": {
34
- "MCP_TRACKER_WORKER_ID": "my-worker"
35
- }
36
- }
37
- }
38
- }
39
- ```
40
-
41
- ## Configuration
42
- - `worker_id` resolution order:
43
- 1. `./.ai_conifg/config.json` 的 `worker_id`
44
- 2. 环境变量 `MCP_TRACKER_WORKER_ID`
45
- 3. 系统用户名(`os.userInfo().username`)
46
- - `MCP_TRACKER_WORKER_ID`: 当本地 `.ai_conifg/config.json` 未提供有效 `worker_id` 时生效。
47
- - Kafka config is currently hardcoded (see `src/config.ts`) and can be extended later.
48
-
49
- ## Tools
50
- ### `before_edit`
51
- 适用场景:支持在编辑前先调用 MCP 的客户端。服务端会保存编辑前快照,并返回 `snapshot_id` 供后续 `after_edit` 使用。
52
-
53
- Input:
54
- - `file_path`: file path
55
-
56
- Output:
57
- - `status`: `snapshotted`
58
- - `snapshot_id`: snapshot id
59
- - `file_path`: normalized file path
60
-
61
- ### `after_edit`
62
- 适用场景:基于 `before_edit` 保存的快照,在编辑后计算完整 diff。该模式能区分新增与删除,`file.operation` 为 `edit`。
63
-
64
- Input:
65
- - `file_path`: file path
66
- - `snapshot_id`: snapshot id returned by `before_edit`
67
-
68
- Output:
69
- - `status`: `recorded` or `queued_for_retry`
70
- - `event_id`: event id
71
- - `lines_changed`: total changed lines (`added + removed`)
72
- - `added_content`: added lines content (may be truncated)
73
-
74
- ### `direct_add`
75
- 适用场景:客户端无法在编辑前调用 MCP,只能在编辑完成后上报新增内容时使用。该模式不依赖 snapshot,直接记录新增内容,`file.operation``add`。
76
-
77
- Input:
78
- - `file_path`: file path
79
- - `added_content`: newly added content supplied by the client
80
-
81
- Output:
82
- - `status`: `recorded` or `queued_for_retry`
83
- - `event_id`: event id
84
- - `lines_changed`: non-empty line count in `added_content`
85
- - `added_content`: original added content
86
-
87
- ### `cleanup_failed`
88
- Input:
89
- - `max_age_hours` (optional): default 168
90
-
91
- Output:
92
- - `status`
93
- - `deleted_count`
94
-
95
- ### `replay_failed`
96
- Input:
97
- - `max_count` (optional): default 10
98
-
99
- Output:
100
- - `status`
101
- - `replayed`
102
- - `failed`
103
-
104
- ## Failed Queue
105
- Failed events are stored in `~/.mcp-tracker/failed` as JSON files. Old files are cleaned by age and replayed in FIFO order.
106
-
107
- ## Usage Flow
108
- ### Snapshot mode (`before_edit` + `after_edit`)
109
- 适用于 Claude Code 这类可以在编辑前后分别调用 MCP 的客户端:
110
-
111
- 1. 调用 `before_edit` 保存服务端快照
112
- 2. 完成文件编辑
113
- 3. 调用 `after_edit` 并传入 `snapshot_id`
114
-
115
- 说明:
116
- - 如果目标文件在 `before_edit` 时还不存在,服务端会保存一个空快照。
117
- - 随后 `after_edit` 会把该文件视为“从空内容新增”,正常上传新增内容。
118
- - 这种场景下事件类型仍然是 `file.operation = edit`,用于保持 snapshot 模式的一致性。
119
-
120
- ### Direct add mode (`direct_add`)
121
- 适用于 Cursor 这类通常只能在编辑完成后调用 MCP 的客户端:
122
-
123
- 1. 客户端拿到本次新增内容
124
- 2. 调用 `direct_add(file_path, added_content)`
125
- 3. 服务端直接构建 `ChangeEvent` 并发布到 Kafka
126
-
127
- 适用建议:
128
- - 能走 `before_edit` / `after_edit` 时,优先使用 snapshot 模式。
129
- - 客户端只能后调用、拿不到编辑前快照时,再使用 `direct_add` 兜底。
130
-
131
- ### Direct add mode (`direct_add`)
132
- 适用于 Cursor 这类通常只能在编辑完成后调用 MCP 的客户端:
133
-
134
- 1. 客户端拿到本次新增内容
135
- 2. 调用 `direct_add(file_path, added_content)`
136
- 3. 服务端直接构建 `ChangeEvent` 并发布到 Kafka
137
-
138
- ## Development
139
- - Build: `npm run build`
140
- - Test tools list: `echo '{"jsonrpc":"2.0","method":"tools/list","id":1}' | node dist/server.js`
141
- - Test direct add locally: call `tools/call` with `direct_add`, `file_path`, and multi-line `added_content`
1
+ # Node.js MCP Code Change Tracker
2
+
3
+ A Node.js MCP server that records code changes and publishes them to Kafka.
4
+
5
+ ## Features
6
+ - Install via a single `npx` command
7
+ - Kafka producer powered by `kafkajs`
8
+ - Automatically records code changes to Kafka
9
+ - Failed queue persistence with replay support
10
+
11
+ ## Installation
12
+ Example MCP config (default worker_id precedence: `./.ai_conifg/config.json` -> `MCP_TRACKER_WORKER_ID` -> OS username):
13
+
14
+ ```json
15
+ {
16
+ "mcpServers": {
17
+ "code-change-tracker": {
18
+ "command": "npx",
19
+ "args": ["@yourorg/mcp-tracker"]
20
+ }
21
+ }
22
+ }
23
+ ```
24
+
25
+ Custom worker id:
26
+
27
+ ```json
28
+ {
29
+ "mcpServers": {
30
+ "code-change-tracker": {
31
+ "command": "npx",
32
+ "args": ["@yourorg/mcp-tracker"],
33
+ "env": {
34
+ "MCP_TRACKER_WORKER_ID": "my-worker"
35
+ }
36
+ }
37
+ }
38
+ }
39
+ ```
40
+
41
+ ## Configuration
42
+ - `worker_id` resolution order:
43
+ 1. `./.ai_conifg/config.json` 的 `worker_id`
44
+ 2. 环境变量 `MCP_TRACKER_WORKER_ID`
45
+ 3. 系统用户名(`os.userInfo().username`)
46
+ - `MCP_TRACKER_WORKER_ID`: 当本地 `.ai_conifg/config.json` 未提供有效 `worker_id` 时生效。
47
+ - Kafka config is currently hardcoded (see `src/config.ts`) and can be extended later.
48
+
49
+ ## Tools
50
+ ### `after_edit`
51
+ Input:
52
+ - `file_path`: file path
53
+ - `old_string`: previous content
54
+ - `new_string`: new content
55
+
56
+ Output:
57
+ - `status`: `recorded`
58
+ - `event_id`: event id
59
+ - `lines_changed`: total lines changed
60
+ - `added_content`: added lines content (may be truncated)
61
+
62
+ ### `cleanup_failed`
63
+ Input:
64
+ - `max_age_hours` (optional): default 168
65
+
66
+ Output:
67
+ - `status`
68
+ - `deleted_count`
69
+
70
+ ### `replay_failed`
71
+ Input:
72
+ - `max_count` (optional): default 10
73
+
74
+ Output:
75
+ - `status`
76
+ - `replayed`
77
+ - `failed`
78
+
79
+ ## Failed Queue
80
+ Failed events are stored in `~/.mcp-tracker/failed` as JSON files. Old files are cleaned by age and replayed in FIFO order.
81
+
82
+ ## Development
83
+ - Build: `npm run build`
84
+ - Test tools list: `echo '{"jsonrpc":"2.0","method":"tools/list","id":1}' | node dist/server.js`
package/dist/config.js CHANGED
@@ -9,6 +9,7 @@ export const FAILED_QUEUE_CONFIG = {
9
9
  };
10
10
  function resolveWorkerId() {
11
11
  const localConfigPath = path.resolve(process.cwd(), '.ai_config', 'config.json');
12
+ console.log('[mcp-tracker] localConfigPath:', localConfigPath);
12
13
  try {
13
14
  if (fs.existsSync(localConfigPath)) {
14
15
  const raw = fs.readFileSync(localConfigPath, 'utf-8');
package/dist/server.js CHANGED
@@ -1,7 +1,6 @@
1
1
  import fs from 'node:fs';
2
2
  import path from 'node:path';
3
3
  import os from 'node:os';
4
- import { execSync } from 'node:child_process';
5
4
  import { randomUUID } from 'node:crypto';
6
5
  import { diffLines } from 'diff';
7
6
  import { Server } from '@modelcontextprotocol/sdk/server/index.js';
@@ -41,27 +40,12 @@ function getRepoNameFromGitConfig(cwd) {
41
40
  return null;
42
41
  }
43
42
  }
44
- function getGitRepoRoot(cwd) {
45
- try {
46
- const output = execSync('git rev-parse --show-toplevel', {
47
- cwd,
48
- encoding: 'utf8',
49
- stdio: ['ignore', 'pipe', 'ignore']
50
- });
51
- const repoRoot = output.trim();
52
- return repoRoot || null;
53
- }
54
- catch {
55
- return null;
56
- }
57
- }
58
43
  async function getGitInfo(cwd) {
59
- const repoRoot = getGitRepoRoot(cwd) ?? cwd;
60
- const repoNameFromGitConfig = getRepoNameFromGitConfig(repoRoot);
44
+ const repoNameFromGitConfig = getRepoNameFromGitConfig(cwd);
61
45
  if (repoNameFromGitConfig) {
62
- return { repo_name: repoNameFromGitConfig, repo_root: repoRoot };
46
+ return { repo_name: repoNameFromGitConfig };
63
47
  }
64
- const configPath = path.join(repoRoot, '.gitlab', 'config.json');
48
+ const configPath = path.join(cwd, '.gitlab', 'config.json');
65
49
  if (!fs.existsSync(configPath)) {
66
50
  return null;
67
51
  }
@@ -72,25 +56,12 @@ async function getGitInfo(cwd) {
72
56
  if (!repoName) {
73
57
  return null;
74
58
  }
75
- return { repo_name: repoName, repo_root: repoRoot };
59
+ return { repo_name: repoName };
76
60
  }
77
61
  catch {
78
62
  return null;
79
63
  }
80
64
  }
81
- function toRepoRelativePath(filePath, repoRoot) {
82
- if (!repoRoot) {
83
- return filePath;
84
- }
85
- const relativePath = path.relative(repoRoot, filePath);
86
- if (!relativePath || relativePath.startsWith('..') || path.isAbsolute(relativePath)) {
87
- return filePath;
88
- }
89
- return relativePath.replace(/\\/g, '/');
90
- }
91
- function getKafkaFilePath(filePath, gitInfo) {
92
- return toRepoRelativePath(filePath, gitInfo?.repo_root);
93
- }
94
65
  function getAddedLines(oldStr, newStr) {
95
66
  const parts = diffLines(oldStr, newStr);
96
67
  let lineCount = 0;
@@ -126,75 +97,9 @@ function countDiff(oldStr, newStr) {
126
97
  }
127
98
  return { added, removed };
128
99
  }
129
- function countNonEmptyLines(content) {
130
- return content.split('\n').filter((line) => line !== '').length;
131
- }
132
- async function buildChangeEvent(params) {
133
- const gitInfo = await getGitInfo(process.cwd());
134
- return {
135
- event_id: randomUUID(),
136
- timestamp: new Date().toISOString(),
137
- worker_id: params.config.worker_id,
138
- git: gitInfo ? { repo_name: gitInfo.repo_name } : null,
139
- file: { path: getKafkaFilePath(params.filePath, gitInfo), operation: params.operation },
140
- changes: {
141
- added_content: params.addedContent,
142
- added: params.added,
143
- removed: params.removed
144
- }
145
- };
146
- }
147
100
  function getFailedDir() {
148
101
  return path.join(os.homedir(), '.mcp-tracker', 'failed');
149
102
  }
150
- function getSnapshotsDir() {
151
- return path.join(os.homedir(), '.mcp-tracker', 'snapshots');
152
- }
153
- function ensureSnapshotsDirExists() {
154
- const dir = getSnapshotsDir();
155
- if (!fs.existsSync(dir)) {
156
- fs.mkdirSync(dir, { recursive: true });
157
- }
158
- return dir;
159
- }
160
- function getSnapshotPath(snapshotId) {
161
- return path.join(getSnapshotsDir(), `${snapshotId}.json`);
162
- }
163
- function cleanupSnapshotsOnStartup() {
164
- const dir = ensureSnapshotsDirExists();
165
- const entries = fs.readdirSync(dir);
166
- let deleted = 0;
167
- for (const name of entries) {
168
- const filePath = path.join(dir, name);
169
- const stat = fs.statSync(filePath);
170
- if (!stat.isFile()) {
171
- continue;
172
- }
173
- fs.unlinkSync(filePath);
174
- deleted += 1;
175
- }
176
- return { status: 'cleaned', deleted_count: deleted };
177
- }
178
- function saveSnapshot(snapshot) {
179
- const dir = ensureSnapshotsDirExists();
180
- const filePath = path.join(dir, `${snapshot.snapshot_id}.json`);
181
- fs.writeFileSync(filePath, JSON.stringify(snapshot, null, 2), 'utf8');
182
- return filePath;
183
- }
184
- function loadSnapshot(snapshotId) {
185
- const filePath = getSnapshotPath(snapshotId);
186
- if (!fs.existsSync(filePath)) {
187
- return null;
188
- }
189
- const raw = fs.readFileSync(filePath, 'utf8');
190
- return JSON.parse(raw);
191
- }
192
- function deleteSnapshot(snapshotId) {
193
- const filePath = getSnapshotPath(snapshotId);
194
- if (fs.existsSync(filePath)) {
195
- fs.unlinkSync(filePath);
196
- }
197
- }
198
103
  function cleanupFailed(maxAgeHours = 168) {
199
104
  const dir = getFailedDir();
200
105
  if (!fs.existsSync(dir)) {
@@ -262,80 +167,23 @@ async function replayFailed(maxCount, config) {
262
167
  }
263
168
  return { status: 'completed', replayed, failed };
264
169
  }
265
- function validateFilePath(filePath) {
266
- const normalizedPath = filePath.trim();
267
- if (!normalizedPath) {
268
- throw new Error('file_path is required');
269
- }
270
- return normalizedPath;
271
- }
272
- function readFileContent(filePath) {
273
- return fs.readFileSync(filePath, 'utf8');
274
- }
275
- function getFailedEventPath(eventId) {
276
- return path.join(getFailedDir(), `${eventId}.json`);
277
- }
278
- function handleBeforeEdit(input) {
279
- const filePath = validateFilePath(input.file_path);
280
- const content = fs.existsSync(filePath) ? readFileContent(filePath) : '';
281
- const snapshotId = randomUUID();
282
- saveSnapshot({
283
- snapshot_id: snapshotId,
284
- file_path: filePath,
285
- created_at: new Date().toISOString(),
286
- content
287
- });
288
- return JSON.stringify({
289
- status: 'snapshotted',
290
- snapshot_id: snapshotId,
291
- file_path: filePath
292
- });
293
- }
294
170
  async function handleAfterEdit(input, config) {
295
- const filePath = validateFilePath(input.file_path);
296
- const snapshotId = input.snapshot_id?.trim();
297
- if (!snapshotId) {
298
- throw new Error('snapshot_id is required');
299
- }
300
- const snapshot = loadSnapshot(snapshotId);
301
- if (!snapshot) {
302
- return JSON.stringify({
303
- status: 'snapshot_not_found',
304
- snapshot_id: snapshotId,
305
- file_path: filePath
306
- });
307
- }
308
- if (snapshot.file_path !== filePath) {
309
- return JSON.stringify({
310
- status: 'snapshot_file_path_mismatch',
311
- snapshot_id: snapshotId,
312
- file_path: filePath,
313
- snapshot_file_path: snapshot.file_path
314
- });
315
- }
316
- const oldStr = snapshot.content;
317
- const newStr = readFileContent(filePath);
318
- const addedLines = getAddedLines(oldStr, newStr);
319
- const diffStats = countDiff(oldStr, newStr);
320
- const event = await buildChangeEvent({
321
- filePath,
322
- operation: 'edit',
323
- addedContent: addedLines,
324
- added: diffStats.added,
325
- removed: diffStats.removed,
326
- config
327
- });
171
+ const addedLines = getAddedLines(input.old_string, input.new_string);
172
+ const diffStats = countDiff(input.old_string, input.new_string);
173
+ const gitInfo = await getGitInfo(process.cwd());
174
+ const event = {
175
+ event_id: randomUUID(),
176
+ timestamp: new Date().toISOString(),
177
+ worker_id: config.worker_id,
178
+ git: gitInfo,
179
+ file: { path: input.file_path, operation: 'edit' },
180
+ changes: {
181
+ added_content: addedLines,
182
+ added: diffStats.added,
183
+ removed: diffStats.removed
184
+ }
185
+ };
328
186
  await publishToKafka(config, event);
329
- if (fs.existsSync(getFailedEventPath(event.event_id))) {
330
- return JSON.stringify({
331
- status: 'queued_for_retry',
332
- event_id: event.event_id,
333
- snapshot_id: snapshotId,
334
- lines_changed: diffStats.added + diffStats.removed,
335
- added_content: addedLines
336
- });
337
- }
338
- deleteSnapshot(snapshotId);
339
187
  return JSON.stringify({
340
188
  status: 'recorded',
341
189
  event_id: event.event_id,
@@ -343,30 +191,6 @@ async function handleAfterEdit(input, config) {
343
191
  added_content: addedLines
344
192
  });
345
193
  }
346
- async function handleDirectAdd(input, config) {
347
- const filePath = validateFilePath(input.file_path);
348
- const addedContent = input.added_content;
349
- if (typeof addedContent !== 'string' || !addedContent.trim()) {
350
- throw new Error('added_content is required');
351
- }
352
- const added = countNonEmptyLines(addedContent);
353
- const event = await buildChangeEvent({
354
- filePath,
355
- operation: 'add',
356
- addedContent,
357
- added,
358
- removed: 0,
359
- config
360
- });
361
- await publishToKafka(config, event);
362
- const status = fs.existsSync(getFailedEventPath(event.event_id)) ? 'queued_for_retry' : 'recorded';
363
- return JSON.stringify({
364
- status,
365
- event_id: event.event_id,
366
- lines_changed: added,
367
- added_content: addedContent
368
- });
369
- }
370
194
  function handleCleanupFailed(input) {
371
195
  const maxAgeHours = input.max_age_hours ?? 168;
372
196
  const result = cleanupFailed(maxAgeHours);
@@ -377,39 +201,17 @@ async function handleReplayFailed(input, config) {
377
201
  const result = await replayFailed(maxCount, config);
378
202
  return JSON.stringify({ status: result.status, replayed: result.replayed, failed: result.failed });
379
203
  }
380
- const BEFORE_EDIT_TOOL_SCHEMA = {
381
- name: 'before_edit',
382
- description: 'Save a server-side snapshot before file edits.',
383
- inputSchema: {
384
- type: 'object',
385
- properties: {
386
- file_path: { type: 'string' }
387
- },
388
- required: ['file_path']
389
- }
390
- };
391
204
  const AFTER_EDIT_TOOL_SCHEMA = {
392
205
  name: 'after_edit',
393
- description: 'Record code changes after file edits using a saved snapshot.',
394
- inputSchema: {
395
- type: 'object',
396
- properties: {
397
- file_path: { type: 'string' },
398
- snapshot_id: { type: 'string' }
399
- },
400
- required: ['file_path', 'snapshot_id']
401
- }
402
- };
403
- const DIRECT_ADD_TOOL_SCHEMA = {
404
- name: 'direct_add',
405
- description: 'Record added content directly when no pre-edit snapshot is available.',
206
+ description: 'Record code changes after file edits.',
406
207
  inputSchema: {
407
208
  type: 'object',
408
209
  properties: {
409
210
  file_path: { type: 'string' },
410
- added_content: { type: 'string' }
211
+ old_string: { type: 'string' },
212
+ new_string: { type: 'string' }
411
213
  },
412
- required: ['file_path', 'added_content']
214
+ required: ['file_path', 'old_string', 'new_string']
413
215
  }
414
216
  };
415
217
  const CLEANUP_FAILED_TOOL_SCHEMA = {
@@ -434,7 +236,6 @@ const REPLAY_FAILED_TOOL_SCHEMA = {
434
236
  };
435
237
  async function main() {
436
238
  const config = loadConfig();
437
- cleanupSnapshotsOnStartup();
438
239
  await initProducer(config);
439
240
  const server = new Server({
440
241
  name: 'code-change-tracker',
@@ -445,23 +246,15 @@ async function main() {
445
246
  }
446
247
  });
447
248
  server.setRequestHandler(ListToolsRequestSchema, async () => ({
448
- tools: [BEFORE_EDIT_TOOL_SCHEMA, AFTER_EDIT_TOOL_SCHEMA, DIRECT_ADD_TOOL_SCHEMA, CLEANUP_FAILED_TOOL_SCHEMA, REPLAY_FAILED_TOOL_SCHEMA]
249
+ tools: [AFTER_EDIT_TOOL_SCHEMA, CLEANUP_FAILED_TOOL_SCHEMA, REPLAY_FAILED_TOOL_SCHEMA]
449
250
  }));
450
251
  server.setRequestHandler(CallToolRequestSchema, async (request) => {
451
252
  const name = request.params.name;
452
253
  const input = request.params.arguments ?? {};
453
- if (name === 'before_edit') {
454
- const result = handleBeforeEdit(input);
455
- return { content: [{ type: 'text', text: result }] };
456
- }
457
254
  if (name === 'after_edit') {
458
255
  const result = await handleAfterEdit(input, config);
459
256
  return { content: [{ type: 'text', text: result }] };
460
257
  }
461
- if (name === 'direct_add') {
462
- const result = await handleDirectAdd(input, config);
463
- return { content: [{ type: 'text', text: result }] };
464
- }
465
258
  if (name === 'cleanup_failed') {
466
259
  const result = handleCleanupFailed(input);
467
260
  return { content: [{ type: 'text', text: result }] };
package/package.json CHANGED
@@ -1,26 +1,26 @@
1
- {
2
- "name": "ronds-mcp-tracker",
3
- "version": "0.1.8",
4
- "type": "module",
5
- "bin": {
6
- "ronds-mcp-tracker": "./bin/cli.js"
7
- },
8
- "scripts": {
9
- "build": "tsc",
10
- "prepublishOnly": "npm run build",
11
- "dev": "tsc && node dist/server.js",
12
- "test": "mcp-inspector node dist/server.js"
13
- },
14
- "dependencies": {
15
- "@modelcontextprotocol/sdk": "^1.0.4",
16
- "ajv": "^8.17.1",
17
- "ajv-formats": "^3.0.1",
18
- "diff": "^7.0.0",
19
- "kafkajs": "^2.2.4"
20
- },
21
- "devDependencies": {
22
- "@types/diff": "^7.0.0",
23
- "@types/node": "^20.0.0",
24
- "typescript": "^5.0.0"
25
- }
26
- }
1
+ {
2
+ "name": "ronds-mcp-tracker",
3
+ "version": "0.1.10",
4
+ "type": "module",
5
+ "bin": {
6
+ "ronds-mcp-tracker": "./bin/cli.js"
7
+ },
8
+ "scripts": {
9
+ "build": "tsc",
10
+ "prepublishOnly": "npm run build",
11
+ "dev": "tsc && node dist/server.js",
12
+ "test": "mcp-inspector node dist/server.js"
13
+ },
14
+ "dependencies": {
15
+ "@modelcontextprotocol/sdk": "^1.0.4",
16
+ "ajv": "^8.17.1",
17
+ "ajv-formats": "^3.0.1",
18
+ "diff": "^7.0.0",
19
+ "kafkajs": "^2.2.4"
20
+ },
21
+ "devDependencies": {
22
+ "@types/diff": "^7.0.0",
23
+ "@types/node": "^20.0.0",
24
+ "typescript": "^5.0.0"
25
+ }
26
+ }