mcp-multitool 0.1.5 → 0.1.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -8,15 +8,12 @@ A [Model Context Protocol (MCP)](https://modelcontextprotocol.io) server with **
8
8
 
9
9
  ## Tools
10
10
 
11
- | Tool | Description |
12
- | --------------------- | ----------------------------------------------- |
13
- | `deleteFile` | Delete files or directories (single or batch) |
14
- | `moveFile` | Move/rename files or directories |
15
- | `compressLogs` | Compress log arrays with 60-90% token reduction |
16
- | `compressText` | Compress multi-line log text |
17
- | `analyzePatterns` | Quick pattern extraction from logs |
18
- | `estimateCompression` | Estimate compression ratio before processing |
19
- | `wait` | Pause execution for rate limits or timing |
11
+ | Tool | Description |
12
+ | ------------ | -------------------------------------------------- |
13
+ | `deleteFile` | Delete files or directories (single or batch) |
14
+ | `moveFile` | Move/rename files or directories |
15
+ | `readLog` | Read and compress logs with 60-90% token reduction |
16
+ | `wait` | Pause execution for rate limits or timing |
20
17
 
21
18
  ## Why
22
19
 
@@ -62,131 +59,91 @@ npx mcp-multitool
62
59
 
63
60
  ## Tool Reference
64
61
 
65
- ### `analyzePatterns`
62
+ ### `deleteFile`
66
63
 
67
- Quick pattern analysis without full compression. Returns top N patterns found in logs, useful for rapid log triage.
64
+ Delete one or more files or directories.
68
65
 
69
- | Parameter | Type | Required | Default | Description |
70
- | ------------- | ---------- | -------- | ------- | --------------------------- |
71
- | `lines` | `string[]` | ✅ | — | Log lines to analyze. |
72
- | `maxPatterns` | `integer` | — | `20` | Maximum patterns to return. |
66
+ | Parameter | Type | Required | Default | Description |
67
+ | ----------- | -------------------- | -------- | ------- | ----------------------------------------------------- |
68
+ | `paths` | `string \| string[]` | ✅ | — | File or directory path(s) to delete. |
69
+ | `recursive` | `boolean` | — | `false` | If true, delete directories and contents recursively. |
73
70
 
74
- **Response:** JSON object with `patterns` array and `total` line count.
71
+ **Response:** `"Deleted N path(s)."`
75
72
 
76
73
  **Examples:**
77
74
 
78
75
  ```
79
- analyzePatterns lines=["INFO Starting server", "INFO Starting server", "ERROR Failed"]
80
- analyzePatterns lines=[...logs] maxPatterns=10
76
+ deleteFile paths="temp.txt"
77
+ deleteFile paths=["a.txt", "b.txt"]
78
+ deleteFile paths="build/" recursive=true
81
79
  ```
82
80
 
83
81
  ---
84
82
 
85
- ### `compressLogs`
83
+ ### `moveFile`
86
84
 
87
- Compress an array of log lines using semantic pattern extraction. Achieves 60-90% token reduction while preserving diagnostic context.
85
+ Move one or more files or directories to a destination directory.
88
86
 
89
- | Parameter | Type | Required | Default | Description |
90
- | -------------- | ---------- | -------- | --------- | ----------------------------------------------------------------------- |
91
- | `lines` | `string[]` | ✅ | — | Log lines to compress. |
92
- | `format` | `string` | | `summary` | Output format: `summary` (compact), `detailed` (full metadata), `json`. |
93
- | `depth` | `integer` | — | `4` | Parse tree depth for pattern matching (2-8). |
94
- | `simThreshold` | `number` | — | `0.4` | Similarity threshold for template matching (0.0-1.0). |
95
- | `maxTemplates` | `integer` | — | `50` | Maximum templates to include in output. |
87
+ | Parameter | Type | Required | Default | Description |
88
+ | ----------- | -------------------- | -------- | ------- | ---------------------------------- |
89
+ | `from` | `string \| string[]` | ✅ | — | Source path(s) to move. |
90
+ | `to` | `string` | | | Destination directory. |
91
+ | `overwrite` | `boolean` | — | `false` | If true, overwrite existing files. |
96
92
 
97
- **Response:** Compressed log summary showing unique templates and occurrence counts.
93
+ **Response:** `"Moved N path(s)."`
98
94
 
99
95
  **Examples:**
100
96
 
101
97
  ```
102
- compressLogs lines=["2026-04-07 INFO Starting on port 3000", "2026-04-07 INFO Starting on port 3001"]
103
- compressLogs lines=[...logs] format="json" maxTemplates=20
98
+ moveFile from="old.txt" to="archive/"
99
+ moveFile from=["a.txt", "b.txt"] to="backup/"
100
+ moveFile from="config.json" to="dest/" overwrite=true
104
101
  ```
105
102
 
106
103
  ---
107
104
 
108
- ### `compressText`
105
+ ### `readLog`
109
106
 
110
- Compress a multi-line log text string. Automatically splits on newlines and processes as individual log lines.
107
+ Compress a log file using semantic pattern extraction (60-90% token reduction). Creates stateful drains for incremental reads. Use `flushLog` to release.
111
108
 
112
- | Parameter | Type | Required | Default | Description |
113
- | -------------- | --------- | -------- | --------- | ----------------------------------------------------------------------- |
114
- | `text` | `string` | ✅ | — | Multi-line log text to compress. |
115
- | `format` | `string` | — | `summary` | Output format: `summary` (compact), `detailed` (full metadata), `json`. |
116
- | `depth` | `integer` | — | `4` | Parse tree depth for pattern matching (2-8). |
117
- | `simThreshold` | `number` | — | `0.4` | Similarity threshold for template matching (0.0-1.0). |
118
- | `maxTemplates` | `integer` | — | `50` | Maximum templates to include in output. |
119
-
120
- **Response:** Compressed log summary showing unique templates and occurrence counts.
121
-
122
- **Examples:**
109
+ **Stateful drains:** On first call for a file, creates a stateful drain. Subsequent calls append only new lines to the existing drain, preserving template IDs. This enables incremental log analysis as files grow. When any drain is active, a dynamic `flushLog` tool appears to release drains.
123
110
 
124
- ```
125
- compressText text="INFO Starting server\nINFO Starting server\nERROR Failed"
126
- compressText text="<paste logs here>" format="detailed"
127
- ```
128
-
129
- ---
130
-
131
- ### `deleteFile`
132
-
133
- Delete one or more files or directories.
111
+ | Parameter | Type | Required | Description |
112
+ | -------------- | --------- | -------- | ------------------------------------------------ |
113
+ | `path` | `string` | ✅ | Path to the log file. |
114
+ | `format` | `string` | ✅ | Output format: `summary`, `detailed`, or `json`. |
115
+ | `depth` | `integer` | ✅ | Parse tree depth (2-8). |
116
+ | `simThreshold` | `number` | ✅ | Similarity threshold (0-1). |
117
+ | `tail` | `integer` | — | Last N lines (first read only). |
118
+ | `head` | `integer` | — | First N lines (first read only). |
119
+ | `grep` | `string` | — | Regex filter for lines. |
134
120
 
135
- | Parameter | Type | Required | Default | Description |
136
- | ----------- | -------------------- | -------- | ------- | ----------------------------------------------------- |
137
- | `paths` | `string \| string[]` | ✅ | — | File or directory path(s) to delete. |
138
- | `recursive` | `boolean` | — | `false` | If true, delete directories and contents recursively. |
139
-
140
- **Response:** `"Deleted N path(s)."`
141
-
142
- **Examples:**
143
-
144
- ```
145
- deleteFile paths="temp.txt"
146
- deleteFile paths=["a.txt", "b.txt"]
147
- deleteFile paths="build/" recursive=true
148
- ```
149
-
150
- ---
151
-
152
- ### `estimateCompression`
153
-
154
- Estimate compression ratio without full processing. Samples a subset of logs to predict compression effectiveness.
155
-
156
- | Parameter | Type | Required | Default | Description |
157
- | ------------ | ---------- | -------- | ------- | ----------------------------------------- |
158
- | `lines` | `string[]` | ✅ | — | Log lines to sample. |
159
- | `sampleSize` | `integer` | — | `1000` | Number of lines to sample for estimation. |
160
-
161
- **Response:** JSON object with estimated compression ratio, token reduction, and recommendation.
121
+ **Response:** Compressed log summary showing unique templates and occurrence counts.
162
122
 
163
123
  **Examples:**
164
124
 
165
125
  ```
166
- estimateCompression lines=[...largeLogs]
167
- estimateCompression lines=[...logs] sampleSize=500
126
+ readLog path="/var/log/app.log" format="summary" depth=4 simThreshold=0.4
127
+ readLog path="./logs/server.log" format="detailed" depth=4 simThreshold=0.4 tail=1000
128
+ readLog path="app.log" format="json" depth=6 simThreshold=0.3 grep="ERROR|WARN"
168
129
  ```
169
130
 
170
131
  ---
171
132
 
172
- ### `moveFile`
133
+ ### `flushLog` (dynamic)
173
134
 
174
- Move one or more files or directories to a destination directory.
135
+ Release a log drain to free memory. Next `readLog` creates fresh drain. **This tool only appears when at least one drain is active.** When the last drain is flushed, the tool is automatically removed.
175
136
 
176
- | Parameter | Type | Required | Default | Description |
177
- | ----------- | -------------------- | -------- | ------- | ---------------------------------- |
178
- | `from` | `string \| string[]` | ✅ | — | Source path(s) to move. |
179
- | `to` | `string` | ✅ | — | Destination directory. |
180
- | `overwrite` | `boolean` | — | `false` | If true, overwrite existing files. |
137
+ | Parameter | Type | Required | Description |
138
+ | --------- | -------- | -------- | ------------------------------ |
139
+ | `path` | `string` | ✅ | Path to the log file to flush. |
181
140
 
182
- **Response:** `"Moved N path(s)."`
141
+ **Response:** `"Flushed {filename}. Released N templates from M lines."`
183
142
 
184
- **Examples:**
143
+ **Example:**
185
144
 
186
145
  ```
187
- moveFile from="old.txt" to="archive/"
188
- moveFile from=["a.txt", "b.txt"] to="backup/"
189
- moveFile from="config.json" to="dest/" overwrite=true
146
+ flushLog path="/var/log/app.log"
190
147
  ```
191
148
 
192
149
  ---
@@ -195,33 +152,31 @@ moveFile from="config.json" to="dest/" overwrite=true
195
152
 
196
153
  Wait for a specified duration before continuing.
197
154
 
198
- | Parameter | Type | Required | Description |
199
- | ------------ | --------- | -------- | ------------------------------------------------------------------------------------------------- |
200
- | `durationMs` | `integer` | ✅ | How long to wait in milliseconds. Must be ≥ 1 and ≤ the configured max (default: 300000 / 5 min). |
201
- | `reason` | `string` | ✅ | Why the wait is needed. Max 64 characters. |
155
+ | Parameter | Type | Required | Description |
156
+ | ----------------- | --------- | -------- | ----------------------------------------------------------------------------------------- |
157
+ | `durationSeconds` | `integer` | ✅ | How long to wait in seconds. Must be ≥ 1 and ≤ the configured max (default: 300 / 5 min). |
158
+ | `reason` | `string` | ✅ | Why the wait is needed. Max 64 characters. |
202
159
 
203
- **Response:** `"Nms have passed."`
160
+ **Response:** `"Ns have passed."`
204
161
 
205
162
  **Examples:**
206
163
 
207
164
  ```
208
- wait durationMs=2000 reason="settling after write"
209
- wait durationMs=5000 reason="rate limit cooldown"
210
- wait durationMs=500 reason="animation to complete"
165
+ wait durationSeconds=2 reason="settling after write"
166
+ wait durationSeconds=5 reason="rate limit cooldown"
167
+ wait durationSeconds=1 reason="animation to complete"
211
168
  ```
212
169
 
213
170
  ## Environment Variables
214
171
 
215
- | Variable | Default | Description |
216
- | --------------------- | -------- | --------------------------------------------------------------------------------------------------------- |
217
- | `waitMaxDurationMs` | `300000` | Override the maximum allowed `durationMs`. Must be a positive number. Server refuses to start if invalid. |
218
- | `analyzePatterns` | _(on)_ | Set to `"false"` to disable the `analyzePatterns` tool at startup. |
219
- | `compressLogs` | _(on)_ | Set to `"false"` to disable the `compressLogs` tool at startup. |
220
- | `compressText` | _(on)_ | Set to `"false"` to disable the `compressText` tool at startup. |
221
- | `deleteFile` | _(on)_ | Set to `"false"` to disable the `deleteFile` tool at startup. |
222
- | `estimateCompression` | _(on)_ | Set to `"false"` to disable the `estimateCompression` tool at startup. |
223
- | `moveFile` | _(on)_ | Set to `"false"` to disable the `moveFile` tool at startup. |
224
- | `wait` | _(on)_ | Set to `"false"` to disable the `wait` tool at startup. |
172
+ | Variable | Default | Description |
173
+ | ------------------------ | ------- | -------------------------------------------------------------------------------------------------------------- |
174
+ | `waitMaxDurationSeconds` | `300` | Override the maximum allowed `durationSeconds`. Must be a positive number. Server refuses to start if invalid. |
175
+ | `readLogTimeoutMs` | `5000` | Override the timeout for `readLog` processing in milliseconds. Server refuses to start if invalid. |
176
+ | `deleteFile` | _(on)_ | Set to `"false"` to disable the `deleteFile` tool at startup. |
177
+ | `moveFile` | _(on)_ | Set to `"false"` to disable the `moveFile` tool at startup. |
178
+ | `readLog` | _(on)_ | Set to `"false"` to disable the `readLog` tool at startup. |
179
+ | `wait` | _(on)_ | Set to `"false"` to disable the `wait` tool at startup. |
225
180
 
226
181
  ### Disabling Individual Tools
227
182
 
package/dist/index.js CHANGED
@@ -2,29 +2,20 @@
2
2
  import { createRequire } from "node:module";
3
3
  import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
4
4
  import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
5
- import { register as registerAnalyzePatterns } from "./tools/analyzePatterns.js";
6
- import { register as registerCompressLogs } from "./tools/compressLogs.js";
7
- import { register as registerCompressText } from "./tools/compressText.js";
8
5
  import { register as registerDeleteFile } from "./tools/deleteFile.js";
9
- import { register as registerEstimateCompression } from "./tools/estimateCompression.js";
10
6
  import { register as registerMoveFile } from "./tools/moveFile.js";
7
+ import { register as registerReadLog } from "./tools/readLog.js";
11
8
  import { register as registerWait } from "./tools/wait.js";
12
9
  const require = createRequire(import.meta.url);
13
10
  const { version } = require("../package.json");
14
11
  const isEnabled = (name) => process.env[name] !== "false";
15
12
  const server = new McpServer({ name: "mcp-multitool", version });
16
- if (isEnabled("analyzePatterns"))
17
- registerAnalyzePatterns(server);
18
- if (isEnabled("compressLogs"))
19
- registerCompressLogs(server);
20
- if (isEnabled("compressText"))
21
- registerCompressText(server);
22
13
  if (isEnabled("deleteFile"))
23
14
  registerDeleteFile(server);
24
- if (isEnabled("estimateCompression"))
25
- registerEstimateCompression(server);
26
15
  if (isEnabled("moveFile"))
27
16
  registerMoveFile(server);
17
+ if (isEnabled("readLog"))
18
+ registerReadLog(server);
28
19
  if (isEnabled("wait"))
29
20
  registerWait(server);
30
21
  await server.connect(new StdioServerTransport());
@@ -13,6 +13,10 @@ export function register(server) {
13
13
  server.registerTool("deleteFile", {
14
14
  description: "Delete one or more files or directories.",
15
15
  inputSchema: schema,
16
+ annotations: {
17
+ destructiveHint: true,
18
+ openWorldHint: false,
19
+ },
16
20
  }, async (input) => {
17
21
  try {
18
22
  const paths = Array.isArray(input.paths) ? input.paths : [input.paths];
@@ -24,6 +24,10 @@ export function register(server) {
24
24
  server.registerTool("moveFile", {
25
25
  description: "Move one or more files or directories to a destination directory.",
26
26
  inputSchema: schema,
27
+ annotations: {
28
+ destructiveHint: true,
29
+ openWorldHint: false,
30
+ },
27
31
  }, async (input) => {
28
32
  try {
29
33
  const sources = Array.isArray(input.from) ? input.from : [input.from];
@@ -0,0 +1,2 @@
1
+ import type { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
2
+ export declare function register(server: McpServer): void;
@@ -0,0 +1,152 @@
1
+ import { readFile } from "node:fs/promises";
2
+ import { resolve, basename } from "node:path";
3
+ import { z } from "zod";
4
+ import { createDrain } from "logpare";
5
+ const timeoutMs = (() => {
6
+ const env = process.env.readLogTimeoutMs;
7
+ if (!env)
8
+ return 5000;
9
+ const n = Number(env);
10
+ if (!Number.isFinite(n) || n <= 0) {
11
+ process.stderr.write(`Invalid readLogTimeoutMs: "${env}".\n`);
12
+ process.exit(1);
13
+ }
14
+ return n;
15
+ })();
16
+ const drains = new Map();
17
+ let flushTool = null;
18
+ const schema = z
19
+ .object({
20
+ path: z.string().min(1).describe("Path to the log file."),
21
+ format: z.enum(["summary", "detailed", "json"]).describe("Output format."),
22
+ depth: z.number().int().min(2).max(8).describe("Parse tree depth (2-8)."),
23
+ simThreshold: z
24
+ .number()
25
+ .min(0)
26
+ .max(1)
27
+ .describe("Similarity threshold (0-1)."),
28
+ tail: z
29
+ .number()
30
+ .int()
31
+ .min(1)
32
+ .optional()
33
+ .describe("Last N lines (first read only)."),
34
+ head: z
35
+ .number()
36
+ .int()
37
+ .min(1)
38
+ .optional()
39
+ .describe("First N lines (first read only)."),
40
+ grep: z.string().optional().describe("Regex filter for lines."),
41
+ })
42
+ .refine((d) => !(d.head && d.tail), {
43
+ message: "Cannot use both head and tail.",
44
+ });
45
+ const flushSchema = z.object({
46
+ path: z.string().min(1).describe("Path to the log file to flush."),
47
+ });
48
+ const ok = (text) => ({ content: [{ type: "text", text }] });
49
+ const err = (text) => ({
50
+ isError: true,
51
+ content: [{ type: "text", text }],
52
+ });
53
+ export function register(server) {
54
+ server.registerTool("readLog", {
55
+ description: "Compress a log file using semantic pattern extraction (60-90% reduction). Creates stateful drains for incremental reads. Use flushLog to release.",
56
+ inputSchema: schema,
57
+ annotations: { readOnlyHint: true, openWorldHint: false },
58
+ }, async (input) => {
59
+ try {
60
+ return ok(await Promise.race([processLog(server, input), timeout()]));
61
+ }
62
+ catch (e) {
63
+ return err(String(e));
64
+ }
65
+ });
66
+ }
67
+ async function processLog(server, input) {
68
+ const path = resolve(process.cwd(), input.path);
69
+ let state = drains.get(path);
70
+ if (state &&
71
+ (state.depth !== input.depth || state.simThreshold !== input.simThreshold)) {
72
+ return `Error: Drain exists with depth=${state.depth}, simThreshold=${state.simThreshold}. Flush first.`;
73
+ }
74
+ const content = await readFile(path, "utf-8");
75
+ let lines = content.split(/\r?\n/).filter(Boolean);
76
+ if (!state) {
77
+ const wasEmpty = drains.size === 0;
78
+ if (input.head)
79
+ lines = lines.slice(0, input.head);
80
+ else if (input.tail)
81
+ lines = lines.slice(-input.tail);
82
+ if (input.grep)
83
+ lines = lines.filter((l) => new RegExp(input.grep).test(l));
84
+ if (!lines.length)
85
+ return "No log lines to process.";
86
+ const drain = createDrain({
87
+ depth: input.depth,
88
+ simThreshold: input.simThreshold,
89
+ });
90
+ drain.addLogLines(lines);
91
+ state = {
92
+ drain,
93
+ lastLine: lines.length,
94
+ depth: input.depth,
95
+ simThreshold: input.simThreshold,
96
+ };
97
+ drains.set(path, state);
98
+ if (wasEmpty)
99
+ registerFlush(server);
100
+ return `${state.drain.getResult(input.format).formatted}\n\n[New drain. ${state.lastLine} lines. Use flushLog when done.]`;
101
+ }
102
+ const newLines = lines.slice(state.lastLine);
103
+ if (!newLines.length) {
104
+ return `${state.drain.getResult(input.format).formatted}\n\n[No new lines. Total: ${state.lastLine}]`;
105
+ }
106
+ const filtered = input.grep ?
107
+ newLines.filter((l) => new RegExp(input.grep).test(l))
108
+ : newLines;
109
+ if (filtered.length)
110
+ state.drain.addLogLines(filtered);
111
+ state.lastLine = lines.length;
112
+ return `${state.drain.getResult(input.format).formatted}\n\n[+${newLines.length} lines. Total: ${state.lastLine}]`;
113
+ }
114
+ function registerFlush(server) {
115
+ if (flushTool)
116
+ return;
117
+ try {
118
+ flushTool = server.registerTool("flushLog", {
119
+ description: "Release a log drain to free memory. Next readLog creates fresh drain.",
120
+ inputSchema: flushSchema,
121
+ annotations: { destructiveHint: true, idempotentHint: true },
122
+ }, async (input) => {
123
+ try {
124
+ const path = resolve(process.cwd(), input.path);
125
+ const state = drains.get(path);
126
+ if (!state) {
127
+ if (!drains.size)
128
+ return ok("No active drains.");
129
+ return ok(`No drain for "${basename(path)}". Active: ${[...drains.keys()].map((p) => basename(p)).join(", ")}`);
130
+ }
131
+ const { totalClusters, lastLine } = {
132
+ totalClusters: state.drain.totalClusters,
133
+ lastLine: state.lastLine,
134
+ };
135
+ drains.delete(path);
136
+ if (!drains.size && flushTool) {
137
+ flushTool.remove();
138
+ flushTool = null;
139
+ }
140
+ return ok(`Flushed ${basename(path)}. Released ${totalClusters} templates from ${lastLine} lines.`);
141
+ }
142
+ catch (e) {
143
+ return err(String(e));
144
+ }
145
+ });
146
+ server.sendToolListChanged();
147
+ }
148
+ catch { }
149
+ }
150
+ function timeout() {
151
+ return new Promise((_, rej) => setTimeout(() => rej(new Error(`Timeout: ${timeoutMs}ms`)), timeoutMs));
152
+ }
@@ -1,24 +1,24 @@
1
1
  import { z } from "zod";
2
- const DEFAULT_MAX_MS = 300_000;
3
- const maxMs = parseMaxDuration();
2
+ const DEFAULT_MAX_SECONDS = 300;
3
+ const maxSeconds = parseMaxDuration();
4
4
  function parseMaxDuration() {
5
- const env = process.env.waitMaxDurationMs;
5
+ const env = process.env.waitMaxDurationSeconds;
6
6
  if (!env)
7
- return DEFAULT_MAX_MS;
7
+ return DEFAULT_MAX_SECONDS;
8
8
  const parsed = Number(env);
9
9
  if (!Number.isFinite(parsed) || parsed <= 0) {
10
- process.stderr.write(`Invalid waitMaxDurationMs: "${env}". Must be a positive number.\n`);
10
+ process.stderr.write(`Invalid waitMaxDurationSeconds: "${env}". Must be a positive number.\n`);
11
11
  process.exit(1);
12
12
  }
13
13
  return parsed;
14
14
  }
15
15
  const schema = z.object({
16
- durationMs: z
16
+ durationSeconds: z
17
17
  .number()
18
18
  .int()
19
19
  .min(1)
20
- .max(maxMs)
21
- .describe(`Milliseconds to wait. Min: 1, max: ${maxMs}.`),
20
+ .max(maxSeconds)
21
+ .describe(`Seconds to wait. Min: 1, max: ${maxSeconds}.`),
22
22
  reason: z
23
23
  .string()
24
24
  .min(1)
@@ -29,12 +29,17 @@ export function register(server) {
29
29
  server.registerTool("wait", {
30
30
  description: "Wait for a specified duration before continuing.",
31
31
  inputSchema: schema,
32
+ annotations: {
33
+ readOnlyHint: true,
34
+ idempotentHint: true,
35
+ },
32
36
  }, async (input) => {
33
37
  try {
34
- await new Promise((r) => setTimeout(r, input.durationMs));
38
+ const ms = Math.round(input.durationSeconds * 1000);
39
+ await new Promise((r) => setTimeout(r, ms));
35
40
  return {
36
41
  content: [
37
- { type: "text", text: `${input.durationMs}ms have passed.` },
42
+ { type: "text", text: `${input.durationSeconds}s have passed.` },
38
43
  ],
39
44
  };
40
45
  }
package/package.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "mcp-multitool",
3
- "version": "0.1.5",
4
- "description": "MCP server with file operations (delete, move), log compression (60-90% token reduction), and timing utilities.",
3
+ "version": "0.1.7",
4
+ "description": "MCP server with file operations (delete, move) and timing utilities.",
5
5
  "license": "MIT",
6
6
  "type": "module",
7
7
  "main": "dist/index.js",
@@ -14,12 +14,10 @@
14
14
  "keywords": [
15
15
  "mcp",
16
16
  "model-context-protocol",
17
- "log-compression",
18
17
  "file-operations",
19
18
  "delete-file",
20
19
  "move-file",
21
20
  "wait",
22
- "logpare",
23
21
  "tool"
24
22
  ],
25
23
  "scripts": {