disk-clean-mcp 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Superandyfre
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,429 @@
1
+ # Disk Clean MCP
2
+
3
+ [![npm version](https://img.shields.io/npm/v/disk-clean-mcp?color=2e7d32)](https://www.npmjs.com/package/disk-clean-mcp)
4
+ [![build](https://github.com/Superandyfre/disk-clean-mcp/actions/workflows/build.yml/badge.svg)](https://github.com/Superandyfre/disk-clean-mcp/actions/workflows/build.yml)
5
+ [![license](https://img.shields.io/badge/license-MIT-0a0a0a.svg)](https://github.com/Superandyfre/disk-clean-mcp/blob/main/LICENSE)
6
+ [![node](https://img.shields.io/badge/node-%3E%3D18-43853d.svg)](https://nodejs.org/en/)
7
+
8
+ **Available Languages:** [English](#disk-clean-mcp) | [Fran?ais](#fr) | [��������](#zh-cn) | [���w����](#zh-tw) | [???](#ko) | [�ձ��Z](#ja)
9
+
10
+ ---
11
+
12
+ ## English
13
+
14
+ A Model Context Protocol (MCP) server that analyzes local disk usage in read-only mode and suggests cleanup targets by size, type, recency, and duplicates.
15
+
16
+ ### Features
17
+ - **Directory Scanning**: Scan directories to summarize total size, file count, and directory count with configurable depth limits and glob patterns.
18
+ - **Extension Analysis**: Breakdown disk usage by file extension to identify the largest file types.
19
+ - **Directory Ranking**: List the heaviest subdirectories by aggregate file size.
20
+ - **Large File Detection**: Find the largest files with optional filters for size, age, and glob patterns.
21
+ - **Stale File Suggestions**: Identify large and old files that may be candidates for cleanup.
22
+ - **Duplicate Detection**: Find groups of files with identical sizes and content hashes (read-only).
23
+ - **Configurable Limits**: Use `ignoreGlobs`, `includeGlobs`, `maxFiles`, and `maxDepth` to refine scope and control workload.
24
+
25
+ ### Requirements
26
+ - Node.js >= 18
27
+ - Read-only operation (no delete/move functions)
28
+ - Common folders ignored by default: `node_modules`, `.git`, `dist`, `build`, `.cache`
29
+
30
+ ### Installation & Usage
31
+ #### Published Package (Recommended)
32
+ ```bash
33
+ npm install -g disk-clean-mcp
34
+ # or use directly without installation
35
+ npx disk-clean-mcp
36
+ ```
37
+
38
+ #### Configuration for Claude Desktop
39
+ ```json
40
+ {
41
+ "mcpServers": {
42
+ "disk-clean": {
43
+ "command": "disk-clean-mcp",
44
+ "args": [],
45
+ "cwd": "/path/to/workdir"
46
+ }
47
+ }
48
+ }
49
+ ```
50
+
51
+ #### Local Development
52
+ ```bash
53
+ git clone https://github.com/Superandyfre/disk-clean-mcp.git
54
+ cd disk-clean-mcp
55
+ npm install
56
+ npm run build
57
+ npm start
58
+ # or dev mode with auto-compilation
59
+ npm run dev
60
+ ```
61
+
62
+ ### Tools
63
+ - **scan_summary** �C Get total size, file count, directory count with optional depth and ignore globs
64
+ - **by_type** �C Top file extensions ranked by total size
65
+ - **top_dirs** �C Heaviest subdirectories by aggregate size
66
+ - **top_files** �C Largest files with filters (min size, age, glob include/exclude)
67
+ - **stale_candidates** �C Large and old files (cleanup suggestions)
68
+ - **duplicate_candidates** �C Groups of files with identical size and content hash (read-only)
69
+
70
+ ### MCP Hub / Claude Submission (EN/����)
71
+ - Status: pending listing; add MCP Hub link after approval.
72
+ - Provide when submitting: repo URL, npm install command (`npm install -g disk-clean-mcp` / `npx disk-clean-mcp`), command name (`disk-clean-mcp`), Node >= 18, MIT license, and the tool list above.
73
+ - Config snippet (Claude Desktop): see ��Configuration for Claude Desktop�� JSON above.
74
+ - ����Ҫ�㣺�ύʱ׼���ֿ����ӡ�npm ��װָ���������Node �汾Ҫ��MIT ���ɡ������嵥��Claude ����ʾ����
75
+
76
+ ### Release Checklist
77
+ - [ ] Run `npm run build` to verify compilation
78
+ - [ ] Confirm Node.js >= 18
79
+ - [ ] Update version: `npm version patch|minor|major`
80
+ - [ ] Publish to npm: `npm publish` (requires npm login)
81
+ - [ ] Update README and CHANGELOG if applicable
82
+ - [ ] Update repository URLs in package.json if needed
83
+
84
+ ### License
85
+ MIT �C Copyright (c) 2026 Superandyfre
86
+
87
+ ---
88
+
89
+ <a id="fr"></a>
90
+
91
+ ## Fran?ais
92
+
93
+ Un serveur Model Context Protocol (MCP) qui analyse l'utilisation du disque local en mode lecture seule et sugg��re des cibles de nettoyage par taille, type, r��cence et doublons.
94
+
95
+ ### Fonctionnalit��s
96
+ - **Analyse de r��pertoires**: Scannez les r��pertoires pour r��sumer la taille totale, le nombre de fichiers et r��pertoires avec limites de profondeur configurables et motifs glob.
97
+ - **Analyse par extension**: D��composez l'utilisation du disque par extension de fichier pour identifier les types de fichiers les plus volumineux.
98
+ - **Classement des r��pertoires**: Listez les sous-r��pertoires les plus volumineux par taille de fichier agr��g��e.
99
+ - **D��tection de fichiers volumineux**: Trouvez les fichiers les plus volumineux avec filtres optionnels par taille, ?ge et motifs glob.
100
+ - **Suggestions de fichiers obsol��tes**: Identifiez les fichiers volumineux et anciens pour nettoyage potentiel.
101
+ - **D��tection des doublons**: Trouvez les groupes de fichiers avec tailles et hachages de contenu identiques (lecture seule).
102
+ - **Limites configurables**: Utilisez `ignoreGlobs`, `includeGlobs`, `maxFiles` et `maxDepth` pour affiner la port��e.
103
+
104
+ ### Configuration requise
105
+ - Node.js >= 18
106
+ - Op��ration en lecture seule (pas de fonctions de suppression/d��placement)
107
+ - Dossiers ignor��s par d��faut: `node_modules`, `.git`, `dist`, `build`, `.cache`
108
+
109
+ ### Installation et utilisation
110
+ #### Paquet publi�� (recommand��)
111
+ ```bash
112
+ npm install -g disk-clean-mcp
113
+ # ou utiliser directement
114
+ npx disk-clean-mcp
115
+ ```
116
+
117
+ #### Configuration pour Claude Desktop
118
+ ```json
119
+ {
120
+ "mcpServers": {
121
+ "disk-clean": {
122
+ "command": "disk-clean-mcp",
123
+ "args": [],
124
+ "cwd": "/path/to/workdir"
125
+ }
126
+ }
127
+ }
128
+ ```
129
+
130
+ #### D��veloppement local
131
+ ```bash
132
+ git clone https://github.com/Superandyfre/disk-clean-mcp.git
133
+ cd disk-clean-mcp
134
+ npm install
135
+ npm run build
136
+ npm start
137
+ npm run dev # mode dev avec recompilation automatique
138
+ ```
139
+
140
+ ### Outils disponibles
141
+ - **scan_summary** �C Obtenez la taille totale, le nombre de fichiers et r��pertoires
142
+ - **by_type** �C Extensions de fichier class��es par taille totale
143
+ - **top_dirs** �C Sous-r��pertoires les plus volumineux
144
+ - **top_files** �C Fichiers les plus volumineux avec filtres
145
+ - **stale_candidates** �C Fichiers volumineux et anciens
146
+ - **duplicate_candidates** �C Groupes de fichiers doublons
147
+
148
+ ### Licence
149
+ MIT �C Copyright (c) 2026 Superandyfre
150
+
151
+ ---
152
+
153
+ <a id="zh-cn"></a>
154
+
155
+ ## ��������
156
+
157
+ һ�� Model Context Protocol (MCP) �����������ڷ������ش���ʹ�������ֻ��ģʽ����������С�����͡��¾ɳ̶Ⱥ��ظ��������Ŀ�ꡣ
158
+
159
+ ### ��������
160
+ - **Ŀ¼ɨ��**: ɨ��Ŀ¼�Ի����ܴ�С���ļ�����Ŀ¼����֧�ֿ����õ�������ƺ�ȫ��ƥ��ģʽ��
161
+ - **��չ������**: ���ļ���չ���ֽ����ʹ�������ʶ�������ļ����͡�
162
+ - **Ŀ¼����**: ���ۺ��ļ���С�г�������Ŀ¼��
163
+ - **���ļ����**: ���������ļ���֧�ְ���С�������ȫ��ģʽ���ˡ�
164
+ - **�¾��ļ�����**: ʶ�������Ҫ�����Ĵ������Ͼɵ��ļ���
165
+ - **�ظ�����**: ���Ҿ�����ͬ��С�����ݹ�ϣ���ļ��飨ֻ������
166
+ - **����������**: ʹ�� `ignoreGlobs`��`includeGlobs`��`maxFiles` �� `maxDepth` ��������Χ�Ϳ��ƹ������ء�
167
+
168
+ ### ϵͳҪ��
169
+ - Node.js >= 18
170
+ - ֻ����������ɾ��/�ƶ����ܣ�
171
+ - Ĭ�Ϻ��Ե��ļ��У�`node_modules`��`.git`��`dist`��`build`��`.cache`
172
+
173
+ ### ��װ��ʹ��
174
+ #### �ѷ����İ����Ƽ���
175
+ ```bash
176
+ npm install -g disk-clean-mcp
177
+ # ��ֱ��ʹ��
178
+ npx disk-clean-mcp
179
+ ```
180
+
181
+ #### Claude Desktop ����ʾ��
182
+ ```json
183
+ {
184
+ "mcpServers": {
185
+ "disk-clean": {
186
+ "command": "disk-clean-mcp",
187
+ "args": [],
188
+ "cwd": "/path/to/workdir"
189
+ }
190
+ }
191
+ }
192
+ ```
193
+
194
+ #### ���ؿ���
195
+ ```bash
196
+ git clone https://github.com/Superandyfre/disk-clean-mcp.git
197
+ cd disk-clean-mcp
198
+ npm install
199
+ npm run build
200
+ npm start
201
+ npm run dev # ����ģʽ���Զ�����
202
+ ```
203
+
204
+ ### �����б�
205
+ - **scan_summary** �C ��ȡ�ܴ�С���ļ�����Ŀ¼����֧����Ⱥͺ���ģʽ
206
+ - **by_type** �C ���ܴ�С���е��ļ���չ��
207
+ - **top_dirs** �C ������Ŀ¼�����ۺϴ�С��
208
+ - **top_files** �C �����ļ���֧�ֹ��ˣ���С�����䡢ģʽ��
209
+ - **stale_candidates** �C �������Ͼɵ��ļ����������飩
210
+ - **duplicate_candidates** �C �ظ��ļ��飨��ͬ��С�͹�ϣ��
211
+
212
+ ### ��������嵥
213
+ - [ ] ���� `npm run build` ��֤����
214
+ - [ ] ȷ�� Node.js >= 18
215
+ - [ ] ���°汾��`npm version patch|minor|major`
216
+ - [ ] ������ npm��`npm publish`����Ҫ npm ��¼��
217
+ - [ ] ���� README �� CHANGELOG�������ã�
218
+
219
+ ### ����֤
220
+ MIT �C Copyright (c) 2026 Superandyfre
221
+
222
+ ---
223
+
224
+ <a id="zh-tw"></a>
225
+
226
+ ## ���w����
227
+
228
+ һ�� Model Context Protocol (MCP) �ŷ�������춷������C�ŵ�ʹ����r��Ψ�xģʽ�����K����С����͡����f�̶Ⱥ����}헽��h����Ŀ�ˡ�
229
+
230
+ ### ��������
231
+ - **Ŀ䛒���**: ����Ŀ��ԏ�������С���n������Ŀ䛔���֧Ԯ���O����������ƺ�ͨ���䌦ģʽ��
232
+ - **���n������**: ���n�����n���ֽ�ŵ�ʹ����r���R�e���ęn����͡�
233
+ - **Ŀ�����**: ���ۺϙn����С�г�������Ŀ䛡�
234
+ - **��n���ɜy**: �������ęn����֧Ԯ����С�����g��ͨ��ģʽ�Y�x��
235
+ - **��f�n�����h**: �R�e������Ҫ�����Ĵ��������f�ęn����
236
+ - **���}헂ɜy**: ���Ҿ�����ͬ��С�̓����s���ęn���M��Ψ�x����
237
+ - **���O������**: ʹ�� `ignoreGlobs`��`includeGlobs`��`maxFiles` �� `maxDepth` ���{�������Ϳ��ƹ���ؓ�d��
238
+
239
+ ### ϵ�y����
240
+ - Node.js >= 18
241
+ - Ψ�x���I���o�h��/�Ƅӹ��ܣ�
242
+ - �A�O���Ե��Y�ϊA��`node_modules`��`.git`��`dist`��`build`��`.cache`
243
+
244
+ ### ���b�cʹ��
245
+ #### �Ѱl�ѵ��׼������]��
246
+ ```bash
247
+ npm install -g disk-clean-mcp
248
+ # ��ֱ��ʹ��
249
+ npx disk-clean-mcp
250
+ ```
251
+
252
+ #### Claude Desktop �O������
253
+ ```json
254
+ {
255
+ "mcpServers": {
256
+ "disk-clean": {
257
+ "command": "disk-clean-mcp",
258
+ "args": [],
259
+ "cwd": "/path/to/workdir"
260
+ }
261
+ }
262
+ }
263
+ ```
264
+
265
+ #### ���C�_�l
266
+ ```bash
267
+ git clone https://github.com/Superandyfre/disk-clean-mcp.git
268
+ cd disk-clean-mcp
269
+ npm install
270
+ npm run build
271
+ npm start
272
+ npm run dev # �_�lģʽ���ԄӾ��g
273
+ ```
274
+
275
+ ### �������
276
+ - **scan_summary** �C ȡ�ÿ���С���n������Ŀ䛔���֧Ԯ��Ⱥͺ���ģʽ
277
+ - **by_type** �C ������С���еęn�����n��
278
+ - **top_dirs** �C ������Ŀ䛣����ۺϴ�С��
279
+ - **top_files** �C ���ęn����֧Ԯ�Y�x����С�����g��ģʽ��
280
+ - **stale_candidates** �C ���������f�ęn�����������h��
281
+ - **duplicate_candidates** �C ���}�n���M����ͬ��С���s����
282
+
283
+ ### �ڙ�
284
+ MIT �C Copyright (c) 2026 Superandyfre
285
+
286
+ ---
287
+
288
+ <a id="ko"></a>
289
+
290
+ ## ???
291
+
292
+ ?? ??? ???? ?? ?? ??? ???? ??, ??, ??? ? ?? ???? ?? ??? ???? Model Context Protocol (MCP) ?????.
293
+
294
+ ### ??
295
+ - **???? ??**: ????? ???? ? ??, ?? ? ? ???? ?? ????, ?? ??? ?? ?? ? ??? ??? ?????.
296
+ - **??? ??**: ?? ????? ??? ???? ???? ?? ? ?? ??? ?????.
297
+ - **???? ??**: ?? ?? ???? ?? ? ?? ????? ?????.
298
+ - **? ?? ??**: ??, ?? ? ??? ???? ??? ??? ???? ?? ? ??? ????.
299
+ - **??? ?? ??**: ?? ??? ? ? ?? ?? ??? ??? ?????.
300
+ - **?? ??**: ??? ?? ? ??? ??? ?? ?? ??? ????(?? ??).
301
+ - **?? ??? ??**: `ignoreGlobs`, `includeGlobs`, `maxFiles` ? `maxDepth`? ???? ??? ?????.
302
+
303
+ ### ?? ??
304
+ - Node.js >= 18
305
+ - ?? ?? ??(??/?? ?? ??)
306
+ - ????? ???? ??: `node_modules`, `.git`, `dist`, `build`, `.cache`
307
+
308
+ ### ?? ? ??
309
+ #### ??? ???(??)
310
+ ```bash
311
+ npm install -g disk-clean-mcp
312
+ # ?? ?? ??
313
+ npx disk-clean-mcp
314
+ ```
315
+
316
+ #### Claude Desktop ?? ?
317
+ ```json
318
+ {
319
+ "mcpServers": {
320
+ "disk-clean": {
321
+ "command": "disk-clean-mcp",
322
+ "args": [],
323
+ "cwd": "/path/to/workdir"
324
+ }
325
+ }
326
+ }
327
+ ```
328
+
329
+ #### ?? ??
330
+ ```bash
331
+ git clone https://github.com/Superandyfre/disk-clean-mcp.git
332
+ cd disk-clean-mcp
333
+ npm install
334
+ npm run build
335
+ npm start
336
+ npm run dev # ?? ???? ?? ?? ??
337
+ ```
338
+
339
+ ### ??
340
+ - **scan_summary** �C ? ??, ?? ?, ???? ? ??
341
+ - **by_type** �C ? ???? ??? ?? ???
342
+ - **top_dirs** �C ?? ???? ?? ? ?? ????
343
+ - **top_files** �C ??? ?? ?? ? ??(??, ??, ??)
344
+ - **stale_candidates** �C ?? ??? ??(?? ??)
345
+ - **duplicate_candidates** �C ?? ?? ??(??? ?? ? ??)
346
+
347
+ ### ????
348
+ MIT �C Copyright (c) 2026 Superandyfre
349
+
350
+ ---
351
+
352
+ <a id="ja"></a>
353
+
354
+ ## �ձ��Z
355
+
356
+ ���`����ǥ�������ʹ��״�r���i��ȡ�ꌟ�å�`�ɤǷ��������������������ס������ԡ���������}�ˤ�äƥ���`�󥢥å׌�����᰸����Model Context Protocol��MCP�����`�Щ`�Ǥ���
357
+
358
+ ### �C��
359
+ - **�ǥ��쥯�ȥꥹ�����**: �ǥ��쥯�ȥ�򥹥���󤷤ơ���Ӌ���������ե����������ǥ��쥯�ȥ�����Ҫ�s���ޤ������ɿ��ܤ�������ޤȥ����֥ѥ��`��򥵥ݩ`�Ȥ��ޤ���
360
+ - **��������**: �ե����뒈���ӄe�˥ǥ�����ʹ������ֽ⤷�����Υե����륿���פ��R�e���ޤ���
361
+ - **�ǥ��쥯�ȥ��󥭥�**: ��Ӌ�ե����륵�����e�����Υ��֥ǥ��쥯�ȥ��һ�E��ʾ���ޤ���
362
+ - **�󤭤ʥե�����ʳ�**: �����������h�������֥ѥ��`��ǥե��륿��󥰿��ܤ����Υե������������ޤ���
363
+ - **�Ť��ե�������᰸**: ����`�󥢥åפΌ���Ȥʤ�����Ԥ�����󤭤��ƹŤ��ե�������R�e���ޤ���
364
+ - **���}�ʳ�**: ͬ���������ȥ���ƥ�ĥϥå����֤ĥե����륰��`�פ�������ޤ����i��ȡ�ꌟ�ã���
365
+ - **���ɿ��ܤ�����**: `ignoreGlobs`��`includeGlobs`��`maxFiles`��`maxDepth`��ʹ�ä��ƥ����`�פ��{�����ޤ���
366
+
367
+ ### Ҫ��
368
+ - Node.js >= 18
369
+ - �i��ȡ�ꌟ�ò���������/�ƄәC�ܤʤ���
370
+ - �ǥե���Ȥǟoҕ�����ե����: `node_modules`��`.git`��`dist`��`build`��`.cache`
371
+
372
+ ### ���󥹥ȩ`���ʹ��
373
+ #### ���_�ѥå��`�����ƊX��
374
+ ```bash
375
+ npm install -g disk-clean-mcp
376
+ # �ޤ���ֱ��ʹ��
377
+ npx disk-clean-mcp
378
+ ```
379
+
380
+ #### Claude Desktop�O����
381
+ ```json
382
+ {
383
+ "mcpServers": {
384
+ "disk-clean": {
385
+ "command": "disk-clean-mcp",
386
+ "args": [],
387
+ "cwd": "/path/to/workdir"
388
+ }
389
+ }
390
+ }
391
+ ```
392
+
393
+ #### ���`�����_�k
394
+ ```bash
395
+ git clone https://github.com/Superandyfre/disk-clean-mcp.git
396
+ cd disk-clean-mcp
397
+ npm install
398
+ npm run build
399
+ npm start
400
+ npm run dev # �Ԅӥ���ѥ��븶�����_�k��`��
401
+ ```
402
+
403
+ ### �ĩ`��
404
+ - **scan_summary** �C ��Ӌ���������ե����������ǥ��쥯�ȥ�����ȡ��
405
+ - **by_type** �C ��Ӌ�������e�˥��`�Ȥ��줿�ե����뒈����
406
+ - **top_dirs** �C ��Ӌ�������e�����Υ��֥ǥ��쥯�ȥ�
407
+ - **top_files** �C �ե��륿���������ե����루�����������h���ѥ��`��
408
+ - **stale_candidates** �C �󤭤��ƹŤ��ե����루����`�󥢥å��᰸��
409
+ - **duplicate_candidates** �C ���}�ե����륰��`�ף�ͬ���������ȥϥå��壩
410
+
411
+ ### �饤����
412
+ MIT �C Copyright (c) 2026 Superandyfre
413
+ - ���� README/CHANGELOG ���иĶ�
414
+ - �� package.json �е� repository / bugs / homepage �滻Ϊʵ�ʲֿ��ַ
415
+
416
+ ## Tools
417
+ - `scan_summary` �C total size/files/dirs, optional depth and ignore globs.
418
+ - `by_type` �C top extensions by total size.
419
+ - `top_dirs` �C heaviest subdirectories.
420
+ - `top_files` �C largest files with filters (min size, age, glob include/exclude).
421
+ - `stale_candidates` �C large and old files.
422
+ - `duplicate_candidates` �C groups of files with identical sizes and hashes (read-only).
423
+
424
+ ## Notes
425
+ - Read-only: no delete/move operations.
426
+ - Defaults skip common noisy folders: `node_modules`, `.git`, `dist`, `build`, `.cache`.
427
+ - Use `ignoreGlobs` / `includeGlobs` to refine scope.
428
+ - Use `maxFiles`/`maxDepth` to control workload.
429
+ - License: MIT
package/dist/index.js ADDED
@@ -0,0 +1,156 @@
1
+ #!/usr/bin/env node
2
+ import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
3
+ import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
4
+ import { z } from "zod";
5
+ import path from "path";
6
+ import { walk, topExtensions, topDirectories, topFiles, staleFiles, duplicateGroups, bytesToHuman } from "./scan.js";
7
+ const scanSummarySchema = z.object({
8
+ rootPath: z.string(),
9
+ maxDepth: z.number().int().min(0).max(50).optional().describe("Maximum depth to descend (default 10)"),
10
+ followSymlinks: z.boolean().optional().describe("Whether to follow symlinks (default false)"),
11
+ ignoreGlobs: z.array(z.string()).optional().describe("Glob patterns to ignore"),
12
+ includeGlobs: z.array(z.string()).optional().describe("Glob patterns to include; if set, only matches are scanned"),
13
+ maxFiles: z.number().int().min(1).max(200000).optional().describe("Limit files recorded for reports (default 50000)")
14
+ });
15
+ const byTypeSchema = z.object({
16
+ rootPath: z.string(),
17
+ limit: z.number().int().min(1).max(100).optional(),
18
+ maxDepth: z.number().int().min(0).max(50).optional(),
19
+ ignoreGlobs: z.array(z.string()).optional(),
20
+ includeGlobs: z.array(z.string()).optional(),
21
+ maxFiles: z.number().int().min(1).max(200000).optional()
22
+ });
23
+ const topDirsSchema = z.object({
24
+ rootPath: z.string(),
25
+ limit: z.number().int().min(1).max(100).optional(),
26
+ maxDepth: z.number().int().min(0).max(50).optional(),
27
+ ignoreGlobs: z.array(z.string()).optional(),
28
+ includeGlobs: z.array(z.string()).optional(),
29
+ maxFiles: z.number().int().min(1).max(200000).optional()
30
+ });
31
+ const topFilesSchema = z.object({
32
+ rootPath: z.string(),
33
+ limit: z.number().int().min(1).max(200).optional(),
34
+ minSizeMB: z.number().nonnegative().optional(),
35
+ olderThanDays: z.number().positive().optional(),
36
+ includeGlobs: z.array(z.string()).optional(),
37
+ excludeGlobs: z.array(z.string()).optional(),
38
+ maxDepth: z.number().int().min(0).max(50).optional(),
39
+ ignoreGlobs: z.array(z.string()).optional(),
40
+ maxFiles: z.number().int().min(1).max(200000).optional()
41
+ });
42
+ const staleSchema = z.object({
43
+ rootPath: z.string(),
44
+ limit: z.number().int().min(1).max(200).optional(),
45
+ minSizeMB: z.number().nonnegative().optional(),
46
+ olderThanDays: z.number().positive().optional(),
47
+ maxDepth: z.number().int().min(0).max(50).optional(),
48
+ ignoreGlobs: z.array(z.string()).optional(),
49
+ includeGlobs: z.array(z.string()).optional(),
50
+ maxFiles: z.number().int().min(1).max(200000).optional()
51
+ });
52
+ const dupSchema = z.object({
53
+ rootPath: z.string(),
54
+ limit: z.number().int().min(1).max(50).optional(),
55
+ maxDepth: z.number().int().min(0).max(50).optional(),
56
+ ignoreGlobs: z.array(z.string()).optional(),
57
+ includeGlobs: z.array(z.string()).optional(),
58
+ maxFiles: z.number().int().min(1).max(50000).optional()
59
+ });
60
+ const mcp = new McpServer({ name: "disk-clean-mcp", version: "0.1.0" });
61
+ async function runWalk(args) {
62
+ const res = await walk({
63
+ rootPath: args.rootPath,
64
+ maxDepth: args.maxDepth,
65
+ followSymlinks: args.followSymlinks,
66
+ ignoreGlobs: args.ignoreGlobs,
67
+ includeGlobs: args.includeGlobs,
68
+ maxFiles: args.maxFiles
69
+ });
70
+ return res;
71
+ }
72
+ mcp.registerTool("scan_summary", {
73
+ description: "Scan a directory (read-only) to summarize total size, files, dirs. Supports ignore globs and depth limits.",
74
+ inputSchema: scanSummarySchema
75
+ }, async (input) => {
76
+ const result = await runWalk(input);
77
+ return {
78
+ content: [
79
+ {
80
+ type: "text",
81
+ text: JSON.stringify({
82
+ root: path.resolve(input.rootPath),
83
+ totalBytes: result.totalSize,
84
+ totalHuman: bytesToHuman(result.totalSize),
85
+ files: result.fileCount,
86
+ dirs: result.dirCount,
87
+ recordedFiles: result.files.length,
88
+ truncated: result.truncated,
89
+ note: "Read-only scan; no deletions performed"
90
+ }, null, 2)
91
+ }
92
+ ]
93
+ };
94
+ });
95
+ mcp.registerTool("by_type", {
96
+ description: "Top file extensions by total size.",
97
+ inputSchema: byTypeSchema
98
+ }, async (input) => {
99
+ const result = await runWalk(input);
100
+ const rows = topExtensions(result.files, input.limit ?? 10);
101
+ return { content: [{ type: "text", text: JSON.stringify({ truncated: result.truncated, rows }, null, 2) }] };
102
+ });
103
+ mcp.registerTool("top_dirs", {
104
+ description: "Heaviest directories under root (by aggregate file size).",
105
+ inputSchema: topDirsSchema
106
+ }, async (input) => {
107
+ const result = await runWalk(input);
108
+ const rows = topDirectories(result.files, input.rootPath, input.limit ?? 10);
109
+ return { content: [{ type: "text", text: JSON.stringify({ truncated: result.truncated, rows }, null, 2) }] };
110
+ });
111
+ mcp.registerTool("top_files", {
112
+ description: "Largest files with optional filters (size, age, glob include/exclude).",
113
+ inputSchema: topFilesSchema
114
+ }, async (input) => {
115
+ const result = await runWalk(input);
116
+ const olderThanMs = input.olderThanDays ? Date.now() - input.olderThanDays * 24 * 60 * 60 * 1000 : undefined;
117
+ const minBytes = input.minSizeMB ? input.minSizeMB * 1024 * 1024 : undefined;
118
+ const rows = topFiles(result.files, {
119
+ limit: input.limit ?? 20,
120
+ minBytes,
121
+ olderThanMs,
122
+ includeGlobs: input.includeGlobs,
123
+ excludeGlobs: input.excludeGlobs
124
+ });
125
+ return { content: [{ type: "text", text: JSON.stringify({ truncated: result.truncated, rows }, null, 2) }] };
126
+ });
127
+ mcp.registerTool("stale_candidates", {
128
+ description: "Large and old files (read-only suggestions).",
129
+ inputSchema: staleSchema
130
+ }, async (input) => {
131
+ const result = await runWalk(input);
132
+ const olderThanMs = input.olderThanDays ? Date.now() - input.olderThanDays * 24 * 60 * 60 * 1000 : undefined;
133
+ const minBytes = input.minSizeMB ? input.minSizeMB * 1024 * 1024 : undefined;
134
+ const rows = staleFiles(result.files, { limit: input.limit ?? 50, minBytes, olderThanMs });
135
+ return { content: [{ type: "text", text: JSON.stringify({ truncated: result.truncated, rows }, null, 2) }] };
136
+ });
137
+ mcp.registerTool("duplicate_candidates", {
138
+ description: "Groups of files with identical size and content hash (read-only).",
139
+ inputSchema: dupSchema
140
+ }, async (input) => {
141
+ const result = await runWalk(input);
142
+ const rows = await duplicateGroups(result.files, input.limit ?? 10);
143
+ return { content: [{ type: "text", text: JSON.stringify({ truncated: result.truncated, rows }, null, 2) }] };
144
+ });
145
+ async function main() {
146
+ const transport = new StdioServerTransport();
147
+ transport.onerror = (err) => {
148
+ console.error("[disk-clean-mcp] transport error", err);
149
+ };
150
+ await mcp.connect(transport);
151
+ console.error("[disk-clean-mcp] ready");
152
+ }
153
+ main().catch((err) => {
154
+ console.error(err);
155
+ process.exit(1);
156
+ });
package/dist/scan.js ADDED
@@ -0,0 +1,191 @@
1
+ import { createHash } from "crypto";
2
+ import { createReadStream } from "fs";
3
+ import { promises as fs } from "fs";
4
+ import path from "path";
5
+ import micromatch from "micromatch";
6
+ const DEFAULT_IGNORES = ["**/node_modules/**", "**/.git/**", "**/dist/**", "**/build/**", "**/.cache/**"];
7
+ const toPosix = (p) => p.replace(/\\/g, "/");
8
+ function shouldSkip(relPath, includeGlobs, ignoreGlobs) {
9
+ const posixPath = toPosix(relPath) || ".";
10
+ const ignoreList = [...DEFAULT_IGNORES, ...(ignoreGlobs ?? [])];
11
+ if (micromatch.some(posixPath, ignoreList, { dot: true }))
12
+ return true;
13
+ if (includeGlobs && includeGlobs.length > 0) {
14
+ return !micromatch.some(posixPath, includeGlobs, { dot: true });
15
+ }
16
+ return false;
17
+ }
18
+ async function statEntry(fullPath, followSymlinks) {
19
+ try {
20
+ return followSymlinks ? await fs.stat(fullPath) : await fs.lstat(fullPath);
21
+ }
22
+ catch (err) {
23
+ return null;
24
+ }
25
+ }
26
+ export async function walk(options) {
27
+ const maxDepth = options.maxDepth ?? 10;
28
+ const maxFiles = options.maxFiles ?? 50000;
29
+ const files = [];
30
+ let totalSize = 0;
31
+ let fileCount = 0;
32
+ let dirCount = 0;
33
+ let truncated = false;
34
+ const root = path.resolve(options.rootPath);
35
+ const queue = [{ dir: root, depth: 0 }];
36
+ while (queue.length > 0) {
37
+ const { dir, depth } = queue.shift();
38
+ let entries;
39
+ try {
40
+ entries = await fs.readdir(dir, { withFileTypes: true });
41
+ dirCount += 1;
42
+ }
43
+ catch (err) {
44
+ continue;
45
+ }
46
+ for (const entry of entries) {
47
+ const fullPath = path.join(dir, entry.name);
48
+ const relPath = path.relative(root, fullPath) || entry.name;
49
+ if (shouldSkip(relPath, options.includeGlobs, options.ignoreGlobs)) {
50
+ continue;
51
+ }
52
+ const stats = await statEntry(fullPath, options.followSymlinks ?? false);
53
+ if (!stats)
54
+ continue;
55
+ if (stats.isDirectory()) {
56
+ if (depth < maxDepth) {
57
+ queue.push({ dir: fullPath, depth: depth + 1 });
58
+ }
59
+ continue;
60
+ }
61
+ if (stats.isFile()) {
62
+ fileCount += 1;
63
+ totalSize += stats.size;
64
+ if (files.length < maxFiles) {
65
+ files.push({ fullPath, relPath, size: stats.size, mtimeMs: stats.mtimeMs });
66
+ }
67
+ else {
68
+ truncated = true;
69
+ }
70
+ }
71
+ }
72
+ }
73
+ return { files, totalSize, fileCount, dirCount, truncated };
74
+ }
75
+ export function topExtensions(files, limit = 10) {
76
+ const byExt = new Map();
77
+ for (const file of files) {
78
+ const ext = path.extname(file.fullPath).toLowerCase() || "<no-ext>";
79
+ const entry = byExt.get(ext) ?? { size: 0, count: 0 };
80
+ entry.size += file.size;
81
+ entry.count += 1;
82
+ byExt.set(ext, entry);
83
+ }
84
+ return [...byExt.entries()]
85
+ .sort((a, b) => b[1].size - a[1].size)
86
+ .slice(0, limit)
87
+ .map(([ext, v]) => ({ ext, totalBytes: v.size, count: v.count }));
88
+ }
89
+ export function topFiles(files, options = {}) {
90
+ const limit = options.limit ?? 20;
91
+ const filtered = files.filter((file) => {
92
+ if (options.minBytes !== undefined && file.size < options.minBytes)
93
+ return false;
94
+ if (options.olderThanMs !== undefined && file.mtimeMs > options.olderThanMs)
95
+ return false;
96
+ const posixPath = toPosix(file.relPath);
97
+ if (options.excludeGlobs && micromatch.some(posixPath, options.excludeGlobs, { dot: true }))
98
+ return false;
99
+ if (options.includeGlobs && options.includeGlobs.length > 0 && !micromatch.some(posixPath, options.includeGlobs, { dot: true }))
100
+ return false;
101
+ return true;
102
+ });
103
+ return filtered
104
+ .sort((a, b) => b.size - a.size)
105
+ .slice(0, limit)
106
+ .map((f) => ({ relPath: f.relPath, bytes: f.size, mtimeMs: f.mtimeMs }));
107
+ }
108
+ export function topDirectories(files, rootPath, limit = 10) {
109
+ const sizeByDir = new Map();
110
+ for (const file of files) {
111
+ const dir = path.dirname(file.relPath) || ".";
112
+ const parts = dir === "." ? [] : dir.split(path.sep);
113
+ let current = "";
114
+ for (const part of parts) {
115
+ current = current ? path.join(current, part) : part;
116
+ sizeByDir.set(current, (sizeByDir.get(current) ?? 0) + file.size);
117
+ }
118
+ }
119
+ return [...sizeByDir.entries()]
120
+ .sort((a, b) => b[1] - a[1])
121
+ .slice(0, limit)
122
+ .map(([relDir, bytes]) => ({ relDir, bytes }));
123
+ }
124
+ export function staleFiles(files, options) {
125
+ const limit = options.limit ?? 50;
126
+ const filtered = files.filter((f) => {
127
+ if (options.minBytes !== undefined && f.size < options.minBytes)
128
+ return false;
129
+ if (options.olderThanMs !== undefined && f.mtimeMs > options.olderThanMs)
130
+ return false;
131
+ return true;
132
+ });
133
+ return filtered
134
+ .sort((a, b) => b.size - a.size)
135
+ .slice(0, limit)
136
+ .map((f) => ({ relPath: f.relPath, bytes: f.size, mtimeMs: f.mtimeMs }));
137
+ }
138
+ async function hashFile(fullPath) {
139
+ return new Promise((resolve) => {
140
+ const hash = createHash("sha256");
141
+ const stream = createReadStream(fullPath);
142
+ stream.on("data", (chunk) => hash.update(chunk));
143
+ stream.on("end", () => resolve(hash.digest("hex")));
144
+ stream.on("error", () => resolve(null));
145
+ });
146
+ }
147
+ export async function duplicateGroups(files, limit = 10) {
148
+ const bySize = new Map();
149
+ for (const file of files) {
150
+ const list = bySize.get(file.size) ?? [];
151
+ list.push(file);
152
+ bySize.set(file.size, list);
153
+ }
154
+ const candidates = [...bySize.values()].filter((g) => g.length > 1);
155
+ const groups = [];
156
+ for (const group of candidates) {
157
+ const byHash = new Map();
158
+ for (const file of group) {
159
+ const hash = await hashFile(file.fullPath);
160
+ if (!hash)
161
+ continue;
162
+ const arr = byHash.get(hash) ?? [];
163
+ arr.push(file);
164
+ byHash.set(hash, arr);
165
+ }
166
+ for (const [hash, dupFiles] of byHash.entries()) {
167
+ if (dupFiles.length < 2)
168
+ continue;
169
+ groups.push({
170
+ hash,
171
+ size: dupFiles[0].size,
172
+ files: dupFiles.map((f) => ({ relPath: f.relPath, mtimeMs: f.mtimeMs }))
173
+ });
174
+ if (groups.length >= limit)
175
+ return groups;
176
+ }
177
+ if (groups.length >= limit)
178
+ break;
179
+ }
180
+ return groups;
181
+ }
182
+ export const bytesToHuman = (bytes) => {
183
+ const units = ["B", "KB", "MB", "GB", "TB"];
184
+ let val = bytes;
185
+ let idx = 0;
186
+ while (val >= 1024 && idx < units.length - 1) {
187
+ val /= 1024;
188
+ idx += 1;
189
+ }
190
+ return `${val.toFixed(val >= 10 ? 1 : 2)} ${units[idx]}`;
191
+ };
package/package.json ADDED
@@ -0,0 +1,50 @@
1
+ {
2
+ "name": "disk-clean-mcp",
3
+ "version": "0.1.2",
4
+ "description": "A Model Context Protocol server that analyzes local disk usage and suggests cleanup targets.",
5
+ "private": false,
6
+ "type": "module",
7
+ "license": "MIT",
8
+ "main": "dist/index.js",
9
+ "bin": {
10
+ "disk-clean-mcp": "dist/index.js"
11
+ },
12
+ "files": [
13
+ "dist"
14
+ ],
15
+ "engines": {
16
+ "node": ">=18"
17
+ },
18
+ "keywords": [
19
+ "mcp",
20
+ "disk",
21
+ "cleanup",
22
+ "filesystem",
23
+ "scan",
24
+ "cli"
25
+ ],
26
+ "repository": {
27
+ "type": "git",
28
+ "url": "git+https://github.com/Superandyfre/disk-clean-mcp.git"
29
+ },
30
+ "bugs": {
31
+ "url": "https://github.com/Superandyfre/disk-clean-mcp/issues"
32
+ },
33
+ "homepage": "https://github.com/Superandyfre/disk-clean-mcp#readme",
34
+ "scripts": {
35
+ "build": "tsc -p tsconfig.json",
36
+ "start": "node dist/index.js",
37
+ "dev": "ts-node src/index.ts"
38
+ },
39
+ "dependencies": {
40
+ "@modelcontextprotocol/sdk": "^1.25.1",
41
+ "micromatch": "^4.0.7",
42
+ "zod": "^3.24.1"
43
+ },
44
+ "devDependencies": {
45
+ "@types/micromatch": "^4.0.8",
46
+ "@types/node": "^20.11.30",
47
+ "ts-node": "^10.9.2",
48
+ "typescript": "^5.3.3"
49
+ }
50
+ }