claudekit-cli 1.0.1 → 1.2.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.github/workflows/ci.yml +2 -0
- package/.github/workflows/release.yml +44 -0
- package/CHANGELOG.md +28 -0
- package/CLAUDE.md +3 -2
- package/LICENSE +21 -0
- package/README.md +73 -3
- package/dist/index.js +11556 -10926
- package/package.json +1 -1
- package/src/commands/new.ts +41 -9
- package/src/commands/update.ts +59 -13
- package/src/commands/version.ts +135 -0
- package/src/index.ts +53 -1
- package/src/lib/download.ts +231 -1
- package/src/lib/github.ts +56 -0
- package/src/lib/prompts.ts +4 -3
- package/src/types.ts +11 -2
- package/src/utils/file-scanner.ts +134 -0
- package/src/utils/logger.ts +108 -21
- package/src/utils/safe-prompts.ts +54 -0
- package/tests/commands/version.test.ts +297 -0
- package/tests/lib/github-download-priority.test.ts +301 -0
- package/tests/lib/github.test.ts +2 -2
- package/tests/lib/merge.test.ts +77 -0
- package/tests/types.test.ts +4 -0
- package/tests/utils/file-scanner.test.ts +202 -0
- package/tests/utils/logger.test.ts +115 -0
- package/.opencode/agent/code-reviewer.md +0 -141
- package/.opencode/agent/debugger.md +0 -74
- package/.opencode/agent/docs-manager.md +0 -119
- package/.opencode/agent/git-manager.md +0 -60
- package/.opencode/agent/planner-researcher.md +0 -100
- package/.opencode/agent/planner.md +0 -87
- package/.opencode/agent/project-manager.md +0 -113
- package/.opencode/agent/researcher.md +0 -173
- package/.opencode/agent/solution-brainstormer.md +0 -89
- package/.opencode/agent/system-architecture.md +0 -192
- package/.opencode/agent/tester.md +0 -96
- package/.opencode/agent/ui-ux-designer.md +0 -203
- package/.opencode/agent/ui-ux-developer.md +0 -97
- package/.opencode/command/cook.md +0 -7
- package/.opencode/command/debug.md +0 -10
- package/.opencode/command/design/3d.md +0 -65
- package/.opencode/command/design/fast.md +0 -18
- package/.opencode/command/design/good.md +0 -21
- package/.opencode/command/design/screenshot.md +0 -22
- package/.opencode/command/design/video.md +0 -22
- package/.opencode/command/fix/ci.md +0 -8
- package/.opencode/command/fix/fast.md +0 -11
- package/.opencode/command/fix/hard.md +0 -15
- package/.opencode/command/fix/logs.md +0 -16
- package/.opencode/command/fix/test.md +0 -18
- package/.opencode/command/fix/types.md +0 -10
- package/.opencode/command/git/cm.md +0 -5
- package/.opencode/command/git/cp.md +0 -4
- package/.opencode/command/plan/ci.md +0 -12
- package/.opencode/command/plan/two.md +0 -13
- package/.opencode/command/plan.md +0 -10
- package/.opencode/command/test.md +0 -7
- package/.opencode/command/watzup.md +0 -8
- package/plans/251008-claudekit-cli-implementation-plan.md +0 -1469
- package/plans/reports/251008-from-code-reviewer-to-developer-review-report.md +0 -864
- package/plans/reports/251008-from-tester-to-developer-test-summary-report.md +0 -409
- package/plans/reports/251008-researcher-download-extraction-report.md +0 -1377
- package/plans/reports/251008-researcher-github-api-report.md +0 -1339
- package/plans/research/251008-cli-frameworks-bun-research.md +0 -1051
- package/plans/templates/bug-fix-template.md +0 -69
- package/plans/templates/feature-implementation-template.md +0 -84
- package/plans/templates/refactor-template.md +0 -82
- package/plans/templates/template-usage-guide.md +0 -58
|
@@ -1,1377 +0,0 @@
|
|
|
1
|
-
# Research Report: Download and Extract Release Archives in Bun CLI
|
|
2
|
-
|
|
3
|
-
**Research Date:** October 8, 2025
|
|
4
|
-
**Target Runtime:** Bun v1.x+
|
|
5
|
-
**Language:** TypeScript
|
|
6
|
-
**Scope:** HTTP download with progress tracking, archive extraction, and file conflict handling
|
|
7
|
-
|
|
8
|
-
---
|
|
9
|
-
|
|
10
|
-
## Executive Summary
|
|
11
|
-
|
|
12
|
-
This research evaluates libraries and strategies for implementing a robust download and extraction system in a Bun CLI application. The investigation covers HTTP download mechanisms with progress tracking, archive extraction for multiple formats (.tar.gz, .zip), intelligent file conflict resolution, and proper cleanup procedures.
|
|
13
|
-
|
|
14
|
-
**Key Recommendations:**
|
|
15
|
-
1. **Download:** Use Bun's native `fetch` API with ReadableStream for progress tracking
|
|
16
|
-
2. **Progress Display:** Use `cli-progress` or `ora` for visual feedback
|
|
17
|
-
3. **Extraction:** Use `tar` for .tar.gz files and `unzipper` for .zip files
|
|
18
|
-
4. **File Operations:** Use `fs-extra` for smart merging with conflict detection
|
|
19
|
-
5. **Cleanup:** Use `tmp` package for automatic temporary directory management
|
|
20
|
-
|
|
21
|
-
---
|
|
22
|
-
|
|
23
|
-
## Research Methodology
|
|
24
|
-
|
|
25
|
-
- **Sources consulted:** 35+ web sources, npm packages, official documentation
|
|
26
|
-
- **Date range:** Primarily 2024-2025 sources, with emphasis on recent updates
|
|
27
|
-
- **Key search terms:** Bun fetch, node-tar, unzipper, cli-progress, fs-extra, download progress, file extraction, conflict handling
|
|
28
|
-
- **Focus Areas:** Bun compatibility, performance, TypeScript support, maintenance status
|
|
29
|
-
|
|
30
|
-
---
|
|
31
|
-
|
|
32
|
-
## Key Findings
|
|
33
|
-
|
|
34
|
-
### 1. HTTP Download Libraries
|
|
35
|
-
|
|
36
|
-
#### Bun's Native Fetch API (Recommended)
|
|
37
|
-
**Features:**
|
|
38
|
-
- Built-in to Bun runtime (no external dependencies)
|
|
39
|
-
- Web-standard implementation with Bun-native extensions
|
|
40
|
-
- Full streaming support via `ReadableStream`
|
|
41
|
-
- Supports multiple protocols: `http://`, `https://`, `file://`, `data:`, `blob:`, `s3://`
|
|
42
|
-
- Automatic connection pooling and DNS prefetching
|
|
43
|
-
|
|
44
|
-
**Progress Tracking Implementation:**
|
|
45
|
-
```typescript
|
|
46
|
-
async function downloadWithProgress(url: string, outputPath: string) {
|
|
47
|
-
const response = await fetch(url);
|
|
48
|
-
const contentLength = parseInt(response.headers.get('Content-Length') || '0');
|
|
49
|
-
|
|
50
|
-
if (!response.ok) {
|
|
51
|
-
throw new Error(`Download failed: ${response.statusText}`);
|
|
52
|
-
}
|
|
53
|
-
|
|
54
|
-
const reader = response.body?.getReader();
|
|
55
|
-
if (!reader) throw new Error('No response body');
|
|
56
|
-
|
|
57
|
-
let receivedLength = 0;
|
|
58
|
-
const chunks: Uint8Array[] = [];
|
|
59
|
-
|
|
60
|
-
while (true) {
|
|
61
|
-
const { done, value } = await reader.read();
|
|
62
|
-
|
|
63
|
-
if (done) break;
|
|
64
|
-
|
|
65
|
-
chunks.push(value);
|
|
66
|
-
receivedLength += value.length;
|
|
67
|
-
|
|
68
|
-
// Calculate and display progress
|
|
69
|
-
const progress = contentLength > 0
|
|
70
|
-
? Math.round((receivedLength / contentLength) * 100)
|
|
71
|
-
: receivedLength;
|
|
72
|
-
|
|
73
|
-
console.log(`Downloaded: ${receivedLength} bytes (${progress}%)`);
|
|
74
|
-
}
|
|
75
|
-
|
|
76
|
-
// Combine chunks and write to file
|
|
77
|
-
const allChunks = new Uint8Array(receivedLength);
|
|
78
|
-
let position = 0;
|
|
79
|
-
for (const chunk of chunks) {
|
|
80
|
-
allChunks.set(chunk, position);
|
|
81
|
-
position += chunk.length;
|
|
82
|
-
}
|
|
83
|
-
|
|
84
|
-
await Bun.write(outputPath, allChunks);
|
|
85
|
-
return outputPath;
|
|
86
|
-
}
|
|
87
|
-
```
|
|
88
|
-
|
|
89
|
-
**Bun-Specific File Writing:**
|
|
90
|
-
```typescript
|
|
91
|
-
// Method 1: Direct write from Response
|
|
92
|
-
const response = await fetch(url);
|
|
93
|
-
await Bun.write('./file.tar.gz', response);
|
|
94
|
-
|
|
95
|
-
// Method 2: Stream to file with FileSink (for large files)
|
|
96
|
-
const file = Bun.file('./large-file.tar.gz');
|
|
97
|
-
const writer = file.writer();
|
|
98
|
-
for await (const chunk of response.body) {
|
|
99
|
-
writer.write(chunk);
|
|
100
|
-
}
|
|
101
|
-
await writer.end();
|
|
102
|
-
```
|
|
103
|
-
|
|
104
|
-
#### Alternative Libraries (Node.js Compatibility)
|
|
105
|
-
- **axios:** Popular but adds dependency weight, has built-in progress events
|
|
106
|
-
- **got:** Modern, promise-based, good streaming support
|
|
107
|
-
- **node-fetch:** Node.js implementation (unnecessary in Bun)
|
|
108
|
-
|
|
109
|
-
**Verdict:** Use Bun's native `fetch` for optimal performance and zero dependencies.
|
|
110
|
-
|
|
111
|
-
---
|
|
112
|
-
|
|
113
|
-
### 2. Archive Extraction Libraries
|
|
114
|
-
|
|
115
|
-
#### For TAR/TAR.GZ Files: `tar` (node-tar)
|
|
116
|
-
|
|
117
|
-
**Package:** `tar` (isaacs/node-tar)
|
|
118
|
-
**Latest Version:** 7.4.3 (actively maintained)
|
|
119
|
-
**Weekly Downloads:** ~50M
|
|
120
|
-
**TypeScript:** ✅ Type definitions included
|
|
121
|
-
|
|
122
|
-
**Features:**
|
|
123
|
-
- Unix tar command-like functionality
|
|
124
|
-
- Automatic gzip detection and decompression
|
|
125
|
-
- Stream-based processing
|
|
126
|
-
- File filtering during extraction
|
|
127
|
-
- Preserves file metadata
|
|
128
|
-
|
|
129
|
-
**Installation:**
|
|
130
|
-
```bash
|
|
131
|
-
bun add tar
|
|
132
|
-
```
|
|
133
|
-
|
|
134
|
-
**Usage Example:**
|
|
135
|
-
```typescript
|
|
136
|
-
import tar from 'tar';
|
|
137
|
-
|
|
138
|
-
// Simple extraction
|
|
139
|
-
await tar.extract({
|
|
140
|
-
file: 'archive.tar.gz',
|
|
141
|
-
cwd: './output',
|
|
142
|
-
strip: 1 // Strip first directory level
|
|
143
|
-
});
|
|
144
|
-
|
|
145
|
-
// With filtering and progress
|
|
146
|
-
await tar.extract({
|
|
147
|
-
file: 'archive.tar.gz',
|
|
148
|
-
cwd: './output',
|
|
149
|
-
filter: (path, entry) => {
|
|
150
|
-
// Skip config files
|
|
151
|
-
if (path.endsWith('.env') || path.endsWith('config.json')) {
|
|
152
|
-
return false;
|
|
153
|
-
}
|
|
154
|
-
return true;
|
|
155
|
-
},
|
|
156
|
-
onentry: (entry) => {
|
|
157
|
-
console.log(`Extracting: ${entry.path}`);
|
|
158
|
-
}
|
|
159
|
-
});
|
|
160
|
-
|
|
161
|
-
// Streaming extraction
|
|
162
|
-
import { createReadStream } from 'fs';
|
|
163
|
-
|
|
164
|
-
createReadStream('archive.tar.gz')
|
|
165
|
-
.pipe(tar.extract({ cwd: './output' }));
|
|
166
|
-
```
|
|
167
|
-
|
|
168
|
-
#### For ZIP Files: `unzipper`
|
|
169
|
-
|
|
170
|
-
**Package:** `unzipper`
|
|
171
|
-
**Latest Version:** 0.12.3
|
|
172
|
-
**Weekly Downloads:** ~5M
|
|
173
|
-
**TypeScript:** ✅ Type definitions available via @types/unzipper
|
|
174
|
-
|
|
175
|
-
**Features:**
|
|
176
|
-
- Pure JavaScript (no compiled dependencies)
|
|
177
|
-
- Uses Node.js built-in zlib
|
|
178
|
-
- Streaming support
|
|
179
|
-
- Simple API similar to node-tar
|
|
180
|
-
|
|
181
|
-
**Installation:**
|
|
182
|
-
```bash
|
|
183
|
-
bun add unzipper
|
|
184
|
-
bun add -D @types/unzipper
|
|
185
|
-
```
|
|
186
|
-
|
|
187
|
-
**Usage Example:**
|
|
188
|
-
```typescript
|
|
189
|
-
import { createReadStream } from 'fs';
|
|
190
|
-
import unzipper from 'unzipper';
|
|
191
|
-
|
|
192
|
-
// Simple extraction
|
|
193
|
-
await createReadStream('archive.zip')
|
|
194
|
-
.pipe(unzipper.Extract({ path: './output' }))
|
|
195
|
-
.promise();
|
|
196
|
-
|
|
197
|
-
// With filtering
|
|
198
|
-
await createReadStream('archive.zip')
|
|
199
|
-
.pipe(unzipper.Parse())
|
|
200
|
-
.on('entry', async (entry) => {
|
|
201
|
-
const fileName = entry.path;
|
|
202
|
-
const type = entry.type; // 'Directory' or 'File'
|
|
203
|
-
|
|
204
|
-
// Skip config files
|
|
205
|
-
if (fileName.endsWith('.env') || fileName.endsWith('config.json')) {
|
|
206
|
-
entry.autodrain();
|
|
207
|
-
return;
|
|
208
|
-
}
|
|
209
|
-
|
|
210
|
-
entry.pipe(createWriteStream(`./output/${fileName}`));
|
|
211
|
-
})
|
|
212
|
-
.promise();
|
|
213
|
-
```
|
|
214
|
-
|
|
215
|
-
#### Alternative ZIP Libraries
|
|
216
|
-
|
|
217
|
-
**extract-zip:**
|
|
218
|
-
- **Pros:** Fast, focused on extraction only, 17M weekly downloads
|
|
219
|
-
- **Cons:** No streaming support, lacks advanced features
|
|
220
|
-
- **Use Case:** Simple, fast extraction tasks
|
|
221
|
-
|
|
222
|
-
**jszip:**
|
|
223
|
-
- **Pros:** Browser + Node.js support, comprehensive features, 12M weekly downloads
|
|
224
|
-
- **Cons:** Loads entire zip into memory, not ideal for large files
|
|
225
|
-
- **Use Case:** Cross-platform apps, zip creation/manipulation
|
|
226
|
-
|
|
227
|
-
**adm-zip:**
|
|
228
|
-
- **Pros:** Pure JavaScript, simple API
|
|
229
|
-
- **Cons:** 2GB file size limit, loads entire file into memory
|
|
230
|
-
- **Use Case:** Small files only
|
|
231
|
-
|
|
232
|
-
**Verdict:** Use `unzipper` for memory-efficient streaming extraction of large files.
|
|
233
|
-
|
|
234
|
-
---
|
|
235
|
-
|
|
236
|
-
### 3. Progress Indicators
|
|
237
|
-
|
|
238
|
-
#### CLI Progress Bars: `cli-progress`
|
|
239
|
-
|
|
240
|
-
**Package:** `cli-progress`
|
|
241
|
-
**Weekly Downloads:** ~26M
|
|
242
|
-
**Bun Compatible:** ✅ Yes
|
|
243
|
-
**TypeScript:** ✅ Built-in types
|
|
244
|
-
|
|
245
|
-
**Features:**
|
|
246
|
-
- Multiple progress bar support
|
|
247
|
-
- Customizable themes
|
|
248
|
-
- Works in any terminal
|
|
249
|
-
- No external dependencies
|
|
250
|
-
|
|
251
|
-
**Installation:**
|
|
252
|
-
```bash
|
|
253
|
-
bun add cli-progress
|
|
254
|
-
```
|
|
255
|
-
|
|
256
|
-
**Usage Example:**
|
|
257
|
-
```typescript
|
|
258
|
-
import cliProgress from 'cli-progress';
|
|
259
|
-
|
|
260
|
-
// Single progress bar
|
|
261
|
-
const progressBar = new cliProgress.SingleBar({
|
|
262
|
-
format: 'Downloading [{bar}] {percentage}% | ETA: {eta}s | {value}/{total} bytes',
|
|
263
|
-
barCompleteChar: '\u2588',
|
|
264
|
-
barIncompleteChar: '\u2591',
|
|
265
|
-
hideCursor: true
|
|
266
|
-
});
|
|
267
|
-
|
|
268
|
-
// Start progress
|
|
269
|
-
progressBar.start(totalBytes, 0);
|
|
270
|
-
|
|
271
|
-
// Update progress
|
|
272
|
-
progressBar.update(receivedBytes);
|
|
273
|
-
|
|
274
|
-
// Complete
|
|
275
|
-
progressBar.stop();
|
|
276
|
-
|
|
277
|
-
// Multi-bar for multiple files
|
|
278
|
-
const multibar = new cliProgress.MultiBar({
|
|
279
|
-
clearOnComplete: false,
|
|
280
|
-
hideCursor: true,
|
|
281
|
-
format: '{filename} [{bar}] {percentage}% | {value}/{total}'
|
|
282
|
-
}, cliProgress.Presets.shades_classic);
|
|
283
|
-
|
|
284
|
-
const bar1 = multibar.create(100, 0, { filename: 'file1.tar.gz' });
|
|
285
|
-
const bar2 = multibar.create(100, 0, { filename: 'file2.zip' });
|
|
286
|
-
|
|
287
|
-
bar1.update(50);
|
|
288
|
-
bar2.update(30);
|
|
289
|
-
|
|
290
|
-
multibar.stop();
|
|
291
|
-
```
|
|
292
|
-
|
|
293
|
-
#### Spinners: `ora`
|
|
294
|
-
|
|
295
|
-
**Package:** `ora`
|
|
296
|
-
**Weekly Downloads:** ~24M
|
|
297
|
-
**Bun Compatible:** ✅ Yes
|
|
298
|
-
**TypeScript:** ✅ Built-in types
|
|
299
|
-
|
|
300
|
-
**Features:**
|
|
301
|
-
- Elegant terminal spinner
|
|
302
|
-
- 80+ spinner styles
|
|
303
|
-
- Color support
|
|
304
|
-
- Promise integration
|
|
305
|
-
|
|
306
|
-
**Installation:**
|
|
307
|
-
```bash
|
|
308
|
-
bun add ora
|
|
309
|
-
```
|
|
310
|
-
|
|
311
|
-
**Usage Example:**
|
|
312
|
-
```typescript
|
|
313
|
-
import ora from 'ora';
|
|
314
|
-
|
|
315
|
-
const spinner = ora('Downloading archive...').start();
|
|
316
|
-
|
|
317
|
-
try {
|
|
318
|
-
await downloadFile(url);
|
|
319
|
-
spinner.succeed('Download complete!');
|
|
320
|
-
} catch (error) {
|
|
321
|
-
spinner.fail('Download failed!');
|
|
322
|
-
}
|
|
323
|
-
|
|
324
|
-
// With promise
|
|
325
|
-
const spinner = ora('Processing...').start();
|
|
326
|
-
await spinner.promise(
|
|
327
|
-
processArchive(),
|
|
328
|
-
{
|
|
329
|
-
successText: 'Archive processed!',
|
|
330
|
-
failText: 'Processing failed!'
|
|
331
|
-
}
|
|
332
|
-
);
|
|
333
|
-
```
|
|
334
|
-
|
|
335
|
-
**Combined Approach:**
|
|
336
|
-
```typescript
|
|
337
|
-
import ora from 'ora';
|
|
338
|
-
import cliProgress from 'cli-progress';
|
|
339
|
-
|
|
340
|
-
// Use spinner for indeterminate operations
|
|
341
|
-
const spinner = ora('Fetching release info...').start();
|
|
342
|
-
const releaseInfo = await getReleaseInfo();
|
|
343
|
-
spinner.succeed('Release info fetched!');
|
|
344
|
-
|
|
345
|
-
// Use progress bar for downloads
|
|
346
|
-
const progressBar = new cliProgress.SingleBar({});
|
|
347
|
-
progressBar.start(totalBytes, 0);
|
|
348
|
-
// ... update during download
|
|
349
|
-
progressBar.stop();
|
|
350
|
-
|
|
351
|
-
// Use spinner for extraction
|
|
352
|
-
const extractSpinner = ora('Extracting archive...').start();
|
|
353
|
-
await extractArchive();
|
|
354
|
-
extractSpinner.succeed('Extraction complete!');
|
|
355
|
-
```
|
|
356
|
-
|
|
357
|
-
---
|
|
358
|
-
|
|
359
|
-
### 4. File System Operations & Conflict Handling
|
|
360
|
-
|
|
361
|
-
#### Smart File Merging: `fs-extra`
|
|
362
|
-
|
|
363
|
-
**Package:** `fs-extra`
|
|
364
|
-
**Latest Version:** 11.3.2
|
|
365
|
-
**Weekly Downloads:** ~70M
|
|
366
|
-
**TypeScript:** ✅ Built-in types
|
|
367
|
-
|
|
368
|
-
**Features:**
|
|
369
|
-
- All Node.js `fs` methods + extras
|
|
370
|
-
- Promise-based API
|
|
371
|
-
- Recursive operations
|
|
372
|
-
- Copy with filtering
|
|
373
|
-
|
|
374
|
-
**Installation:**
|
|
375
|
-
```bash
|
|
376
|
-
bun add fs-extra
|
|
377
|
-
```
|
|
378
|
-
|
|
379
|
-
**Conflict Handling Strategies:**
|
|
380
|
-
|
|
381
|
-
```typescript
|
|
382
|
-
import fs from 'fs-extra';
|
|
383
|
-
import path from 'path';
|
|
384
|
-
|
|
385
|
-
// Strategy 1: Skip existing files
|
|
386
|
-
async function copySkipExisting(src: string, dest: string) {
|
|
387
|
-
await fs.copy(src, dest, {
|
|
388
|
-
overwrite: false,
|
|
389
|
-
errorOnExist: false,
|
|
390
|
-
filter: async (srcPath) => {
|
|
391
|
-
const destPath = srcPath.replace(src, dest);
|
|
392
|
-
const exists = await fs.pathExists(destPath);
|
|
393
|
-
return !exists; // Skip if exists
|
|
394
|
-
}
|
|
395
|
-
});
|
|
396
|
-
}
|
|
397
|
-
|
|
398
|
-
// Strategy 2: Skip config files, overwrite code files
|
|
399
|
-
const CONFIG_PATTERNS = [
|
|
400
|
-
/\.env(\..*)?$/,
|
|
401
|
-
/config\.(json|yaml|yml|toml)$/,
|
|
402
|
-
/\.config\.(js|ts)$/,
|
|
403
|
-
/package\.json$/,
|
|
404
|
-
/bun\.lockb$/
|
|
405
|
-
];
|
|
406
|
-
|
|
407
|
-
async function smartMerge(src: string, dest: string) {
|
|
408
|
-
await fs.copy(src, dest, {
|
|
409
|
-
overwrite: true,
|
|
410
|
-
filter: async (srcPath) => {
|
|
411
|
-
const relativePath = path.relative(src, srcPath);
|
|
412
|
-
|
|
413
|
-
// Skip config files
|
|
414
|
-
if (CONFIG_PATTERNS.some(pattern => pattern.test(relativePath))) {
|
|
415
|
-
const destPath = path.join(dest, relativePath);
|
|
416
|
-
if (await fs.pathExists(destPath)) {
|
|
417
|
-
console.log(`⏭️ Skipped: ${relativePath} (config file)`);
|
|
418
|
-
return false;
|
|
419
|
-
}
|
|
420
|
-
}
|
|
421
|
-
|
|
422
|
-
return true;
|
|
423
|
-
}
|
|
424
|
-
});
|
|
425
|
-
}
|
|
426
|
-
|
|
427
|
-
// Strategy 3: Prompt user for conflicts
|
|
428
|
-
import prompts from 'prompts';
|
|
429
|
-
|
|
430
|
-
async function interactiveMerge(src: string, dest: string) {
|
|
431
|
-
const conflicts: string[] = [];
|
|
432
|
-
|
|
433
|
-
// First pass: detect conflicts
|
|
434
|
-
await fs.copy(src, dest, {
|
|
435
|
-
overwrite: false,
|
|
436
|
-
errorOnExist: false,
|
|
437
|
-
filter: async (srcPath) => {
|
|
438
|
-
const destPath = srcPath.replace(src, dest);
|
|
439
|
-
if (await fs.pathExists(destPath)) {
|
|
440
|
-
const stat = await fs.stat(srcPath);
|
|
441
|
-
if (stat.isFile()) {
|
|
442
|
-
conflicts.push(path.relative(src, srcPath));
|
|
443
|
-
}
|
|
444
|
-
}
|
|
445
|
-
return false; // Don't copy yet
|
|
446
|
-
}
|
|
447
|
-
});
|
|
448
|
-
|
|
449
|
-
// Resolve conflicts
|
|
450
|
-
const resolutions = new Map<string, 'skip' | 'overwrite'>();
|
|
451
|
-
|
|
452
|
-
for (const file of conflicts) {
|
|
453
|
-
const response = await prompts({
|
|
454
|
-
type: 'select',
|
|
455
|
-
name: 'action',
|
|
456
|
-
message: `File exists: ${file}`,
|
|
457
|
-
choices: [
|
|
458
|
-
{ title: 'Skip', value: 'skip' },
|
|
459
|
-
{ title: 'Overwrite', value: 'overwrite' }
|
|
460
|
-
]
|
|
461
|
-
});
|
|
462
|
-
|
|
463
|
-
resolutions.set(file, response.action);
|
|
464
|
-
}
|
|
465
|
-
|
|
466
|
-
// Second pass: copy with resolutions
|
|
467
|
-
await fs.copy(src, dest, {
|
|
468
|
-
overwrite: true,
|
|
469
|
-
filter: (srcPath) => {
|
|
470
|
-
const relativePath = path.relative(src, srcPath);
|
|
471
|
-
const resolution = resolutions.get(relativePath);
|
|
472
|
-
|
|
473
|
-
if (resolution === 'skip') {
|
|
474
|
-
return false;
|
|
475
|
-
}
|
|
476
|
-
|
|
477
|
-
return true;
|
|
478
|
-
}
|
|
479
|
-
});
|
|
480
|
-
}
|
|
481
|
-
|
|
482
|
-
// Strategy 4: Backup existing files
|
|
483
|
-
async function mergeWithBackup(src: string, dest: string) {
|
|
484
|
-
const backupDir = `${dest}.backup.${Date.now()}`;
|
|
485
|
-
|
|
486
|
-
await fs.copy(src, dest, {
|
|
487
|
-
overwrite: true,
|
|
488
|
-
filter: async (srcPath) => {
|
|
489
|
-
const destPath = srcPath.replace(src, dest);
|
|
490
|
-
|
|
491
|
-
if (await fs.pathExists(destPath)) {
|
|
492
|
-
const backupPath = destPath.replace(dest, backupDir);
|
|
493
|
-
await fs.ensureDir(path.dirname(backupPath));
|
|
494
|
-
await fs.copy(destPath, backupPath);
|
|
495
|
-
console.log(`📦 Backed up: ${path.relative(dest, destPath)}`);
|
|
496
|
-
}
|
|
497
|
-
|
|
498
|
-
return true;
|
|
499
|
-
}
|
|
500
|
-
});
|
|
501
|
-
|
|
502
|
-
console.log(`\n✅ Backup created at: ${backupDir}`);
|
|
503
|
-
}
|
|
504
|
-
```
|
|
505
|
-
|
|
506
|
-
#### File Filtering with gitignore: `ignore`
|
|
507
|
-
|
|
508
|
-
**Package:** `ignore`
|
|
509
|
-
**Weekly Downloads:** ~45M
|
|
510
|
-
**Used by:** ESLint, Prettier, many others
|
|
511
|
-
|
|
512
|
-
```typescript
|
|
513
|
-
import ignore from 'ignore';
|
|
514
|
-
import fs from 'fs-extra';
|
|
515
|
-
import path from 'path';
|
|
516
|
-
|
|
517
|
-
async function mergeWithGitignore(src: string, dest: string) {
|
|
518
|
-
const ig = ignore();
|
|
519
|
-
|
|
520
|
-
// Load .gitignore if exists
|
|
521
|
-
const gitignorePath = path.join(dest, '.gitignore');
|
|
522
|
-
if (await fs.pathExists(gitignorePath)) {
|
|
523
|
-
const gitignoreContent = await fs.readFile(gitignorePath, 'utf-8');
|
|
524
|
-
ig.add(gitignoreContent);
|
|
525
|
-
}
|
|
526
|
-
|
|
527
|
-
// Add default ignores
|
|
528
|
-
ig.add([
|
|
529
|
-
'.env*',
|
|
530
|
-
'node_modules/',
|
|
531
|
-
'*.log',
|
|
532
|
-
'.DS_Store'
|
|
533
|
-
]);
|
|
534
|
-
|
|
535
|
-
await fs.copy(src, dest, {
|
|
536
|
-
filter: (srcPath) => {
|
|
537
|
-
const relativePath = path.relative(src, srcPath);
|
|
538
|
-
return !ig.ignores(relativePath);
|
|
539
|
-
}
|
|
540
|
-
});
|
|
541
|
-
}
|
|
542
|
-
```
|
|
543
|
-
|
|
544
|
-
---
|
|
545
|
-
|
|
546
|
-
### 5. Temporary Directory Management & Cleanup
|
|
547
|
-
|
|
548
|
-
#### Automatic Cleanup: `tmp`
|
|
549
|
-
|
|
550
|
-
**Package:** `tmp`
|
|
551
|
-
**Weekly Downloads:** ~20M
|
|
552
|
-
**TypeScript:** ✅ @types/tmp available
|
|
553
|
-
|
|
554
|
-
**Features:**
|
|
555
|
-
- Automatic cleanup on process exit
|
|
556
|
-
- Secure temp file creation
|
|
557
|
-
- Graceful or forced cleanup
|
|
558
|
-
- Prefix/postfix support
|
|
559
|
-
|
|
560
|
-
**Installation:**
|
|
561
|
-
```bash
|
|
562
|
-
bun add tmp
|
|
563
|
-
bun add -D @types/tmp
|
|
564
|
-
```
|
|
565
|
-
|
|
566
|
-
**Usage Example:**
|
|
567
|
-
```typescript
|
|
568
|
-
import tmp from 'tmp';
|
|
569
|
-
import { promisify } from 'util';
|
|
570
|
-
|
|
571
|
-
// Enable automatic cleanup
|
|
572
|
-
tmp.setGracefulCleanup();
|
|
573
|
-
|
|
574
|
-
// Create temporary directory
|
|
575
|
-
const tmpDir = await promisify(tmp.dir)({
|
|
576
|
-
prefix: 'claudekit-',
|
|
577
|
-
unsafeCleanup: true // Remove even if not empty
|
|
578
|
-
});
|
|
579
|
-
|
|
580
|
-
console.log(`Temp dir: ${tmpDir.name}`);
|
|
581
|
-
|
|
582
|
-
// Download and extract to temp dir
|
|
583
|
-
await downloadFile(url, path.join(tmpDir.name, 'archive.tar.gz'));
|
|
584
|
-
await extractArchive(path.join(tmpDir.name, 'archive.tar.gz'), tmpDir.name);
|
|
585
|
-
|
|
586
|
-
// Manual cleanup (if needed before exit)
|
|
587
|
-
tmpDir.cleanup();
|
|
588
|
-
|
|
589
|
-
// Temp file
|
|
590
|
-
const tmpFile = await promisify(tmp.file)({
|
|
591
|
-
prefix: 'download-',
|
|
592
|
-
postfix: '.tar.gz'
|
|
593
|
-
});
|
|
594
|
-
|
|
595
|
-
console.log(`Temp file: ${tmpFile.name}`);
|
|
596
|
-
```
|
|
597
|
-
|
|
598
|
-
#### Manual Cleanup Pattern
|
|
599
|
-
|
|
600
|
-
```typescript
|
|
601
|
-
import fs from 'fs-extra';
|
|
602
|
-
import path from 'path';
|
|
603
|
-
import os from 'os';
|
|
604
|
-
|
|
605
|
-
async function withTempDir<T>(
|
|
606
|
-
fn: (tmpDir: string) => Promise<T>
|
|
607
|
-
): Promise<T> {
|
|
608
|
-
const tmpDir = path.join(
|
|
609
|
-
os.tmpdir(),
|
|
610
|
-
`claudekit-${Date.now()}-${Math.random().toString(36).slice(2)}`
|
|
611
|
-
);
|
|
612
|
-
|
|
613
|
-
try {
|
|
614
|
-
await fs.ensureDir(tmpDir);
|
|
615
|
-
return await fn(tmpDir);
|
|
616
|
-
} finally {
|
|
617
|
-
// Always cleanup, even if error occurs
|
|
618
|
-
await fs.remove(tmpDir).catch(err => {
|
|
619
|
-
console.warn(`Failed to cleanup temp dir: ${err.message}`);
|
|
620
|
-
});
|
|
621
|
-
}
|
|
622
|
-
}
|
|
623
|
-
|
|
624
|
-
// Usage
|
|
625
|
-
const result = await withTempDir(async (tmpDir) => {
|
|
626
|
-
await downloadFile(url, path.join(tmpDir, 'archive.tar.gz'));
|
|
627
|
-
await extractArchive(path.join(tmpDir, 'archive.tar.gz'), tmpDir);
|
|
628
|
-
await mergeFiles(tmpDir, targetDir);
|
|
629
|
-
return 'success';
|
|
630
|
-
});
|
|
631
|
-
```
|
|
632
|
-
|
|
633
|
-
---
|
|
634
|
-
|
|
635
|
-
### 6. GitHub Release Download
|
|
636
|
-
|
|
637
|
-
#### GitHub API Integration
|
|
638
|
-
|
|
639
|
-
```typescript
|
|
640
|
-
interface GitHubRelease {
|
|
641
|
-
tag_name: string;
|
|
642
|
-
tarball_url: string;
|
|
643
|
-
zipball_url: string;
|
|
644
|
-
}
|
|
645
|
-
|
|
646
|
-
async function getLatestRelease(
|
|
647
|
-
owner: string,
|
|
648
|
-
repo: string,
|
|
649
|
-
token?: string
|
|
650
|
-
): Promise<GitHubRelease> {
|
|
651
|
-
const headers: HeadersInit = {
|
|
652
|
-
'Accept': 'application/vnd.github.v3+json'
|
|
653
|
-
};
|
|
654
|
-
|
|
655
|
-
if (token) {
|
|
656
|
-
headers['Authorization'] = `Bearer ${token}`;
|
|
657
|
-
}
|
|
658
|
-
|
|
659
|
-
const response = await fetch(
|
|
660
|
-
`https://api.github.com/repos/${owner}/${repo}/releases/latest`,
|
|
661
|
-
{ headers }
|
|
662
|
-
);
|
|
663
|
-
|
|
664
|
-
if (!response.ok) {
|
|
665
|
-
throw new Error(`GitHub API error: ${response.statusText}`);
|
|
666
|
-
}
|
|
667
|
-
|
|
668
|
-
return response.json();
|
|
669
|
-
}
|
|
670
|
-
|
|
671
|
-
async function downloadGitHubRelease(
|
|
672
|
-
owner: string,
|
|
673
|
-
repo: string,
|
|
674
|
-
tag: string,
|
|
675
|
-
token?: string
|
|
676
|
-
): Promise<string> {
|
|
677
|
-
const headers: HeadersInit = {
|
|
678
|
-
'Accept': 'application/vnd.github.v3+json'
|
|
679
|
-
};
|
|
680
|
-
|
|
681
|
-
if (token) {
|
|
682
|
-
headers['Authorization'] = `Bearer ${token}`;
|
|
683
|
-
}
|
|
684
|
-
|
|
685
|
-
// Download tarball (follows redirects automatically)
|
|
686
|
-
const tarballUrl = `https://api.github.com/repos/${owner}/${repo}/tarball/${tag}`;
|
|
687
|
-
const response = await fetch(tarballUrl, {
|
|
688
|
-
headers,
|
|
689
|
-
redirect: 'follow'
|
|
690
|
-
});
|
|
691
|
-
|
|
692
|
-
if (!response.ok) {
|
|
693
|
-
throw new Error(`Download failed: ${response.statusText}`);
|
|
694
|
-
}
|
|
695
|
-
|
|
696
|
-
const tmpFile = path.join(os.tmpdir(), `${repo}-${tag}.tar.gz`);
|
|
697
|
-
await Bun.write(tmpFile, response);
|
|
698
|
-
|
|
699
|
-
return tmpFile;
|
|
700
|
-
}
|
|
701
|
-
```
|
|
702
|
-
|
|
703
|
-
---
|
|
704
|
-
|
|
705
|
-
### 7. Security Considerations
|
|
706
|
-
|
|
707
|
-
#### Path Traversal Protection
|
|
708
|
-
|
|
709
|
-
```typescript
|
|
710
|
-
import path from 'path';
|
|
711
|
-
|
|
712
|
-
function isPathSafe(basePath: string, targetPath: string): boolean {
|
|
713
|
-
const resolvedBase = path.resolve(basePath);
|
|
714
|
-
const resolvedTarget = path.resolve(targetPath);
|
|
715
|
-
|
|
716
|
-
// Ensure target is within base directory
|
|
717
|
-
return resolvedTarget.startsWith(resolvedBase);
|
|
718
|
-
}
|
|
719
|
-
|
|
720
|
-
async function safeExtract(archivePath: string, outputDir: string) {
|
|
721
|
-
await tar.extract({
|
|
722
|
-
file: archivePath,
|
|
723
|
-
cwd: outputDir,
|
|
724
|
-
filter: (filePath) => {
|
|
725
|
-
const fullPath = path.join(outputDir, filePath);
|
|
726
|
-
|
|
727
|
-
if (!isPathSafe(outputDir, fullPath)) {
|
|
728
|
-
console.warn(`⚠️ Blocked path traversal: ${filePath}`);
|
|
729
|
-
return false;
|
|
730
|
-
}
|
|
731
|
-
|
|
732
|
-
return true;
|
|
733
|
-
}
|
|
734
|
-
});
|
|
735
|
-
}
|
|
736
|
-
```
|
|
737
|
-
|
|
738
|
-
#### Content Validation
|
|
739
|
-
|
|
740
|
-
```typescript
|
|
741
|
-
import crypto from 'crypto';
|
|
742
|
-
|
|
743
|
-
async function verifyChecksum(
|
|
744
|
-
filePath: string,
|
|
745
|
-
expectedChecksum: string,
|
|
746
|
-
algorithm: 'sha256' | 'md5' = 'sha256'
|
|
747
|
-
): Promise<boolean> {
|
|
748
|
-
const hash = crypto.createHash(algorithm);
|
|
749
|
-
const fileContent = await fs.readFile(filePath);
|
|
750
|
-
hash.update(fileContent);
|
|
751
|
-
const actualChecksum = hash.digest('hex');
|
|
752
|
-
|
|
753
|
-
return actualChecksum === expectedChecksum;
|
|
754
|
-
}
|
|
755
|
-
```
|
|
756
|
-
|
|
757
|
-
---
|
|
758
|
-
|
|
759
|
-
## Comparative Analysis
|
|
760
|
-
|
|
761
|
-
### Download Libraries
|
|
762
|
-
|
|
763
|
-
| Library | Bun Native | Streaming | Progress | TypeScript | Recommendation |
|
|
764
|
-
|---------|-----------|-----------|----------|------------|----------------|
|
|
765
|
-
| Bun fetch | ✅ | ✅ | Manual | ✅ | **Best choice** |
|
|
766
|
-
| axios | ❌ | ✅ | Built-in | ✅ | Alternative |
|
|
767
|
-
| got | ❌ | ✅ | Built-in | ✅ | Alternative |
|
|
768
|
-
|
|
769
|
-
### Extraction Libraries
|
|
770
|
-
|
|
771
|
-
| Library | Format | Streaming | Memory | Performance | Weekly DL | Recommendation |
|
|
772
|
-
|---------|--------|-----------|--------|-------------|-----------|----------------|
|
|
773
|
-
| tar | .tar.gz | ✅ | Low | Excellent | ~50M | **Best for tar** |
|
|
774
|
-
| unzipper | .zip | ✅ | Low | Good | ~5M | **Best for zip** |
|
|
775
|
-
| extract-zip | .zip | ❌ | Medium | Excellent | ~17M | Simple tasks |
|
|
776
|
-
| jszip | .zip | ✅ | High | Fair | ~12M | Browser + Node |
|
|
777
|
-
|
|
778
|
-
### Progress Indicators
|
|
779
|
-
|
|
780
|
-
| Library | Type | Features | Complexity | Weekly DL | Recommendation |
|
|
781
|
-
|---------|------|----------|------------|-----------|----------------|
|
|
782
|
-
| cli-progress | Bar | Multi-bar, themes | Medium | ~26M | **Best for progress** |
|
|
783
|
-
| ora | Spinner | 80+ styles, colors | Low | ~24M | **Best for spinners** |
|
|
784
|
-
|
|
785
|
-
---
|
|
786
|
-
|
|
787
|
-
## Implementation Recommendations
|
|
788
|
-
|
|
789
|
-
### Complete Download & Extract Workflow
|
|
790
|
-
|
|
791
|
-
```typescript
|
|
792
|
-
import ora from 'ora';
|
|
793
|
-
import cliProgress from 'cli-progress';
|
|
794
|
-
import tar from 'tar';
|
|
795
|
-
import unzipper from 'unzipper';
|
|
796
|
-
import fs from 'fs-extra';
|
|
797
|
-
import tmp from 'tmp';
|
|
798
|
-
import path from 'path';
|
|
799
|
-
|
|
800
|
-
// Enable auto cleanup
|
|
801
|
-
tmp.setGracefulCleanup();
|
|
802
|
-
|
|
803
|
-
interface DownloadOptions {
|
|
804
|
-
url: string;
|
|
805
|
-
targetDir: string;
|
|
806
|
-
skipExisting?: boolean;
|
|
807
|
-
skipPatterns?: RegExp[];
|
|
808
|
-
githubToken?: string;
|
|
809
|
-
}
|
|
810
|
-
|
|
811
|
-
async function downloadAndExtract(options: DownloadOptions): Promise<void> {
|
|
812
|
-
const {
|
|
813
|
-
url,
|
|
814
|
-
targetDir,
|
|
815
|
-
skipExisting = false,
|
|
816
|
-
skipPatterns = [/\.env/, /config\.json/],
|
|
817
|
-
githubToken
|
|
818
|
-
} = options;
|
|
819
|
-
|
|
820
|
-
// Step 1: Fetch file info
|
|
821
|
-
const spinner = ora('Fetching download info...').start();
|
|
822
|
-
|
|
823
|
-
const headers: HeadersInit = {};
|
|
824
|
-
if (githubToken) {
|
|
825
|
-
headers['Authorization'] = `Bearer ${githubToken}`;
|
|
826
|
-
}
|
|
827
|
-
|
|
828
|
-
const response = await fetch(url, {
|
|
829
|
-
method: 'HEAD',
|
|
830
|
-
headers
|
|
831
|
-
});
|
|
832
|
-
|
|
833
|
-
if (!response.ok) {
|
|
834
|
-
spinner.fail('Failed to fetch download info');
|
|
835
|
-
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
|
|
836
|
-
}
|
|
837
|
-
|
|
838
|
-
const contentLength = parseInt(response.headers.get('Content-Length') || '0');
|
|
839
|
-
const contentType = response.headers.get('Content-Type') || '';
|
|
840
|
-
|
|
841
|
-
spinner.succeed('Download info fetched');
|
|
842
|
-
|
|
843
|
-
// Step 2: Download with progress
|
|
844
|
-
const progressBar = new cliProgress.SingleBar({
|
|
845
|
-
format: 'Downloading [{bar}] {percentage}% | {value}/{total} bytes',
|
|
846
|
-
barCompleteChar: '\u2588',
|
|
847
|
-
barIncompleteChar: '\u2591'
|
|
848
|
-
});
|
|
849
|
-
|
|
850
|
-
progressBar.start(contentLength, 0);
|
|
851
|
-
|
|
852
|
-
const downloadResponse = await fetch(url, { headers });
|
|
853
|
-
const reader = downloadResponse.body?.getReader();
|
|
854
|
-
|
|
855
|
-
if (!reader) {
|
|
856
|
-
throw new Error('No response body');
|
|
857
|
-
}
|
|
858
|
-
|
|
859
|
-
let receivedLength = 0;
|
|
860
|
-
const chunks: Uint8Array[] = [];
|
|
861
|
-
|
|
862
|
-
while (true) {
|
|
863
|
-
const { done, value } = await reader.read();
|
|
864
|
-
if (done) break;
|
|
865
|
-
|
|
866
|
-
chunks.push(value);
|
|
867
|
-
receivedLength += value.length;
|
|
868
|
-
progressBar.update(receivedLength);
|
|
869
|
-
}
|
|
870
|
-
|
|
871
|
-
progressBar.stop();
|
|
872
|
-
|
|
873
|
-
// Step 3: Save to temp file
|
|
874
|
-
const tmpDir = tmp.dirSync({ unsafeCleanup: true });
|
|
875
|
-
const fileExt = contentType.includes('gzip') || url.includes('.tar.gz')
|
|
876
|
-
? '.tar.gz'
|
|
877
|
-
: '.zip';
|
|
878
|
-
const tmpFile = path.join(tmpDir.name, `archive${fileExt}`);
|
|
879
|
-
|
|
880
|
-
const allChunks = new Uint8Array(receivedLength);
|
|
881
|
-
let position = 0;
|
|
882
|
-
for (const chunk of chunks) {
|
|
883
|
-
allChunks.set(chunk, position);
|
|
884
|
-
position += chunk.length;
|
|
885
|
-
}
|
|
886
|
-
|
|
887
|
-
await Bun.write(tmpFile, allChunks);
|
|
888
|
-
console.log('✅ Download complete');
|
|
889
|
-
|
|
890
|
-
// Step 4: Extract
|
|
891
|
-
const extractSpinner = ora('Extracting archive...').start();
|
|
892
|
-
const extractDir = path.join(tmpDir.name, 'extracted');
|
|
893
|
-
await fs.ensureDir(extractDir);
|
|
894
|
-
|
|
895
|
-
try {
|
|
896
|
-
if (fileExt === '.tar.gz') {
|
|
897
|
-
await tar.extract({
|
|
898
|
-
file: tmpFile,
|
|
899
|
-
cwd: extractDir,
|
|
900
|
-
strip: 1, // Remove top-level directory
|
|
901
|
-
filter: (filePath) => {
|
|
902
|
-
// Security check
|
|
903
|
-
const fullPath = path.join(extractDir, filePath);
|
|
904
|
-
const resolvedPath = path.resolve(fullPath);
|
|
905
|
-
const resolvedBase = path.resolve(extractDir);
|
|
906
|
-
|
|
907
|
-
if (!resolvedPath.startsWith(resolvedBase)) {
|
|
908
|
-
console.warn(`⚠️ Blocked: ${filePath}`);
|
|
909
|
-
return false;
|
|
910
|
-
}
|
|
911
|
-
|
|
912
|
-
return true;
|
|
913
|
-
}
|
|
914
|
-
});
|
|
915
|
-
} else {
|
|
916
|
-
const { createReadStream } = await import('fs');
|
|
917
|
-
await createReadStream(tmpFile)
|
|
918
|
-
.pipe(unzipper.Extract({ path: extractDir }))
|
|
919
|
-
.promise();
|
|
920
|
-
}
|
|
921
|
-
|
|
922
|
-
extractSpinner.succeed('Extraction complete');
|
|
923
|
-
} catch (error) {
|
|
924
|
-
extractSpinner.fail('Extraction failed');
|
|
925
|
-
throw error;
|
|
926
|
-
}
|
|
927
|
-
|
|
928
|
-
// Step 5: Smart merge
|
|
929
|
-
const mergeSpinner = ora('Merging files...').start();
|
|
930
|
-
let skippedCount = 0;
|
|
931
|
-
let copiedCount = 0;
|
|
932
|
-
|
|
933
|
-
await fs.copy(extractDir, targetDir, {
|
|
934
|
-
overwrite: !skipExisting,
|
|
935
|
-
filter: async (srcPath) => {
|
|
936
|
-
const relativePath = path.relative(extractDir, srcPath);
|
|
937
|
-
const destPath = path.join(targetDir, relativePath);
|
|
938
|
-
|
|
939
|
-
// Check if exists
|
|
940
|
-
const exists = await fs.pathExists(destPath);
|
|
941
|
-
|
|
942
|
-
// Skip if exists and skipExisting is true
|
|
943
|
-
if (exists && skipExisting) {
|
|
944
|
-
skippedCount++;
|
|
945
|
-
return false;
|
|
946
|
-
}
|
|
947
|
-
|
|
948
|
-
// Skip if matches skip patterns
|
|
949
|
-
if (skipPatterns.some(pattern => pattern.test(relativePath))) {
|
|
950
|
-
if (exists) {
|
|
951
|
-
mergeSpinner.text = `Skipped: ${relativePath} (protected)`;
|
|
952
|
-
skippedCount++;
|
|
953
|
-
return false;
|
|
954
|
-
}
|
|
955
|
-
}
|
|
956
|
-
|
|
957
|
-
copiedCount++;
|
|
958
|
-
mergeSpinner.text = `Copying: ${relativePath}`;
|
|
959
|
-
return true;
|
|
960
|
-
}
|
|
961
|
-
});
|
|
962
|
-
|
|
963
|
-
mergeSpinner.succeed(
|
|
964
|
-
`Merge complete (${copiedCount} copied, ${skippedCount} skipped)`
|
|
965
|
-
);
|
|
966
|
-
|
|
967
|
-
// Cleanup is automatic via tmp.setGracefulCleanup()
|
|
968
|
-
}
|
|
969
|
-
|
|
970
|
-
// Usage
|
|
971
|
-
await downloadAndExtract({
|
|
972
|
-
url: 'https://github.com/owner/repo/archive/refs/tags/v1.0.0.tar.gz',
|
|
973
|
-
targetDir: './my-project',
|
|
974
|
-
skipExisting: true,
|
|
975
|
-
skipPatterns: [
|
|
976
|
-
/\.env(\..*)?$/,
|
|
977
|
-
/config\.(json|yaml|yml)$/,
|
|
978
|
-
/package\.json$/
|
|
979
|
-
],
|
|
980
|
-
githubToken: process.env.GITHUB_TOKEN
|
|
981
|
-
});
|
|
982
|
-
```
|
|
983
|
-
|
|
984
|
-
---
|
|
985
|
-
|
|
986
|
-
## Common Pitfalls & Solutions
|
|
987
|
-
|
|
988
|
-
### 1. Missing Content-Length Header (CORS)
|
|
989
|
-
|
|
990
|
-
**Problem:** Progress tracking fails when Content-Length is not exposed in CORS requests.
|
|
991
|
-
|
|
992
|
-
**Solution:**
|
|
993
|
-
```typescript
|
|
994
|
-
async function downloadWithFallback(url: string) {
|
|
995
|
-
const response = await fetch(url);
|
|
996
|
-
const contentLength = parseInt(response.headers.get('Content-Length') || '0');
|
|
997
|
-
|
|
998
|
-
if (contentLength === 0) {
|
|
999
|
-
// Fallback to spinner when progress unknown
|
|
1000
|
-
const spinner = ora('Downloading (size unknown)...').start();
|
|
1001
|
-
const data = await response.arrayBuffer();
|
|
1002
|
-
spinner.succeed(`Downloaded ${data.byteLength} bytes`);
|
|
1003
|
-
return data;
|
|
1004
|
-
} else {
|
|
1005
|
-
// Use progress bar when size is known
|
|
1006
|
-
// ... progress bar implementation
|
|
1007
|
-
}
|
|
1008
|
-
}
|
|
1009
|
-
```
|
|
1010
|
-
|
|
1011
|
-
### 2. Memory Issues with Large Files
|
|
1012
|
-
|
|
1013
|
-
**Problem:** Loading entire file into memory causes crashes.
|
|
1014
|
-
|
|
1015
|
-
**Solution:** Always use streaming
|
|
1016
|
-
```typescript
|
|
1017
|
-
// ❌ Bad: Loads entire file into memory
|
|
1018
|
-
const data = await response.arrayBuffer();
|
|
1019
|
-
await Bun.write(file, data);
|
|
1020
|
-
|
|
1021
|
-
// ✅ Good: Streams directly to disk
|
|
1022
|
-
const writer = Bun.file(file).writer();
|
|
1023
|
-
for await (const chunk of response.body) {
|
|
1024
|
-
writer.write(chunk);
|
|
1025
|
-
}
|
|
1026
|
-
await writer.end();
|
|
1027
|
-
```
|
|
1028
|
-
|
|
1029
|
-
### 3. Path Traversal Attacks
|
|
1030
|
-
|
|
1031
|
-
**Problem:** Malicious archives can write outside target directory.
|
|
1032
|
-
|
|
1033
|
-
**Solution:** Always validate paths
|
|
1034
|
-
```typescript
|
|
1035
|
-
function validatePath(basePath: string, targetPath: string): boolean {
|
|
1036
|
-
const resolved = path.resolve(targetPath);
|
|
1037
|
-
const base = path.resolve(basePath);
|
|
1038
|
-
return resolved.startsWith(base);
|
|
1039
|
-
}
|
|
1040
|
-
```
|
|
1041
|
-
|
|
1042
|
-
### 4. Incomplete Cleanup
|
|
1043
|
-
|
|
1044
|
-
**Problem:** Temporary files left behind on errors.
|
|
1045
|
-
|
|
1046
|
-
**Solution:** Use try-finally with tmp package
|
|
1047
|
-
```typescript
|
|
1048
|
-
tmp.setGracefulCleanup(); // Auto-cleanup on exit
|
|
1049
|
-
|
|
1050
|
-
async function process() {
|
|
1051
|
-
const tmpDir = tmp.dirSync({ unsafeCleanup: true });
|
|
1052
|
-
|
|
1053
|
-
try {
|
|
1054
|
-
// ... work with tmpDir.name
|
|
1055
|
-
} finally {
|
|
1056
|
-
// Manual cleanup if needed before exit
|
|
1057
|
-
tmpDir.removeCallback();
|
|
1058
|
-
}
|
|
1059
|
-
}
|
|
1060
|
-
```
|
|
1061
|
-
|
|
1062
|
-
### 5. Overwriting Important Files
|
|
1063
|
-
|
|
1064
|
-
**Problem:** Config files get overwritten during merge.
|
|
1065
|
-
|
|
1066
|
-
**Solution:** Use smart filtering
|
|
1067
|
-
```typescript
|
|
1068
|
-
const PROTECTED_FILES = [
|
|
1069
|
-
/\.env/,
|
|
1070
|
-
/config\./,
|
|
1071
|
-
/package\.json/,
|
|
1072
|
-
/bun\.lockb/
|
|
1073
|
-
];
|
|
1074
|
-
|
|
1075
|
-
await fs.copy(src, dest, {
|
|
1076
|
-
filter: (srcPath) => {
|
|
1077
|
-
const relative = path.relative(src, srcPath);
|
|
1078
|
-
if (PROTECTED_FILES.some(p => p.test(relative))) {
|
|
1079
|
-
return !fs.existsSync(path.join(dest, relative));
|
|
1080
|
-
}
|
|
1081
|
-
return true;
|
|
1082
|
-
}
|
|
1083
|
-
});
|
|
1084
|
-
```
|
|
1085
|
-
|
|
1086
|
-
---
|
|
1087
|
-
|
|
1088
|
-
## Error Handling Best Practices
|
|
1089
|
-
|
|
1090
|
-
```typescript
|
|
1091
|
-
class DownloadError extends Error {
|
|
1092
|
-
constructor(message: string, public cause?: Error) {
|
|
1093
|
-
super(message);
|
|
1094
|
-
this.name = 'DownloadError';
|
|
1095
|
-
}
|
|
1096
|
-
}
|
|
1097
|
-
|
|
1098
|
-
class ExtractionError extends Error {
|
|
1099
|
-
constructor(message: string, public cause?: Error) {
|
|
1100
|
-
super(message);
|
|
1101
|
-
this.name = 'ExtractionError';
|
|
1102
|
-
}
|
|
1103
|
-
}
|
|
1104
|
-
|
|
1105
|
-
async function safeDownloadAndExtract(url: string, targetDir: string) {
|
|
1106
|
-
const spinner = ora('Starting...').start();
|
|
1107
|
-
let tmpDir: tmp.DirResult | null = null;
|
|
1108
|
-
|
|
1109
|
-
try {
|
|
1110
|
-
// Download phase
|
|
1111
|
-
spinner.text = 'Downloading...';
|
|
1112
|
-
const response = await fetch(url);
|
|
1113
|
-
|
|
1114
|
-
if (!response.ok) {
|
|
1115
|
-
throw new DownloadError(
|
|
1116
|
-
`HTTP ${response.status}: ${response.statusText}`
|
|
1117
|
-
);
|
|
1118
|
-
}
|
|
1119
|
-
|
|
1120
|
-
tmpDir = tmp.dirSync({ unsafeCleanup: true });
|
|
1121
|
-
const tmpFile = path.join(tmpDir.name, 'archive.tar.gz');
|
|
1122
|
-
|
|
1123
|
-
await Bun.write(tmpFile, response);
|
|
1124
|
-
spinner.succeed('Download complete');
|
|
1125
|
-
|
|
1126
|
-
// Extract phase
|
|
1127
|
-
spinner.start('Extracting...');
|
|
1128
|
-
await tar.extract({
|
|
1129
|
-
file: tmpFile,
|
|
1130
|
-
cwd: targetDir
|
|
1131
|
-
});
|
|
1132
|
-
spinner.succeed('Extraction complete');
|
|
1133
|
-
|
|
1134
|
-
} catch (error) {
|
|
1135
|
-
spinner.fail('Operation failed');
|
|
1136
|
-
|
|
1137
|
-
if (error instanceof DownloadError) {
|
|
1138
|
-
console.error(`Download error: ${error.message}`);
|
|
1139
|
-
console.error('Please check your internet connection and URL');
|
|
1140
|
-
} else if (error instanceof ExtractionError) {
|
|
1141
|
-
console.error(`Extraction error: ${error.message}`);
|
|
1142
|
-
console.error('The archive may be corrupted');
|
|
1143
|
-
} else {
|
|
1144
|
-
console.error(`Unexpected error: ${error}`);
|
|
1145
|
-
}
|
|
1146
|
-
|
|
1147
|
-
throw error;
|
|
1148
|
-
|
|
1149
|
-
} finally {
|
|
1150
|
-
// Always cleanup
|
|
1151
|
-
if (tmpDir) {
|
|
1152
|
-
tmpDir.removeCallback();
|
|
1153
|
-
}
|
|
1154
|
-
}
|
|
1155
|
-
}
|
|
1156
|
-
```
|
|
1157
|
-
|
|
1158
|
-
---
|
|
1159
|
-
|
|
1160
|
-
## Performance Optimization
|
|
1161
|
-
|
|
1162
|
-
### 1. Parallel Downloads
|
|
1163
|
-
|
|
1164
|
-
```typescript
|
|
1165
|
-
async function downloadMultiple(urls: string[]) {
|
|
1166
|
-
const multibar = new cliProgress.MultiBar({
|
|
1167
|
-
clearOnComplete: false,
|
|
1168
|
-
hideCursor: true
|
|
1169
|
-
});
|
|
1170
|
-
|
|
1171
|
-
const downloads = urls.map(async (url, index) => {
|
|
1172
|
-
const bar = multibar.create(100, 0, {
|
|
1173
|
-
filename: path.basename(url)
|
|
1174
|
-
});
|
|
1175
|
-
|
|
1176
|
-
// ... download with progress updates to bar
|
|
1177
|
-
|
|
1178
|
-
return result;
|
|
1179
|
-
});
|
|
1180
|
-
|
|
1181
|
-
const results = await Promise.all(downloads);
|
|
1182
|
-
multibar.stop();
|
|
1183
|
-
|
|
1184
|
-
return results;
|
|
1185
|
-
}
|
|
1186
|
-
```
|
|
1187
|
-
|
|
1188
|
-
### 2. Stream Processing
|
|
1189
|
-
|
|
1190
|
-
```typescript
|
|
1191
|
-
// Process archive while downloading (no temp file)
|
|
1192
|
-
async function streamExtract(url: string, targetDir: string) {
|
|
1193
|
-
const response = await fetch(url);
|
|
1194
|
-
|
|
1195
|
-
// Convert Web ReadableStream to Node.js stream
|
|
1196
|
-
const webStream = response.body;
|
|
1197
|
-
const nodeStream = Readable.fromWeb(webStream);
|
|
1198
|
-
|
|
1199
|
-
// Extract while downloading
|
|
1200
|
-
await new Promise((resolve, reject) => {
|
|
1201
|
-
nodeStream
|
|
1202
|
-
.pipe(tar.extract({ cwd: targetDir }))
|
|
1203
|
-
.on('finish', resolve)
|
|
1204
|
-
.on('error', reject);
|
|
1205
|
-
});
|
|
1206
|
-
}
|
|
1207
|
-
```
|
|
1208
|
-
|
|
1209
|
-
### 3. Incremental Processing
|
|
1210
|
-
|
|
1211
|
-
```typescript
|
|
1212
|
-
// Process files as they're extracted
|
|
1213
|
-
await tar.extract({
|
|
1214
|
-
file: archivePath,
|
|
1215
|
-
cwd: targetDir,
|
|
1216
|
-
onentry: async (entry) => {
|
|
1217
|
-
console.log(`Extracted: ${entry.path}`);
|
|
1218
|
-
|
|
1219
|
-
// Process immediately
|
|
1220
|
-
if (entry.path.endsWith('.js')) {
|
|
1221
|
-
await processJsFile(path.join(targetDir, entry.path));
|
|
1222
|
-
}
|
|
1223
|
-
}
|
|
1224
|
-
});
|
|
1225
|
-
```
|
|
1226
|
-
|
|
1227
|
-
---
|
|
1228
|
-
|
|
1229
|
-
## Resources & References
|
|
1230
|
-
|
|
1231
|
-
### Official Documentation
|
|
1232
|
-
- **Bun Fetch API:** https://bun.sh/docs/api/fetch
|
|
1233
|
-
- **Bun Streams:** https://bun.sh/docs/api/streams
|
|
1234
|
-
- **Node-tar:** https://github.com/isaacs/node-tar
|
|
1235
|
-
- **Unzipper:** https://www.npmjs.com/package/unzipper
|
|
1236
|
-
- **cli-progress:** https://www.npmjs.com/package/cli-progress
|
|
1237
|
-
- **ora:** https://github.com/sindresorhus/ora
|
|
1238
|
-
- **fs-extra:** https://github.com/jprichardson/node-fs-extra
|
|
1239
|
-
- **tmp:** https://www.npmjs.com/package/tmp
|
|
1240
|
-
|
|
1241
|
-
### Recommended Tutorials
|
|
1242
|
-
- **Fetch Download Progress:** https://javascript.info/fetch-progress
|
|
1243
|
-
- **Node.js Tarball Decompression (2024):** https://www.petecorey.com/blog/2024/03/26/decompress-a-tarball-in-nodejs/
|
|
1244
|
-
- **Node.js Zip File Management (Oct 2024):** https://www.somethingsblog.com/2024/10/24/node-js-zip-file-management-a-comprehensive-guide/
|
|
1245
|
-
- **CLI Progress in TypeScript (Oct 2024):** https://www.webdevtutor.net/blog/typescript-cli-progress-bar
|
|
1246
|
-
|
|
1247
|
-
### Community Resources
|
|
1248
|
-
- **Stack Overflow:** [bun], [node-tar], [file-extraction] tags
|
|
1249
|
-
- **GitHub Discussions:** oven-sh/bun repository
|
|
1250
|
-
- **Discord:** Bun community server
|
|
1251
|
-
|
|
1252
|
-
### Package Comparison Tools
|
|
1253
|
-
- **npm trends:** https://npmtrends.com/
|
|
1254
|
-
- **npm-compare:** https://npm-compare.com/
|
|
1255
|
-
|
|
1256
|
-
---
|
|
1257
|
-
|
|
1258
|
-
## Appendices
|
|
1259
|
-
|
|
1260
|
-
### A. Glossary
|
|
1261
|
-
|
|
1262
|
-
- **Tarball:** A .tar or .tar.gz archive file, commonly used in Unix/Linux
|
|
1263
|
-
- **Stream:** Data processing method that handles data piece-by-piece rather than all at once
|
|
1264
|
-
- **Bun:** Modern JavaScript runtime with built-in tooling, faster than Node.js
|
|
1265
|
-
- **CORS:** Cross-Origin Resource Sharing, browser security mechanism
|
|
1266
|
-
- **Path Traversal:** Security vulnerability where files are accessed outside intended directory
|
|
1267
|
-
- **Content-Length:** HTTP header indicating the size of the response body
|
|
1268
|
-
|
|
1269
|
-
### B. Archive Format Support Matrix
|
|
1270
|
-
|
|
1271
|
-
| Format | Extension | Compression | Recommended Library | Streaming | Performance |
|
|
1272
|
-
|--------|-----------|-------------|---------------------|-----------|-------------|
|
|
1273
|
-
| Tarball (gzip) | .tar.gz, .tgz | gzip | `tar` | ✅ | Excellent |
|
|
1274
|
-
| Tarball (plain) | .tar | none | `tar` | ✅ | Excellent |
|
|
1275
|
-
| Zip | .zip | deflate | `unzipper` | ✅ | Good |
|
|
1276
|
-
| Bzip2 | .tar.bz2 | bzip2 | `tar` | ✅ | Good |
|
|
1277
|
-
| XZ | .tar.xz | xz | `tar` | ✅ | Good |
|
|
1278
|
-
|
|
1279
|
-
### C. Recommended Package Versions (Oct 2025)
|
|
1280
|
-
|
|
1281
|
-
```json
|
|
1282
|
-
{
|
|
1283
|
-
"dependencies": {
|
|
1284
|
-
"tar": "^7.4.3",
|
|
1285
|
-
"unzipper": "^0.12.3",
|
|
1286
|
-
"cli-progress": "^3.12.0",
|
|
1287
|
-
"ora": "^8.1.1",
|
|
1288
|
-
"fs-extra": "^11.3.2",
|
|
1289
|
-
"tmp": "^0.2.3",
|
|
1290
|
-
"ignore": "^6.0.2",
|
|
1291
|
-
"prompts": "^2.4.2"
|
|
1292
|
-
},
|
|
1293
|
-
"devDependencies": {
|
|
1294
|
-
"@types/tar": "^6.1.13",
|
|
1295
|
-
"@types/unzipper": "^0.10.10",
|
|
1296
|
-
"@types/tmp": "^0.2.6",
|
|
1297
|
-
"@types/fs-extra": "^11.0.4",
|
|
1298
|
-
"@types/prompts": "^2.4.9",
|
|
1299
|
-
"bun-types": "latest"
|
|
1300
|
-
}
|
|
1301
|
-
}
|
|
1302
|
-
```
|
|
1303
|
-
|
|
1304
|
-
### D. Quick Start Code Template
|
|
1305
|
-
|
|
1306
|
-
```typescript
|
|
1307
|
-
// download-extract.ts
|
|
1308
|
-
import ora from 'ora';
|
|
1309
|
-
import cliProgress from 'cli-progress';
|
|
1310
|
-
import tar from 'tar';
|
|
1311
|
-
import fs from 'fs-extra';
|
|
1312
|
-
import tmp from 'tmp';
|
|
1313
|
-
import path from 'path';
|
|
1314
|
-
|
|
1315
|
-
tmp.setGracefulCleanup();
|
|
1316
|
-
|
|
1317
|
-
export async function downloadAndExtract(
|
|
1318
|
-
url: string,
|
|
1319
|
-
targetDir: string
|
|
1320
|
-
): Promise<void> {
|
|
1321
|
-
const spinner = ora('Downloading...').start();
|
|
1322
|
-
|
|
1323
|
-
// Download
|
|
1324
|
-
const response = await fetch(url);
|
|
1325
|
-
const tmpFile = tmp.fileSync({ postfix: '.tar.gz' });
|
|
1326
|
-
await Bun.write(tmpFile.name, response);
|
|
1327
|
-
spinner.succeed('Downloaded');
|
|
1328
|
-
|
|
1329
|
-
// Extract
|
|
1330
|
-
spinner.start('Extracting...');
|
|
1331
|
-
await tar.extract({ file: tmpFile.name, cwd: targetDir });
|
|
1332
|
-
spinner.succeed('Complete!');
|
|
1333
|
-
}
|
|
1334
|
-
|
|
1335
|
-
// Usage
|
|
1336
|
-
await downloadAndExtract(
|
|
1337
|
-
'https://github.com/owner/repo/archive/v1.0.0.tar.gz',
|
|
1338
|
-
'./output'
|
|
1339
|
-
);
|
|
1340
|
-
```
|
|
1341
|
-
|
|
1342
|
-
---
|
|
1343
|
-
|
|
1344
|
-
## Summary & Next Steps
|
|
1345
|
-
|
|
1346
|
-
### Recommended Stack for Bun CLI:
|
|
1347
|
-
1. **Download:** Bun's native `fetch()` with manual progress tracking
|
|
1348
|
-
2. **Progress UI:** `cli-progress` for downloads, `ora` for spinners
|
|
1349
|
-
3. **Extraction:** `tar` for .tar.gz, `unzipper` for .zip
|
|
1350
|
-
4. **File Operations:** `fs-extra` with custom conflict resolution
|
|
1351
|
-
5. **Temp Management:** `tmp` package with graceful cleanup
|
|
1352
|
-
6. **Security:** Path validation, gitignore filtering with `ignore` package
|
|
1353
|
-
|
|
1354
|
-
### Implementation Checklist:
|
|
1355
|
-
- [ ] Install recommended packages
|
|
1356
|
-
- [ ] Implement download with progress tracking
|
|
1357
|
-
- [ ] Add extraction with format detection
|
|
1358
|
-
- [ ] Implement smart merge with conflict resolution
|
|
1359
|
-
- [ ] Add proper error handling and cleanup
|
|
1360
|
-
- [ ] Add path traversal protection
|
|
1361
|
-
- [ ] Test with various archive formats
|
|
1362
|
-
- [ ] Add user prompts for conflicts (optional)
|
|
1363
|
-
- [ ] Implement parallel downloads (if needed)
|
|
1364
|
-
- [ ] Add checksum verification (if available)
|
|
1365
|
-
|
|
1366
|
-
### Performance Targets:
|
|
1367
|
-
- Download: Stream directly to disk, minimal memory usage
|
|
1368
|
-
- Extraction: Process in parallel where possible
|
|
1369
|
-
- Merge: Skip unnecessary file reads with smart filtering
|
|
1370
|
-
- Cleanup: Automatic via tmp package, no manual intervention
|
|
1371
|
-
|
|
1372
|
-
---
|
|
1373
|
-
|
|
1374
|
-
**Report Generated:** October 8, 2025
|
|
1375
|
-
**Total Research Time:** ~2 hours
|
|
1376
|
-
**Sources Reviewed:** 35+
|
|
1377
|
-
**Code Examples:** 25+
|