@nightowne/tas-cli 1.0.0 β 1.1.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +111 -136
- package/package.json +6 -6
- package/src/cli.js +233 -4
- package/src/db/index.js +147 -0
- package/src/index.js +12 -2
- package/src/utils/progress.js +119 -0
package/README.md
CHANGED
|
@@ -1,158 +1,128 @@
|
|
|
1
|
-
#
|
|
1
|
+
# TAS β Telegram as Storage
|
|
2
2
|
|
|
3
|
-
|
|
3
|
+
A CLI tool that uses your Telegram bot as encrypted file storage. Files are compressed, encrypted locally, then uploaded to your private bot chat.
|
|
4
4
|
|
|
5
5
|
```
|
|
6
|
-
|
|
7
|
-
|
|
8
|
-
|
|
9
|
-
|
|
10
|
-
|
|
11
|
-
|
|
6
|
+
βββββββββββββββ βββββββββββββββββ ββββββββββββββββ
|
|
7
|
+
β CLI ββββββΆβ Compress & ββββββΆβ Telegram β
|
|
8
|
+
β FUSE β β Encrypt β β Bot API β
|
|
9
|
+
βββββββββββββββ βββββββββββββββββ ββββββββββββββββ
|
|
10
|
+
β β β
|
|
11
|
+
βΌ βΌ βΌ
|
|
12
|
+
βββββββββββββββ βββββββββββββββββ ββββββββββββββββ
|
|
13
|
+
β SQLite Indexβ β 49MB Chunks β β Private Chat β
|
|
14
|
+
βββββββββββββββ βββββββββββββββββ ββββββββββββββββ
|
|
12
15
|
```
|
|
13
16
|
|
|
14
|
-
|
|
17
|
+
## Why TAS?
|
|
15
18
|
|
|
16
|
-
|
|
19
|
+
| Feature | TAS | Session-based tools (e.g. teldrive) |
|
|
20
|
+
|---------|:---:|:-----------------------------------:|
|
|
21
|
+
| Account ban risk | **None** (Bot API) | High (session hijack detection) |
|
|
22
|
+
| Encryption | AES-256-GCM | Usually none |
|
|
23
|
+
| Dependencies | SQLite only | Rclone, external DB |
|
|
24
|
+
| Setup complexity | 2 minutes | Docker + multiple services |
|
|
17
25
|
|
|
18
|
-
|
|
26
|
+
**Key differences:**
|
|
27
|
+
- Uses **Bot API**, not session-based auth β Telegram can't ban your account
|
|
28
|
+
- **Encryption by default** β files encrypted before leaving your machine
|
|
29
|
+
- **Local-first** β SQLite index, no cloud dependencies
|
|
30
|
+
- **FUSE mount** β use Telegram like a folder
|
|
19
31
|
|
|
20
|
-
|
|
32
|
+
## Security Model
|
|
21
33
|
|
|
22
|
-
|
|
34
|
+
| Component | Implementation |
|
|
35
|
+
|-----------|----------------|
|
|
36
|
+
| Cipher | AES-256-GCM |
|
|
37
|
+
| Key derivation | PBKDF2-SHA512, 100,000 iterations |
|
|
38
|
+
| Salt | 32 bytes, random per file |
|
|
39
|
+
| IV | 12 bytes, random per file |
|
|
40
|
+
| Auth tag | 16 bytes (integrity) |
|
|
23
41
|
|
|
24
|
-
|
|
25
|
-
npm install -g @nightowne/tas-cli
|
|
26
|
-
tas init
|
|
27
|
-
tas mount ~/cloud
|
|
28
|
-
# Now use ~/cloud like any folder. Files go to Telegram.
|
|
29
|
-
```
|
|
30
|
-
|
|
31
|
-
---
|
|
42
|
+
Your password never leaves your machine. Telegram stores encrypted blobs.
|
|
32
43
|
|
|
33
|
-
##
|
|
44
|
+
## Limitations
|
|
34
45
|
|
|
35
|
-
|
|
36
|
-
|
|
37
|
-
|
|
38
|
-
|
|
39
|
-
|
|
40
|
-
| **Mounts as folder** | β
| β | β |
|
|
41
|
-
| **Your data, your control** | β
| β | β |
|
|
46
|
+
- **Not a backup** β Telegram can delete content without notice
|
|
47
|
+
- **No versioning** β overwriting a file deletes the old version
|
|
48
|
+
- **49MB chunks** β files split due to Bot API limits
|
|
49
|
+
- **FUSE required** β mount feature needs `libfuse` on Linux/macOS
|
|
50
|
+
- **Single user** β designed for personal use, not multi-tenant
|
|
42
51
|
|
|
43
|
-
|
|
52
|
+
## Quick Start
|
|
44
53
|
|
|
45
|
-
## π Quick Start
|
|
46
|
-
|
|
47
|
-
### 1. Get a Telegram Bot (30 seconds)
|
|
48
|
-
- Message [@BotFather](https://t.me/BotFather) on Telegram
|
|
49
|
-
- Send `/newbot`, pick a name
|
|
50
|
-
- Copy the token
|
|
51
|
-
|
|
52
|
-
### 2. Install & Setup
|
|
53
54
|
```bash
|
|
54
|
-
|
|
55
|
+
# Install
|
|
56
|
+
npm install -g @nightowne/tas-cli
|
|
57
|
+
|
|
58
|
+
# Setup (creates bot connection + encryption password)
|
|
55
59
|
tas init
|
|
56
|
-
# Paste token, set password, message your bot
|
|
57
|
-
```
|
|
58
60
|
|
|
59
|
-
|
|
60
|
-
```bash
|
|
61
|
-
# Upload files
|
|
61
|
+
# Upload a file
|
|
62
62
|
tas push secret.pdf
|
|
63
63
|
|
|
64
|
-
#
|
|
65
|
-
tas
|
|
66
|
-
cp anything.zip ~/cloud/ # uploads to Telegram
|
|
67
|
-
open ~/cloud/secret.pdf # downloads from Telegram
|
|
68
|
-
```
|
|
69
|
-
|
|
70
|
-
---
|
|
71
|
-
|
|
72
|
-
## οΏ½ The Folder Thing
|
|
73
|
-
|
|
74
|
-
This is the part that makes TAS different. Run `tas mount ~/cloud` and you get a folder that:
|
|
75
|
-
|
|
76
|
-
- **Looks normal** in your file manager
|
|
77
|
-
- **Drag & drop** = upload to Telegram
|
|
78
|
-
- **Open files** = download from Telegram
|
|
79
|
-
- **Delete files** = removes from Telegram
|
|
80
|
-
|
|
81
|
-
It's like Dropbox, except free and you own your data.
|
|
82
|
-
|
|
83
|
-
```bash
|
|
84
|
-
$ ls ~/cloud
|
|
85
|
-
secret.pdf photos.zip notes.txt
|
|
86
|
-
|
|
87
|
-
$ cp newfile.doc ~/cloud/
|
|
88
|
-
# Compresses β Encrypts β Uploads to Telegram
|
|
89
|
-
```
|
|
90
|
-
|
|
91
|
-
---
|
|
64
|
+
# Download a file
|
|
65
|
+
tas pull secret.pdf
|
|
92
66
|
|
|
93
|
-
|
|
94
|
-
|
|
95
|
-
```bash
|
|
96
|
-
tas tag add report.pdf work finance
|
|
97
|
-
tas tag list work # shows all "work" files
|
|
98
|
-
tas tag remove report.pdf finance
|
|
99
|
-
```
|
|
100
|
-
|
|
101
|
-
---
|
|
102
|
-
|
|
103
|
-
## π Auto-Sync Folders
|
|
104
|
-
|
|
105
|
-
Dropbox-style sync. Any changes in the folder β auto-upload to Telegram.
|
|
106
|
-
|
|
107
|
-
```bash
|
|
108
|
-
tas sync add ~/Documents/work
|
|
109
|
-
tas sync start
|
|
110
|
-
# Now any file changes auto-sync to Telegram
|
|
111
|
-
```
|
|
112
|
-
|
|
113
|
-
Two-way sync:
|
|
114
|
-
```bash
|
|
115
|
-
tas sync pull # Download everything from Telegram β local
|
|
67
|
+
# Mount as folder (requires libfuse)
|
|
68
|
+
tas mount ~/cloud
|
|
116
69
|
```
|
|
117
70
|
|
|
118
|
-
|
|
119
|
-
|
|
120
|
-
|
|
121
|
-
|
|
122
|
-
|
|
123
|
-
|
|
124
|
-
|
|
125
|
-
|
|
126
|
-
|
|
127
|
-
|
|
128
|
-
|
|
129
|
-
|
|
130
|
-
|
|
131
|
-
|
|
71
|
+
### Prerequisites
|
|
72
|
+
- Node.js β₯18
|
|
73
|
+
- Telegram account + bot token from [@BotFather](https://t.me/BotFather)
|
|
74
|
+
- `libfuse` for mount feature:
|
|
75
|
+
```bash
|
|
76
|
+
# Debian/Ubuntu
|
|
77
|
+
sudo apt install fuse libfuse-dev
|
|
78
|
+
|
|
79
|
+
# Fedora
|
|
80
|
+
sudo dnf install fuse fuse-devel
|
|
81
|
+
|
|
82
|
+
# macOS
|
|
83
|
+
brew install macfuse
|
|
84
|
+
```
|
|
85
|
+
|
|
86
|
+
## CLI Reference
|
|
132
87
|
|
|
133
88
|
```bash
|
|
134
|
-
|
|
135
|
-
tas
|
|
136
|
-
tas
|
|
137
|
-
tas
|
|
138
|
-
tas
|
|
139
|
-
tas
|
|
140
|
-
tas
|
|
141
|
-
|
|
142
|
-
|
|
143
|
-
tas
|
|
144
|
-
tas
|
|
89
|
+
# Core
|
|
90
|
+
tas init # Setup wizard
|
|
91
|
+
tas push <file> # Upload file
|
|
92
|
+
tas pull <file|hash> # Download file
|
|
93
|
+
tas list [-l] # List files (long format)
|
|
94
|
+
tas delete <file|hash> # Remove file
|
|
95
|
+
tas status # Show stats
|
|
96
|
+
|
|
97
|
+
# Search & Resume (v1.1.0)
|
|
98
|
+
tas search <query> # Search by filename
|
|
99
|
+
tas search -t <query> # Search by tag
|
|
100
|
+
tas resume # Resume interrupted uploads
|
|
101
|
+
|
|
102
|
+
# FUSE Mount
|
|
103
|
+
tas mount <path> # Mount as folder
|
|
104
|
+
tas unmount <path> # Unmount
|
|
105
|
+
|
|
106
|
+
# Tags
|
|
107
|
+
tas tag add <file> <tags...> # Add tags
|
|
108
|
+
tas tag remove <file> <tags...> # Remove tags
|
|
109
|
+
tas tag list [tag] # List tags or files by tag
|
|
110
|
+
|
|
111
|
+
# Sync (Dropbox-style)
|
|
112
|
+
tas sync add <folder> # Register folder for sync
|
|
113
|
+
tas sync start # Start watching
|
|
114
|
+
tas sync pull # Download all to sync folders
|
|
115
|
+
tas sync status # Show sync status
|
|
116
|
+
|
|
117
|
+
# Verification
|
|
118
|
+
tas verify # Check file integrity
|
|
145
119
|
```
|
|
146
120
|
|
|
147
|
-
|
|
148
|
-
|
|
149
|
-
## βοΈ Auto-Start on Boot
|
|
121
|
+
## Auto-Start (systemd)
|
|
150
122
|
|
|
151
|
-
|
|
123
|
+
See [systemd/README.md](systemd/README.md) for running sync as a service.
|
|
152
124
|
|
|
153
|
-
|
|
154
|
-
|
|
155
|
-
## π§ͺ Development
|
|
125
|
+
## Development
|
|
156
126
|
|
|
157
127
|
```bash
|
|
158
128
|
git clone https://github.com/ixchio/tas
|
|
@@ -161,14 +131,19 @@ npm install
|
|
|
161
131
|
npm test # 28 tests
|
|
162
132
|
```
|
|
163
133
|
|
|
164
|
-
|
|
165
|
-
|
|
166
|
-
|
|
167
|
-
|
|
168
|
-
|
|
169
|
-
|
|
170
|
-
|
|
134
|
+
### Project Structure
|
|
135
|
+
```
|
|
136
|
+
src/
|
|
137
|
+
βββ cli.js # Command definitions
|
|
138
|
+
βββ index.js # Upload/download pipeline
|
|
139
|
+
βββ crypto/ # AES-256-GCM encryption
|
|
140
|
+
βββ db/ # SQLite file index
|
|
141
|
+
βββ fuse/ # FUSE filesystem mount
|
|
142
|
+
βββ sync/ # Folder sync engine
|
|
143
|
+
βββ telegram/ # Bot API client
|
|
144
|
+
βββ utils/ # Compression, chunking
|
|
145
|
+
```
|
|
171
146
|
|
|
172
|
-
|
|
147
|
+
## License
|
|
173
148
|
|
|
174
|
-
|
|
149
|
+
MIT
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@nightowne/tas-cli",
|
|
3
|
-
"version": "1.
|
|
3
|
+
"version": "1.1.0",
|
|
4
4
|
"description": "π¦ Telegram as Storage - Free encrypted cloud storage via Telegram. Mount Telegram as a folder!",
|
|
5
5
|
"type": "module",
|
|
6
6
|
"main": "src/index.js",
|
|
@@ -43,13 +43,13 @@
|
|
|
43
43
|
"LICENSE"
|
|
44
44
|
],
|
|
45
45
|
"dependencies": {
|
|
46
|
-
"
|
|
46
|
+
"better-sqlite3": "^12.6.2",
|
|
47
|
+
"chalk": "^5.3.0",
|
|
47
48
|
"commander": "^12.1.0",
|
|
48
|
-
"better-sqlite3": "^11.6.0",
|
|
49
49
|
"fuse-native": "^2.2.6",
|
|
50
|
-
"
|
|
51
|
-
"
|
|
52
|
-
"
|
|
50
|
+
"inquirer": "^12.2.0",
|
|
51
|
+
"node-telegram-bot-api": "^0.66.0",
|
|
52
|
+
"ora": "^8.1.1"
|
|
53
53
|
},
|
|
54
54
|
"engines": {
|
|
55
55
|
"node": ">=18.0.0"
|
package/src/cli.js
CHANGED
|
@@ -179,16 +179,33 @@ program
|
|
|
179
179
|
|
|
180
180
|
spinner.start('Processing file...');
|
|
181
181
|
|
|
182
|
+
// Import progress bar
|
|
183
|
+
const { ProgressBar } = await import('./utils/progress.js');
|
|
184
|
+
let progressBar = null;
|
|
185
|
+
|
|
182
186
|
// Process and upload
|
|
183
187
|
const result = await processFile(file, {
|
|
184
188
|
password,
|
|
185
189
|
dataDir: DATA_DIR,
|
|
186
190
|
customName: options.name,
|
|
187
191
|
config,
|
|
188
|
-
onProgress: (msg) => {
|
|
192
|
+
onProgress: (msg) => {
|
|
193
|
+
if (!progressBar) spinner.text = msg;
|
|
194
|
+
},
|
|
195
|
+
onByteProgress: ({ uploaded, total }) => {
|
|
196
|
+
if (!progressBar) {
|
|
197
|
+
spinner.stop();
|
|
198
|
+
progressBar = new ProgressBar({ label: 'Uploading', total });
|
|
199
|
+
}
|
|
200
|
+
progressBar.update(uploaded);
|
|
201
|
+
}
|
|
189
202
|
});
|
|
190
203
|
|
|
191
|
-
|
|
204
|
+
if (progressBar) {
|
|
205
|
+
progressBar.complete(`Uploaded: ${result.filename}`);
|
|
206
|
+
} else {
|
|
207
|
+
spinner.succeed(`Uploaded: ${chalk.green(result.filename)}`);
|
|
208
|
+
}
|
|
192
209
|
console.log(chalk.dim(` Hash: ${result.hash}`));
|
|
193
210
|
console.log(chalk.dim(` Size: ${formatBytes(result.originalSize)} β ${formatBytes(result.storedSize)}`));
|
|
194
211
|
console.log(chalk.dim(` Chunks: ${result.chunks}`));
|
|
@@ -247,16 +264,33 @@ program
|
|
|
247
264
|
|
|
248
265
|
spinner.start('Downloading...');
|
|
249
266
|
|
|
267
|
+
// Import progress bar
|
|
268
|
+
const { ProgressBar } = await import('./utils/progress.js');
|
|
269
|
+
let progressBar = null;
|
|
270
|
+
|
|
250
271
|
const outputPath = options.output || fileRecord.filename;
|
|
251
272
|
await retrieveFile(fileRecord, {
|
|
252
273
|
password,
|
|
253
274
|
dataDir: DATA_DIR,
|
|
254
275
|
outputPath,
|
|
255
276
|
config,
|
|
256
|
-
onProgress: (msg) => {
|
|
277
|
+
onProgress: (msg) => {
|
|
278
|
+
if (!progressBar) spinner.text = msg;
|
|
279
|
+
},
|
|
280
|
+
onByteProgress: ({ downloaded, total }) => {
|
|
281
|
+
if (!progressBar && total > 0) {
|
|
282
|
+
spinner.stop();
|
|
283
|
+
progressBar = new ProgressBar({ label: 'Downloading', total });
|
|
284
|
+
}
|
|
285
|
+
if (progressBar) progressBar.update(downloaded);
|
|
286
|
+
}
|
|
257
287
|
});
|
|
258
288
|
|
|
259
|
-
|
|
289
|
+
if (progressBar) {
|
|
290
|
+
progressBar.complete(`Downloaded: ${outputPath}`);
|
|
291
|
+
} else {
|
|
292
|
+
spinner.succeed(`Downloaded: ${chalk.green(outputPath)}`);
|
|
293
|
+
}
|
|
260
294
|
|
|
261
295
|
} catch (err) {
|
|
262
296
|
spinner.fail(`Download failed: ${err.message}`);
|
|
@@ -1005,6 +1039,201 @@ function formatBytes(bytes) {
|
|
|
1005
1039
|
return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + ' ' + sizes[i];
|
|
1006
1040
|
}
|
|
1007
1041
|
|
|
1042
|
+
// ============== SEARCH COMMAND ==============
|
|
1043
|
+
program
|
|
1044
|
+
.command('search <query>')
|
|
1045
|
+
.description('Search files by name or tag')
|
|
1046
|
+
.option('-t, --tag', 'Search by tag instead of filename')
|
|
1047
|
+
.action(async (query, options) => {
|
|
1048
|
+
try {
|
|
1049
|
+
const db = new FileIndex(path.join(DATA_DIR, 'index.db'));
|
|
1050
|
+
db.init();
|
|
1051
|
+
|
|
1052
|
+
const results = options.tag
|
|
1053
|
+
? db.searchByTag(query)
|
|
1054
|
+
: db.search(query);
|
|
1055
|
+
|
|
1056
|
+
if (results.length === 0) {
|
|
1057
|
+
console.log(chalk.yellow(`\nπ No files found matching "${query}"\n`));
|
|
1058
|
+
db.close();
|
|
1059
|
+
return;
|
|
1060
|
+
}
|
|
1061
|
+
|
|
1062
|
+
console.log(chalk.cyan(`\nπ Search Results for "${query}" (${results.length})\n`));
|
|
1063
|
+
|
|
1064
|
+
for (const file of results) {
|
|
1065
|
+
const tags = file.tags ? chalk.dim(` [${file.tags}]`) : '';
|
|
1066
|
+
console.log(` ${chalk.blue('β')} ${file.filename} ${chalk.dim(`(${formatBytes(file.original_size)})`)}${tags}`);
|
|
1067
|
+
}
|
|
1068
|
+
|
|
1069
|
+
console.log();
|
|
1070
|
+
db.close();
|
|
1071
|
+
} catch (err) {
|
|
1072
|
+
console.error(chalk.red('Search failed:'), err.message);
|
|
1073
|
+
process.exit(1);
|
|
1074
|
+
}
|
|
1075
|
+
});
|
|
1076
|
+
|
|
1077
|
+
// ============== RESUME COMMAND ==============
|
|
1078
|
+
program
|
|
1079
|
+
.command('resume')
|
|
1080
|
+
.description('Resume interrupted uploads')
|
|
1081
|
+
.action(async () => {
|
|
1082
|
+
try {
|
|
1083
|
+
const db = new FileIndex(path.join(DATA_DIR, 'index.db'));
|
|
1084
|
+
db.init();
|
|
1085
|
+
|
|
1086
|
+
const pending = db.getPendingUploads();
|
|
1087
|
+
|
|
1088
|
+
if (pending.length === 0) {
|
|
1089
|
+
console.log(chalk.yellow('\nπ No interrupted uploads found.\n'));
|
|
1090
|
+
db.close();
|
|
1091
|
+
return;
|
|
1092
|
+
}
|
|
1093
|
+
|
|
1094
|
+
console.log(chalk.cyan(`\nπ Pending Uploads (${pending.length})\n`));
|
|
1095
|
+
|
|
1096
|
+
for (const upload of pending) {
|
|
1097
|
+
const progress = Math.round((upload.uploaded_chunks / upload.total_chunks) * 100);
|
|
1098
|
+
console.log(` ${chalk.blue('β')} ${upload.filename}`);
|
|
1099
|
+
console.log(chalk.dim(` Progress: ${upload.uploaded_chunks}/${upload.total_chunks} chunks (${progress}%)`));
|
|
1100
|
+
console.log(chalk.dim(` Started: ${new Date(upload.created_at).toLocaleString()}`));
|
|
1101
|
+
}
|
|
1102
|
+
|
|
1103
|
+
console.log();
|
|
1104
|
+
|
|
1105
|
+
// Ask if user wants to resume
|
|
1106
|
+
const { action } = await inquirer.prompt([
|
|
1107
|
+
{
|
|
1108
|
+
type: 'list',
|
|
1109
|
+
name: 'action',
|
|
1110
|
+
message: 'What would you like to do?',
|
|
1111
|
+
choices: [
|
|
1112
|
+
{ name: 'Resume all pending uploads', value: 'resume' },
|
|
1113
|
+
{ name: 'Clear all pending uploads', value: 'clear' },
|
|
1114
|
+
{ name: 'Cancel', value: 'cancel' }
|
|
1115
|
+
]
|
|
1116
|
+
}
|
|
1117
|
+
]);
|
|
1118
|
+
|
|
1119
|
+
if (action === 'cancel') {
|
|
1120
|
+
db.close();
|
|
1121
|
+
return;
|
|
1122
|
+
}
|
|
1123
|
+
|
|
1124
|
+
if (action === 'clear') {
|
|
1125
|
+
for (const upload of pending) {
|
|
1126
|
+
// Clean up temp files
|
|
1127
|
+
const chunks = db.getPendingChunks(upload.id);
|
|
1128
|
+
for (const chunk of chunks) {
|
|
1129
|
+
try { fs.unlinkSync(chunk.chunk_path); } catch (e) { }
|
|
1130
|
+
}
|
|
1131
|
+
if (upload.temp_dir) {
|
|
1132
|
+
try { fs.rmdirSync(upload.temp_dir); } catch (e) { }
|
|
1133
|
+
}
|
|
1134
|
+
db.deletePendingUpload(upload.id);
|
|
1135
|
+
}
|
|
1136
|
+
console.log(chalk.green('β Cleared all pending uploads'));
|
|
1137
|
+
db.close();
|
|
1138
|
+
return;
|
|
1139
|
+
}
|
|
1140
|
+
|
|
1141
|
+
// Resume uploads
|
|
1142
|
+
const configPath = path.join(DATA_DIR, 'config.json');
|
|
1143
|
+
if (!fs.existsSync(configPath)) {
|
|
1144
|
+
console.log(chalk.red('β TAS not initialized.'));
|
|
1145
|
+
db.close();
|
|
1146
|
+
return;
|
|
1147
|
+
}
|
|
1148
|
+
const config = JSON.parse(fs.readFileSync(configPath, 'utf-8'));
|
|
1149
|
+
|
|
1150
|
+
// Get password
|
|
1151
|
+
const { password } = await inquirer.prompt([
|
|
1152
|
+
{
|
|
1153
|
+
type: 'password',
|
|
1154
|
+
name: 'password',
|
|
1155
|
+
message: 'Enter your encryption password:',
|
|
1156
|
+
mask: '*'
|
|
1157
|
+
}
|
|
1158
|
+
]);
|
|
1159
|
+
|
|
1160
|
+
const encryptor = new Encryptor(password);
|
|
1161
|
+
if (encryptor.getPasswordHash() !== config.passwordHash) {
|
|
1162
|
+
console.log(chalk.red('β Incorrect password'));
|
|
1163
|
+
db.close();
|
|
1164
|
+
return;
|
|
1165
|
+
}
|
|
1166
|
+
|
|
1167
|
+
// Connect to Telegram
|
|
1168
|
+
const { TelegramClient } = await import('./telegram/client.js');
|
|
1169
|
+
const client = new TelegramClient(DATA_DIR);
|
|
1170
|
+
await client.initialize(config.botToken);
|
|
1171
|
+
client.setChatId(config.chatId);
|
|
1172
|
+
|
|
1173
|
+
for (const upload of pending) {
|
|
1174
|
+
console.log(chalk.cyan(`\nπ€ Resuming: ${upload.filename}`));
|
|
1175
|
+
|
|
1176
|
+
const chunks = db.getPendingChunks(upload.id);
|
|
1177
|
+
const pendingChunks = chunks.filter(c => !c.uploaded);
|
|
1178
|
+
|
|
1179
|
+
for (const chunk of pendingChunks) {
|
|
1180
|
+
if (!fs.existsSync(chunk.chunk_path)) {
|
|
1181
|
+
console.log(chalk.red(` β Chunk file missing: ${chunk.chunk_path}`));
|
|
1182
|
+
continue;
|
|
1183
|
+
}
|
|
1184
|
+
|
|
1185
|
+
console.log(chalk.dim(` β Uploading chunk ${chunk.chunk_index + 1}/${upload.total_chunks}...`));
|
|
1186
|
+
|
|
1187
|
+
const caption = upload.total_chunks > 1
|
|
1188
|
+
? `π¦ ${upload.filename} (${chunk.chunk_index + 1}/${upload.total_chunks})`
|
|
1189
|
+
: `π¦ ${upload.filename}`;
|
|
1190
|
+
|
|
1191
|
+
const result = await client.sendFile(chunk.chunk_path, caption);
|
|
1192
|
+
db.markChunkUploaded(upload.id, chunk.chunk_index, result.messageId.toString(), result.fileId);
|
|
1193
|
+
|
|
1194
|
+
// Clean up temp file
|
|
1195
|
+
fs.unlinkSync(chunk.chunk_path);
|
|
1196
|
+
}
|
|
1197
|
+
|
|
1198
|
+
// All chunks uploaded - finalize
|
|
1199
|
+
const allChunks = db.getPendingChunks(upload.id);
|
|
1200
|
+
if (allChunks.every(c => c.uploaded)) {
|
|
1201
|
+
// Add to main files table
|
|
1202
|
+
const fileId = db.addFile({
|
|
1203
|
+
filename: upload.filename,
|
|
1204
|
+
hash: upload.hash,
|
|
1205
|
+
originalSize: upload.original_size,
|
|
1206
|
+
storedSize: upload.original_size, // Approximate
|
|
1207
|
+
chunks: upload.total_chunks,
|
|
1208
|
+
compressed: true
|
|
1209
|
+
});
|
|
1210
|
+
|
|
1211
|
+
// Add chunk records
|
|
1212
|
+
for (const chunk of allChunks) {
|
|
1213
|
+
db.addChunk(fileId, chunk.chunk_index, chunk.message_id, 0);
|
|
1214
|
+
db.db.prepare('UPDATE chunks SET file_telegram_id = ? WHERE file_id = ? AND chunk_index = ?')
|
|
1215
|
+
.run(chunk.file_telegram_id, fileId, chunk.chunk_index);
|
|
1216
|
+
}
|
|
1217
|
+
|
|
1218
|
+
// Clean up pending record
|
|
1219
|
+
db.deletePendingUpload(upload.id);
|
|
1220
|
+
if (upload.temp_dir) {
|
|
1221
|
+
try { fs.rmdirSync(upload.temp_dir); } catch (e) { }
|
|
1222
|
+
}
|
|
1223
|
+
|
|
1224
|
+
console.log(chalk.green(` β Completed: ${upload.filename}`));
|
|
1225
|
+
}
|
|
1226
|
+
}
|
|
1227
|
+
|
|
1228
|
+
console.log(chalk.green('\n⨠All uploads resumed!\n'));
|
|
1229
|
+
db.close();
|
|
1230
|
+
|
|
1231
|
+
} catch (err) {
|
|
1232
|
+
console.error(chalk.red('Resume failed:'), err.message);
|
|
1233
|
+
process.exit(1);
|
|
1234
|
+
}
|
|
1235
|
+
});
|
|
1236
|
+
|
|
1008
1237
|
program.parse();
|
|
1009
1238
|
|
|
1010
1239
|
|
package/src/db/index.js
CHANGED
|
@@ -91,6 +91,34 @@ export class FileIndex {
|
|
|
91
91
|
UNIQUE(folder_id, relative_path)
|
|
92
92
|
);
|
|
93
93
|
`);
|
|
94
|
+
|
|
95
|
+
// Create pending_uploads table for resume functionality
|
|
96
|
+
this.db.exec(`
|
|
97
|
+
CREATE TABLE IF NOT EXISTS pending_uploads (
|
|
98
|
+
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
99
|
+
filename TEXT NOT NULL,
|
|
100
|
+
file_path TEXT NOT NULL,
|
|
101
|
+
hash TEXT NOT NULL,
|
|
102
|
+
original_size INTEGER NOT NULL,
|
|
103
|
+
total_chunks INTEGER NOT NULL,
|
|
104
|
+
uploaded_chunks INTEGER NOT NULL DEFAULT 0,
|
|
105
|
+
temp_dir TEXT,
|
|
106
|
+
created_at TEXT NOT NULL DEFAULT (datetime('now')),
|
|
107
|
+
UNIQUE(hash)
|
|
108
|
+
);
|
|
109
|
+
|
|
110
|
+
CREATE TABLE IF NOT EXISTS pending_chunks (
|
|
111
|
+
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
112
|
+
pending_id INTEGER NOT NULL,
|
|
113
|
+
chunk_index INTEGER NOT NULL,
|
|
114
|
+
chunk_path TEXT NOT NULL,
|
|
115
|
+
uploaded INTEGER NOT NULL DEFAULT 0,
|
|
116
|
+
message_id TEXT,
|
|
117
|
+
file_telegram_id TEXT,
|
|
118
|
+
FOREIGN KEY (pending_id) REFERENCES pending_uploads(id) ON DELETE CASCADE,
|
|
119
|
+
UNIQUE(pending_id, chunk_index)
|
|
120
|
+
);
|
|
121
|
+
`);
|
|
94
122
|
}
|
|
95
123
|
|
|
96
124
|
/**
|
|
@@ -345,6 +373,125 @@ export class FileIndex {
|
|
|
345
373
|
stmt.run(folderId, relativePath);
|
|
346
374
|
}
|
|
347
375
|
|
|
376
|
+
// ============== SEARCH METHODS ==============
|
|
377
|
+
|
|
378
|
+
/**
|
|
379
|
+
* Search files by filename (fuzzy match)
|
|
380
|
+
*/
|
|
381
|
+
search(query) {
|
|
382
|
+
const stmt = this.db.prepare(`
|
|
383
|
+
SELECT f.*, GROUP_CONCAT(t.tag) as tags
|
|
384
|
+
FROM files f
|
|
385
|
+
LEFT JOIN tags t ON f.id = t.file_id
|
|
386
|
+
WHERE f.filename LIKE ?
|
|
387
|
+
GROUP BY f.id
|
|
388
|
+
ORDER BY f.created_at DESC
|
|
389
|
+
`);
|
|
390
|
+
return stmt.all(`%${query}%`);
|
|
391
|
+
}
|
|
392
|
+
|
|
393
|
+
/**
|
|
394
|
+
* Search files by tag
|
|
395
|
+
*/
|
|
396
|
+
searchByTag(query) {
|
|
397
|
+
const stmt = this.db.prepare(`
|
|
398
|
+
SELECT f.*, GROUP_CONCAT(t.tag) as tags
|
|
399
|
+
FROM files f
|
|
400
|
+
INNER JOIN tags t ON f.id = t.file_id
|
|
401
|
+
WHERE t.tag LIKE ?
|
|
402
|
+
GROUP BY f.id
|
|
403
|
+
ORDER BY f.created_at DESC
|
|
404
|
+
`);
|
|
405
|
+
return stmt.all(`%${query}%`);
|
|
406
|
+
}
|
|
407
|
+
|
|
408
|
+
// ============== RESUME UPLOAD METHODS ==============
|
|
409
|
+
|
|
410
|
+
/**
|
|
411
|
+
* Add a pending upload
|
|
412
|
+
*/
|
|
413
|
+
addPendingUpload(data) {
|
|
414
|
+
const stmt = this.db.prepare(`
|
|
415
|
+
INSERT OR REPLACE INTO pending_uploads
|
|
416
|
+
(filename, file_path, hash, original_size, total_chunks, uploaded_chunks, temp_dir)
|
|
417
|
+
VALUES (?, ?, ?, ?, ?, ?, ?)
|
|
418
|
+
`);
|
|
419
|
+
const result = stmt.run(
|
|
420
|
+
data.filename,
|
|
421
|
+
data.filePath,
|
|
422
|
+
data.hash,
|
|
423
|
+
data.originalSize,
|
|
424
|
+
data.totalChunks,
|
|
425
|
+
data.uploadedChunks || 0,
|
|
426
|
+
data.tempDir
|
|
427
|
+
);
|
|
428
|
+
return result.lastInsertRowid;
|
|
429
|
+
}
|
|
430
|
+
|
|
431
|
+
/**
|
|
432
|
+
* Add a pending chunk
|
|
433
|
+
*/
|
|
434
|
+
addPendingChunk(pendingId, chunkIndex, chunkPath) {
|
|
435
|
+
const stmt = this.db.prepare(`
|
|
436
|
+
INSERT OR REPLACE INTO pending_chunks (pending_id, chunk_index, chunk_path, uploaded)
|
|
437
|
+
VALUES (?, ?, ?, 0)
|
|
438
|
+
`);
|
|
439
|
+
stmt.run(pendingId, chunkIndex, chunkPath);
|
|
440
|
+
}
|
|
441
|
+
|
|
442
|
+
/**
|
|
443
|
+
* Mark chunk as uploaded
|
|
444
|
+
*/
|
|
445
|
+
markChunkUploaded(pendingId, chunkIndex, messageId, fileTelegramId) {
|
|
446
|
+
const stmt = this.db.prepare(`
|
|
447
|
+
UPDATE pending_chunks
|
|
448
|
+
SET uploaded = 1, message_id = ?, file_telegram_id = ?
|
|
449
|
+
WHERE pending_id = ? AND chunk_index = ?
|
|
450
|
+
`);
|
|
451
|
+
stmt.run(messageId, fileTelegramId, pendingId, chunkIndex);
|
|
452
|
+
|
|
453
|
+
// Update uploaded count
|
|
454
|
+
this.db.prepare(`
|
|
455
|
+
UPDATE pending_uploads SET uploaded_chunks = uploaded_chunks + 1 WHERE id = ?
|
|
456
|
+
`).run(pendingId);
|
|
457
|
+
}
|
|
458
|
+
|
|
459
|
+
/**
|
|
460
|
+
* Get all pending uploads
|
|
461
|
+
*/
|
|
462
|
+
getPendingUploads() {
|
|
463
|
+
const stmt = this.db.prepare(`
|
|
464
|
+
SELECT * FROM pending_uploads ORDER BY created_at DESC
|
|
465
|
+
`);
|
|
466
|
+
return stmt.all();
|
|
467
|
+
}
|
|
468
|
+
|
|
469
|
+
/**
|
|
470
|
+
* Get pending chunks for an upload
|
|
471
|
+
*/
|
|
472
|
+
getPendingChunks(pendingId) {
|
|
473
|
+
const stmt = this.db.prepare(`
|
|
474
|
+
SELECT * FROM pending_chunks WHERE pending_id = ? ORDER BY chunk_index
|
|
475
|
+
`);
|
|
476
|
+
return stmt.all(pendingId);
|
|
477
|
+
}
|
|
478
|
+
|
|
479
|
+
/**
|
|
480
|
+
* Delete a pending upload (and its chunks via CASCADE)
|
|
481
|
+
*/
|
|
482
|
+
deletePendingUpload(pendingId) {
|
|
483
|
+
const stmt = this.db.prepare('DELETE FROM pending_uploads WHERE id = ?');
|
|
484
|
+
stmt.run(pendingId);
|
|
485
|
+
}
|
|
486
|
+
|
|
487
|
+
/**
|
|
488
|
+
* Get pending upload by hash
|
|
489
|
+
*/
|
|
490
|
+
getPendingByHash(hash) {
|
|
491
|
+
const stmt = this.db.prepare('SELECT * FROM pending_uploads WHERE hash = ?');
|
|
492
|
+
return stmt.get(hash);
|
|
493
|
+
}
|
|
494
|
+
|
|
348
495
|
/**
|
|
349
496
|
* Close database connection
|
|
350
497
|
*/
|
package/src/index.js
CHANGED
|
@@ -19,7 +19,7 @@ const TELEGRAM_CHUNK_SIZE = 49 * 1024 * 1024;
|
|
|
19
19
|
* Process and upload a file to Telegram
|
|
20
20
|
*/
|
|
21
21
|
export async function processFile(filePath, options) {
|
|
22
|
-
const { password, dataDir, customName, config, onProgress } = options;
|
|
22
|
+
const { password, dataDir, customName, config, onProgress, onByteProgress } = options;
|
|
23
23
|
|
|
24
24
|
onProgress?.('Reading file...');
|
|
25
25
|
|
|
@@ -109,6 +109,9 @@ export async function processFile(filePath, options) {
|
|
|
109
109
|
compressed
|
|
110
110
|
});
|
|
111
111
|
|
|
112
|
+
let uploadedBytes = 0;
|
|
113
|
+
const totalBytes = chunkFiles.reduce((acc, c) => acc + c.size, 0);
|
|
114
|
+
|
|
112
115
|
for (const chunk of chunkFiles) {
|
|
113
116
|
onProgress?.(`Uploading chunk ${chunk.index + 1}/${chunkFiles.length}...`);
|
|
114
117
|
|
|
@@ -118,6 +121,9 @@ export async function processFile(filePath, options) {
|
|
|
118
121
|
|
|
119
122
|
const result = await client.sendFile(chunk.path, caption);
|
|
120
123
|
|
|
124
|
+
uploadedBytes += chunk.size;
|
|
125
|
+
onByteProgress?.({ uploaded: uploadedBytes, total: totalBytes, chunk: chunk.index + 1, totalChunks: chunkFiles.length });
|
|
126
|
+
|
|
121
127
|
// Store file_id instead of message_id for downloads
|
|
122
128
|
db.addChunk(fileId, chunk.index, result.messageId.toString(), chunk.size);
|
|
123
129
|
|
|
@@ -152,7 +158,7 @@ export async function processFile(filePath, options) {
|
|
|
152
158
|
* Retrieve a file from Telegram
|
|
153
159
|
*/
|
|
154
160
|
export async function retrieveFile(fileRecord, options) {
|
|
155
|
-
const { password, dataDir, outputPath, config, onProgress } = options;
|
|
161
|
+
const { password, dataDir, outputPath, config, onProgress, onByteProgress } = options;
|
|
156
162
|
|
|
157
163
|
onProgress?.('Connecting to Telegram...');
|
|
158
164
|
|
|
@@ -174,11 +180,15 @@ export async function retrieveFile(fileRecord, options) {
|
|
|
174
180
|
|
|
175
181
|
// Download all chunks
|
|
176
182
|
const downloadedChunks = [];
|
|
183
|
+
let downloadedBytes = 0;
|
|
184
|
+
const totalBytes = fileRecord.stored_size || chunks.reduce((acc, c) => acc + (c.size || 0), 0);
|
|
177
185
|
|
|
178
186
|
for (const chunk of chunks) {
|
|
179
187
|
onProgress?.(`Downloading chunk ${chunk.chunk_index + 1}/${chunks.length}...`);
|
|
180
188
|
|
|
181
189
|
const data = await client.downloadFile(chunk.file_telegram_id);
|
|
190
|
+
downloadedBytes += data.length;
|
|
191
|
+
onByteProgress?.({ downloaded: downloadedBytes, total: totalBytes, chunk: chunk.chunk_index + 1, totalChunks: chunks.length });
|
|
182
192
|
|
|
183
193
|
// Parse header
|
|
184
194
|
const header = parseHeader(data);
|
|
@@ -0,0 +1,119 @@
|
|
|
1
|
+
/**
|
|
2
|
+
* Progress bar utility with speed calculation
|
|
3
|
+
* Shows actual MB/s instead of boring spinners
|
|
4
|
+
*/
|
|
5
|
+
|
|
6
|
+
import chalk from 'chalk';
|
|
7
|
+
|
|
8
|
+
export class ProgressBar {
|
|
9
|
+
constructor(options = {}) {
|
|
10
|
+
this.total = options.total || 100;
|
|
11
|
+
this.width = options.width || 30;
|
|
12
|
+
this.label = options.label || 'Progress';
|
|
13
|
+
this.current = 0;
|
|
14
|
+
this.startTime = Date.now();
|
|
15
|
+
this.lastUpdate = 0;
|
|
16
|
+
this.lastBytes = 0;
|
|
17
|
+
this.speed = 0;
|
|
18
|
+
}
|
|
19
|
+
|
|
20
|
+
/**
|
|
21
|
+
* Update progress
|
|
22
|
+
* @param {number} current - Current bytes processed
|
|
23
|
+
*/
|
|
24
|
+
update(current) {
|
|
25
|
+
this.current = current;
|
|
26
|
+
|
|
27
|
+
const now = Date.now();
|
|
28
|
+
const elapsed = now - this.lastUpdate;
|
|
29
|
+
|
|
30
|
+
// Calculate speed every 200ms
|
|
31
|
+
if (elapsed >= 200) {
|
|
32
|
+
const bytesDelta = current - this.lastBytes;
|
|
33
|
+
this.speed = (bytesDelta / elapsed) * 1000; // bytes per second
|
|
34
|
+
this.lastUpdate = now;
|
|
35
|
+
this.lastBytes = current;
|
|
36
|
+
}
|
|
37
|
+
|
|
38
|
+
this.render();
|
|
39
|
+
}
|
|
40
|
+
|
|
41
|
+
/**
|
|
42
|
+
* Render the progress bar
|
|
43
|
+
*/
|
|
44
|
+
render() {
|
|
45
|
+
const percent = Math.min(100, Math.round((this.current / this.total) * 100));
|
|
46
|
+
const filled = Math.round((percent / 100) * this.width);
|
|
47
|
+
const empty = this.width - filled;
|
|
48
|
+
|
|
49
|
+
const bar = chalk.cyan('β'.repeat(filled)) + chalk.dim('β'.repeat(empty));
|
|
50
|
+
const speedStr = this.formatSpeed(this.speed);
|
|
51
|
+
const sizeStr = `${this.formatBytes(this.current)}/${this.formatBytes(this.total)}`;
|
|
52
|
+
|
|
53
|
+
// Calculate ETA
|
|
54
|
+
const eta = this.speed > 0
|
|
55
|
+
? Math.round((this.total - this.current) / this.speed)
|
|
56
|
+
: 0;
|
|
57
|
+
const etaStr = eta > 0 ? this.formatTime(eta) : '--:--';
|
|
58
|
+
|
|
59
|
+
// Clear line and write
|
|
60
|
+
process.stdout.write(`\r${this.label} ${bar} ${percent}% | ${sizeStr} | ${speedStr} | ETA: ${etaStr} `);
|
|
61
|
+
}
|
|
62
|
+
|
|
63
|
+
/**
|
|
64
|
+
* Complete the progress bar
|
|
65
|
+
*/
|
|
66
|
+
complete(message) {
|
|
67
|
+
const totalTime = (Date.now() - this.startTime) / 1000;
|
|
68
|
+
const avgSpeed = this.total / totalTime;
|
|
69
|
+
|
|
70
|
+
process.stdout.write('\r' + ' '.repeat(100) + '\r'); // Clear line
|
|
71
|
+
console.log(chalk.green(`β ${message || this.label}`) +
|
|
72
|
+
chalk.dim(` (${this.formatBytes(this.total)} in ${totalTime.toFixed(1)}s, avg ${this.formatSpeed(avgSpeed)})`));
|
|
73
|
+
}
|
|
74
|
+
|
|
75
|
+
/**
|
|
76
|
+
* Format bytes to human readable
|
|
77
|
+
*/
|
|
78
|
+
formatBytes(bytes) {
|
|
79
|
+
if (bytes === 0) return '0 B';
|
|
80
|
+
const k = 1024;
|
|
81
|
+
const sizes = ['B', 'KB', 'MB', 'GB'];
|
|
82
|
+
const i = Math.floor(Math.log(bytes) / Math.log(k));
|
|
83
|
+
return (bytes / Math.pow(k, i)).toFixed(1) + ' ' + sizes[i];
|
|
84
|
+
}
|
|
85
|
+
|
|
86
|
+
/**
|
|
87
|
+
* Format speed to human readable
|
|
88
|
+
*/
|
|
89
|
+
formatSpeed(bytesPerSec) {
|
|
90
|
+
if (bytesPerSec === 0) return '-- MB/s';
|
|
91
|
+
const mbps = bytesPerSec / (1024 * 1024);
|
|
92
|
+
if (mbps >= 1) {
|
|
93
|
+
return mbps.toFixed(1) + ' MB/s';
|
|
94
|
+
}
|
|
95
|
+
const kbps = bytesPerSec / 1024;
|
|
96
|
+
return kbps.toFixed(0) + ' KB/s';
|
|
97
|
+
}
|
|
98
|
+
|
|
99
|
+
/**
|
|
100
|
+
* Format seconds to mm:ss
|
|
101
|
+
*/
|
|
102
|
+
formatTime(seconds) {
|
|
103
|
+
const mins = Math.floor(seconds / 60);
|
|
104
|
+
const secs = seconds % 60;
|
|
105
|
+
return `${mins}:${secs.toString().padStart(2, '0')}`;
|
|
106
|
+
}
|
|
107
|
+
}
|
|
108
|
+
|
|
109
|
+
/**
|
|
110
|
+
* Create a simple progress callback for ora-style usage
|
|
111
|
+
*/
|
|
112
|
+
export function createProgressCallback(label, total) {
|
|
113
|
+
const bar = new ProgressBar({ label, total });
|
|
114
|
+
return {
|
|
115
|
+
update: (current) => bar.update(current),
|
|
116
|
+
complete: (msg) => bar.complete(msg),
|
|
117
|
+
bar
|
|
118
|
+
};
|
|
119
|
+
}
|