@hasindu---7/ff-link-extract 1.0.1 → 2.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,58 +1,252 @@
1
1
  # ff-link-extract
2
2
 
3
- CLI to extract direct `https://fuckingfast.co/dl/...` links from fuckingfast page URLs.
3
+ Extracts direct download links from fuckingfast page links and generates clean output files.
4
4
 
5
- ## Install
5
+ ## Commands Overview
6
+
7
+ This package ships with three CLI commands:
8
+
9
+ - `ff-link-extract`: Main extractor. Reads input links (plain links, HTML, or supported paste URLs) and generates direct download outputs.
10
+ - `ff-link-extract-autorun`: Opens links from a text file in your browser at timed intervals using config settings.
11
+ - `ff-link-extract-init`: Creates starter files (`urls.txt` and `autorun.config.json`) in your current folder.
12
+
13
+ Quick examples:
14
+
15
+ ```bash
16
+ # 1) Create starter files in current folder
17
+ ff-link-extract-init
18
+
19
+ # 2) Extract direct links
20
+ ff-link-extract -i urls.txt -o all-direct-links
21
+
22
+ # 3) Open extracted links at intervals
23
+ ff-link-extract-autorun -c autorun.config.json
24
+ ```
25
+
26
+ This tool reads your source list, finds page URLs, fetches each page, and extracts the real link in this format:
27
+
28
+ https://fuckingfast.co/dl/...
29
+
30
+ ## Who This Is For
31
+
32
+ - Anyone who has a list of fuckingfast page links.
33
+ - Users who want one-click output links.
34
+ - Users who want both machine-readable and human-readable results.
35
+
36
+ ## Fastest Way To Use (Recommended)
37
+
38
+ Run without installing anything:
39
+
40
+ ```bash
41
+ npx -y @hasindu---7/ff-link-extract -i urls.txt -o all-direct-links
42
+ ```
43
+
44
+ Why recommended:
45
+
46
+ - Always uses the latest version.
47
+ - No global install issues.
48
+ - Works great across Windows, macOS, and Linux.
49
+
50
+ ## Global Install (Optional)
6
51
 
7
52
  ```bash
8
53
  npm install -g @hasindu---7/ff-link-extract
9
54
  ```
10
55
 
11
- ## Usage
56
+ Then run:
12
57
 
13
58
  ```bash
14
59
  ff-link-extract -i urls.txt -o all-direct-links
15
60
  ```
16
61
 
17
- ## Input formats supported
62
+ Create starter files in your current folder:
63
+
64
+ ```bash
65
+ ff-link-extract-init
66
+ ```
67
+
68
+ This creates:
69
+
70
+ - urls.txt
71
+ - autorun.config.json
72
+
73
+ So users do not need to look inside node_modules for templates.
74
+
75
+ ## Input File Formats
76
+
77
+ The input file can be in any of these formats:
18
78
 
19
79
  - One URL per line.
20
- - Raw HTML (for example copied `<a href="...">` blocks).
21
- - Mixed text that contains fuckingfast URLs.
80
+ - Raw HTML blocks (including anchor tags).
81
+ - Mixed text that contains fuckingfast links.
82
+ - A PrivateBin paste URL (for example from paste.fitgirl-repacks.site with a key in the URL hash).
83
+
84
+ Example file names:
85
+
86
+ - urls.txt
87
+ - links.txt
88
+ - pasted-html.txt
89
+
90
+ ## Output Files
91
+
92
+ With this command:
93
+
94
+ ```bash
95
+ ff-link-extract -i urls.txt -o all-direct-links
96
+ ```
97
+
98
+ You get:
99
+
100
+ - links/YYYY-MM-DD_HH-mm-ss/all-direct-links.txt: Direct links only, one per line.
101
+ - links/YYYY-MM-DD_HH-mm-ss/all-direct-links.html: Clickable links page for browser use.
102
+ - links/YYYY-MM-DD_HH-mm-ss/all-direct-links.json: Full success and error report.
22
103
 
23
- ## Output files
104
+ Each run creates a new timestamped folder under links so previous runs stay saved and easy to compare.
24
105
 
25
- With `-o all-direct-links`, it creates:
106
+ ## Autorun Mode (Open Links At Intervals)
26
107
 
27
- - `all-direct-links.txt` - direct links only (one per line)
28
- - `all-direct-links.html` - clickable links page
29
- - `all-direct-links.json` - detailed report with success/failure per source URL
108
+ This package also includes a second command:
30
109
 
31
- ## Options
110
+ - ff-link-extract-autorun
32
111
 
33
- - `-i, --input <file>`: input file path (default: `urls.txt`)
34
- - `-o, --output-prefix <name>`: output file prefix (default: `direct-links`)
35
- - `-h, --help`: show help
36
- - `-v, --version`: show version
112
+ It opens links from a text file in your default browser one by one using timed intervals.
37
113
 
38
- ## Local development
114
+ If the input file is not found in the current folder, autorun automatically tries the latest folder inside links.
115
+
116
+ Default config file shipped with package:
117
+
118
+ - autorun.config.json
119
+
120
+ Default config values:
121
+
122
+ - inputFile: all-direct-links.txt
123
+ - intervalMs: 20000
124
+ - startDelayMs: 3000
125
+ - maxLinks: 999999
126
+ - dryRun: false
127
+
128
+ Example:
129
+
130
+ ```bash
131
+ ff-link-extract-autorun -c autorun.config.json
132
+ ```
133
+
134
+ If you do not have config yet:
135
+
136
+ ```bash
137
+ ff-link-extract-init
138
+ ff-link-extract-autorun -c autorun.config.json
139
+ ```
39
140
 
40
- Run directly without installing globally:
141
+ Quick test mode without opening browser:
41
142
 
42
143
  ```bash
43
- node ff_extract_links.js -i urls.txt -o all-direct-links
144
+ ff-link-extract-autorun -c autorun.config.json --dry-run
44
145
  ```
45
146
 
46
- Or test as package command locally:
147
+ Override config from command line:
148
+
149
+ ```bash
150
+ ff-link-extract-autorun -i all-direct-links.txt --interval-ms 15000 --start-delay-ms 5000 --max-links 20
151
+ ```
152
+
153
+ ## Command Options
154
+
155
+ - -i, --input <file>: Input file path. Default is urls.txt.
156
+ - -o, --output-prefix <name>: Output prefix. Default is direct-links.
157
+ - -h, --help: Show help.
158
+ - -v, --version: Show installed version.
159
+
160
+ Autorun command options:
161
+
162
+ - -c, --config <file>: Config file path. Default is autorun.config.json.
163
+ - -i, --input <file>: Links file path override.
164
+ - --interval-ms <number>: Delay between opening each link.
165
+ - --start-delay-ms <number>: Delay before first link opens.
166
+ - --max-links <number>: Open only first N links.
167
+ - --dry-run: Print schedule only. No browser opens.
168
+
169
+ Init command options:
170
+
171
+ - -f, --force: Overwrite existing starter files.
172
+ - -h, --help: Show help.
173
+
174
+ ## Copy-Paste Examples
175
+
176
+ Use a custom file and custom output name:
177
+
178
+ ```bash
179
+ ff-link-extract -i my-links.txt -o game-links
180
+ ```
181
+
182
+ Use a paste source URL directly:
183
+
184
+ ```bash
185
+ echo "https://paste.fitgirl-repacks.site/?add4fc7178e6fa8f#831kvVZtdGSNdTncYhZ8e2vnks7wuCyYYXHYFeZaWhT3" > paste_source.txt
186
+ ff-link-extract -i paste_source.txt -o game-links
187
+ ```
188
+
189
+ Check installed version:
190
+
191
+ ```bash
192
+ ff-link-extract --version
193
+ ```
194
+
195
+ Show help:
196
+
197
+ ```bash
198
+ ff-link-extract --help
199
+ ```
200
+
201
+ ## Updating
202
+
203
+ If using npx:
204
+
205
+ - No manual update needed. You get the newest version by default.
206
+
207
+ If globally installed:
208
+
209
+ ```bash
210
+ npm i -g @hasindu---7/ff-link-extract@latest
211
+ ```
212
+
213
+ ## Troubleshooting
214
+
215
+ Problem: command not found after global install on macOS or Linux.
216
+
217
+ Use npx instead:
218
+
219
+ ```bash
220
+ npx -y @hasindu---7/ff-link-extract -i urls.txt -o all-direct-links
221
+ ```
222
+
223
+ Problem: extraction fails for some links.
224
+
225
+ - Check all-direct-links.json for per-link error details.
226
+ - Retry the command after a few minutes.
227
+ - Confirm input actually contains valid fuckingfast page URLs.
228
+
229
+ Problem: no links found in input.
230
+
231
+ - Make sure the file includes URLs starting with https://fuckingfast.co/
232
+ - Remove unrelated text if needed and retry.
233
+
234
+ ## Local Development
235
+
236
+ Run script directly:
237
+
238
+ ```bash
239
+ node bin/ff_extract_links.js -i urls.txt -o all-direct-links
240
+ ```
241
+
242
+ Test package command locally:
47
243
 
48
244
  ```bash
49
245
  npm link
50
246
  ff-link-extract -i urls.txt -o all-direct-links
51
247
  ```
52
248
 
53
- ## Publish to npm
54
-
55
- 1. `npm login`
56
- 2. `npm publish --access public`
249
+ ## Package
57
250
 
58
- If the package name is already taken, change `name` in `package.json` first.
251
+ - Package: @hasindu---7/ff-link-extract
252
+ - Current bundled files are organized under bin and templates for easier maintenance.
@@ -2,12 +2,19 @@
2
2
 
3
3
  const fs = require('fs');
4
4
  const path = require('path');
5
+ const { decryptPrivateBin } = require('privatebin-decrypt');
5
6
 
6
7
  function parseArgs(argv) {
7
- const args = { input: 'urls.txt', outputPrefix: 'direct-links', help: false, version: false };
8
+ const args = {
9
+ input: 'urls.txt',
10
+ outputPrefix: 'direct-links',
11
+ help: false,
12
+ version: false,
13
+ };
8
14
 
9
15
  for (let i = 2; i < argv.length; i += 1) {
10
16
  const arg = argv[i];
17
+
11
18
  if (arg === '-h' || arg === '--help') {
12
19
  args.help = true;
13
20
  continue;
@@ -33,7 +40,7 @@ function parseArgs(argv) {
33
40
 
34
41
  function readPackageVersion() {
35
42
  try {
36
- const packagePath = path.resolve(__dirname, 'package.json');
43
+ const packagePath = path.resolve(__dirname, '..', 'package.json');
37
44
  if (!fs.existsSync(packagePath)) {
38
45
  return 'dev';
39
46
  }
@@ -56,6 +63,8 @@ function printHelp() {
56
63
  console.log(' -o, --output-prefix <name> Output prefix for .txt/.html/.json files. Default: direct-links');
57
64
  console.log(' -h, --help Show this help');
58
65
  console.log(' -v, --version Show version');
66
+ console.log('');
67
+ console.log('Also supports paste page URLs (e.g. paste.fitgirl-repacks.site) and expands links automatically.');
59
68
  }
60
69
 
61
70
  function sanitizeLine(line) {
@@ -69,12 +78,32 @@ function normalizeExtractedUrl(raw) {
69
78
  .trim();
70
79
  }
71
80
 
72
- function extractSourceUrls(inputText) {
81
+ function dedupePreserveOrder(values) {
82
+ return [...new Set(values)];
83
+ }
84
+
85
+ function extractFuckingfastUrls(inputText) {
73
86
  const allMatches = inputText.match(/https?:\/\/fuckingfast\.co\/[^\s"'<>]+/gi) || [];
74
87
  const cleaned = allMatches.map(normalizeExtractedUrl).filter(Boolean);
88
+ return dedupePreserveOrder(cleaned);
89
+ }
75
90
 
76
- // Preserve order while removing duplicates.
77
- return [...new Set(cleaned)];
91
+ function extractHttpUrls(inputText) {
92
+ const allMatches = inputText.match(/https?:\/\/[^\s"'<>]+/gi) || [];
93
+ const cleaned = allMatches.map(normalizeExtractedUrl).filter(Boolean);
94
+ return dedupePreserveOrder(cleaned);
95
+ }
96
+
97
+ function isAggregatorPageUrl(rawUrl) {
98
+ try {
99
+ const u = new URL(rawUrl);
100
+ return (
101
+ u.hostname === 'paste.fitgirl-repacks.site'
102
+ || u.hostname.endsWith('.paste.fitgirl-repacks.site')
103
+ );
104
+ } catch {
105
+ return false;
106
+ }
78
107
  }
79
108
 
80
109
  function escapeHtml(str) {
@@ -99,14 +128,12 @@ function extractFilenameFromUrl(rawUrl) {
99
128
  }
100
129
 
101
130
  function extractDirectDownload(html) {
102
- // Primary pattern used on the sample pages.
103
131
  const openPattern = /window\.open\(\s*["'](https:\/\/fuckingfast\.co\/dl\/[A-Za-z0-9_\-]+)["']\s*\)/i;
104
132
  const m1 = html.match(openPattern);
105
133
  if (m1) {
106
134
  return m1[1];
107
135
  }
108
136
 
109
- // Fallback in case the download URL appears elsewhere in the source.
110
137
  const fallback = html.match(/https:\/\/fuckingfast\.co\/dl\/[A-Za-z0-9_\-]+/i);
111
138
  if (fallback) {
112
139
  return fallback[0];
@@ -115,6 +142,19 @@ function extractDirectDownload(html) {
115
142
  return null;
116
143
  }
117
144
 
145
+ function pad2(n) {
146
+ return String(n).padStart(2, '0');
147
+ }
148
+
149
+ function buildRunDirectory(cwd) {
150
+ const now = new Date();
151
+ const stamp = `${now.getFullYear()}-${pad2(now.getMonth() + 1)}-${pad2(now.getDate())}_${pad2(now.getHours())}-${pad2(now.getMinutes())}-${pad2(now.getSeconds())}`;
152
+ const linksRoot = path.resolve(cwd, 'links');
153
+ const runDir = path.resolve(linksRoot, stamp);
154
+ fs.mkdirSync(runDir, { recursive: true });
155
+ return runDir;
156
+ }
157
+
118
158
  async function fetchHtml(url) {
119
159
  const res = await fetch(url, {
120
160
  headers: {
@@ -131,6 +171,84 @@ async function fetchHtml(url) {
131
171
  return await res.text();
132
172
  }
133
173
 
174
+ async function fetchPrivateBinPayload(pageUrl) {
175
+ const u = new URL(pageUrl);
176
+ const apiUrl = `${u.origin}${u.pathname}${u.search}`;
177
+ const res = await fetch(apiUrl, {
178
+ headers: {
179
+ 'X-Requested-With': 'JSONHttpRequest',
180
+ accept: 'application/json, text/javascript, */*; q=0.01',
181
+ },
182
+ redirect: 'follow',
183
+ });
184
+
185
+ if (!res.ok) {
186
+ throw new Error(`HTTP ${res.status}`);
187
+ }
188
+
189
+ return await res.json();
190
+ }
191
+
192
+ async function extractLinksFromPrivateBinPage(pageUrl) {
193
+ const u = new URL(pageUrl);
194
+ const key = (u.hash || '').replace(/^#/, '');
195
+
196
+ if (!key) {
197
+ return [];
198
+ }
199
+
200
+ const payload = await fetchPrivateBinPayload(pageUrl);
201
+ if (!payload || typeof payload.ct !== 'string' || !Array.isArray(payload.adata)) {
202
+ return [];
203
+ }
204
+
205
+ const decrypted = await decryptPrivateBin({
206
+ key,
207
+ data: payload.adata,
208
+ cipherMessage: payload.ct,
209
+ });
210
+
211
+ return extractFuckingfastUrls(decrypted || '');
212
+ }
213
+
214
+ async function resolveSourceUrls(inputText) {
215
+ const directUrls = extractFuckingfastUrls(inputText);
216
+ const allHttpUrls = extractHttpUrls(inputText);
217
+ const aggregatorUrls = allHttpUrls.filter(isAggregatorPageUrl);
218
+
219
+ if (aggregatorUrls.length === 0) {
220
+ return directUrls;
221
+ }
222
+
223
+ const collected = [...directUrls];
224
+ const seen = new Set(directUrls);
225
+
226
+ console.log(`Found ${aggregatorUrls.length} paste source URL(s); expanding links...`);
227
+
228
+ for (const pageUrl of aggregatorUrls) {
229
+ try {
230
+ const html = await fetchHtml(pageUrl);
231
+ let found = extractFuckingfastUrls(html);
232
+
233
+ // PrivateBin pages often require client-side decryption; fallback to JSON API decrypt.
234
+ if (found.length === 0) {
235
+ found = await extractLinksFromPrivateBinPage(pageUrl);
236
+ }
237
+
238
+ for (const url of found) {
239
+ if (!seen.has(url)) {
240
+ seen.add(url);
241
+ collected.push(url);
242
+ }
243
+ }
244
+ } catch (err) {
245
+ console.warn(`Could not read paste source: ${pageUrl} (${err.message || String(err)})`);
246
+ }
247
+ }
248
+
249
+ return collected;
250
+ }
251
+
134
252
  async function main() {
135
253
  const { input, outputPrefix, help, version } = parseArgs(process.argv);
136
254
 
@@ -149,21 +267,20 @@ async function main() {
149
267
 
150
268
  if (!fs.existsSync(inputPath)) {
151
269
  console.error(`Input file not found: ${inputPath}`);
152
- console.error('Create a text file with one fuckingfast page URL per line.');
270
+ console.error('Create a text file with one fuckingfast page URL per line, raw HTML, or a paste source URL.');
153
271
  process.exit(1);
154
272
  }
155
273
 
156
274
  const rawInput = fs.readFileSync(inputPath, 'utf8');
157
- let urls = extractSourceUrls(rawInput);
275
+ let urls = await resolveSourceUrls(rawInput);
158
276
 
159
- // Fallback for old format: one URL per line (with optional comments).
160
277
  if (urls.length === 0) {
161
278
  const rawLines = rawInput.split(/\r?\n/);
162
279
  urls = rawLines.map(sanitizeLine).filter((line) => line && !line.startsWith('#'));
163
280
  }
164
281
 
165
282
  if (urls.length === 0) {
166
- console.error('No URLs found in input file.');
283
+ console.error('No source URLs found in input file.');
167
284
  process.exit(1);
168
285
  }
169
286
 
@@ -205,9 +322,10 @@ async function main() {
205
322
  }
206
323
  }
207
324
 
208
- const txtOut = path.resolve(cwd, `${outputPrefix}.txt`);
209
- const htmlOut = path.resolve(cwd, `${outputPrefix}.html`);
210
- const jsonOut = path.resolve(cwd, `${outputPrefix}.json`);
325
+ const runDir = buildRunDirectory(cwd);
326
+ const txtOut = path.resolve(runDir, `${outputPrefix}.txt`);
327
+ const htmlOut = path.resolve(runDir, `${outputPrefix}.html`);
328
+ const jsonOut = path.resolve(runDir, `${outputPrefix}.json`);
211
329
 
212
330
  fs.writeFileSync(txtOut, txtLines.join('\n') + (txtLines.length ? '\n' : ''), 'utf8');
213
331
 
@@ -290,6 +408,7 @@ async function main() {
290
408
 
291
409
  console.log('');
292
410
  console.log(`Done. Success: ${okCount}, Failed: ${failCount}`);
411
+ console.log(`Run folder: ${runDir}`);
293
412
  console.log(`Text links: ${txtOut}`);
294
413
  console.log(`Clickable HTML: ${htmlOut}`);
295
414
  console.log(`Full report: ${jsonOut}`);
@@ -0,0 +1,226 @@
1
+ #!/usr/bin/env node
2
+
3
+ const fs = require('fs');
4
+ const path = require('path');
5
+ const { spawn } = require('child_process');
6
+
7
+ function readJsonFile(filePath) {
8
+ try {
9
+ if (!fs.existsSync(filePath)) {
10
+ return null;
11
+ }
12
+ return JSON.parse(fs.readFileSync(filePath, 'utf8'));
13
+ } catch (err) {
14
+ throw new Error(`Invalid JSON in ${filePath}: ${err.message || String(err)}`);
15
+ }
16
+ }
17
+
18
+ function parseArgs(argv) {
19
+ const args = {
20
+ config: 'autorun.config.json',
21
+ input: null,
22
+ intervalMs: null,
23
+ startDelayMs: null,
24
+ maxLinks: null,
25
+ dryRun: false,
26
+ help: false,
27
+ };
28
+
29
+ for (let i = 2; i < argv.length; i += 1) {
30
+ const arg = argv[i];
31
+
32
+ if (arg === '-h' || arg === '--help') {
33
+ args.help = true;
34
+ continue;
35
+ }
36
+ if (arg === '--dry-run') {
37
+ args.dryRun = true;
38
+ continue;
39
+ }
40
+ if ((arg === '-c' || arg === '--config') && argv[i + 1]) {
41
+ args.config = argv[i + 1];
42
+ i += 1;
43
+ continue;
44
+ }
45
+ if ((arg === '-i' || arg === '--input') && argv[i + 1]) {
46
+ args.input = argv[i + 1];
47
+ i += 1;
48
+ continue;
49
+ }
50
+ if (arg === '--interval-ms' && argv[i + 1]) {
51
+ args.intervalMs = Number(argv[i + 1]);
52
+ i += 1;
53
+ continue;
54
+ }
55
+ if (arg === '--start-delay-ms' && argv[i + 1]) {
56
+ args.startDelayMs = Number(argv[i + 1]);
57
+ i += 1;
58
+ continue;
59
+ }
60
+ if (arg === '--max-links' && argv[i + 1]) {
61
+ args.maxLinks = Number(argv[i + 1]);
62
+ i += 1;
63
+ continue;
64
+ }
65
+ }
66
+
67
+ return args;
68
+ }
69
+
70
+ function printHelp() {
71
+ console.log('ff-link-extract-autorun');
72
+ console.log('Open links from a text file at configurable intervals.');
73
+ console.log('');
74
+ console.log('Usage:');
75
+ console.log(' ff-link-extract-autorun -c autorun.config.json');
76
+ console.log(' ff-link-extract-autorun -i all-direct-links.txt --interval-ms 20000');
77
+ console.log('');
78
+ console.log('Options:');
79
+ console.log(' -c, --config <file> Config file path (default: autorun.config.json)');
80
+ console.log(' -i, --input <file> Links file path override');
81
+ console.log(' --interval-ms <number> Interval between openings');
82
+ console.log(' --start-delay-ms <num> Delay before first open');
83
+ console.log(' --max-links <number> Open only first N links');
84
+ console.log(' --dry-run Print schedule, do not open browser');
85
+ console.log(' -h, --help Show this help');
86
+ }
87
+
88
+ function normalizeUrl(raw) {
89
+ return String(raw || '').trim();
90
+ }
91
+
92
+ function loadLinks(filePath) {
93
+ if (!fs.existsSync(filePath)) {
94
+ throw new Error(`Links file not found: ${filePath}`);
95
+ }
96
+
97
+ const lines = fs.readFileSync(filePath, 'utf8').split(/\r?\n/);
98
+ const links = lines
99
+ .map(normalizeUrl)
100
+ .filter((line) => line && !line.startsWith('#') && /^https?:\/\//i.test(line));
101
+
102
+ return [...new Set(links)];
103
+ }
104
+
105
+ function findLatestTimestampDir(baseDir) {
106
+ if (!fs.existsSync(baseDir)) {
107
+ return null;
108
+ }
109
+
110
+ const dirs = fs.readdirSync(baseDir, { withFileTypes: true })
111
+ .filter((d) => d.isDirectory())
112
+ .map((d) => d.name)
113
+ .sort();
114
+
115
+ if (dirs.length === 0) {
116
+ return null;
117
+ }
118
+
119
+ return path.resolve(baseDir, dirs[dirs.length - 1]);
120
+ }
121
+
122
+ function resolveLinksFilePath(cwd, inputFile) {
123
+ const explicitPath = path.resolve(cwd, inputFile);
124
+ if (fs.existsSync(explicitPath)) {
125
+ return explicitPath;
126
+ }
127
+
128
+ const latestRunDir = findLatestTimestampDir(path.resolve(cwd, 'links'));
129
+ if (!latestRunDir) {
130
+ return explicitPath;
131
+ }
132
+
133
+ const latestPath = path.resolve(latestRunDir, path.basename(inputFile));
134
+ if (fs.existsSync(latestPath)) {
135
+ return latestPath;
136
+ }
137
+
138
+ return explicitPath;
139
+ }
140
+
141
+ function openInBrowser(url) {
142
+ if (process.platform === 'win32') {
143
+ // start requires a window title argument before URL.
144
+ spawn('cmd', ['/c', 'start', '', url], { detached: true, stdio: 'ignore' }).unref();
145
+ return;
146
+ }
147
+
148
+ if (process.platform === 'darwin') {
149
+ spawn('open', [url], { detached: true, stdio: 'ignore' }).unref();
150
+ return;
151
+ }
152
+
153
+ spawn('xdg-open', [url], { detached: true, stdio: 'ignore' }).unref();
154
+ }
155
+
156
+ function asNonNegativeNumber(value, fallback) {
157
+ const n = Number(value);
158
+ if (!Number.isFinite(n) || n < 0) {
159
+ return fallback;
160
+ }
161
+ return n;
162
+ }
163
+
164
+ async function main() {
165
+ const args = parseArgs(process.argv);
166
+
167
+ if (args.help) {
168
+ printHelp();
169
+ return;
170
+ }
171
+
172
+ const cwd = process.cwd();
173
+ const configPath = path.resolve(cwd, args.config);
174
+ const config = readJsonFile(configPath) || {};
175
+
176
+ const inputFile = args.input || config.inputFile || 'all-direct-links.txt';
177
+ const linksPath = resolveLinksFilePath(cwd, inputFile);
178
+
179
+ const intervalMs = asNonNegativeNumber(
180
+ args.intervalMs != null ? args.intervalMs : config.intervalMs,
181
+ 20000
182
+ );
183
+ const startDelayMs = asNonNegativeNumber(
184
+ args.startDelayMs != null ? args.startDelayMs : config.startDelayMs,
185
+ 3000
186
+ );
187
+ const maxLinks = asNonNegativeNumber(
188
+ args.maxLinks != null ? args.maxLinks : config.maxLinks,
189
+ Number.MAX_SAFE_INTEGER
190
+ );
191
+ const dryRun = Boolean(args.dryRun || config.dryRun);
192
+
193
+ const links = loadLinks(linksPath).slice(0, maxLinks);
194
+ if (links.length === 0) {
195
+ console.error('No valid URLs found in links file.');
196
+ process.exit(1);
197
+ }
198
+
199
+ console.log(`Loaded ${links.length} link(s) from ${linksPath}`);
200
+ console.log(`Start delay: ${startDelayMs}ms | Interval: ${intervalMs}ms | Dry run: ${dryRun}`);
201
+
202
+ await new Promise((resolve) => setTimeout(resolve, startDelayMs));
203
+
204
+ for (let i = 0; i < links.length; i += 1) {
205
+ const url = links[i];
206
+ const index = i + 1;
207
+
208
+ if (dryRun) {
209
+ console.log(`[${index}/${links.length}] Would open: ${url}`);
210
+ } else {
211
+ console.log(`[${index}/${links.length}] Opening: ${url}`);
212
+ openInBrowser(url);
213
+ }
214
+
215
+ if (i < links.length - 1) {
216
+ await new Promise((resolve) => setTimeout(resolve, intervalMs));
217
+ }
218
+ }
219
+
220
+ console.log('Autorun finished.');
221
+ }
222
+
223
+ main().catch((err) => {
224
+ console.error(err.message || String(err));
225
+ process.exit(1);
226
+ });
@@ -0,0 +1,82 @@
1
+ #!/usr/bin/env node
2
+
3
+ const fs = require('fs');
4
+ const path = require('path');
5
+
6
+ function parseArgs(argv) {
7
+ return {
8
+ force: argv.includes('--force') || argv.includes('-f'),
9
+ help: argv.includes('--help') || argv.includes('-h'),
10
+ };
11
+ }
12
+
13
+ function printHelp() {
14
+ console.log('ff-link-extract-init');
15
+ console.log('Copy starter files into the current folder.');
16
+ console.log('');
17
+ console.log('Usage:');
18
+ console.log(' ff-link-extract-init');
19
+ console.log(' ff-link-extract-init --force');
20
+ console.log('');
21
+ console.log('Options:');
22
+ console.log(' -f, --force Overwrite existing files');
23
+ console.log(' -h, --help Show this help');
24
+ }
25
+
26
+ function copyTemplate(templatePath, outPath, force) {
27
+ if (!fs.existsSync(templatePath)) {
28
+ throw new Error(`Template not found: ${templatePath}`);
29
+ }
30
+
31
+ if (fs.existsSync(outPath) && !force) {
32
+ return { status: 'skip', path: outPath };
33
+ }
34
+
35
+ fs.copyFileSync(templatePath, outPath);
36
+ return { status: fs.existsSync(outPath) ? 'write' : 'skip', path: outPath };
37
+ }
38
+
39
+ function main() {
40
+ const args = parseArgs(process.argv);
41
+
42
+ if (args.help) {
43
+ printHelp();
44
+ return;
45
+ }
46
+
47
+ const packageRoot = path.resolve(__dirname, '..');
48
+ const templatesDir = path.resolve(packageRoot, 'templates');
49
+ const cwd = process.cwd();
50
+
51
+ const targets = [
52
+ { from: path.resolve(templatesDir, 'urls.txt'), to: path.resolve(cwd, 'urls.txt') },
53
+ { from: path.resolve(templatesDir, 'autorun.config.json'), to: path.resolve(cwd, 'autorun.config.json') },
54
+ ];
55
+
56
+ let wrote = 0;
57
+ let skipped = 0;
58
+
59
+ for (const item of targets) {
60
+ const existedBefore = fs.existsSync(item.to);
61
+ copyTemplate(item.from, item.to, args.force);
62
+
63
+ if (existedBefore && !args.force) {
64
+ skipped += 1;
65
+ console.log(`Skipped existing: ${item.to}`);
66
+ } else {
67
+ wrote += 1;
68
+ console.log(`Created: ${item.to}`);
69
+ }
70
+ }
71
+
72
+ console.log('');
73
+ console.log(`Done. Created: ${wrote}, Skipped: ${skipped}`);
74
+ console.log('Next: add your source links to urls.txt, then run ff-link-extract.');
75
+ }
76
+
77
+ try {
78
+ main();
79
+ } catch (err) {
80
+ console.error(err.message || String(err));
81
+ process.exit(1);
82
+ }
package/package.json CHANGED
@@ -1,15 +1,21 @@
1
1
  {
2
2
  "name": "@hasindu---7/ff-link-extract",
3
- "version": "1.0.1",
3
+ "version": "2.0.1",
4
4
  "description": "Extract direct fuckingfast download links from page URLs",
5
5
  "bin": {
6
- "ff-link-extract": "ff_extract_links.js"
6
+ "ff-link-extract": "bin/ff_extract_links.js",
7
+ "ff-link-extract-autorun": "bin/ff_extract_links_autorun.js",
8
+ "ff-link-extract-init": "bin/ff_extract_links_init.js"
7
9
  },
8
10
  "type": "commonjs",
9
11
  "files": [
10
- "ff_extract_links.js",
12
+ "bin/ff_extract_links.js",
13
+ "bin/ff_extract_links_autorun.js",
14
+ "bin/ff_extract_links_init.js",
15
+ "templates/autorun.config.json",
16
+ "templates/urls.txt",
11
17
  "README.md",
12
- "urls.txt"
18
+ "package.json"
13
19
  ],
14
20
  "keywords": [
15
21
  "fuckingfast",
@@ -20,5 +26,8 @@
20
26
  "license": "MIT",
21
27
  "engines": {
22
28
  "node": ">=18"
29
+ },
30
+ "dependencies": {
31
+ "privatebin-decrypt": "^1.0.3"
23
32
  }
24
33
  }
@@ -0,0 +1,7 @@
1
+ {
2
+ "inputFile": "all-direct-links.txt",
3
+ "intervalMs": 20000,
4
+ "startDelayMs": 3000,
5
+ "maxLinks": 999999,
6
+ "dryRun": false
7
+ }
File without changes