firecrawl-cli 0.0.6 → 1.0.1-beta.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -4,24 +4,444 @@ Command-line interface for Firecrawl. Scrape, crawl, and extract data from any w
4
4
 
5
5
  ## Installation
6
6
 
7
- Install Firecrawl CLI globally using npm:
8
-
9
7
  ```bash
10
8
  npm install -g firecrawl-cli
11
9
  ```
12
10
 
13
11
  ## Quick Start
14
12
 
15
- Set your API key:
13
+ Just run a command - the CLI will prompt you to authenticate if needed:
14
+
15
+ ```bash
16
+ firecrawl https://example.com
17
+ ```
18
+
19
+ ## Authentication
20
+
21
+ On first run, you'll be prompted to authenticate:
22
+
23
+ ```
24
+ šŸ”„ firecrawl cli
25
+ Turn websites into LLM-ready data
26
+
27
+ Welcome! To get started, authenticate with your Firecrawl account.
28
+
29
+ 1. Login with browser (recommended)
30
+ 2. Enter API key manually
31
+
32
+ Tip: You can also set FIRECRAWL_API_KEY environment variable
33
+
34
+ Enter choice [1/2]:
35
+ ```
36
+
37
+ ### Authentication Methods
38
+
39
+ ```bash
40
+ # Interactive (prompts automatically when needed)
41
+ firecrawl
42
+
43
+ # Browser login
44
+ firecrawl login
45
+
46
+ # Direct API key
47
+ firecrawl login --api-key fc-your-api-key
48
+
49
+ # Environment variable
50
+ export FIRECRAWL_API_KEY=fc-your-api-key
51
+
52
+ # Per-command API key
53
+ firecrawl scrape https://example.com --api-key fc-your-api-key
54
+ ```
55
+
56
+ ---
57
+
58
+ ## Commands
59
+
60
+ ### `scrape` - Scrape a single URL
61
+
62
+ Extract content from any webpage in various formats.
63
+
64
+ ```bash
65
+ # Basic usage (outputs markdown)
66
+ firecrawl https://example.com
67
+ firecrawl scrape https://example.com
68
+
69
+ # Get raw HTML
70
+ firecrawl https://example.com --html
71
+ firecrawl https://example.com -H
72
+
73
+ # Multiple formats (outputs JSON)
74
+ firecrawl https://example.com --format markdown,links,images
75
+
76
+ # Save to file
77
+ firecrawl https://example.com -o output.md
78
+ firecrawl https://example.com --format json -o data.json --pretty
79
+ ```
80
+
81
+ #### Scrape Options
82
+
83
+ | Option | Description |
84
+ | ------------------------ | ------------------------------------------------------- |
85
+ | `-f, --format <formats>` | Output format(s), comma-separated |
86
+ | `-H, --html` | Shortcut for `--format html` |
87
+ | `--only-main-content` | Extract only main content (removes navs, footers, etc.) |
88
+ | `--wait-for <ms>` | Wait time before scraping (for JS-rendered content) |
89
+ | `--screenshot` | Take a screenshot |
90
+ | `--include-tags <tags>` | Only include specific HTML tags |
91
+ | `--exclude-tags <tags>` | Exclude specific HTML tags |
92
+ | `-o, --output <path>` | Save output to file |
93
+ | `--pretty` | Pretty print JSON output |
94
+ | `--timing` | Show request timing info |
95
+
96
+ #### Available Formats
97
+
98
+ | Format | Description |
99
+ | ------------ | -------------------------- |
100
+ | `markdown` | Clean markdown (default) |
101
+ | `html` | Cleaned HTML |
102
+ | `rawHtml` | Original HTML |
103
+ | `links` | All links on the page |
104
+ | `screenshot` | Screenshot as base64 |
105
+ | `json` | Structured JSON extraction |
106
+
107
+ #### Examples
108
+
109
+ ```bash
110
+ # Extract only main content as markdown
111
+ firecrawl https://blog.example.com --only-main-content
112
+
113
+ # Wait for JS to render, then scrape
114
+ firecrawl https://spa-app.com --wait-for 3000
115
+
116
+ # Get all links from a page
117
+ firecrawl https://example.com --format links
118
+
119
+ # Screenshot + markdown
120
+ firecrawl https://example.com --format markdown --screenshot
121
+
122
+ # Extract specific elements only
123
+ firecrawl https://example.com --include-tags article,main
124
+
125
+ # Exclude navigation and ads
126
+ firecrawl https://example.com --exclude-tags nav,aside,.ad
127
+ ```
128
+
129
+ ---
130
+
131
+ ### `crawl` - Crawl an entire website
132
+
133
+ Crawl multiple pages from a website.
134
+
135
+ ```bash
136
+ # Start a crawl (returns job ID)
137
+ firecrawl crawl https://example.com
138
+
139
+ # Wait for crawl to complete
140
+ firecrawl crawl https://example.com --wait
141
+
142
+ # With progress indicator
143
+ firecrawl crawl https://example.com --wait --progress
144
+
145
+ # Check crawl status
146
+ firecrawl crawl <job-id>
147
+
148
+ # Limit pages
149
+ firecrawl crawl https://example.com --limit 100 --max-depth 3
150
+ ```
151
+
152
+ #### Crawl Options
153
+
154
+ | Option | Description |
155
+ | --------------------------- | ---------------------------------------- |
156
+ | `--wait` | Wait for crawl to complete |
157
+ | `--progress` | Show progress while waiting |
158
+ | `--limit <n>` | Maximum pages to crawl |
159
+ | `--max-depth <n>` | Maximum crawl depth |
160
+ | `--include-paths <paths>` | Only crawl matching paths |
161
+ | `--exclude-paths <paths>` | Skip matching paths |
162
+ | `--sitemap <mode>` | `include`, `skip`, or `only` |
163
+ | `--allow-subdomains` | Include subdomains |
164
+ | `--allow-external-links` | Follow external links |
165
+ | `--crawl-entire-domain` | Crawl entire domain |
166
+ | `--ignore-query-parameters` | Treat URLs with different params as same |
167
+ | `--delay <ms>` | Delay between requests |
168
+ | `--max-concurrency <n>` | Max concurrent requests |
169
+ | `--timeout <seconds>` | Timeout when waiting |
170
+ | `--poll-interval <seconds>` | Status check interval |
171
+
172
+ #### Examples
173
+
174
+ ```bash
175
+ # Crawl blog section only
176
+ firecrawl crawl https://example.com --include-paths /blog,/posts
177
+
178
+ # Exclude admin pages
179
+ firecrawl crawl https://example.com --exclude-paths /admin,/login
180
+
181
+ # Crawl with rate limiting
182
+ firecrawl crawl https://example.com --delay 1000 --max-concurrency 2
183
+
184
+ # Deep crawl with high limit
185
+ firecrawl crawl https://example.com --limit 1000 --max-depth 10 --wait --progress
186
+
187
+ # Save results
188
+ firecrawl crawl https://example.com --wait -o crawl-results.json --pretty
189
+ ```
190
+
191
+ ---
192
+
193
+ ### `map` - Discover all URLs on a website
194
+
195
+ Quickly discover all URLs on a website without scraping content.
196
+
197
+ ```bash
198
+ # List all URLs (one per line)
199
+ firecrawl map https://example.com
200
+
201
+ # Output as JSON
202
+ firecrawl map https://example.com --json
203
+
204
+ # Search for specific URLs
205
+ firecrawl map https://example.com --search "blog"
206
+
207
+ # Limit results
208
+ firecrawl map https://example.com --limit 500
209
+ ```
210
+
211
+ #### Map Options
212
+
213
+ | Option | Description |
214
+ | --------------------------- | --------------------------------- |
215
+ | `--limit <n>` | Maximum URLs to discover |
216
+ | `--search <query>` | Filter URLs by search query |
217
+ | `--sitemap <mode>` | `include`, `skip`, or `only` |
218
+ | `--include-subdomains` | Include subdomains |
219
+ | `--ignore-query-parameters` | Dedupe URLs with different params |
220
+ | `--timeout <seconds>` | Request timeout |
221
+ | `--json` | Output as JSON |
222
+ | `-o, --output <path>` | Save to file |
223
+
224
+ #### Examples
225
+
226
+ ```bash
227
+ # Find all product pages
228
+ firecrawl map https://shop.example.com --search "product"
229
+
230
+ # Get sitemap URLs only
231
+ firecrawl map https://example.com --sitemap only
232
+
233
+ # Save URL list to file
234
+ firecrawl map https://example.com -o urls.txt
235
+
236
+ # Include subdomains
237
+ firecrawl map https://example.com --include-subdomains --limit 1000
238
+ ```
239
+
240
+ ---
241
+
242
+ ### `search` - Search the web
243
+
244
+ Search the web and optionally scrape content from search results.
245
+
246
+ ```bash
247
+ # Basic search
248
+ firecrawl search "firecrawl web scraping"
249
+
250
+ # Limit results
251
+ firecrawl search "AI news" --limit 10
252
+
253
+ # Search news sources
254
+ firecrawl search "tech startups" --sources news
255
+
256
+ # Search images
257
+ firecrawl search "landscape photography" --sources images
258
+
259
+ # Multiple sources
260
+ firecrawl search "machine learning" --sources web,news,images
261
+
262
+ # Filter by category (GitHub, research papers, PDFs)
263
+ firecrawl search "web scraping python" --categories github
264
+ firecrawl search "transformer architecture" --categories research
265
+ firecrawl search "machine learning" --categories github,research
266
+
267
+ # Time-based search
268
+ firecrawl search "AI announcements" --tbs qdr:d # Past day
269
+ firecrawl search "tech news" --tbs qdr:w # Past week
270
+
271
+ # Location-based search
272
+ firecrawl search "restaurants" --location "San Francisco,California,United States"
273
+ firecrawl search "local news" --country DE
274
+
275
+ # Search and scrape results
276
+ firecrawl search "firecrawl tutorials" --scrape
277
+ firecrawl search "API documentation" --scrape --scrape-formats markdown,links
278
+
279
+ # Output as pretty JSON
280
+ firecrawl search "web scraping" -p
281
+ ```
282
+
283
+ #### Search Options
284
+
285
+ | Option | Description |
286
+ | ---------------------------- | ------------------------------------------------------------------------------------------- |
287
+ | `--limit <n>` | Maximum results (default: 5, max: 100) |
288
+ | `--sources <sources>` | Comma-separated: `web`, `images`, `news` (default: web) |
289
+ | `--categories <categories>` | Comma-separated: `github`, `research`, `pdf` |
290
+ | `--tbs <value>` | Time filter: `qdr:h` (hour), `qdr:d` (day), `qdr:w` (week), `qdr:m` (month), `qdr:y` (year) |
291
+ | `--location <location>` | Geo-targeting (e.g., "Germany", "San Francisco,California,United States") |
292
+ | `--country <code>` | ISO country code (default: US) |
293
+ | `--timeout <ms>` | Timeout in milliseconds (default: 60000) |
294
+ | `--ignore-invalid-urls` | Exclude URLs invalid for other Firecrawl endpoints |
295
+ | `--scrape` | Enable scraping of search results |
296
+ | `--scrape-formats <formats>` | Scrape formats when `--scrape` enabled (default: markdown) |
297
+ | `--only-main-content` | Include only main content when scraping (default: true) |
298
+ | `-p, --pretty` | Output as pretty JSON (default is human-readable text) |
299
+ | `-o, --output <path>` | Save to file |
300
+ | `--json` | Output as compact JSON (use `-p` for pretty JSON) |
301
+
302
+ #### Examples
303
+
304
+ ```bash
305
+ # Research a topic with recent results
306
+ firecrawl search "React Server Components" --tbs qdr:m --limit 10
307
+
308
+ # Find GitHub repositories
309
+ firecrawl search "web scraping library" --categories github --limit 20
310
+
311
+ # Search and get full content
312
+ firecrawl search "firecrawl documentation" --scrape --scrape-formats markdown -p -o results.json
313
+
314
+ # Find research papers
315
+ firecrawl search "large language models" --categories research -p
316
+
317
+ # Search with location targeting
318
+ firecrawl search "best coffee shops" --location "Berlin,Germany" --country DE
319
+
320
+ # Get news from the past week
321
+ firecrawl search "AI startups funding" --sources news --tbs qdr:w --limit 15
322
+ ```
323
+
324
+ ---
325
+
326
+ ### `credit-usage` - Check your credits
327
+
328
+ ```bash
329
+ # Show credit usage
330
+ firecrawl credit-usage
331
+
332
+ # Output as JSON
333
+ firecrawl credit-usage --json --pretty
334
+ ```
335
+
336
+ ---
337
+
338
+ ### `config` - View configuration
16
339
 
17
340
  ```bash
18
341
  firecrawl config
19
342
  ```
20
343
 
21
- Scrape a URL:
344
+ Shows authentication status and stored credentials location.
345
+
346
+ ---
347
+
348
+ ### `login` / `logout`
349
+
350
+ ```bash
351
+ # Login
352
+ firecrawl login
353
+ firecrawl login --method browser
354
+ firecrawl login --method manual
355
+ firecrawl login --api-key fc-xxx
356
+
357
+ # Logout
358
+ firecrawl logout
359
+ ```
360
+
361
+ ---
362
+
363
+ ## Global Options
364
+
365
+ These options work with any command:
366
+
367
+ | Option | Description |
368
+ | --------------------- | -------------------- |
369
+ | `-k, --api-key <key>` | Use specific API key |
370
+ | `-V, --version` | Show version |
371
+ | `-h, --help` | Show help |
372
+
373
+ ---
374
+
375
+ ## Output Handling
376
+
377
+ ### Stdout vs File
378
+
379
+ ```bash
380
+ # Output to stdout (default)
381
+ firecrawl https://example.com
382
+
383
+ # Pipe to another command
384
+ firecrawl https://example.com | head -50
385
+
386
+ # Save to file
387
+ firecrawl https://example.com -o output.md
388
+
389
+ # JSON output
390
+ firecrawl https://example.com --format links --pretty
391
+ ```
392
+
393
+ ### Format Behavior
394
+
395
+ - **Single format**: Outputs raw content (markdown text, HTML, etc.)
396
+ - **Multiple formats**: Outputs JSON with all requested data
22
397
 
23
398
  ```bash
24
- firecrawl https://firecrawl.dev
399
+ # Raw markdown output
400
+ firecrawl https://example.com --format markdown
401
+
402
+ # JSON output with multiple formats
403
+ firecrawl https://example.com --format markdown,links,images
25
404
  ```
26
405
 
27
- For detailed usage instructions, examples, and all available commands, visit the [CLI documentation](https://docs.firecrawl.dev/cli).
406
+ ---
407
+
408
+ ## Tips & Tricks
409
+
410
+ ### Scrape multiple URLs
411
+
412
+ ```bash
413
+ # Using a loop
414
+ for url in https://example.com/page1 https://example.com/page2; do
415
+ firecrawl "$url" -o "$(echo $url | sed 's/[^a-zA-Z0-9]/_/g').md"
416
+ done
417
+
418
+ # From a file
419
+ cat urls.txt | xargs -I {} firecrawl {} -o {}.md
420
+ ```
421
+
422
+ ### Combine with other tools
423
+
424
+ ```bash
425
+ # Extract links and process with jq
426
+ firecrawl https://example.com --format links | jq '.links[].url'
427
+
428
+ # Convert to PDF (with pandoc)
429
+ firecrawl https://example.com | pandoc -o document.pdf
430
+
431
+ # Search within scraped content
432
+ firecrawl https://example.com | grep -i "keyword"
433
+ ```
434
+
435
+ ### CI/CD Usage
436
+
437
+ ```bash
438
+ # Set API key via environment
439
+ export FIRECRAWL_API_KEY=${{ secrets.FIRECRAWL_API_KEY }}
440
+ firecrawl crawl https://docs.example.com --wait -o docs.json
441
+ ```
442
+
443
+ ---
444
+
445
+ ## Documentation
446
+
447
+ For more details, visit the [Firecrawl Documentation](https://docs.firecrawl.dev).
@@ -1,10 +1,19 @@
1
1
  /**
2
2
  * Config command implementation
3
- * Manages stored credentials and configuration
3
+ * Handles configuration and authentication
4
4
  */
5
+ export interface ConfigureOptions {
6
+ apiKey?: string;
7
+ apiUrl?: string;
8
+ webUrl?: string;
9
+ method?: 'browser' | 'manual';
10
+ }
5
11
  /**
6
- * Interactive configuration setup
7
- * Asks for API URL and API key
12
+ * Configure/login - triggers login flow when not authenticated
8
13
  */
9
- export declare function configure(): Promise<void>;
14
+ export declare function configure(options?: ConfigureOptions): Promise<void>;
15
+ /**
16
+ * View current configuration (read-only)
17
+ */
18
+ export declare function viewConfig(): Promise<void>;
10
19
  //# sourceMappingURL=config.d.ts.map
@@ -1 +1 @@
1
- {"version":3,"file":"config.d.ts","sourceRoot":"","sources":["../../src/commands/config.ts"],"names":[],"mappings":"AAAA;;;GAGG;AA6BH;;;GAGG;AACH,wBAAsB,SAAS,IAAI,OAAO,CAAC,IAAI,CAAC,CA8C/C"}
1
+ {"version":3,"file":"config.d.ts","sourceRoot":"","sources":["../../src/commands/config.ts"],"names":[],"mappings":"AAAA;;;GAGG;AAMH,MAAM,WAAW,gBAAgB;IAC/B,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,MAAM,CAAC,EAAE,SAAS,GAAG,QAAQ,CAAC;CAC/B;AAED;;GAEG;AACH,wBAAsB,SAAS,CAAC,OAAO,GAAE,gBAAqB,GAAG,OAAO,CAAC,IAAI,CAAC,CAmB7E;AAED;;GAEG;AACH,wBAAsB,UAAU,IAAI,OAAO,CAAC,IAAI,CAAC,CA0BhD"}
@@ -1,7 +1,7 @@
1
1
  "use strict";
2
2
  /**
3
3
  * Config command implementation
4
- * Manages stored credentials and configuration
4
+ * Handles configuration and authentication
5
5
  */
6
6
  var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
7
7
  if (k2 === undefined) k2 = k;
@@ -38,67 +38,56 @@ var __importStar = (this && this.__importStar) || (function () {
38
38
  })();
39
39
  Object.defineProperty(exports, "__esModule", { value: true });
40
40
  exports.configure = configure;
41
- const readline = __importStar(require("readline"));
41
+ exports.viewConfig = viewConfig;
42
42
  const credentials_1 = require("../utils/credentials");
43
43
  const config_1 = require("../utils/config");
44
- const DEFAULT_API_URL = 'https://api.firecrawl.dev';
44
+ const auth_1 = require("../utils/auth");
45
45
  /**
46
- * Prompt for input (for secure API key entry)
46
+ * Configure/login - triggers login flow when not authenticated
47
47
  */
48
- function promptInput(question, defaultValue) {
49
- const rl = readline.createInterface({
50
- input: process.stdin,
51
- output: process.stdout,
52
- });
53
- const promptText = defaultValue
54
- ? `${question} [${defaultValue}]: `
55
- : `${question} `;
56
- return new Promise((resolve) => {
57
- rl.question(promptText, (answer) => {
58
- rl.close();
59
- resolve(answer.trim() || defaultValue || '');
48
+ async function configure(options = {}) {
49
+ // If not authenticated, trigger the login flow
50
+ if (!(0, auth_1.isAuthenticated)() || options.apiKey || options.method) {
51
+ // Import handleLoginCommand to avoid circular dependency
52
+ const { handleLoginCommand } = await Promise.resolve().then(() => __importStar(require('./login')));
53
+ await handleLoginCommand({
54
+ apiKey: options.apiKey,
55
+ apiUrl: options.apiUrl,
56
+ webUrl: options.webUrl,
57
+ method: options.method,
60
58
  });
61
- });
59
+ return;
60
+ }
61
+ // Already authenticated - show config and offer to re-authenticate
62
+ await viewConfig();
63
+ console.log('To re-authenticate, run: firecrawl logout && firecrawl config\n');
62
64
  }
63
65
  /**
64
- * Interactive configuration setup
65
- * Asks for API URL and API key
66
+ * View current configuration (read-only)
66
67
  */
67
- async function configure() {
68
- console.log('Firecrawl Configuration Setup\n');
69
- // Prompt for API URL with default
70
- let url = await promptInput('Enter API URL', DEFAULT_API_URL);
71
- // Ensure URL doesn't end with trailing slash
72
- url = url.replace(/\/$/, '');
73
- // Prompt for API key
74
- const key = await promptInput('Enter your Firecrawl API key: ');
75
- if (!key || key.trim().length === 0) {
76
- console.error('Error: API key cannot be empty');
77
- process.exit(1);
78
- }
79
- if (!url || url.trim().length === 0) {
80
- console.error('Error: API URL cannot be empty');
81
- process.exit(1);
82
- }
83
- // Normalize URL (remove trailing slash)
84
- const normalizedUrl = url.trim().replace(/\/$/, '');
85
- try {
86
- (0, credentials_1.saveCredentials)({
87
- apiKey: key.trim(),
88
- apiUrl: normalizedUrl,
89
- });
90
- console.log('\nāœ“ Configuration saved successfully');
91
- console.log(` API URL: ${normalizedUrl}`);
92
- console.log(` Stored in: ${(0, credentials_1.getConfigDirectoryPath)()}`);
93
- // Update global config
94
- (0, config_1.updateConfig)({
95
- apiKey: key.trim(),
96
- apiUrl: normalizedUrl,
97
- });
68
+ async function viewConfig() {
69
+ const credentials = (0, credentials_1.loadCredentials)();
70
+ const config = (0, config_1.getConfig)();
71
+ console.log('\nā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”');
72
+ console.log('│ Firecrawl Configuration │');
73
+ console.log('ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜\n');
74
+ if ((0, auth_1.isAuthenticated)()) {
75
+ const maskedKey = credentials?.apiKey
76
+ ? `${credentials.apiKey.substring(0, 6)}...${credentials.apiKey.slice(-4)}`
77
+ : 'Not set';
78
+ console.log('Status: āœ“ Authenticated\n');
79
+ console.log(`API Key: ${maskedKey}`);
80
+ console.log(`API URL: ${config.apiUrl || 'https://api.firecrawl.dev'}`);
81
+ console.log(`Config: ${(0, credentials_1.getConfigDirectoryPath)()}`);
82
+ console.log('\nCommands:');
83
+ console.log(' firecrawl logout Clear credentials');
84
+ console.log(' firecrawl config Re-authenticate');
98
85
  }
99
- catch (error) {
100
- console.error('Error saving configuration:', error instanceof Error ? error.message : 'Unknown error');
101
- process.exit(1);
86
+ else {
87
+ console.log('Status: Not authenticated\n');
88
+ console.log('Run any command to start authentication, or use:');
89
+ console.log(' firecrawl config Authenticate with browser or API key');
102
90
  }
91
+ console.log('');
103
92
  }
104
93
  //# sourceMappingURL=config.js.map
@@ -1 +1 @@
1
- {"version":3,"file":"config.js","sourceRoot":"","sources":["../../src/commands/config.ts"],"names":[],"mappings":";AAAA;;;GAGG;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAiCH,8BA8CC;AA7ED,mDAAqC;AACrC,sDAA+E;AAC/E,4CAA+C;AAE/C,MAAM,eAAe,GAAG,2BAA2B,CAAC;AAEpD;;GAEG;AACH,SAAS,WAAW,CAAC,QAAgB,EAAE,YAAqB;IAC1D,MAAM,EAAE,GAAG,QAAQ,CAAC,eAAe,CAAC;QAClC,KAAK,EAAE,OAAO,CAAC,KAAK;QACpB,MAAM,EAAE,OAAO,CAAC,MAAM;KACvB,CAAC,CAAC;IAEH,MAAM,UAAU,GAAG,YAAY;QAC7B,CAAC,CAAC,GAAG,QAAQ,KAAK,YAAY,KAAK;QACnC,CAAC,CAAC,GAAG,QAAQ,GAAG,CAAC;IAEnB,OAAO,IAAI,OAAO,CAAC,CAAC,OAAO,EAAE,EAAE;QAC7B,EAAE,CAAC,QAAQ,CAAC,UAAU,EAAE,CAAC,MAAM,EAAE,EAAE;YACjC,EAAE,CAAC,KAAK,EAAE,CAAC;YACX,OAAO,CAAC,MAAM,CAAC,IAAI,EAAE,IAAI,YAAY,IAAI,EAAE,CAAC,CAAC;QAC/C,CAAC,CAAC,CAAC;IACL,CAAC,CAAC,CAAC;AACL,CAAC;AAED;;;GAGG;AACI,KAAK,UAAU,SAAS;IAC7B,OAAO,CAAC,GAAG,CAAC,iCAAiC,CAAC,CAAC;IAE/C,kCAAkC;IAClC,IAAI,GAAG,GAAG,MAAM,WAAW,CAAC,eAAe,EAAE,eAAe,CAAC,CAAC;IAE9D,6CAA6C;IAC7C,GAAG,GAAG,GAAG,CAAC,OAAO,CAAC,KAAK,EAAE,EAAE,CAAC,CAAC;IAE7B,qBAAqB;IACrB,MAAM,GAAG,GAAG,MAAM,WAAW,CAAC,gCAAgC,CAAC,CAAC;IAEhE,IAAI,CAAC,GAAG,IAAI,GAAG,CAAC,IAAI,EAAE,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;QACpC,OAAO,CAAC,KAAK,CAAC,gCAAgC,CAAC,CAAC;QAChD,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;IAClB,CAAC;IAED,IAAI,CAAC,GAAG,IAAI,GAAG,CAAC,IAAI,EAAE,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;QACpC,OAAO,CAAC,KAAK,CAAC,gCAAgC,CAAC,CAAC;QAChD,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;IAClB,CAAC;IAED,wCAAwC;IACxC,MAAM,aAAa,GAAG,GAAG,CAAC,IAAI,EAAE,CAAC,OAAO,CAAC,KAAK,EAAE,EAAE,CAAC,CAAC;IAEpD,IAAI,CAAC;QACH,IAAA,6BAAe,EAAC;YACd,MAAM,EAAE,GAAG,CAAC,IAAI,EAAE;YAClB,MAAM,EAAE,aAAa;SACtB,CAAC,CAAC;QACH,OAAO,CAAC,GAAG,CAAC,sCAAsC,CAAC,CAAC;QACpD,OAAO,CAAC,GAAG,CAAC,cAAc,aAAa,EAAE,CAAC,CAAC;QAC3C,OAAO,CAAC,GAAG,CAAC,gBAAgB,IAAA,oCAAsB,GAAE,EAAE,CAAC,CAAC;QAExD,uBAAuB;QACvB,IAAA,qBAAY,EAAC;YACX,MAAM,EAAE,GAAG,CAAC,IAAI,EAAE;YAClB,MAAM,EAAE,aAAa;SACtB,CAAC,CAAC;IACL,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,OAAO,CAAC,KAAK,CACX,6BAA6B,EAC7B,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,eAAe,CACzD,CAAC;QACF,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;IAClB,CAAC;AACH,CAAC"}
1
+ {"version":3,"file":"config.js","sourceRoot":"","sources":["../../src/commands/config.ts"],"names":[],"mappings":";AAAA;;;GAGG;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAgBH,8BAmBC;AAKD,gCA0BC;AAhED,sDAA+E;AAC/E,4CAA4C;AAC5C,wCAAqE;AASrE;;GAEG;AACI,KAAK,UAAU,SAAS,CAAC,UAA4B,EAAE;IAC5D,+CAA+C;IAC/C,IAAI,CAAC,IAAA,sBAAe,GAAE,IAAI,OAAO,CAAC,MAAM,IAAI,OAAO,CAAC,MAAM,EAAE,CAAC;QAC3D,yDAAyD;QACzD,MAAM,EAAE,kBAAkB,EAAE,GAAG,wDAAa,SAAS,GAAC,CAAC;QACvD,MAAM,kBAAkB,CAAC;YACvB,MAAM,EAAE,OAAO,CAAC,MAAM;YACtB,MAAM,EAAE,OAAO,CAAC,MAAM;YACtB,MAAM,EAAE,OAAO,CAAC,MAAM;YACtB,MAAM,EAAE,OAAO,CAAC,MAAM;SACvB,CAAC,CAAC;QACH,OAAO;IACT,CAAC;IAED,mEAAmE;IACnE,MAAM,UAAU,EAAE,CAAC;IACnB,OAAO,CAAC,GAAG,CACT,iEAAiE,CAClE,CAAC;AACJ,CAAC;AAED;;GAEG;AACI,KAAK,UAAU,UAAU;IAC9B,MAAM,WAAW,GAAG,IAAA,6BAAe,GAAE,CAAC;IACtC,MAAM,MAAM,GAAG,IAAA,kBAAS,GAAE,CAAC;IAE3B,OAAO,CAAC,GAAG,CAAC,+CAA+C,CAAC,CAAC;IAC7D,OAAO,CAAC,GAAG,CAAC,6CAA6C,CAAC,CAAC;IAC3D,OAAO,CAAC,GAAG,CAAC,+CAA+C,CAAC,CAAC;IAE7D,IAAI,IAAA,sBAAe,GAAE,EAAE,CAAC;QACtB,MAAM,SAAS,GAAG,WAAW,EAAE,MAAM;YACnC,CAAC,CAAC,GAAG,WAAW,CAAC,MAAM,CAAC,SAAS,CAAC,CAAC,EAAE,CAAC,CAAC,MAAM,WAAW,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,EAAE;YAC3E,CAAC,CAAC,SAAS,CAAC;QAEd,OAAO,CAAC,GAAG,CAAC,2BAA2B,CAAC,CAAC;QACzC,OAAO,CAAC,GAAG,CAAC,aAAa,SAAS,EAAE,CAAC,CAAC;QACtC,OAAO,CAAC,GAAG,CAAC,aAAa,MAAM,CAAC,MAAM,IAAI,2BAA2B,EAAE,CAAC,CAAC;QACzE,OAAO,CAAC,GAAG,CAAC,aAAa,IAAA,oCAAsB,GAAE,EAAE,CAAC,CAAC;QACrD,OAAO,CAAC,GAAG,CAAC,aAAa,CAAC,CAAC;QAC3B,OAAO,CAAC,GAAG,CAAC,4CAA4C,CAAC,CAAC;QAC1D,OAAO,CAAC,GAAG,CAAC,0CAA0C,CAAC,CAAC;IAC1D,CAAC;SAAM,CAAC;QACN,OAAO,CAAC,GAAG,CAAC,6BAA6B,CAAC,CAAC;QAC3C,OAAO,CAAC,GAAG,CAAC,kDAAkD,CAAC,CAAC;QAChE,OAAO,CAAC,GAAG,CAAC,4DAA4D,CAAC,CAAC;IAC5E,CAAC;IACD,OAAO,CAAC,GAAG,CAAC,EAAE,CAAC,CAAC;AAClB,CAAC"}
@@ -0,0 +1,15 @@
1
+ /**
2
+ * Login command implementation
3
+ * Handles both manual API key entry and browser-based authentication
4
+ */
5
+ export interface LoginOptions {
6
+ apiKey?: string;
7
+ apiUrl?: string;
8
+ webUrl?: string;
9
+ method?: 'browser' | 'manual';
10
+ }
11
+ /**
12
+ * Main login command handler
13
+ */
14
+ export declare function handleLoginCommand(options?: LoginOptions): Promise<void>;
15
+ //# sourceMappingURL=login.d.ts.map
@@ -0,0 +1 @@
1
+ {"version":3,"file":"login.d.ts","sourceRoot":"","sources":["../../src/commands/login.ts"],"names":[],"mappings":"AAAA;;;GAGG;AAcH,MAAM,WAAW,YAAY;IAC3B,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,MAAM,CAAC,EAAE,SAAS,GAAG,QAAQ,CAAC;CAC/B;AAED;;GAEG;AACH,wBAAsB,kBAAkB,CACtC,OAAO,GAAE,YAAiB,GACzB,OAAO,CAAC,IAAI,CAAC,CA6Ef"}