firecrawl 1.7.1 → 1.7.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -147,7 +147,7 @@ watch.addEventListener("done", state => {
147
147
 
148
148
  ### Batch scraping multiple URLs
149
149
 
150
- To batch scrape multiple URLs with error handling, use the `batchScrapeUrls` method. It takes the starting URLs and optional parameters as arguments. The `params` argument allows you to specify additional options for the crawl job, such as the output formats.
150
+ To batch scrape multiple URLs with error handling, use the `batchScrapeUrls` method. It takes the starting URLs and optional parameters as arguments. The `params` argument allows you to specify additional options for the batch scrape job, such as the output formats.
151
151
 
152
152
  ```js
153
153
  const batchScrapeResponse = await app.batchScrapeUrls(['https://firecrawl.dev', 'https://mendable.ai'], {
@@ -158,10 +158,10 @@ const batchScrapeResponse = await app.batchScrapeUrls(['https://firecrawl.dev',
158
158
 
159
159
  #### Asynchronous batch scrape
160
160
 
161
- To initiate an asynchronous batch scrape, utilize the `asyncBulkScrapeUrls` method. This method requires the starting URLs and optional parameters as inputs. The params argument enables you to define various settings for the scrape, such as the output formats. Upon successful initiation, this method returns an ID, which is essential for subsequently checking the status of the batch scrape.
161
+ To initiate an asynchronous batch scrape, utilize the `asyncBatchScrapeUrls` method. This method requires the starting URLs and optional parameters as inputs. The params argument enables you to define various settings for the scrape, such as the output formats. Upon successful initiation, this method returns an ID, which is essential for subsequently checking the status of the batch scrape.
162
162
 
163
163
  ```js
164
- const asyncBulkScrapeResult = await app.asyncBulkScrapeUrls(['https://firecrawl.dev', 'https://mendable.ai'], { formats: ['markdown', 'html'] });
164
+ const asyncBatchScrapeResult = await app.asyncBatchScrapeUrls(['https://firecrawl.dev', 'https://mendable.ai'], { formats: ['markdown', 'html'] });
165
165
  ```
166
166
 
167
167
  #### Batch scrape with WebSockets
package/dist/index.d.cts CHANGED
@@ -77,6 +77,10 @@ interface CrawlScrapeOptions {
77
77
  onlyMainContent?: boolean;
78
78
  waitFor?: number;
79
79
  timeout?: number;
80
+ location?: {
81
+ country?: string;
82
+ languages?: string[];
83
+ };
80
84
  }
81
85
  type Action = {
82
86
  type: "wait";
package/dist/index.d.ts CHANGED
@@ -77,6 +77,10 @@ interface CrawlScrapeOptions {
77
77
  onlyMainContent?: boolean;
78
78
  waitFor?: number;
79
79
  timeout?: number;
80
+ location?: {
81
+ country?: string;
82
+ languages?: string[];
83
+ };
80
84
  }
81
85
  type Action = {
82
86
  type: "wait";
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "firecrawl",
3
- "version": "1.7.1",
3
+ "version": "1.7.2",
4
4
  "description": "JavaScript SDK for Firecrawl API",
5
5
  "main": "dist/index.js",
6
6
  "types": "dist/index.d.ts",
package/src/index.ts CHANGED
@@ -82,6 +82,10 @@ export interface CrawlScrapeOptions {
82
82
  onlyMainContent?: boolean;
83
83
  waitFor?: number;
84
84
  timeout?: number;
85
+ location?: {
86
+ country?: string;
87
+ languages?: string[];
88
+ };
85
89
  }
86
90
 
87
91
  export type Action = {