@the-convocation/twitter-scraper 0.15.0 → 0.15.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -171,14 +171,13 @@ const scraper = new Scraper({
171
171
  ### Rate limiting
172
172
  The Twitter API heavily rate-limits clients, requiring that the scraper has its own
173
173
  rate-limit handling to behave predictably when rate-limiting occurs. By default, the
174
- scraper uses a rate-limiting strategy that rates for the current rate-limiting period
174
+ scraper uses a rate-limiting strategy that waits for the current rate-limiting period
175
175
  to expire before resuming requests.
176
176
 
177
177
  **This has been known to take a very long time, in some cases (up to 13 minutes).**
178
178
 
179
179
  You may want to change how rate-limiting events are handled, potentially by pooling
180
- scrapers logged-in to different accounts (approach currently out of scope for this
181
- README). The rate-limit handling strategy can be configured by passing a custom
180
+ scrapers logged-in to different accounts (refer to [#116](https://github.com/the-convocation/twitter-scraper/pull/116) for how to do this yourself). The rate-limit handling strategy can be configured by passing a custom
182
181
  implementation to the `rateLimitStrategy` option in the scraper constructor:
183
182
 
184
183
  ```ts
@@ -60,8 +60,8 @@ class WaitingRateLimitStrategy {
60
60
  }
61
61
  }
62
62
  class ErrorRateLimitStrategy {
63
- onRateLimit({ response: res }) {
64
- throw ApiError.fromResponse(res);
63
+ async onRateLimit({ response: res }) {
64
+ throw await ApiError.fromResponse(res);
65
65
  }
66
66
  }
67
67