@the-convocation/twitter-scraper 0.13.0 → 0.14.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (104) hide show
  1. package/README.md +71 -38
  2. package/dist/default/cjs/index.js +2126 -0
  3. package/dist/default/cjs/index.js.map +1 -0
  4. package/dist/default/esm/index.mjs +2104 -0
  5. package/dist/default/esm/index.mjs.map +1 -0
  6. package/dist/node/cjs/index.cjs +2156 -0
  7. package/dist/node/cjs/index.cjs.map +1 -0
  8. package/dist/node/esm/index.mjs +2134 -0
  9. package/dist/node/esm/index.mjs.map +1 -0
  10. package/dist/{scraper.d.ts → types/index.d.ts} +284 -8
  11. package/examples/cors-proxy/package.json +18 -0
  12. package/examples/node-integration/package.json +12 -0
  13. package/examples/react-integration/README.md +30 -0
  14. package/examples/react-integration/index.html +13 -0
  15. package/examples/react-integration/package.json +29 -0
  16. package/examples/react-integration/public/vite.svg +1 -0
  17. package/examples/react-integration/tsconfig.node.json +11 -0
  18. package/examples/react-integration/vite.config.ts +7 -0
  19. package/package.json +20 -3
  20. package/rollup.config.mjs +61 -0
  21. package/test-setup.js +2 -0
  22. package/dist/_module.d.ts +0 -6
  23. package/dist/_module.d.ts.map +0 -1
  24. package/dist/_module.js +0 -8
  25. package/dist/_module.js.map +0 -1
  26. package/dist/api-data.d.ts +0 -47
  27. package/dist/api-data.d.ts.map +0 -1
  28. package/dist/api-data.js +0 -84
  29. package/dist/api-data.js.map +0 -1
  30. package/dist/api.d.ts +0 -32
  31. package/dist/api.d.ts.map +0 -1
  32. package/dist/api.js +0 -138
  33. package/dist/api.js.map +0 -1
  34. package/dist/auth-user.d.ts +0 -23
  35. package/dist/auth-user.d.ts.map +0 -1
  36. package/dist/auth-user.js +0 -290
  37. package/dist/auth-user.js.map +0 -1
  38. package/dist/auth.d.ts +0 -82
  39. package/dist/auth.d.ts.map +0 -1
  40. package/dist/auth.js +0 -122
  41. package/dist/auth.js.map +0 -1
  42. package/dist/errors.d.ts +0 -28
  43. package/dist/errors.d.ts.map +0 -1
  44. package/dist/errors.js +0 -26
  45. package/dist/errors.js.map +0 -1
  46. package/dist/profile.d.ts +0 -80
  47. package/dist/profile.d.ts.map +0 -1
  48. package/dist/profile.js +0 -127
  49. package/dist/profile.js.map +0 -1
  50. package/dist/relationships.d.ts +0 -8
  51. package/dist/relationships.d.ts.map +0 -1
  52. package/dist/relationships.js +0 -93
  53. package/dist/relationships.js.map +0 -1
  54. package/dist/requests.d.ts +0 -9
  55. package/dist/requests.d.ts.map +0 -1
  56. package/dist/requests.js +0 -26
  57. package/dist/requests.js.map +0 -1
  58. package/dist/scraper.d.ts.map +0 -1
  59. package/dist/scraper.js +0 -357
  60. package/dist/scraper.js.map +0 -1
  61. package/dist/search.d.ts +0 -19
  62. package/dist/search.d.ts.map +0 -1
  63. package/dist/search.js +0 -99
  64. package/dist/search.js.map +0 -1
  65. package/dist/timeline-async.d.ts +0 -15
  66. package/dist/timeline-async.d.ts.map +0 -1
  67. package/dist/timeline-async.js +0 -53
  68. package/dist/timeline-async.js.map +0 -1
  69. package/dist/timeline-list.d.ts +0 -19
  70. package/dist/timeline-list.d.ts.map +0 -1
  71. package/dist/timeline-list.js +0 -46
  72. package/dist/timeline-list.js.map +0 -1
  73. package/dist/timeline-relationship.d.ts +0 -39
  74. package/dist/timeline-relationship.d.ts.map +0 -1
  75. package/dist/timeline-relationship.js +0 -46
  76. package/dist/timeline-relationship.js.map +0 -1
  77. package/dist/timeline-search.d.ts +0 -20
  78. package/dist/timeline-search.d.ts.map +0 -1
  79. package/dist/timeline-search.js +0 -93
  80. package/dist/timeline-search.js.map +0 -1
  81. package/dist/timeline-tweet-util.d.ts +0 -9
  82. package/dist/timeline-tweet-util.d.ts.map +0 -1
  83. package/dist/timeline-tweet-util.js +0 -108
  84. package/dist/timeline-tweet-util.js.map +0 -1
  85. package/dist/timeline-v1.d.ts +0 -233
  86. package/dist/timeline-v1.d.ts.map +0 -1
  87. package/dist/timeline-v1.js +0 -197
  88. package/dist/timeline-v1.js.map +0 -1
  89. package/dist/timeline-v2.d.ts +0 -94
  90. package/dist/timeline-v2.d.ts.map +0 -1
  91. package/dist/timeline-v2.js +0 -253
  92. package/dist/timeline-v2.js.map +0 -1
  93. package/dist/trends.d.ts +0 -3
  94. package/dist/trends.d.ts.map +0 -1
  95. package/dist/trends.js +0 -39
  96. package/dist/trends.js.map +0 -1
  97. package/dist/tweets.d.ts +0 -117
  98. package/dist/tweets.d.ts.map +0 -1
  99. package/dist/tweets.js +0 -202
  100. package/dist/tweets.js.map +0 -1
  101. package/dist/type-util.d.ts +0 -6
  102. package/dist/type-util.d.ts.map +0 -1
  103. package/dist/type-util.js +0 -14
  104. package/dist/type-util.js.map +0 -1
package/README.md CHANGED
@@ -1,26 +1,35 @@
1
1
  # twitter-scraper
2
+
2
3
  [![Documentation badge](https://img.shields.io/badge/docs-here-informational)](https://the-convocation.github.io/twitter-scraper/)
3
4
 
4
- A port of [n0madic/twitter-scraper](https://github.com/n0madic/twitter-scraper) to Node.js.
5
+ A port of [n0madic/twitter-scraper](https://github.com/n0madic/twitter-scraper)
6
+ to Node.js.
5
7
 
6
- > Twitter's API is annoying to work with, and has lots of limitations — luckily their frontend (JavaScript) has it's own API, which I reverse-engineered. No API rate limits. No tokens needed. No restrictions. Extremely fast.
8
+ > Twitter's API is annoying to work with, and has lots of limitations — luckily
9
+ > their frontend (JavaScript) has it's own API, which I reverse-engineered. No
10
+ > API rate limits. No tokens needed. No restrictions. Extremely fast.
7
11
  >
8
12
  > You can use this library to get the text of any user's Tweets trivially.
9
13
 
10
14
  Known limitations:
11
15
 
12
- * Search operations require logging in with a real user account via `scraper.login()`.
13
- * Twitter's frontend API does in fact have rate limits ([#11](https://github.com/the-convocation/twitter-scraper/issues/11))
16
+ - Search operations require logging in with a real user account via
17
+ `scraper.login()`.
18
+ - Twitter's frontend API does in fact have rate limits
19
+ ([#11](https://github.com/the-convocation/twitter-scraper/issues/11))
14
20
 
15
21
  ## Installation
22
+
16
23
  This package requires Node.js v16.0.0 or greater.
17
24
 
18
25
  NPM:
26
+
19
27
  ```sh
20
28
  npm install @the-convocation/twitter-scraper
21
29
  ```
22
30
 
23
31
  Yarn:
32
+
24
33
  ```sh
25
34
  yarn add @the-convocation/twitter-scraper
26
35
  ```
@@ -28,16 +37,23 @@ yarn add @the-convocation/twitter-scraper
28
37
  TypeScript types have been bundled with the distribution.
29
38
 
30
39
  ## Usage
31
- Most use cases are exactly the same as in [n0madic/twitter-scraper](https://github.com/n0madic/twitter-scraper).
32
- Channel iterators have been translated into [AsyncGenerator](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/AsyncGenerator)
33
- instances, and can be consumed with the corresponding `for await (const x of y) { ... }` syntax.
40
+
41
+ Most use cases are exactly the same as in
42
+ [n0madic/twitter-scraper](https://github.com/n0madic/twitter-scraper). Channel
43
+ iterators have been translated into
44
+ [AsyncGenerator](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/AsyncGenerator)
45
+ instances, and can be consumed with the corresponding
46
+ `for await (const x of y) { ... }` syntax.
34
47
 
35
48
  ### Browser usage
36
- This package directly invokes the Twitter API, which does not have permissive CORS headers. With the default
37
- settings, requests will fail unless you disable CORS checks, which is not advised. Instead, applications must
38
- provide a CORS proxy and configure it in the `Scraper` options.
39
49
 
40
- Proxies (and other request mutations) can be configured with the request interceptor transform:
50
+ This package directly invokes the Twitter API, which does not have permissive
51
+ CORS headers. With the default settings, requests will fail unless you disable
52
+ CORS checks, which is not advised. Instead, applications must provide a CORS
53
+ proxy and configure it in the `Scraper` options.
54
+
55
+ Proxies (and other request mutations) can be configured with the request
56
+ interceptor transform:
41
57
 
42
58
  ```ts
43
59
  const scraper = new Scraper({
@@ -46,13 +62,11 @@ const scraper = new Scraper({
46
62
  // The arguments here are the same as the parameters to fetch(), and
47
63
  // are kept as-is for flexibility of both the library and applications.
48
64
  if (input instanceof URL) {
49
- const proxy =
50
- "https://corsproxy.io/?" +
65
+ const proxy = "https://corsproxy.io/?" +
51
66
  encodeURIComponent(input.toString());
52
67
  return [proxy, init];
53
68
  } else if (typeof input === "string") {
54
- const proxy =
55
- "https://corsproxy.io/?" + encodeURIComponent(input);
69
+ const proxy = "https://corsproxy.io/?" + encodeURIComponent(input);
56
70
  return [proxy, init];
57
71
  } else {
58
72
  // Omitting handling for example
@@ -63,12 +77,15 @@ const scraper = new Scraper({
63
77
  });
64
78
  ```
65
79
 
66
- [corsproxy.io](https://corsproxy.io) is a public CORS proxy that works correctly with this package.
80
+ [corsproxy.io](https://corsproxy.io) is a public CORS proxy that works correctly
81
+ with this package.
67
82
 
68
- The public CORS proxy [corsproxy.org](https://corsproxy.org) *does not work* at the time of writing (at least
69
- not using their recommended integration on the front page).
83
+ The public CORS proxy [corsproxy.org](https://corsproxy.org) _does not work_ at
84
+ the time of writing (at least not using their recommended integration on the
85
+ front page).
70
86
 
71
87
  #### Next.js 13.x example:
88
+
72
89
  ```tsx
73
90
  "use client";
74
91
 
@@ -82,13 +99,12 @@ export default function Home() {
82
99
  transform: {
83
100
  request(input: RequestInfo | URL, init?: RequestInit) {
84
101
  if (input instanceof URL) {
85
- const proxy =
86
- "https://corsproxy.io/?" +
102
+ const proxy = "https://corsproxy.io/?" +
87
103
  encodeURIComponent(input.toString());
88
104
  return [proxy, init];
89
105
  } else if (typeof input === "string") {
90
- const proxy =
91
- "https://corsproxy.io/?" + encodeURIComponent(input);
106
+ const proxy = "https://corsproxy.io/?" +
107
+ encodeURIComponent(input);
92
108
  return [proxy, init];
93
109
  } else {
94
110
  throw new Error("Unexpected request input type");
@@ -120,19 +136,23 @@ export default function Home() {
120
136
  ```
121
137
 
122
138
  ### Edge runtimes
123
- This package currently uses [`cross-fetch`](https://www.npmjs.com/package/cross-fetch) as a portable `fetch`.
124
- Edge runtimes such as CloudFlare Workers sometimes have `fetch` functions that behave differently from the web
125
- standard, so you may need to override the `fetch` function the scraper uses. If so, a custom `fetch` can be
139
+
140
+ This package currently uses
141
+ [`cross-fetch`](https://www.npmjs.com/package/cross-fetch) as a portable
142
+ `fetch`. Edge runtimes such as CloudFlare Workers sometimes have `fetch`
143
+ functions that behave differently from the web standard, so you may need to
144
+ override the `fetch` function the scraper uses. If so, a custom `fetch` can be
126
145
  provided in the options:
127
146
 
128
147
  ```ts
129
148
  const scraper = new Scraper({
130
- fetch: fetch
149
+ fetch: fetch,
131
150
  });
132
151
  ```
133
152
 
134
- Note that this does not change the arguments passed to the function, or the expected return type. If the custom
135
- `fetch` function produces runtime errors related to incorrect types, be sure to wrap it in a shim (not currently
153
+ Note that this does not change the arguments passed to the function, or the
154
+ expected return type. If the custom `fetch` function produces runtime errors
155
+ related to incorrect types, be sure to wrap it in a shim (not currently
136
156
  supported directly by interceptors):
137
157
 
138
158
  ```ts
@@ -151,26 +171,37 @@ const scraper = new Scraper({
151
171
  ## Contributing
152
172
 
153
173
  ### Setup
154
- This project currently targets Node 16.x and uses Yarn for package management. [Corepack](https://nodejs.org/dist/latest-v16.x/docs/api/corepack.html)
155
- is configured for this project, so you don't need to install a particular package manager version manually.
156
174
 
157
- Just run `corepack enable` to turn on the shims, then run `yarn` to install the dependencies.
175
+ This project currently requires Node 18.x for development and uses Yarn for
176
+ package management.
177
+ [Corepack](https://nodejs.org/dist/latest-v18.x/docs/api/corepack.html) is
178
+ configured for this project, so you don't need to install a particular package
179
+ manager version manually.
180
+
181
+ > The project supports Node 16.x at runtime, but requires Node 18.x to run its
182
+ > build tools.
183
+
184
+ Just run `corepack enable` to turn on the shims, then run `yarn` to install the
185
+ dependencies.
158
186
 
159
187
  #### Basic scripts
160
- * `yarn build`: Builds the project into the `dist` folder
161
- * `yarn test`: Runs the package tests (see [Testing](#testing) first)
188
+
189
+ - `yarn build`: Builds the project into the `dist` folder
190
+ - `yarn test`: Runs the package tests (see [Testing](#testing) first)
162
191
 
163
192
  Run `yarn help` for general `yarn` usage information.
164
193
 
165
194
  ### Testing
166
- This package includes unit tests for all major functionality. Given the speed at which Twitter's private API
167
- changes, failing tests are to be expected.
195
+
196
+ This package includes unit tests for all major functionality. Given the speed at
197
+ which Twitter's private API changes, failing tests are to be expected.
168
198
 
169
199
  ```sh
170
200
  yarn test
171
201
  ```
172
202
 
173
- Before running tests, you should configure environment variables for authentication.
203
+ Before running tests, you should configure environment variables for
204
+ authentication.
174
205
 
175
206
  ```
176
207
  TWITTER_USERNAME= # Account username
@@ -181,5 +212,7 @@ PROXY_URL= # HTTP(s) proxy for requests (optional)
181
212
  ```
182
213
 
183
214
  ### Commit message format
184
- We use [Conventional Commits](https://www.conventionalcommits.org), and enforce this with precommit checks.
185
- Please refer to the Git history for real examples of the commit message format.
215
+
216
+ We use [Conventional Commits](https://www.conventionalcommits.org), and enforce
217
+ this with precommit checks. Please refer to the Git history for real examples of
218
+ the commit message format.