@scrapi.ai/mcp-server 2.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/LICENSE +21 -0
- package/README.md +715 -0
- package/dist/http.d.ts +3 -0
- package/dist/http.d.ts.map +1 -0
- package/dist/http.js +89 -0
- package/dist/http.js.map +1 -0
- package/dist/index.d.ts +3 -0
- package/dist/index.d.ts.map +1 -0
- package/dist/index.js +14 -0
- package/dist/index.js.map +1 -0
- package/dist/server.d.ts +3 -0
- package/dist/server.d.ts.map +1 -0
- package/dist/server.js +19 -0
- package/dist/server.js.map +1 -0
- package/dist/tools/get-billing.d.ts +3 -0
- package/dist/tools/get-billing.d.ts.map +1 -0
- package/dist/tools/get-billing.js +179 -0
- package/dist/tools/get-billing.js.map +1 -0
- package/dist/tools/get-usage.d.ts +3 -0
- package/dist/tools/get-usage.d.ts.map +1 -0
- package/dist/tools/get-usage.js +96 -0
- package/dist/tools/get-usage.js.map +1 -0
- package/dist/tools/scrape-url.d.ts +3 -0
- package/dist/tools/scrape-url.d.ts.map +1 -0
- package/dist/tools/scrape-url.js +77 -0
- package/dist/tools/scrape-url.js.map +1 -0
- package/dist/tools/scrape-urls.d.ts +3 -0
- package/dist/tools/scrape-urls.d.ts.map +1 -0
- package/dist/tools/scrape-urls.js +96 -0
- package/dist/tools/scrape-urls.js.map +1 -0
- package/dist/tools/scraper-server-status.d.ts +3 -0
- package/dist/tools/scraper-server-status.d.ts.map +1 -0
- package/dist/tools/scraper-server-status.js +87 -0
- package/dist/tools/scraper-server-status.js.map +1 -0
- package/dist/utils/api.d.ts +150 -0
- package/dist/utils/api.d.ts.map +1 -0
- package/dist/utils/api.js +142 -0
- package/dist/utils/api.js.map +1 -0
- package/package.json +63 -0
package/README.md
ADDED
|
@@ -0,0 +1,715 @@
|
|
|
1
|
+
# ๐ Scrapi MCP Server
|
|
2
|
+
|
|
3
|
+
[ํ๊ตญ์ด](README-KO.md)
|
|
4
|
+
|
|
5
|
+
> MCP server that converts URLs to clean Markdown/Text for LLM agents
|
|
6
|
+
|
|
7
|
+
[](https://opensource.org/licenses/MIT)
|
|
8
|
+
|
|
9
|
+
**โก Fast & Reliable** โ Built on 7+ years of web scraping expertise, 1,900+ production crawlers, and battle-tested anti-bot handling.
|
|
10
|
+
|
|
11
|
+
## What is this?
|
|
12
|
+
|
|
13
|
+
An [MCP (Model Context Protocol)](https://modelcontextprotocol.io/) server that lets AI agents fetch and read web pages. Simply give it a URL, and it returns clean, LLM-ready content โ fast.
|
|
14
|
+
|
|
15
|
+
**Before:** AI can't read web pages directly
|
|
16
|
+
**After:** "Summarize this article" just works โจ
|
|
17
|
+
|
|
18
|
+
---
|
|
19
|
+
|
|
20
|
+
## Features
|
|
21
|
+
|
|
22
|
+
- ๐ **URL โ Markdown**: Preserves headings, lists, links
|
|
23
|
+
- ๐ **URL โ Text**: Plain text extraction
|
|
24
|
+
- ๐ท๏ธ **Metadata**: Title, author, date, images
|
|
25
|
+
- ๐งน **Clean Output**: No ads, no navigation, no scripts
|
|
26
|
+
- โก **JavaScript Rendering**: Works with SPAs
|
|
27
|
+
- ๐ณ **Built-in Billing**: Credit tracking, subscription management, usage analytics (MCP keys)
|
|
28
|
+
- ๐ **Auto-Retry**: 429 rate limit responses automatically retried with Retry-After
|
|
29
|
+
- ๐ **Dual Transport**: Stdio (npx) + Streamable HTTP for flexible deployment
|
|
30
|
+
|
|
31
|
+
---
|
|
32
|
+
|
|
33
|
+
## Transport Modes
|
|
34
|
+
|
|
35
|
+
Scrapi MCP Server supports two transport modes:
|
|
36
|
+
|
|
37
|
+
| Mode | Best For | Node.js Required |
|
|
38
|
+
|------|----------|-----------------|
|
|
39
|
+
| **Stdio** | Claude Desktop, Cursor, Cline, Claude Code | Yes (auto via npx) |
|
|
40
|
+
| **Streamable HTTP** | All clients, Node.js-free environments | No |
|
|
41
|
+
|
|
42
|
+
---
|
|
43
|
+
|
|
44
|
+
## Prerequisites
|
|
45
|
+
|
|
46
|
+
- [Scrapi MCP](https://scrapi.ai) account (separate from the main Scrapi account)
|
|
47
|
+
- Claude Desktop, Cline, or Cursor installed
|
|
48
|
+
- Node.js 20+
|
|
49
|
+
|
|
50
|
+
---
|
|
51
|
+
|
|
52
|
+
## Installation
|
|
53
|
+
|
|
54
|
+
### Option A: npx (Recommended)
|
|
55
|
+
|
|
56
|
+
No installation needed. Just configure your MCP client to use `npx`.
|
|
57
|
+
|
|
58
|
+
```json
|
|
59
|
+
{
|
|
60
|
+
"mcpServers": {
|
|
61
|
+
"scrapi": {
|
|
62
|
+
"command": "npx",
|
|
63
|
+
"args": ["-y", "@scrapi.ai/mcp-server"],
|
|
64
|
+
"env": {
|
|
65
|
+
"SCRAPI_API_KEY": "your-api-key"
|
|
66
|
+
}
|
|
67
|
+
}
|
|
68
|
+
}
|
|
69
|
+
}
|
|
70
|
+
```
|
|
71
|
+
|
|
72
|
+
> See [Step 2](#step-2-configure-mcp-server) for where to put this configuration.
|
|
73
|
+
|
|
74
|
+
### Option B: Install from Source
|
|
75
|
+
|
|
76
|
+
```bash
|
|
77
|
+
# Clone the repository
|
|
78
|
+
git clone https://github.com/bamchi/scrapi-mcp-server.git
|
|
79
|
+
cd scrapi-mcp-server
|
|
80
|
+
|
|
81
|
+
# Install dependencies and build
|
|
82
|
+
npm install && npm run build
|
|
83
|
+
```
|
|
84
|
+
|
|
85
|
+
---
|
|
86
|
+
|
|
87
|
+
## Step 1: Get Your API Key
|
|
88
|
+
|
|
89
|
+
1. Go to [https://scrapi.ai](https://scrapi.ai)
|
|
90
|
+
2. Sign up or log in
|
|
91
|
+
3. Visit the [MCP Dashboard](https://scrapi.ai/dashboard) โ your Free plan (500 credits/month) and API key are created automatically
|
|
92
|
+
4. Copy your `hsmcp_` API key
|
|
93
|
+
|
|
94
|
+
---
|
|
95
|
+
|
|
96
|
+
## Step 2: Configure MCP Server
|
|
97
|
+
|
|
98
|
+
### Claude Desktop
|
|
99
|
+
|
|
100
|
+
**Option A: Via Settings (Recommended)**
|
|
101
|
+
|
|
102
|
+
1. Open Claude Desktop
|
|
103
|
+
2. Click Settings (gear icon, bottom left)
|
|
104
|
+
3. Select Developer tab
|
|
105
|
+
4. Click "Edit Config" button
|
|
106
|
+
5. Add the mcpServers configuration (see below)
|
|
107
|
+
6. Save and restart Claude Desktop (Cmd+Q, then reopen)
|
|
108
|
+
|
|
109
|
+
**Option B: Edit config file directly**
|
|
110
|
+
|
|
111
|
+
- macOS: `~/Library/Application Support/Claude/claude_desktop_config.json`
|
|
112
|
+
- Windows: `%APPDATA%\Claude\claude_desktop_config.json`
|
|
113
|
+
|
|
114
|
+
**Configuration (npx):**
|
|
115
|
+
|
|
116
|
+
```json
|
|
117
|
+
{
|
|
118
|
+
"mcpServers": {
|
|
119
|
+
"scrapi": {
|
|
120
|
+
"command": "npx",
|
|
121
|
+
"args": ["-y", "@scrapi.ai/mcp-server"],
|
|
122
|
+
"env": {
|
|
123
|
+
"SCRAPI_API_KEY": "your-api-key"
|
|
124
|
+
}
|
|
125
|
+
}
|
|
126
|
+
}
|
|
127
|
+
}
|
|
128
|
+
```
|
|
129
|
+
|
|
130
|
+
**Configuration (from source):**
|
|
131
|
+
|
|
132
|
+
```json
|
|
133
|
+
{
|
|
134
|
+
"mcpServers": {
|
|
135
|
+
"scrapi": {
|
|
136
|
+
"command": "node",
|
|
137
|
+
"args": ["/absolute/path/to/scrapi-mcp-server/dist/index.js"],
|
|
138
|
+
"env": {
|
|
139
|
+
"SCRAPI_API_KEY": "your-api-key"
|
|
140
|
+
}
|
|
141
|
+
}
|
|
142
|
+
}
|
|
143
|
+
}
|
|
144
|
+
```
|
|
145
|
+
|
|
146
|
+
> Note: Replace `/absolute/path/to/` with the actual path where you cloned the repository.
|
|
147
|
+
|
|
148
|
+
### Cline
|
|
149
|
+
|
|
150
|
+
Config file location:
|
|
151
|
+
- macOS: `~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json`
|
|
152
|
+
- Windows: `%APPDATA%\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json`
|
|
153
|
+
|
|
154
|
+
**Configuration (npx):**
|
|
155
|
+
|
|
156
|
+
```json
|
|
157
|
+
{
|
|
158
|
+
"mcpServers": {
|
|
159
|
+
"scrapi": {
|
|
160
|
+
"command": "npx",
|
|
161
|
+
"args": ["-y", "@scrapi.ai/mcp-server"],
|
|
162
|
+
"env": {
|
|
163
|
+
"SCRAPI_API_KEY": "your-api-key"
|
|
164
|
+
}
|
|
165
|
+
}
|
|
166
|
+
}
|
|
167
|
+
}
|
|
168
|
+
```
|
|
169
|
+
|
|
170
|
+
**Configuration (from source):**
|
|
171
|
+
|
|
172
|
+
```json
|
|
173
|
+
{
|
|
174
|
+
"mcpServers": {
|
|
175
|
+
"scrapi": {
|
|
176
|
+
"command": "node",
|
|
177
|
+
"args": ["/absolute/path/to/scrapi-mcp-server/dist/index.js"],
|
|
178
|
+
"env": {
|
|
179
|
+
"SCRAPI_API_KEY": "your-api-key"
|
|
180
|
+
}
|
|
181
|
+
}
|
|
182
|
+
}
|
|
183
|
+
}
|
|
184
|
+
```
|
|
185
|
+
|
|
186
|
+
### Cursor
|
|
187
|
+
|
|
188
|
+
Create or edit `.cursor/mcp.json` in your project root:
|
|
189
|
+
|
|
190
|
+
**Configuration (npx):**
|
|
191
|
+
|
|
192
|
+
```json
|
|
193
|
+
{
|
|
194
|
+
"mcpServers": {
|
|
195
|
+
"scrapi": {
|
|
196
|
+
"command": "npx",
|
|
197
|
+
"args": ["-y", "@scrapi.ai/mcp-server"],
|
|
198
|
+
"env": {
|
|
199
|
+
"SCRAPI_API_KEY": "your-api-key"
|
|
200
|
+
}
|
|
201
|
+
}
|
|
202
|
+
}
|
|
203
|
+
}
|
|
204
|
+
```
|
|
205
|
+
|
|
206
|
+
**Configuration (from source):**
|
|
207
|
+
|
|
208
|
+
```json
|
|
209
|
+
{
|
|
210
|
+
"mcpServers": {
|
|
211
|
+
"scrapi": {
|
|
212
|
+
"command": "node",
|
|
213
|
+
"args": ["/absolute/path/to/scrapi-mcp-server/dist/index.js"],
|
|
214
|
+
"env": {
|
|
215
|
+
"SCRAPI_API_KEY": "your-api-key"
|
|
216
|
+
}
|
|
217
|
+
}
|
|
218
|
+
}
|
|
219
|
+
}
|
|
220
|
+
```
|
|
221
|
+
|
|
222
|
+
### Claude Code
|
|
223
|
+
|
|
224
|
+
Edit `~/.claude.json` or project `.mcp.json`:
|
|
225
|
+
|
|
226
|
+
**Configuration (npx):**
|
|
227
|
+
|
|
228
|
+
```json
|
|
229
|
+
{
|
|
230
|
+
"mcpServers": {
|
|
231
|
+
"scrapi": {
|
|
232
|
+
"command": "npx",
|
|
233
|
+
"args": ["-y", "@scrapi.ai/mcp-server"],
|
|
234
|
+
"env": {
|
|
235
|
+
"SCRAPI_API_KEY": "your-api-key"
|
|
236
|
+
}
|
|
237
|
+
}
|
|
238
|
+
}
|
|
239
|
+
}
|
|
240
|
+
```
|
|
241
|
+
|
|
242
|
+
### Streamable HTTP
|
|
243
|
+
|
|
244
|
+
Connect via Streamable HTTP โ no Node.js installation needed on the client side.
|
|
245
|
+
|
|
246
|
+
**Cursor** (`.cursor/mcp.json`):
|
|
247
|
+
|
|
248
|
+
```json
|
|
249
|
+
{
|
|
250
|
+
"mcpServers": {
|
|
251
|
+
"scrapi": {
|
|
252
|
+
"url": "https://scrapi.ai/api",
|
|
253
|
+
"headers": {
|
|
254
|
+
"X-API-Key": "your-api-key"
|
|
255
|
+
}
|
|
256
|
+
}
|
|
257
|
+
}
|
|
258
|
+
}
|
|
259
|
+
```
|
|
260
|
+
|
|
261
|
+
**Claude Code** (CLI):
|
|
262
|
+
|
|
263
|
+
```bash
|
|
264
|
+
claude mcp add --transport http scrapi https://scrapi.ai/api \
|
|
265
|
+
--header "X-API-Key: your-api-key"
|
|
266
|
+
```
|
|
267
|
+
|
|
268
|
+
**Cline** (`cline_mcp_settings.json`):
|
|
269
|
+
|
|
270
|
+
```json
|
|
271
|
+
{
|
|
272
|
+
"mcpServers": {
|
|
273
|
+
"scrapi": {
|
|
274
|
+
"type": "streamableHttp",
|
|
275
|
+
"url": "https://scrapi.ai/api",
|
|
276
|
+
"headers": {
|
|
277
|
+
"X-API-Key": "your-api-key"
|
|
278
|
+
}
|
|
279
|
+
}
|
|
280
|
+
}
|
|
281
|
+
}
|
|
282
|
+
```
|
|
283
|
+
|
|
284
|
+
**Claude Desktop** (`claude_desktop_config.json`):
|
|
285
|
+
|
|
286
|
+
```json
|
|
287
|
+
{
|
|
288
|
+
"mcpServers": {
|
|
289
|
+
"scrapi": {
|
|
290
|
+
"command": "npx",
|
|
291
|
+
"args": [
|
|
292
|
+
"mcp-remote",
|
|
293
|
+
"https://scrapi.ai/api",
|
|
294
|
+
"--header",
|
|
295
|
+
"X-API-Key: your-api-key"
|
|
296
|
+
]
|
|
297
|
+
}
|
|
298
|
+
}
|
|
299
|
+
}
|
|
300
|
+
```
|
|
301
|
+
|
|
302
|
+
> Note: Claude Desktop requires the [mcp-remote](https://www.npmjs.com/package/mcp-remote) proxy for HTTP connections.
|
|
303
|
+
|
|
304
|
+
<details>
|
|
305
|
+
<summary>Self-host the HTTP server (advanced)</summary>
|
|
306
|
+
|
|
307
|
+
Run your own instance instead of using the hosted endpoint:
|
|
308
|
+
|
|
309
|
+
```bash
|
|
310
|
+
SCRAPI_API_KEY=your-api-key npx -y -p @scrapi.ai/mcp-server scrapi-http
|
|
311
|
+
# or from source:
|
|
312
|
+
SCRAPI_API_KEY=your-api-key node dist/http.js
|
|
313
|
+
```
|
|
314
|
+
|
|
315
|
+
The server starts at `http://localhost:3000/api`. Configure with `PORT` and `HOST` environment variables. Replace the URL in the client configurations above with your self-hosted URL.
|
|
316
|
+
|
|
317
|
+
**Health check:** `GET http://localhost:3000/health`
|
|
318
|
+
|
|
319
|
+
</details>
|
|
320
|
+
|
|
321
|
+
---
|
|
322
|
+
|
|
323
|
+
## Step 3: Restart Your AI Client
|
|
324
|
+
|
|
325
|
+
- **Claude Desktop**: Fully quit (Cmd+Q on macOS, Alt+F4 on Windows) and reopen
|
|
326
|
+
- **Claude Code**: Restart the session
|
|
327
|
+
- **Cline**: Restart VS Code
|
|
328
|
+
- **Cursor**: Restart the editor
|
|
329
|
+
|
|
330
|
+
You should see the MCP server connection indicator.
|
|
331
|
+
|
|
332
|
+
---
|
|
333
|
+
|
|
334
|
+
## Available Tools
|
|
335
|
+
|
|
336
|
+
### `scrape_url`
|
|
337
|
+
|
|
338
|
+
Scrapes a webpage and returns AI-readable content.
|
|
339
|
+
|
|
340
|
+
**Parameters:**
|
|
341
|
+
|
|
342
|
+
| Name | Type | Required | Description |
|
|
343
|
+
| -------- | ------ | -------- | ---------------------------------------- |
|
|
344
|
+
| `url` | string | โ
| URL to scrape |
|
|
345
|
+
| `format` | string | | `markdown` (default) or `text` |
|
|
346
|
+
|
|
347
|
+
**Example:**
|
|
348
|
+
|
|
349
|
+
```json
|
|
350
|
+
{
|
|
351
|
+
"url": "https://example.com/article",
|
|
352
|
+
"format": "markdown"
|
|
353
|
+
}
|
|
354
|
+
```
|
|
355
|
+
|
|
356
|
+
**Markdown Output:**
|
|
357
|
+
|
|
358
|
+
```markdown
|
|
359
|
+
# Article Title
|
|
360
|
+
|
|
361
|
+
> Author: John Doe | Published: 2024-01-15
|
|
362
|
+
|
|
363
|
+
## Introduction
|
|
364
|
+
|
|
365
|
+
This is the main content of the article, converted to clean markdown...
|
|
366
|
+
|
|
367
|
+
## Key Points
|
|
368
|
+
|
|
369
|
+
- Point 1: Important detail
|
|
370
|
+
- Point 2: Another insight
|
|
371
|
+
- [Related Link](https://example.com/related)
|
|
372
|
+
```
|
|
373
|
+
|
|
374
|
+
**Text Output:**
|
|
375
|
+
|
|
376
|
+
```text
|
|
377
|
+
Article Title
|
|
378
|
+
|
|
379
|
+
Author: John Doe | Published: 2024-01-15
|
|
380
|
+
|
|
381
|
+
Introduction
|
|
382
|
+
|
|
383
|
+
This is the main content of the article, converted to plain text...
|
|
384
|
+
|
|
385
|
+
Key Points
|
|
386
|
+
|
|
387
|
+
- Point 1: Important detail
|
|
388
|
+
- Point 2: Another insight
|
|
389
|
+
```
|
|
390
|
+
|
|
391
|
+
### `scrape_urls`
|
|
392
|
+
|
|
393
|
+
Scrapes multiple webpages in parallel and returns AI-readable content.
|
|
394
|
+
|
|
395
|
+
**Parameters:**
|
|
396
|
+
|
|
397
|
+
| Name | Type | Required | Description |
|
|
398
|
+
| -------- | -------- | -------- | ---------------------------------------- |
|
|
399
|
+
| `urls` | string[] | โ
| URLs to scrape (max 10) |
|
|
400
|
+
| `format` | string | | `markdown` (default) or `text` |
|
|
401
|
+
|
|
402
|
+
**Example:**
|
|
403
|
+
|
|
404
|
+
```json
|
|
405
|
+
{
|
|
406
|
+
"urls": ["https://example.com/page1", "https://example.com/page2"],
|
|
407
|
+
"format": "text"
|
|
408
|
+
}
|
|
409
|
+
```
|
|
410
|
+
|
|
411
|
+
**Output:**
|
|
412
|
+
|
|
413
|
+
```json
|
|
414
|
+
[
|
|
415
|
+
{
|
|
416
|
+
"url": "https://example.com/page1",
|
|
417
|
+
"content": "Page 1 Title\n\nThis is the content of page 1..."
|
|
418
|
+
},
|
|
419
|
+
{
|
|
420
|
+
"url": "https://example.com/page2",
|
|
421
|
+
"content": "Page 2 Title\n\nThis is the content of page 2..."
|
|
422
|
+
}
|
|
423
|
+
]
|
|
424
|
+
```
|
|
425
|
+
|
|
426
|
+
### `scraper_server_status`
|
|
427
|
+
|
|
428
|
+
Check the status of all ScraperServer instances. Shows server health, circuit breaker state, failure counts, and timing info.
|
|
429
|
+
|
|
430
|
+
**Parameters:** None
|
|
431
|
+
|
|
432
|
+
**Example:**
|
|
433
|
+
|
|
434
|
+
```json
|
|
435
|
+
{}
|
|
436
|
+
```
|
|
437
|
+
|
|
438
|
+
**Output:**
|
|
439
|
+
|
|
440
|
+
```markdown
|
|
441
|
+
## ScraperServer Status
|
|
442
|
+
|
|
443
|
+
Total: 3 | Available: 2
|
|
444
|
+
|
|
445
|
+
| Name | OS | Status | Failures | Last Success | Last Failure |
|
|
446
|
+
|------|----|--------|----------|--------------|--------------|
|
|
447
|
+
| pluto | linux | OK | 0 | 01/30 14:23:05 | - |
|
|
448
|
+
| mars | mac | FAIL | 2 | 01/29 10:00:00 | 01/30 13:55:12 |
|
|
449
|
+
| venus | linux | OPEN | 3 | 01/28 09:00:00 | 01/30 12:00:00 |
|
|
450
|
+
|
|
451
|
+
### Issues
|
|
452
|
+
- **mars**: Connection refused - connect(2)
|
|
453
|
+
- **venus**: Circuit breaker open until 01/30 12:30:00
|
|
454
|
+
- **venus**: Net::ReadTimeout
|
|
455
|
+
```
|
|
456
|
+
|
|
457
|
+
**Status values:**
|
|
458
|
+
|
|
459
|
+
| Status | Description |
|
|
460
|
+
|--------|-------------|
|
|
461
|
+
| `OK` | Server is healthy |
|
|
462
|
+
| `FAIL` | Server is unhealthy |
|
|
463
|
+
| `OPEN` | Circuit breaker open (isolated for 30 min) |
|
|
464
|
+
| `N/A` | Not yet checked |
|
|
465
|
+
|
|
466
|
+
### `get_usage`
|
|
467
|
+
|
|
468
|
+
Check your API usage and remaining credits.
|
|
469
|
+
|
|
470
|
+
**Parameters:** None
|
|
471
|
+
|
|
472
|
+
**Example:**
|
|
473
|
+
|
|
474
|
+
```json
|
|
475
|
+
{}
|
|
476
|
+
```
|
|
477
|
+
|
|
478
|
+
**Output:**
|
|
479
|
+
|
|
480
|
+
```markdown
|
|
481
|
+
## MCP Credits
|
|
482
|
+
|
|
483
|
+
| Item | Value |
|
|
484
|
+
|------|-------|
|
|
485
|
+
| Plan | starter |
|
|
486
|
+
| Subscription Credits | 1,500 |
|
|
487
|
+
| Purchased Credits | 200 |
|
|
488
|
+
| Total Remaining | 1,700 |
|
|
489
|
+
| Period End | 2026-03-01 |
|
|
490
|
+
```
|
|
491
|
+
|
|
492
|
+
### `get_billing`
|
|
493
|
+
|
|
494
|
+
Retrieve detailed billing information including subscription, plans, daily usage, and spending limits.
|
|
495
|
+
|
|
496
|
+
**Parameters:**
|
|
497
|
+
|
|
498
|
+
| Name | Type | Required | Description |
|
|
499
|
+
|------|------|----------|-------------|
|
|
500
|
+
| `action` | string | Yes | `subscription`, `plans`, `daily_usage`, or `spending_limits` |
|
|
501
|
+
| `start_date` | string | | Start date for `daily_usage` (YYYY-MM-DD, default: 30 days ago) |
|
|
502
|
+
| `end_date` | string | | End date for `daily_usage` (YYYY-MM-DD, default: today) |
|
|
503
|
+
|
|
504
|
+
**Example โ Current subscription:**
|
|
505
|
+
|
|
506
|
+
```json
|
|
507
|
+
{ "action": "subscription" }
|
|
508
|
+
```
|
|
509
|
+
|
|
510
|
+
```markdown
|
|
511
|
+
## MCP Subscription
|
|
512
|
+
|
|
513
|
+
| Item | Value |
|
|
514
|
+
|------|-------|
|
|
515
|
+
| Plan | starter (Starter) |
|
|
516
|
+
| Status | active |
|
|
517
|
+
| Monthly Credits | 2,000 |
|
|
518
|
+
| Price | $19.00/mo |
|
|
519
|
+
| Rate Limit | 30 RPM |
|
|
520
|
+
| Burst Limit | 5 concurrent |
|
|
521
|
+
| Period End | 2026-03-01 |
|
|
522
|
+
```
|
|
523
|
+
|
|
524
|
+
**Example โ Available plans:**
|
|
525
|
+
|
|
526
|
+
```json
|
|
527
|
+
{ "action": "plans" }
|
|
528
|
+
```
|
|
529
|
+
|
|
530
|
+
```markdown
|
|
531
|
+
## Available MCP Plans
|
|
532
|
+
|
|
533
|
+
| Plan | Credits/mo | Price | RPM | Burst |
|
|
534
|
+
|------|-----------|-------|-----|-------|
|
|
535
|
+
| Free (free) | 500 | Free | 10 | 2 |
|
|
536
|
+
| Starter (starter) | 2,000 | $19.00/mo | 30 | 5 |
|
|
537
|
+
| Pro (pro) | 10,000 | $49.00/mo | 60 | 10 |
|
|
538
|
+
| Business (business) | 50,000 | $149.00/mo | 120 | 20 |
|
|
539
|
+
```
|
|
540
|
+
|
|
541
|
+
**Example โ Daily usage history:**
|
|
542
|
+
|
|
543
|
+
```json
|
|
544
|
+
{ "action": "daily_usage", "start_date": "2026-02-01", "end_date": "2026-02-07" }
|
|
545
|
+
```
|
|
546
|
+
|
|
547
|
+
```markdown
|
|
548
|
+
## Daily Usage (2026-02-01 ~ 2026-02-07)
|
|
549
|
+
|
|
550
|
+
| Date | Requests | Credits | Top Tool |
|
|
551
|
+
|------|----------|---------|----------|
|
|
552
|
+
| 2026-02-07 | 45 | 45 | scrape#scrape (45) |
|
|
553
|
+
| 2026-02-06 | 120 | 120 | scrape#scrape (100) |
|
|
554
|
+
|
|
555
|
+
**Total**: 165 requests, 165 credits
|
|
556
|
+
```
|
|
557
|
+
|
|
558
|
+
**Example โ Spending limits:**
|
|
559
|
+
|
|
560
|
+
```json
|
|
561
|
+
{ "action": "spending_limits" }
|
|
562
|
+
```
|
|
563
|
+
|
|
564
|
+
```markdown
|
|
565
|
+
## Spending Limits
|
|
566
|
+
|
|
567
|
+
| Item | Value |
|
|
568
|
+
|------|-------|
|
|
569
|
+
| Daily Limit | 500 credits |
|
|
570
|
+
| Today's Usage | 120 credits |
|
|
571
|
+
| Usage % | 24.0% |
|
|
572
|
+
```
|
|
573
|
+
|
|
574
|
+
---
|
|
575
|
+
|
|
576
|
+
## Usage Examples
|
|
577
|
+
|
|
578
|
+
### Example 1: Summarize a News Article
|
|
579
|
+
|
|
580
|
+
```
|
|
581
|
+
User: Summarize this article: https://news.example.com/article/12345
|
|
582
|
+
|
|
583
|
+
Claude: [calls scrape_url]
|
|
584
|
+
|
|
585
|
+
Here's a summary of the article:
|
|
586
|
+
|
|
587
|
+
## Key Points
|
|
588
|
+
- Point 1: ...
|
|
589
|
+
- Point 2: ...
|
|
590
|
+
- Point 3: ...
|
|
591
|
+
```
|
|
592
|
+
|
|
593
|
+
### Example 2: Fetch Page Content
|
|
594
|
+
|
|
595
|
+
```
|
|
596
|
+
User: Get the content from https://example.com/data
|
|
597
|
+
|
|
598
|
+
Claude: [calls scrape_url]
|
|
599
|
+
|
|
600
|
+
# Page Title
|
|
601
|
+
> Source: https://example.com/data
|
|
602
|
+
|
|
603
|
+
The page content is returned in clean Markdown format...
|
|
604
|
+
```
|
|
605
|
+
|
|
606
|
+
### Example 3: Research Competitor Pricing
|
|
607
|
+
|
|
608
|
+
```
|
|
609
|
+
User: What's the pricing on https://competitor.com/product/abc
|
|
610
|
+
|
|
611
|
+
Claude: [calls scrape_url]
|
|
612
|
+
|
|
613
|
+
Here's the pricing information:
|
|
614
|
+
- **Product**: ABC Premium
|
|
615
|
+
- **Regular Price**: $99.00
|
|
616
|
+
- **Sale Price**: $79.00 (20% off)
|
|
617
|
+
```
|
|
618
|
+
|
|
619
|
+
### Example 4: Read API Documentation
|
|
620
|
+
|
|
621
|
+
```
|
|
622
|
+
User: Read https://docs.example.com/api/v2 and write integration code
|
|
623
|
+
|
|
624
|
+
Claude: [calls scrape_url]
|
|
625
|
+
|
|
626
|
+
I've analyzed the API documentation. Here's the integration code:
|
|
627
|
+
|
|
628
|
+
// api-client.ts
|
|
629
|
+
export class ExampleApiClient {
|
|
630
|
+
private baseUrl = 'https://api.example.com/v2';
|
|
631
|
+
|
|
632
|
+
async getData(): Promise<Response> {
|
|
633
|
+
// ...
|
|
634
|
+
}
|
|
635
|
+
}
|
|
636
|
+
```
|
|
637
|
+
|
|
638
|
+
---
|
|
639
|
+
|
|
640
|
+
## How It Works
|
|
641
|
+
|
|
642
|
+
```
|
|
643
|
+
โโโโโโโโโโโโโโโโโโโ
|
|
644
|
+
โ User โ
|
|
645
|
+
โ "Summarize this โ
|
|
646
|
+
โ URL for me" โ
|
|
647
|
+
โโโโโโโโโโฌโโโโโโโโโ
|
|
648
|
+
โ
|
|
649
|
+
โผ
|
|
650
|
+
โโโโโโโโโโโโโโโโโโโ
|
|
651
|
+
โ Claude Desktop โ
|
|
652
|
+
โ / Cursor โ
|
|
653
|
+
โโโโโโโโโโฌโโโโโโโโโ
|
|
654
|
+
โ
|
|
655
|
+
โผ
|
|
656
|
+
โโโโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโโโ
|
|
657
|
+
โ MCP Server โโโโโโบโ Scrapi API โ
|
|
658
|
+
โ (scrape_url) โ โ (format param) โ
|
|
659
|
+
โโโโโโโโโโฌโโโโโโโโโ โโโโโโโโโโฌโโโโโโโโโ
|
|
660
|
+
โ โ
|
|
661
|
+
โโโโโโโโโโโโโโโโโโโโโโโโโ
|
|
662
|
+
โ Markdown/Text Response
|
|
663
|
+
โผ
|
|
664
|
+
โโโโโโโโโโโโโโโโโโโ
|
|
665
|
+
โ AI Response โ
|
|
666
|
+
โ (Summary, etc.) โ
|
|
667
|
+
โโโโโโโโโโโโโโโโโโโ
|
|
668
|
+
```
|
|
669
|
+
|
|
670
|
+
---
|
|
671
|
+
|
|
672
|
+
## Why Scrapi?
|
|
673
|
+
|
|
674
|
+
Built by the team behind [Scrapi](https://scrapi.ai), with 7+ years of web scraping experience:
|
|
675
|
+
|
|
676
|
+
- โ
1,900+ production crawlers
|
|
677
|
+
- โ
JavaScript rendering support
|
|
678
|
+
- โ
Anti-bot handling
|
|
679
|
+
- โ
99.9% uptime
|
|
680
|
+
|
|
681
|
+
---
|
|
682
|
+
|
|
683
|
+
## Troubleshooting
|
|
684
|
+
|
|
685
|
+
### "API key is required"
|
|
686
|
+
|
|
687
|
+
Make sure your `SCRAPI_API_KEY` environment variable is set correctly in the configuration file.
|
|
688
|
+
|
|
689
|
+
### "Invalid API key"
|
|
690
|
+
|
|
691
|
+
Verify that your API key is correct and active in your Scrapi dashboard.
|
|
692
|
+
|
|
693
|
+
### MCP Server not connecting
|
|
694
|
+
|
|
695
|
+
1. Ensure Node.js 20+ is installed
|
|
696
|
+
2. Try running `node /absolute/path/to/scrapi-mcp-server/dist/index.js` manually to check for errors
|
|
697
|
+
3. Fully quit Claude Desktop (Cmd+Q on macOS, Alt+F4 on Windows) and restart
|
|
698
|
+
4. Check Settings > Developer to verify the server is listed
|
|
699
|
+
|
|
700
|
+
### Developer tab not visible
|
|
701
|
+
|
|
702
|
+
Update Claude Desktop to the latest version: Claude menu โ "Check for Updates..."
|
|
703
|
+
|
|
704
|
+
---
|
|
705
|
+
|
|
706
|
+
## Support
|
|
707
|
+
|
|
708
|
+
- Email: help@scrapi.ai
|
|
709
|
+
- Issues: [GitHub Issues](https://github.com/bamchi/scrapi-mcp-server/issues)
|
|
710
|
+
|
|
711
|
+
---
|
|
712
|
+
|
|
713
|
+
## License
|
|
714
|
+
|
|
715
|
+
MIT ยฉ [Scrapi](https://scrapi.ai)
|
package/dist/http.d.ts
ADDED
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"file":"http.d.ts","sourceRoot":"","sources":["../src/http.ts"],"names":[],"mappings":""}
|