claude-webapi 1.0.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Wojciech
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,457 @@
1
+ Metadata-Version: 2.4
2
+ Name: claude-webapi
3
+ Version: 1.0.0
4
+ Summary: Async Python wrapper for the Claude.ai web app
5
+ License: MIT
6
+ Project-URL: Homepage, https://github.com/yourusername/claude_webapi
7
+ Project-URL: Issues, https://github.com/yourusername/claude_webapi/issues
8
+ Keywords: claude,anthropic,ai,wrapper,async
9
+ Classifier: Development Status :: 4 - Beta
10
+ Classifier: Intended Audience :: Developers
11
+ Classifier: License :: OSI Approved :: MIT License
12
+ Classifier: Programming Language :: Python :: 3
13
+ Classifier: Programming Language :: Python :: 3.10
14
+ Classifier: Programming Language :: Python :: 3.11
15
+ Classifier: Programming Language :: Python :: 3.12
16
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
17
+ Requires-Python: >=3.10
18
+ Description-Content-Type: text/markdown
19
+ License-File: LICENSE
20
+ Requires-Dist: aiohttp>=3.9
21
+ Requires-Dist: browser-cookie3>=0.19; extra == "browser"
22
+ Provides-Extra: browser
23
+ Requires-Dist: browser-cookie3>=0.19; extra == "browser"
24
+ Provides-Extra: dev
25
+ Requires-Dist: pytest>=7; extra == "dev"
26
+ Requires-Dist: pytest-asyncio>=0.23; extra == "dev"
27
+ Requires-Dist: black; extra == "dev"
28
+ Requires-Dist: ruff; extra == "dev"
29
+ Dynamic: license-file
30
+
31
+ # Claude-API
32
+
33
+ A reverse-engineered **asynchronous Python wrapper** for the [Claude.ai](https://claude.ai) web app.
34
+
35
+ > **Disclaimer:** This is an unofficial library that interacts with Claude.ai's internal web API.
36
+ > It is not affiliated with or endorsed by Anthropic. Use responsibly.
37
+
38
+ ---
39
+
40
+ ## Features
41
+
42
+ - **Multi-turn Conversations** — Stateful `ChatSession` objects keep history automatically across turns.
43
+ - **Streaming Mode** — Yield partial outputs in real time as Claude writes them.
44
+ - **File Attachments** — Attach local images, PDFs, and documents to any message.
45
+ - **File Downloads** — Download files generated by Claude's REPL/sandbox.
46
+ - **Model Selection** — Switch between Claude 4, 3.7, 3.5, and any future model by name.
47
+ - **System Prompts** — Apply custom system prompts at the conversation or single-call level.
48
+ - **Session Resumption** — Serialize and reload conversation metadata to continue across Python processes.
49
+ - **Auto-Close** — Optional inactivity timer for always-on services.
50
+ - **Async-First** — Built on `aiohttp` for non-blocking I/O throughout.
51
+
52
+ ---
53
+
54
+ ## Table of Contents
55
+
56
+ - [claude\_webapi](#claude_webapi)
57
+ - [Features](#features)
58
+ - [Table of Contents](#table-of-contents)
59
+ - [Installation](#installation)
60
+ - [Authentication](#authentication)
61
+ - [Usage](#usage)
62
+ - [Initialization](#initialization)
63
+ - [Generate content](#generate-content)
64
+ - [Generate content with files](#generate-content-with-files)
65
+ - [Upload file as bytes](#upload-file-as-bytes)
66
+ - [Inline attachments](#inline-attachments)
67
+ - [Multi-turn conversations](#multi-turn-conversations)
68
+ - [Resume a previous conversation](#resume-a-previous-conversation)
69
+ - [Delete a conversation](#delete-a-conversation)
70
+ - [Streaming mode](#streaming-mode)
71
+ - [Select a language model](#select-a-language-model)
72
+ - [Download files from Claude's sandbox](#download-files-from-claudes-sandbox)
73
+ - [List \& manage conversations](#list--manage-conversations)
74
+ - [Check for multiple reply candidates](#check-for-multiple-reply-candidates)
75
+ - [Logging](#logging)
76
+ - [Error handling](#error-handling)
77
+ - [References](#references)
78
+ - [Stargazers](#stargazers)
79
+
80
+ ---
81
+
82
+ ## Installation
83
+
84
+ Requires Python **3.10** or higher.
85
+
86
+ ```sh
87
+ pip install -U claude-webapi
88
+ ```
89
+
90
+ ---
91
+
92
+ ## Authentication
93
+
94
+ 1. Go to [claude.ai](https://claude.ai) and log in with your Google / email account.
95
+ 2. Press **F12** → **Application** tab → **Cookies** → `https://claude.ai`.
96
+ 3. Copy the value of the **`sessionKey`** cookie.
97
+ 4. Your **`organization_id`** is visible in the `lastActiveOrg` cookie, or in the
98
+ URL when you open a conversation: `https://claude.ai/chat/<org_uuid>/…`
99
+
100
+ > **Note:** Keep your `sessionKey` secret — it grants full access to your Claude account.
101
+
102
+ ---
103
+
104
+ ## Usage
105
+
106
+ ### Initialization
107
+
108
+ ```python
109
+ import asyncio
110
+ from claude_webapi import ClaudeClient
111
+
112
+ SESSION_KEY = "sk-ant-…" # your sessionKey cookie
113
+ ORGANIZATION_ID = "xxxxxxxx-…" # your org UUID
114
+
115
+ async def main():
116
+ client = ClaudeClient(SESSION_KEY, ORGANIZATION_ID)
117
+ await client.init(timeout=30, auto_close=False, close_delay=300)
118
+ # … use client …
119
+ await client.close()
120
+
121
+ asyncio.run(main())
122
+ ```
123
+
124
+ Or use it as an async context manager (automatically calls `init` and `close`):
125
+
126
+ ```python
127
+ async def main():
128
+ async with ClaudeClient(SESSION_KEY, ORGANIZATION_ID) as client:
129
+ response = await client.generate_content("Hello!")
130
+ print(response.text)
131
+ ```
132
+
133
+ > **Tip:** In long-running services (bots, APIs) set `auto_close=True` and a
134
+ > reasonable `close_delay` so the HTTP session is cleaned up during idle periods.
135
+
136
+ ---
137
+
138
+ ### Generate content
139
+
140
+ ```python
141
+ async def main():
142
+ response = await client.generate_content("Explain quantum entanglement simply.")
143
+ print(response.text)
144
+ ```
145
+
146
+ > `print(response)` produces the same output — `ModelOutput.__str__` returns `text`.
147
+
148
+ ---
149
+
150
+ ### Generate content with files
151
+
152
+ Pass a list of local file paths alongside your prompt:
153
+
154
+ ```python
155
+ from pathlib import Path
156
+
157
+ async def main():
158
+ response = await client.generate_content(
159
+ "Summarise this PDF and describe the chart image.",
160
+ files=["report.pdf", Path("chart.png")],
161
+ )
162
+ print(response.text)
163
+ ```
164
+
165
+ Supported file types mirror what Claude.ai accepts: PDFs, plain text, images (PNG/JPEG/GIF/WebP), CSV, and most common document formats.
166
+
167
+ ---
168
+
169
+ ### Upload file as bytes
170
+
171
+ When the file lives in memory rather than on disk, pass raw bytes directly:
172
+
173
+ ```python
174
+ async def main():
175
+ data = b"col1,col2\n1,2\n3,4"
176
+ chat = client.start_chat()
177
+ fid = await client.upload_file(
178
+ chat.cid,
179
+ data=data,
180
+ filename="data.csv",
181
+ mime_type="text/csv",
182
+ )
183
+ r = await chat.send_message("Analyse the uploaded CSV.", files=[fid])
184
+ print(r.text)
185
+ ```
186
+
187
+ ---
188
+
189
+ ### Inline attachments
190
+
191
+ Pass pre-extracted text directly as an *attachment* — no upload round-trip needed:
192
+
193
+ ```python
194
+ async def main():
195
+ response = await client.generate_content(
196
+ "Summarise the document below.",
197
+ attachments=[{
198
+ "extracted_content": "The quick brown fox…",
199
+ "file_name": "notes.txt",
200
+ "file_size": 1234,
201
+ "file_type": "txt",
202
+ }],
203
+ )
204
+ print(response.text)
205
+ ```
206
+
207
+ `attachments` is supported on `generate_content`, `generate_content_stream`, and `ChatSession.send_message` / `send_message_stream`.
208
+
209
+ ---
210
+
211
+ ### Multi-turn conversations
212
+
213
+ Use `start_chat()` to create a `ChatSession` that automatically threads context across messages:
214
+
215
+ ```python
216
+ async def main():
217
+ chat = client.start_chat()
218
+
219
+ r1 = await chat.send_message("My name is Alice.")
220
+ print(r1.text)
221
+
222
+ r2 = await chat.send_message("What is my name?")
223
+ print(r2.text) # → "Your name is Alice."
224
+ ```
225
+
226
+ Files can be attached to any turn:
227
+
228
+ ```python
229
+ r3 = await chat.send_message(
230
+ "Analyse this spreadsheet and create a bar chart.",
231
+ files=["sales_q1.csv"],
232
+ )
233
+ ```
234
+
235
+ ---
236
+
237
+ ### Resume a previous conversation
238
+
239
+ Save `chat.metadata` and pass it back to `start_chat` to resume later — even after the Python process has exited:
240
+
241
+ ```python
242
+ import json
243
+
244
+ async def main():
245
+ chat = client.start_chat()
246
+ await chat.send_message("Remember: the secret word is BANANA.")
247
+
248
+ # Persist metadata
249
+ saved = chat.metadata
250
+ with open("session.json", "w") as f:
251
+ json.dump(saved, f)
252
+
253
+ # --- later, in a new process ---
254
+ with open("session.json") as f:
255
+ saved = json.load(f)
256
+
257
+ previous_chat = client.start_chat(metadata=saved)
258
+ r = await previous_chat.send_message("What was the secret word?")
259
+ print(r.text) # → "The secret word is BANANA."
260
+ ```
261
+
262
+ ---
263
+
264
+ ### Delete a conversation
265
+
266
+ ```python
267
+ async def main():
268
+ chat = client.start_chat()
269
+ await chat.send_message("This is temporary.")
270
+
271
+ await client.delete_conversation(chat.cid)
272
+ print(f"Deleted: {chat.cid}")
273
+ # Or equivalently:
274
+ # await chat.delete()
275
+ ```
276
+
277
+ ---
278
+
279
+ ### Streaming mode
280
+
281
+ Get incremental output using `generate_content_stream` or `ChatSession.send_message_stream`.
282
+ The `text_delta` attribute on each chunk holds only the new characters since the last yield.
283
+
284
+ ```python
285
+ async def main():
286
+ async for chunk in client.generate_content_stream(
287
+ "Write a 500-word short story about a time-travelling librarian."
288
+ ):
289
+ print(chunk.text_delta, end="", flush=True)
290
+ print()
291
+ ```
292
+
293
+ Inside a chat session:
294
+
295
+ ```python
296
+ async def main():
297
+ chat = client.start_chat()
298
+ async for chunk in chat.send_message_stream("Explain async/await in Python."):
299
+ print(chunk.text_delta, end="", flush=True)
300
+ print()
301
+ # Follow-up works normally — context is preserved
302
+ r = await chat.send_message("Give me a code example.")
303
+ print(r.text)
304
+ ```
305
+
306
+ ---
307
+
308
+ ### Select a language model
309
+
310
+ Pass a `Model` enum member or a raw model string:
311
+
312
+ ```python
313
+ from claude_webapi.constants import Model
314
+
315
+ async def main():
316
+ # Using the enum
317
+ r1 = await client.generate_content(
318
+ "What model are you?",
319
+ model=Model.OPUS,
320
+ )
321
+ print(r1.text)
322
+
323
+ # Using a raw string (useful for models not in the enum)
324
+ r2 = await client.generate_content(
325
+ "What model are you?",
326
+ model="claude-sonnet-4-6",
327
+ )
328
+ print(r2.text)
329
+
330
+ # Per-session model
331
+ chat = client.start_chat(model=Model.HAIKU)
332
+ r3 = await chat.send_message("Fast reply please.")
333
+ print(r3.text)
334
+ ```
335
+
336
+ **Available models** (as of February 2026):
337
+
338
+ | Enum constant | Model string |
339
+ |---------------------|---------------------------------------|
340
+ | `Model.SONNET` | `claude-sonnet-4-6` |
341
+ | `Model.OPUS` | `claude-opus-4-6` |
342
+ | `Model.HAIKU` | `claude-haiku-4-5-20251001` |
343
+ | `Model.SONNET_3_7` | `claude-3-7-sonnet-20250219` |
344
+ | `Model.SONNET_3_5` | `claude-3-5-sonnet-20241022` |
345
+ | `Model.HAIKU_3_5` | `claude-3-5-haiku-20241022` |
346
+ | `Model.OPUS_3` | `claude-3-opus-20240229` |
347
+
348
+ You can always pass a custom string to access models not listed above.
349
+
350
+ ---
351
+
352
+ ### Download files from Claude's sandbox
353
+
354
+ When Claude generates a file through its code interpreter (REPL), you can download it:
355
+
356
+ ```python
357
+ async def main():
358
+ chat = client.start_chat()
359
+ await chat.send_message(
360
+ "Generate a CSV with the first 20 Fibonacci numbers and save it as fib.csv"
361
+ )
362
+
363
+ local_path = await client.download_file(
364
+ chat.cid, "fib.csv", dest="./downloads"
365
+ )
366
+ print(f"Saved to: {local_path}")
367
+ ```
368
+
369
+ ---
370
+
371
+ ### List & manage conversations
372
+
373
+ ```python
374
+ async def main():
375
+ conversations = await client.list_conversations()
376
+ for conv in conversations[:5]:
377
+ print(conv["uuid"], conv.get("name", "(unnamed)"))
378
+
379
+ # Rename a conversation
380
+ await client.rename_conversation(conversations[0]["uuid"], "My renamed chat")
381
+ ```
382
+
383
+ ---
384
+
385
+ ### Check for multiple reply candidates
386
+
387
+ Claude sometimes returns multiple reply candidates. You can inspect them and
388
+ select which one to continue the conversation from:
389
+
390
+ ```python
391
+ async def main():
392
+ chat = client.start_chat()
393
+ response = await chat.send_message("Recommend a sci-fi novel.")
394
+
395
+ for i, candidate in enumerate(response.candidates):
396
+ print(f"--- Candidate {i} ---")
397
+ print(candidate.text[:200])
398
+
399
+ if len(response.candidates) > 1:
400
+ chat.choose_candidate(index=1) # use the second candidate going forward
401
+ follow_up = await chat.send_message("Tell me more about that book.")
402
+ print(follow_up.text)
403
+ ```
404
+
405
+ ---
406
+
407
+ ## Logging
408
+
409
+ ```python
410
+ from claude_webapi import set_log_level
411
+
412
+ set_log_level("DEBUG") # DEBUG | INFO | WARNING | ERROR | CRITICAL
413
+ ```
414
+
415
+ ---
416
+
417
+ ## Error handling
418
+
419
+ ```python
420
+ from claude_webapi import (
421
+ ClaudeClient,
422
+ AuthenticationError,
423
+ APIError,
424
+ QuotaExceededError,
425
+ TimeoutError,
426
+ ConversationNotFoundError,
427
+ )
428
+
429
+ async def main():
430
+ try:
431
+ response = await client.generate_content("Hello!")
432
+ except AuthenticationError:
433
+ print("Invalid or expired sessionKey — please re-authenticate.")
434
+ except QuotaExceededError:
435
+ print("Daily message limit reached. Try again later.")
436
+ except TimeoutError:
437
+ print("Request timed out.")
438
+ except ConversationNotFoundError:
439
+ print("Conversation UUID not found.")
440
+ except APIError as e:
441
+ print(f"API error {e.status_code}: {e}")
442
+ ```
443
+
444
+ ---
445
+
446
+ ## References
447
+
448
+ - [Anthropic — Claude.ai](https://claude.ai)
449
+ - [Anthropic — Official API](https://docs.anthropic.com)
450
+ - [gemini_webapi](https://github.com/HanaokaYuzu/Gemini-API) — Inspiration for this project's interface design
451
+
452
+ ---
453
+
454
+ ## Stargazers
455
+
456
+ If this project helped you, please consider starring it ⭐
457
+