@magpiecloud/mags 1.8.13 → 1.8.15

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (43) hide show
  1. package/README.md +95 -378
  2. package/bin/mags.js +196 -104
  3. package/index.js +6 -52
  4. package/package.json +22 -4
  5. package/API.md +0 -388
  6. package/Mags-API.postman_collection.json +0 -374
  7. package/QUICKSTART.md +0 -295
  8. package/deploy-page.sh +0 -171
  9. package/mags +0 -0
  10. package/mags.sh +0 -270
  11. package/nodejs/README.md +0 -197
  12. package/nodejs/bin/mags.js +0 -1146
  13. package/nodejs/index.js +0 -642
  14. package/nodejs/package.json +0 -42
  15. package/python/INTEGRATION.md +0 -800
  16. package/python/README.md +0 -161
  17. package/python/dist/magpie_mags-1.3.5-py3-none-any.whl +0 -0
  18. package/python/dist/magpie_mags-1.3.5.tar.gz +0 -0
  19. package/python/examples/demo.py +0 -181
  20. package/python/pyproject.toml +0 -39
  21. package/python/src/magpie_mags.egg-info/PKG-INFO +0 -182
  22. package/python/src/magpie_mags.egg-info/SOURCES.txt +0 -9
  23. package/python/src/magpie_mags.egg-info/dependency_links.txt +0 -1
  24. package/python/src/magpie_mags.egg-info/requires.txt +0 -1
  25. package/python/src/magpie_mags.egg-info/top_level.txt +0 -1
  26. package/python/src/mags/__init__.py +0 -6
  27. package/python/src/mags/client.py +0 -573
  28. package/python/test_sdk.py +0 -78
  29. package/skill.md +0 -153
  30. package/website/api.html +0 -1095
  31. package/website/claude-skill.html +0 -481
  32. package/website/cookbook/hn-marketing.html +0 -410
  33. package/website/cookbook/hn-marketing.sh +0 -42
  34. package/website/cookbook.html +0 -282
  35. package/website/env.js +0 -4
  36. package/website/index.html +0 -801
  37. package/website/llms.txt +0 -334
  38. package/website/login.html +0 -108
  39. package/website/mags.md +0 -210
  40. package/website/script.js +0 -453
  41. package/website/styles.css +0 -908
  42. package/website/tokens.html +0 -169
  43. package/website/usage.html +0 -185
@@ -1,800 +0,0 @@
1
- # Integrating Mags into Python Applications
2
-
3
- Mags gives your Python app on-demand Linux VMs that boot in ~300ms. Run untrusted code, build CI pipelines, host ephemeral services, or give your users their own sandboxed environments — all from a few lines of Python.
4
-
5
- ```bash
6
- pip install magpie-mags
7
- ```
8
-
9
- ```python
10
- from mags import Mags
11
-
12
- m = Mags(api_token="your-token")
13
- result = m.run_and_wait("echo 'Hello from a VM!'")
14
- print(result["exit_code"]) # 0
15
- ```
16
-
17
- ---
18
-
19
- ## Table of Contents
20
-
21
- - [Authentication](#authentication)
22
- - [Pattern 1: Code Execution Sandbox](#pattern-1-code-execution-sandbox)
23
- - [Pattern 2: AI Agent Tool Use](#pattern-2-ai-agent-tool-use)
24
- - [Pattern 3: Background Job Processing](#pattern-3-background-job-processing)
25
- - [Pattern 4: On-Demand Dev Environments](#pattern-4-on-demand-dev-environments)
26
- - [Pattern 5: CI/CD Pipeline Steps](#pattern-5-cicd-pipeline-steps)
27
- - [Pattern 6: Scheduled Tasks](#pattern-6-scheduled-tasks)
28
- - [Pattern 7: Web App with Live Preview](#pattern-7-web-app-with-live-preview)
29
- - [Working with Workspaces](#working-with-workspaces)
30
- - [File Uploads](#file-uploads)
31
- - [SSH Access](#ssh-access)
32
- - [Error Handling](#error-handling)
33
- - [Async / FastAPI Integration](#async--fastapi-integration)
34
- - [Django Integration](#django-integration)
35
- - [Flask Integration](#flask-integration)
36
- - [Environment & Configuration](#environment--configuration)
37
-
38
- ---
39
-
40
- ## Authentication
41
-
42
- Set your API token once. All calls use it automatically.
43
-
44
- ```python
45
- # Option 1: Pass directly
46
- m = Mags(api_token="your-token")
47
-
48
- # Option 2: Environment variable (recommended for production)
49
- # export MAGS_API_TOKEN="your-token"
50
- m = Mags()
51
-
52
- # Option 3: Custom API endpoint (self-hosted)
53
- m = Mags(api_url="https://your-instance.example.com")
54
- ```
55
-
56
- ---
57
-
58
- ## Pattern 1: Code Execution Sandbox
59
-
60
- Run user-submitted code safely inside an isolated VM. The code never touches your infrastructure.
61
-
62
- ```python
63
- from mags import Mags, MagsError
64
-
65
- m = Mags()
66
-
67
- def execute_user_code(language, code, timeout=30):
68
- """Run untrusted user code in a sandboxed VM."""
69
-
70
- runners = {
71
- "python": f"python3 -c {shlex.quote(code)}",
72
- "javascript": f"node -e {shlex.quote(code)}",
73
- "bash": code,
74
- }
75
-
76
- script = runners.get(language)
77
- if not script:
78
- return {"error": f"Unsupported language: {language}"}
79
-
80
- try:
81
- result = m.run_and_wait(script, timeout=timeout)
82
- return {
83
- "exit_code": result["exit_code"],
84
- "output": [
85
- log["message"]
86
- for log in result["logs"]
87
- if log["source"] in ("stdout", "stderr")
88
- ],
89
- }
90
- except MagsError as e:
91
- return {"error": str(e)}
92
- ```
93
-
94
- ```python
95
- # Usage
96
- import shlex
97
-
98
- output = execute_user_code("python", "print(sum(range(100)))")
99
- # {"exit_code": 0, "output": ["4950"]}
100
- ```
101
-
102
- ### With package installation
103
-
104
- ```python
105
- def run_with_packages(code, packages):
106
- """Run Python code with pip packages pre-installed."""
107
- install = f"pip install -q {' '.join(packages)}" if packages else "true"
108
- script = f"{install} && python3 -c {shlex.quote(code)}"
109
- return m.run_and_wait(script, timeout=60)
110
-
111
- result = run_with_packages(
112
- "import pandas as pd; print(pd.DataFrame({'a': [1,2,3]}))",
113
- ["pandas"],
114
- )
115
- ```
116
-
117
- ### With a pre-built base image
118
-
119
- For repeated runs, avoid re-installing packages every time by creating a base workspace and syncing it:
120
-
121
- ```python
122
- # One-time setup: create a base workspace with common packages
123
- job = m.run(
124
- "pip install pandas numpy requests flask scikit-learn",
125
- workspace_id="python-base",
126
- persistent=True,
127
- )
128
-
129
- # Wait for setup to finish, then sync to S3
130
- import time
131
- for _ in range(60):
132
- status = m.status(job["request_id"])
133
- if status["status"] == "running":
134
- break
135
- time.sleep(1)
136
-
137
- # Force sync — persists everything to S3 immediately
138
- m.sync(job["request_id"])
139
-
140
- # Every subsequent run inherits the base (read-only, no install needed)
141
- result = m.run_and_wait(
142
- "python3 -c 'import pandas; print(pandas.__version__)'",
143
- base_workspace_id="python-base",
144
- )
145
-
146
- # Fork: load base, save changes to a new workspace
147
- result = m.run_and_wait(
148
- "pip install torch",
149
- base_workspace_id="python-base",
150
- workspace_id="python-ml",
151
- )
152
- ```
153
-
154
- ---
155
-
156
- ## Pattern 2: AI Agent Tool Use
157
-
158
- Give LLM agents a sandboxed environment to execute code, install packages, and inspect results.
159
-
160
- ```python
161
- from mags import Mags
162
-
163
- m = Mags()
164
-
165
- def agent_tool_execute(code, workspace_id="agent-workspace"):
166
- """Tool function for an AI agent to run code."""
167
- result = m.run_and_wait(
168
- code,
169
- workspace_id=workspace_id,
170
- timeout=60,
171
- )
172
-
173
- stdout = "\n".join(
174
- log["message"]
175
- for log in result["logs"]
176
- if log["source"] == "stdout"
177
- )
178
- stderr = "\n".join(
179
- log["message"]
180
- for log in result["logs"]
181
- if log["source"] == "stderr"
182
- )
183
-
184
- return {
185
- "success": result["exit_code"] == 0,
186
- "stdout": stdout,
187
- "stderr": stderr,
188
- }
189
- ```
190
-
191
- ### Claude / OpenAI tool definition
192
-
193
- ```python
194
- tool_definition = {
195
- "name": "execute_code",
196
- "description": (
197
- "Execute shell commands or code in a persistent Linux VM. "
198
- "Installed packages and files persist between calls. "
199
- "The VM runs Alpine Linux with Python, Node.js, and common tools."
200
- ),
201
- "input_schema": {
202
- "type": "object",
203
- "properties": {
204
- "code": {
205
- "type": "string",
206
- "description": "Shell command(s) to execute",
207
- }
208
- },
209
- "required": ["code"],
210
- },
211
- }
212
-
213
- # In your agent loop:
214
- if tool_call.name == "execute_code":
215
- result = agent_tool_execute(tool_call.input["code"])
216
- # Return result to the LLM
217
- ```
218
-
219
- ### Multi-turn agent with persistent workspace
220
-
221
- ```python
222
- class AgentSandbox:
223
- """Per-session sandbox that persists state across tool calls."""
224
-
225
- def __init__(self, session_id):
226
- self.m = Mags()
227
- self.workspace_id = f"agent-{session_id}"
228
-
229
- def execute(self, code):
230
- return self.m.run_and_wait(
231
- code,
232
- workspace_id=self.workspace_id,
233
- timeout=60,
234
- )
235
-
236
- def upload_and_run(self, file_path, command):
237
- file_ids = self.m.upload_files([file_path])
238
- return self.m.run_and_wait(
239
- command,
240
- workspace_id=self.workspace_id,
241
- file_ids=file_ids,
242
- timeout=60,
243
- )
244
-
245
- # Each user conversation gets its own sandbox
246
- sandbox = AgentSandbox(session_id="user-123-conv-456")
247
- sandbox.execute("pip install matplotlib")
248
- sandbox.execute("python3 -c 'import matplotlib; print(matplotlib.__version__)'")
249
- # matplotlib is still installed on the next call — workspace persists
250
- ```
251
-
252
- ---
253
-
254
- ## Pattern 3: Background Job Processing
255
-
256
- Offload heavy or untrusted processing to VMs. Good for video transcoding, PDF generation, data pipelines, etc.
257
-
258
- ```python
259
- import time
260
- from mags import Mags
261
-
262
- m = Mags()
263
-
264
- def process_video(video_url, output_format="mp4"):
265
- """Transcode a video in a sandboxed VM."""
266
- script = f"""
267
- apk add ffmpeg
268
- curl -sL '{video_url}' -o /tmp/input
269
- ffmpeg -i /tmp/input -c:v libx264 /tmp/output.{output_format}
270
- ls -la /tmp/output.*
271
- """
272
- return m.run_and_wait(script, timeout=300)
273
-
274
-
275
- def run_data_pipeline(sql_query, workspace_id="etl-pipeline"):
276
- """Run a data processing pipeline with persistent deps."""
277
- script = f"""
278
- python3 << 'PYEOF'
279
- import sqlite3, json
280
-
281
- conn = sqlite3.connect("/root/data.db")
282
- cursor = conn.execute("{sql_query}")
283
- rows = cursor.fetchall()
284
- print(json.dumps(rows))
285
- PYEOF
286
- """
287
- return m.run_and_wait(script, workspace_id=workspace_id, timeout=120)
288
- ```
289
-
290
- ### Fire-and-forget (don't block)
291
-
292
- ```python
293
- def submit_job(script, name=None):
294
- """Submit a job without waiting. Check status later."""
295
- result = m.run(script, name=name)
296
- return result["request_id"]
297
-
298
- def check_job(request_id):
299
- """Poll for completion."""
300
- status = m.status(request_id)
301
- if status["status"] in ("completed", "error"):
302
- logs = m.logs(request_id)
303
- return {**status, "logs": logs.get("logs", [])}
304
- return status
305
-
306
- # Submit
307
- job_id = submit_job("sleep 30 && echo done", name="background-task")
308
-
309
- # Check later
310
- result = check_job(job_id)
311
- ```
312
-
313
- ---
314
-
315
- ## Pattern 4: On-Demand Dev Environments
316
-
317
- Give each user a persistent sandbox with SSH and URL access.
318
-
319
- ```python
320
- from mags import Mags
321
-
322
- m = Mags()
323
-
324
- def create_dev_environment(user_id, template="python"):
325
- """Spin up a dev environment for a user."""
326
- templates = {
327
- "python": "apk add python3 py3-pip git && pip install ipython",
328
- "node": "apk add nodejs npm git && npm i -g yarn",
329
- "go": "apk add go git",
330
- }
331
-
332
- workspace_id = f"dev-{user_id}"
333
- setup_script = templates.get(template, templates["python"])
334
-
335
- # Create persistent VM with the dev environment
336
- job = m.run(
337
- setup_script,
338
- workspace_id=workspace_id,
339
- persistent=True,
340
- startup_command=templates[template].split("&&")[-1].strip(),
341
- )
342
- request_id = job["request_id"]
343
-
344
- # Wait for it to be ready
345
- import time
346
- for _ in range(30):
347
- status = m.status(request_id)
348
- if status["status"] == "running":
349
- break
350
- time.sleep(1)
351
-
352
- # Enable SSH access
353
- ssh = m.enable_access(request_id, port=22)
354
-
355
- return {
356
- "request_id": request_id,
357
- "workspace_id": workspace_id,
358
- "ssh_host": ssh["ssh_host"],
359
- "ssh_port": ssh["ssh_port"],
360
- "ssh_command": (
361
- f"ssh -p {ssh['ssh_port']} "
362
- f"-o StrictHostKeyChecking=no "
363
- f"root@{ssh['ssh_host']}"
364
- ),
365
- }
366
- ```
367
-
368
- ---
369
-
370
- ## Pattern 5: CI/CD Pipeline Steps
371
-
372
- Run test suites, builds, or linting in isolated VMs.
373
-
374
- ```python
375
- from mags import Mags
376
-
377
- m = Mags()
378
-
379
- def run_tests(repo_url, branch="main", workspace_id=None):
380
- """Clone a repo and run its test suite."""
381
- script = f"""
382
- apk add git nodejs npm
383
- git clone --branch {branch} --depth 1 {repo_url} /tmp/repo
384
- cd /tmp/repo
385
- npm ci
386
- npm test
387
- """
388
- result = m.run_and_wait(script, workspace_id=workspace_id, timeout=300)
389
- return {
390
- "passed": result["exit_code"] == 0,
391
- "duration_ms": result["duration_ms"],
392
- "output": [l["message"] for l in result["logs"] if l["source"] == "stdout"],
393
- }
394
-
395
-
396
- def parallel_test_matrix(repo_url, versions):
397
- """Run tests against multiple Node.js versions in parallel."""
398
- jobs = {}
399
- for version in versions:
400
- script = f"""
401
- apk add nodejs~={version} npm git
402
- git clone --depth 1 {repo_url} /tmp/repo
403
- cd /tmp/repo && npm ci && npm test
404
- """
405
- result = m.run(script, name=f"test-node-{version}")
406
- jobs[version] = result["request_id"]
407
-
408
- # Poll all jobs
409
- import time
410
- results = {}
411
- pending = set(jobs.keys())
412
-
413
- while pending:
414
- for version in list(pending):
415
- status = m.status(jobs[version])
416
- if status["status"] in ("completed", "error"):
417
- results[version] = {
418
- "passed": status.get("exit_code") == 0,
419
- "status": status["status"],
420
- }
421
- pending.discard(version)
422
- if pending:
423
- time.sleep(2)
424
-
425
- return results
426
-
427
- # results = parallel_test_matrix("https://github.com/user/repo.git", ["18", "20", "22"])
428
- ```
429
-
430
- ---
431
-
432
- ## Pattern 6: Scheduled Tasks
433
-
434
- Use Mags cron jobs for recurring work.
435
-
436
- ```python
437
- from mags import Mags
438
-
439
- m = Mags()
440
-
441
- # Run a health check every 5 minutes
442
- cron = m.cron_create(
443
- name="health-check",
444
- cron_expression="*/5 * * * *",
445
- script='curl -sf https://myapp.com/health || echo "ALERT: health check failed"',
446
- )
447
-
448
- # Nightly database backup
449
- m.cron_create(
450
- name="db-backup",
451
- cron_expression="0 2 * * *",
452
- script="pg_dump $DATABASE_URL | gzip > /root/backup-$(date +%F).sql.gz",
453
- workspace_id="backups",
454
- )
455
-
456
- # List all cron jobs
457
- crons = m.cron_list()
458
- for job in crons.get("cron_jobs", []):
459
- print(f"{job['name']}: {job['cron_expression']} (enabled={job['enabled']})")
460
-
461
- # Pause a cron job
462
- m.cron_update(cron["id"], enabled=False)
463
-
464
- # Delete it
465
- m.cron_delete(cron["id"])
466
- ```
467
-
468
- ---
469
-
470
- ## Pattern 7: Web App with Live Preview
471
-
472
- Deploy user code and give them a live URL.
473
-
474
- ```python
475
- from mags import Mags
476
-
477
- m = Mags()
478
-
479
- def deploy_preview(user_id, html_content):
480
- """Deploy HTML and return a live preview URL."""
481
- import shlex
482
-
483
- script = f"""
484
- mkdir -p /root/site
485
- cat > /root/site/index.html << 'HTMLEOF'
486
- {html_content}
487
- HTMLEOF
488
- cd /root/site && python3 -m http.server 8080
489
- """
490
- job = m.run(
491
- script,
492
- workspace_id=f"preview-{user_id}",
493
- persistent=True,
494
- startup_command="cd /root/site && python3 -m http.server 8080",
495
- )
496
-
497
- # Wait for VM to start
498
- import time
499
- for _ in range(15):
500
- status = m.status(job["request_id"])
501
- if status["status"] == "running":
502
- break
503
- time.sleep(1)
504
-
505
- # Enable URL access
506
- access = m.enable_access(job["request_id"], port=8080)
507
- return {
508
- "request_id": job["request_id"],
509
- "url": access.get("url"),
510
- "status": "live",
511
- }
512
-
513
- # preview = deploy_preview("user-42", "<h1>My App</h1>")
514
- # print(preview["url"]) # https://abc123.apps.magpiecloud.com
515
- ```
516
-
517
- The URL stays live. When the VM goes idle it sleeps automatically. The next visitor triggers an auto-wake (~3-5 seconds) and the page loads.
518
-
519
- ---
520
-
521
- ## Working with Workspaces
522
-
523
- Workspaces are persistent filesystems backed by S3. Files, installed packages, and configs survive across VM restarts and sleep/wake cycles.
524
-
525
- ```python
526
- # Create and populate a workspace
527
- m.run_and_wait("pip install flask gunicorn", workspace_id="my-app")
528
-
529
- # Run again — flask is already installed
530
- m.run_and_wait("flask --version", workspace_id="my-app")
531
-
532
- # Fork a workspace: start from base, save changes to a new workspace
533
- m.run_and_wait(
534
- "pip install pandas",
535
- workspace_id="my-app-with-pandas",
536
- base_workspace_id="my-app",
537
- )
538
-
539
- # Ephemeral run: no workspace sync, fastest possible
540
- m.run_and_wait("echo 'no persistence'", ephemeral=True)
541
- ```
542
-
543
- | Mode | `workspace_id` | `base_workspace_id` | Behavior |
544
- |------|----------------|----------------------|----------|
545
- | Ephemeral | omit | omit | No persistence. Fastest. |
546
- | Persistent | `"my-ws"` | omit | Read-write. Changes sync to S3. |
547
- | Read-only base | omit | `"my-base"` | Base mounted read-only. Changes discarded. |
548
- | Fork | `"fork-1"` | `"my-base"` | Starts from base, saves to `fork-1`. |
549
-
550
- ### Syncing workspaces
551
-
552
- Workspaces sync to S3 automatically when a job completes. For persistent VMs (`persistent=True`), workspaces also sync every 30 seconds and on sleep.
553
-
554
- Use `m.sync()` to force an immediate sync without stopping the VM — useful for persisting a base image you've just set up:
555
-
556
- ```python
557
- # Set up a base workspace on a persistent VM
558
- job = m.run(
559
- "pip install pandas numpy scikit-learn",
560
- workspace_id="ml-base",
561
- persistent=True,
562
- )
563
-
564
- # Wait for the install to finish
565
- import time
566
- for _ in range(60):
567
- status = m.status(job["request_id"])
568
- if status["status"] == "running":
569
- break
570
- time.sleep(1)
571
-
572
- # Force sync — base image is now available for other jobs
573
- m.sync(job["request_id"])
574
-
575
- # List and manage workspaces
576
- workspaces = m.list_workspaces()
577
- for ws in workspaces.get("workspaces", []):
578
- print(f"{ws['workspace_id']} — {ws['job_count']} jobs")
579
-
580
- # Delete a workspace (removes stored data from S3)
581
- m.delete_workspace("old-workspace")
582
- ```
583
-
584
- ---
585
-
586
- ## File Uploads
587
-
588
- Upload local files into a VM before the script runs. Files appear in `/root/`.
589
-
590
- ```python
591
- # Upload and use files
592
- file_ids = m.upload_files(["model.pkl", "data.csv"])
593
-
594
- result = m.run_and_wait(
595
- "python3 -c 'import pickle; m = pickle.load(open(\"/root/model.pkl\",\"rb\")); print(type(m))'",
596
- file_ids=file_ids,
597
- timeout=30,
598
- )
599
- ```
600
-
601
- ---
602
-
603
- ## SSH Access
604
-
605
- Enable SSH to get a full interactive terminal or run remote commands programmatically.
606
-
607
- ```python
608
- import subprocess, tempfile, os
609
-
610
- def ssh_command(request_id, command=None):
611
- """Run a command via SSH on a running VM."""
612
- access = m.enable_access(request_id, port=22)
613
-
614
- # Write private key to temp file
615
- key_file = tempfile.NamedTemporaryFile(mode="w", suffix=".pem", delete=False)
616
- key_file.write(access["ssh_private_key"])
617
- key_file.close()
618
- os.chmod(key_file.name, 0o600)
619
-
620
- ssh_cmd = [
621
- "ssh", "-i", key_file.name,
622
- "-p", str(access["ssh_port"]),
623
- "-o", "StrictHostKeyChecking=no",
624
- "-o", "UserKnownHostsFile=/dev/null",
625
- f"root@{access['ssh_host']}",
626
- ]
627
-
628
- if command:
629
- ssh_cmd.append(command)
630
- result = subprocess.run(ssh_cmd, capture_output=True, text=True, timeout=30)
631
- os.unlink(key_file.name)
632
- return {"stdout": result.stdout, "stderr": result.stderr, "returncode": result.returncode}
633
- else:
634
- # Interactive — opens a terminal
635
- os.unlink(key_file.name)
636
- return ssh_cmd # caller can use subprocess.run(ssh_cmd)
637
- ```
638
-
639
- ---
640
-
641
- ## Error Handling
642
-
643
- ```python
644
- from mags import Mags, MagsError
645
-
646
- m = Mags()
647
-
648
- try:
649
- result = m.run_and_wait("exit 1", timeout=10)
650
- if result["exit_code"] != 0:
651
- print("Script failed:", result["exit_code"])
652
- for log in result["logs"]:
653
- if log["source"] == "stderr":
654
- print(" ", log["message"])
655
-
656
- except MagsError as e:
657
- # API-level errors (auth, not found, timeout, etc.)
658
- print(f"Mags error: {e}")
659
- if e.status_code == 401:
660
- print("Check your API token")
661
- ```
662
-
663
- ---
664
-
665
- ## Async / FastAPI Integration
666
-
667
- The SDK uses synchronous `requests`. For async frameworks, run it in a thread pool:
668
-
669
- ```python
670
- import asyncio
671
- from functools import partial
672
-
673
- from fastapi import FastAPI
674
- from mags import Mags
675
-
676
- app = FastAPI()
677
- m = Mags()
678
-
679
- @app.post("/execute")
680
- async def execute(code: str, language: str = "python"):
681
- runner = f"python3 -c {code!r}" if language == "python" else code
682
-
683
- # Run synchronous SDK in a thread to avoid blocking the event loop
684
- loop = asyncio.get_event_loop()
685
- result = await loop.run_in_executor(
686
- None,
687
- partial(m.run_and_wait, runner, timeout=30),
688
- )
689
-
690
- return {
691
- "exit_code": result["exit_code"],
692
- "output": [l["message"] for l in result["logs"] if l["source"] == "stdout"],
693
- }
694
- ```
695
-
696
- ---
697
-
698
- ## Django Integration
699
-
700
- ```python
701
- # settings.py
702
- MAGS_API_TOKEN = os.environ.get("MAGS_API_TOKEN")
703
-
704
- # services.py
705
- from mags import Mags
706
- from django.conf import settings
707
-
708
- _client = None
709
-
710
- def get_mags_client():
711
- global _client
712
- if _client is None:
713
- _client = Mags(api_token=settings.MAGS_API_TOKEN)
714
- return _client
715
-
716
- # views.py
717
- from django.http import JsonResponse
718
- from .services import get_mags_client
719
-
720
- def run_code(request):
721
- m = get_mags_client()
722
- code = request.POST.get("code", "echo hello")
723
-
724
- result = m.run_and_wait(code, timeout=30)
725
-
726
- return JsonResponse({
727
- "exit_code": result["exit_code"],
728
- "output": [l["message"] for l in result["logs"] if l["source"] == "stdout"],
729
- })
730
- ```
731
-
732
- ---
733
-
734
- ## Flask Integration
735
-
736
- ```python
737
- from flask import Flask, request, jsonify
738
- from mags import Mags
739
-
740
- app = Flask(__name__)
741
- m = Mags()
742
-
743
- @app.post("/run")
744
- def run():
745
- data = request.get_json()
746
- result = m.run_and_wait(
747
- data["script"],
748
- workspace_id=data.get("workspace_id"),
749
- timeout=data.get("timeout", 30),
750
- )
751
- return jsonify({
752
- "exit_code": result["exit_code"],
753
- "logs": result["logs"],
754
- })
755
- ```
756
-
757
- ---
758
-
759
- ## Environment & Configuration
760
-
761
- | Env Variable | Description | Default |
762
- |-------------|-------------|---------|
763
- | `MAGS_API_TOKEN` | API token (required) | — |
764
- | `MAGS_TOKEN` | Alias for `MAGS_API_TOKEN` | — |
765
- | `MAGS_API_URL` | API base URL | `https://api.magpiecloud.com` |
766
-
767
- ```python
768
- # All configuration options
769
- m = Mags(
770
- api_token="...", # required
771
- api_url="https://api.magpiecloud.com", # optional
772
- timeout=30, # default request timeout in seconds
773
- )
774
- ```
775
-
776
- ---
777
-
778
- ## VM Specs
779
-
780
- | Property | Value |
781
- |----------|-------|
782
- | OS | Alpine Linux |
783
- | Shell | `/bin/sh` (ash) |
784
- | Package manager | `apk add <package>` |
785
- | User | `root` |
786
- | Working directory | `/root` |
787
- | Boot time | ~300ms from pool |
788
- | Default timeout | 300 seconds |
789
-
790
- Common packages: `python3`, `py3-pip`, `nodejs`, `npm`, `git`, `curl`, `jq`, `ffmpeg`, `go`, `rust`, `gcc`.
791
-
792
- ---
793
-
794
- ## Links
795
-
796
- - **Website:** [mags.run](https://mags.run)
797
- - **PyPI:** `pip install magpie-mags`
798
- - **npm:** `npm install @magpiecloud/mags`
799
- - **API Reference:** [API.md](../API.md)
800
- - **CLI Quickstart:** [QUICKSTART.md](../QUICKSTART.md)