@magpiecloud/mags 1.5.1 → 1.6.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (42) hide show
  1. package/API.md +381 -0
  2. package/Mags-API.postman_collection.json +374 -0
  3. package/QUICKSTART.md +283 -0
  4. package/README.md +287 -79
  5. package/bin/mags.js +161 -27
  6. package/deploy-page.sh +171 -0
  7. package/index.js +1 -163
  8. package/mags +0 -0
  9. package/mags.sh +270 -0
  10. package/nodejs/README.md +191 -0
  11. package/nodejs/bin/mags.js +1146 -0
  12. package/nodejs/index.js +326 -0
  13. package/nodejs/package.json +42 -0
  14. package/package.json +4 -15
  15. package/python/INTEGRATION.md +747 -0
  16. package/python/README.md +139 -0
  17. package/python/dist/magpie_mags-1.0.0-py3-none-any.whl +0 -0
  18. package/python/dist/magpie_mags-1.0.0.tar.gz +0 -0
  19. package/python/examples/demo.py +181 -0
  20. package/python/pyproject.toml +39 -0
  21. package/python/src/magpie_mags.egg-info/PKG-INFO +164 -0
  22. package/python/src/magpie_mags.egg-info/SOURCES.txt +9 -0
  23. package/python/src/magpie_mags.egg-info/dependency_links.txt +1 -0
  24. package/python/src/magpie_mags.egg-info/requires.txt +1 -0
  25. package/python/src/magpie_mags.egg-info/top_level.txt +1 -0
  26. package/python/src/mags/__init__.py +6 -0
  27. package/python/src/mags/client.py +283 -0
  28. package/skill.md +153 -0
  29. package/website/api.html +927 -0
  30. package/website/claude-skill.html +483 -0
  31. package/website/cookbook/hn-marketing.html +410 -0
  32. package/website/cookbook/hn-marketing.sh +50 -0
  33. package/website/cookbook.html +278 -0
  34. package/website/env.js +4 -0
  35. package/website/index.html +718 -0
  36. package/website/llms.txt +242 -0
  37. package/website/login.html +88 -0
  38. package/website/mags.md +171 -0
  39. package/website/script.js +425 -0
  40. package/website/styles.css +845 -0
  41. package/website/tokens.html +171 -0
  42. package/website/usage.html +187 -0
@@ -0,0 +1,747 @@
1
+ # Integrating Mags into Python Applications
2
+
3
+ Mags gives your Python app on-demand Linux VMs that boot in ~300ms. Run untrusted code, build CI pipelines, host ephemeral services, or give your users their own sandboxed environments — all from a few lines of Python.
4
+
5
+ ```bash
6
+ pip install magpie-mags
7
+ ```
8
+
9
+ ```python
10
+ from mags import Mags
11
+
12
+ m = Mags(api_token="your-token")
13
+ result = m.run_and_wait("echo 'Hello from a VM!'")
14
+ print(result["exit_code"]) # 0
15
+ ```
16
+
17
+ ---
18
+
19
+ ## Table of Contents
20
+
21
+ - [Authentication](#authentication)
22
+ - [Pattern 1: Code Execution Sandbox](#pattern-1-code-execution-sandbox)
23
+ - [Pattern 2: AI Agent Tool Use](#pattern-2-ai-agent-tool-use)
24
+ - [Pattern 3: Background Job Processing](#pattern-3-background-job-processing)
25
+ - [Pattern 4: On-Demand Dev Environments](#pattern-4-on-demand-dev-environments)
26
+ - [Pattern 5: CI/CD Pipeline Steps](#pattern-5-cicd-pipeline-steps)
27
+ - [Pattern 6: Scheduled Tasks](#pattern-6-scheduled-tasks)
28
+ - [Pattern 7: Web App with Live Preview](#pattern-7-web-app-with-live-preview)
29
+ - [Working with Workspaces](#working-with-workspaces)
30
+ - [File Uploads](#file-uploads)
31
+ - [SSH Access](#ssh-access)
32
+ - [Error Handling](#error-handling)
33
+ - [Async / FastAPI Integration](#async--fastapi-integration)
34
+ - [Django Integration](#django-integration)
35
+ - [Flask Integration](#flask-integration)
36
+ - [Environment & Configuration](#environment--configuration)
37
+
38
+ ---
39
+
40
+ ## Authentication
41
+
42
+ Set your API token once. All calls use it automatically.
43
+
44
+ ```python
45
+ # Option 1: Pass directly
46
+ m = Mags(api_token="your-token")
47
+
48
+ # Option 2: Environment variable (recommended for production)
49
+ # export MAGS_API_TOKEN="your-token"
50
+ m = Mags()
51
+
52
+ # Option 3: Custom API endpoint (self-hosted)
53
+ m = Mags(api_url="https://your-instance.example.com")
54
+ ```
55
+
56
+ ---
57
+
58
+ ## Pattern 1: Code Execution Sandbox
59
+
60
+ Run user-submitted code safely inside an isolated VM. The code never touches your infrastructure.
61
+
62
+ ```python
63
+ from mags import Mags, MagsError
64
+
65
+ m = Mags()
66
+
67
+ def execute_user_code(language, code, timeout=30):
68
+ """Run untrusted user code in a sandboxed VM."""
69
+
70
+ runners = {
71
+ "python": f"python3 -c {shlex.quote(code)}",
72
+ "javascript": f"node -e {shlex.quote(code)}",
73
+ "bash": code,
74
+ }
75
+
76
+ script = runners.get(language)
77
+ if not script:
78
+ return {"error": f"Unsupported language: {language}"}
79
+
80
+ try:
81
+ result = m.run_and_wait(script, timeout=timeout)
82
+ return {
83
+ "exit_code": result["exit_code"],
84
+ "output": [
85
+ log["message"]
86
+ for log in result["logs"]
87
+ if log["source"] in ("stdout", "stderr")
88
+ ],
89
+ }
90
+ except MagsError as e:
91
+ return {"error": str(e)}
92
+ ```
93
+
94
+ ```python
95
+ # Usage
96
+ import shlex
97
+
98
+ output = execute_user_code("python", "print(sum(range(100)))")
99
+ # {"exit_code": 0, "output": ["4950"]}
100
+ ```
101
+
102
+ ### With package installation
103
+
104
+ ```python
105
+ def run_with_packages(code, packages):
106
+ """Run Python code with pip packages pre-installed."""
107
+ install = f"pip install -q {' '.join(packages)}" if packages else "true"
108
+ script = f"{install} && python3 -c {shlex.quote(code)}"
109
+ return m.run_and_wait(script, timeout=60)
110
+
111
+ result = run_with_packages(
112
+ "import pandas as pd; print(pd.DataFrame({'a': [1,2,3]}))",
113
+ ["pandas"],
114
+ )
115
+ ```
116
+
117
+ ### With a pre-built base image
118
+
119
+ For repeated runs, avoid re-installing packages every time by using a base workspace:
120
+
121
+ ```python
122
+ # One-time setup: create a base workspace with common packages
123
+ m.run_and_wait(
124
+ "pip install pandas numpy requests flask scikit-learn",
125
+ workspace_id="python-base",
126
+ )
127
+
128
+ # Every subsequent run inherits the base (read-only, no install needed)
129
+ result = m.run_and_wait(
130
+ "python3 -c 'import pandas; print(pandas.__version__)'",
131
+ base_workspace_id="python-base",
132
+ )
133
+ ```
134
+
135
+ ---
136
+
137
+ ## Pattern 2: AI Agent Tool Use
138
+
139
+ Give LLM agents a sandboxed environment to execute code, install packages, and inspect results.
140
+
141
+ ```python
142
+ from mags import Mags
143
+
144
+ m = Mags()
145
+
146
+ def agent_tool_execute(code, workspace_id="agent-workspace"):
147
+ """Tool function for an AI agent to run code."""
148
+ result = m.run_and_wait(
149
+ code,
150
+ workspace_id=workspace_id,
151
+ timeout=60,
152
+ )
153
+
154
+ stdout = "\n".join(
155
+ log["message"]
156
+ for log in result["logs"]
157
+ if log["source"] == "stdout"
158
+ )
159
+ stderr = "\n".join(
160
+ log["message"]
161
+ for log in result["logs"]
162
+ if log["source"] == "stderr"
163
+ )
164
+
165
+ return {
166
+ "success": result["exit_code"] == 0,
167
+ "stdout": stdout,
168
+ "stderr": stderr,
169
+ }
170
+ ```
171
+
172
+ ### Claude / OpenAI tool definition
173
+
174
+ ```python
175
+ tool_definition = {
176
+ "name": "execute_code",
177
+ "description": (
178
+ "Execute shell commands or code in a persistent Linux VM. "
179
+ "Installed packages and files persist between calls. "
180
+ "The VM runs Alpine Linux with Python, Node.js, and common tools."
181
+ ),
182
+ "input_schema": {
183
+ "type": "object",
184
+ "properties": {
185
+ "code": {
186
+ "type": "string",
187
+ "description": "Shell command(s) to execute",
188
+ }
189
+ },
190
+ "required": ["code"],
191
+ },
192
+ }
193
+
194
+ # In your agent loop:
195
+ if tool_call.name == "execute_code":
196
+ result = agent_tool_execute(tool_call.input["code"])
197
+ # Return result to the LLM
198
+ ```
199
+
200
+ ### Multi-turn agent with persistent workspace
201
+
202
+ ```python
203
+ class AgentSandbox:
204
+ """Per-session sandbox that persists state across tool calls."""
205
+
206
+ def __init__(self, session_id):
207
+ self.m = Mags()
208
+ self.workspace_id = f"agent-{session_id}"
209
+
210
+ def execute(self, code):
211
+ return self.m.run_and_wait(
212
+ code,
213
+ workspace_id=self.workspace_id,
214
+ timeout=60,
215
+ )
216
+
217
+ def upload_and_run(self, file_path, command):
218
+ file_ids = self.m.upload_files([file_path])
219
+ return self.m.run_and_wait(
220
+ command,
221
+ workspace_id=self.workspace_id,
222
+ file_ids=file_ids,
223
+ timeout=60,
224
+ )
225
+
226
+ # Each user conversation gets its own sandbox
227
+ sandbox = AgentSandbox(session_id="user-123-conv-456")
228
+ sandbox.execute("pip install matplotlib")
229
+ sandbox.execute("python3 -c 'import matplotlib; print(matplotlib.__version__)'")
230
+ # matplotlib is still installed on the next call — workspace persists
231
+ ```
232
+
233
+ ---
234
+
235
+ ## Pattern 3: Background Job Processing
236
+
237
+ Offload heavy or untrusted processing to VMs. Good for video transcoding, PDF generation, data pipelines, etc.
238
+
239
+ ```python
240
+ import time
241
+ from mags import Mags
242
+
243
+ m = Mags()
244
+
245
+ def process_video(video_url, output_format="mp4"):
246
+ """Transcode a video in a sandboxed VM."""
247
+ script = f"""
248
+ apk add ffmpeg
249
+ curl -sL '{video_url}' -o /tmp/input
250
+ ffmpeg -i /tmp/input -c:v libx264 /tmp/output.{output_format}
251
+ ls -la /tmp/output.*
252
+ """
253
+ return m.run_and_wait(script, timeout=300)
254
+
255
+
256
+ def run_data_pipeline(sql_query, workspace_id="etl-pipeline"):
257
+ """Run a data processing pipeline with persistent deps."""
258
+ script = f"""
259
+ python3 << 'PYEOF'
260
+ import sqlite3, json
261
+
262
+ conn = sqlite3.connect("/workspace/data.db")
263
+ cursor = conn.execute("{sql_query}")
264
+ rows = cursor.fetchall()
265
+ print(json.dumps(rows))
266
+ PYEOF
267
+ """
268
+ return m.run_and_wait(script, workspace_id=workspace_id, timeout=120)
269
+ ```
270
+
271
+ ### Fire-and-forget (don't block)
272
+
273
+ ```python
274
+ def submit_job(script, name=None):
275
+ """Submit a job without waiting. Check status later."""
276
+ result = m.run(script, name=name)
277
+ return result["request_id"]
278
+
279
+ def check_job(request_id):
280
+ """Poll for completion."""
281
+ status = m.status(request_id)
282
+ if status["status"] in ("completed", "error"):
283
+ logs = m.logs(request_id)
284
+ return {**status, "logs": logs.get("logs", [])}
285
+ return status
286
+
287
+ # Submit
288
+ job_id = submit_job("sleep 30 && echo done", name="background-task")
289
+
290
+ # Check later
291
+ result = check_job(job_id)
292
+ ```
293
+
294
+ ---
295
+
296
+ ## Pattern 4: On-Demand Dev Environments
297
+
298
+ Give each user a persistent sandbox with SSH and URL access.
299
+
300
+ ```python
301
+ from mags import Mags
302
+
303
+ m = Mags()
304
+
305
+ def create_dev_environment(user_id, template="python"):
306
+ """Spin up a dev environment for a user."""
307
+ templates = {
308
+ "python": "apk add python3 py3-pip git && pip install ipython",
309
+ "node": "apk add nodejs npm git && npm i -g yarn",
310
+ "go": "apk add go git",
311
+ }
312
+
313
+ workspace_id = f"dev-{user_id}"
314
+ setup_script = templates.get(template, templates["python"])
315
+
316
+ # Create persistent VM with the dev environment
317
+ job = m.run(
318
+ setup_script,
319
+ workspace_id=workspace_id,
320
+ persistent=True,
321
+ startup_command=templates[template].split("&&")[-1].strip(),
322
+ )
323
+ request_id = job["request_id"]
324
+
325
+ # Wait for it to be ready
326
+ import time
327
+ for _ in range(30):
328
+ status = m.status(request_id)
329
+ if status["status"] == "running":
330
+ break
331
+ time.sleep(1)
332
+
333
+ # Enable SSH access
334
+ ssh = m.enable_access(request_id, port=22)
335
+
336
+ return {
337
+ "request_id": request_id,
338
+ "workspace_id": workspace_id,
339
+ "ssh_host": ssh["ssh_host"],
340
+ "ssh_port": ssh["ssh_port"],
341
+ "ssh_command": (
342
+ f"ssh -p {ssh['ssh_port']} "
343
+ f"-o StrictHostKeyChecking=no "
344
+ f"root@{ssh['ssh_host']}"
345
+ ),
346
+ }
347
+ ```
348
+
349
+ ---
350
+
351
+ ## Pattern 5: CI/CD Pipeline Steps
352
+
353
+ Run test suites, builds, or linting in isolated VMs.
354
+
355
+ ```python
356
+ from mags import Mags
357
+
358
+ m = Mags()
359
+
360
+ def run_tests(repo_url, branch="main", workspace_id=None):
361
+ """Clone a repo and run its test suite."""
362
+ script = f"""
363
+ apk add git nodejs npm
364
+ git clone --branch {branch} --depth 1 {repo_url} /tmp/repo
365
+ cd /tmp/repo
366
+ npm ci
367
+ npm test
368
+ """
369
+ result = m.run_and_wait(script, workspace_id=workspace_id, timeout=300)
370
+ return {
371
+ "passed": result["exit_code"] == 0,
372
+ "duration_ms": result["duration_ms"],
373
+ "output": [l["message"] for l in result["logs"] if l["source"] == "stdout"],
374
+ }
375
+
376
+
377
+ def parallel_test_matrix(repo_url, versions):
378
+ """Run tests against multiple Node.js versions in parallel."""
379
+ jobs = {}
380
+ for version in versions:
381
+ script = f"""
382
+ apk add nodejs~={version} npm git
383
+ git clone --depth 1 {repo_url} /tmp/repo
384
+ cd /tmp/repo && npm ci && npm test
385
+ """
386
+ result = m.run(script, name=f"test-node-{version}")
387
+ jobs[version] = result["request_id"]
388
+
389
+ # Poll all jobs
390
+ import time
391
+ results = {}
392
+ pending = set(jobs.keys())
393
+
394
+ while pending:
395
+ for version in list(pending):
396
+ status = m.status(jobs[version])
397
+ if status["status"] in ("completed", "error"):
398
+ results[version] = {
399
+ "passed": status.get("exit_code") == 0,
400
+ "status": status["status"],
401
+ }
402
+ pending.discard(version)
403
+ if pending:
404
+ time.sleep(2)
405
+
406
+ return results
407
+
408
+ # results = parallel_test_matrix("https://github.com/user/repo.git", ["18", "20", "22"])
409
+ ```
410
+
411
+ ---
412
+
413
+ ## Pattern 6: Scheduled Tasks
414
+
415
+ Use Mags cron jobs for recurring work.
416
+
417
+ ```python
418
+ from mags import Mags
419
+
420
+ m = Mags()
421
+
422
+ # Run a health check every 5 minutes
423
+ cron = m.cron_create(
424
+ name="health-check",
425
+ cron_expression="*/5 * * * *",
426
+ script='curl -sf https://myapp.com/health || echo "ALERT: health check failed"',
427
+ )
428
+
429
+ # Nightly database backup
430
+ m.cron_create(
431
+ name="db-backup",
432
+ cron_expression="0 2 * * *",
433
+ script="pg_dump $DATABASE_URL | gzip > /workspace/backup-$(date +%F).sql.gz",
434
+ workspace_id="backups",
435
+ )
436
+
437
+ # List all cron jobs
438
+ crons = m.cron_list()
439
+ for job in crons.get("cron_jobs", []):
440
+ print(f"{job['name']}: {job['cron_expression']} (enabled={job['enabled']})")
441
+
442
+ # Pause a cron job
443
+ m.cron_update(cron["id"], enabled=False)
444
+
445
+ # Delete it
446
+ m.cron_delete(cron["id"])
447
+ ```
448
+
449
+ ---
450
+
451
+ ## Pattern 7: Web App with Live Preview
452
+
453
+ Deploy user code and give them a live URL.
454
+
455
+ ```python
456
+ from mags import Mags
457
+
458
+ m = Mags()
459
+
460
+ def deploy_preview(user_id, html_content):
461
+ """Deploy HTML and return a live preview URL."""
462
+ import shlex
463
+
464
+ script = f"""
465
+ mkdir -p /workspace/site
466
+ cat > /workspace/site/index.html << 'HTMLEOF'
467
+ {html_content}
468
+ HTMLEOF
469
+ cd /workspace/site && python3 -m http.server 8080
470
+ """
471
+ job = m.run(
472
+ script,
473
+ workspace_id=f"preview-{user_id}",
474
+ persistent=True,
475
+ startup_command="cd /workspace/site && python3 -m http.server 8080",
476
+ )
477
+
478
+ # Wait for VM to start
479
+ import time
480
+ for _ in range(15):
481
+ status = m.status(job["request_id"])
482
+ if status["status"] == "running":
483
+ break
484
+ time.sleep(1)
485
+
486
+ # Enable URL access
487
+ access = m.enable_access(job["request_id"], port=8080)
488
+ return {
489
+ "request_id": job["request_id"],
490
+ "url": access.get("url"),
491
+ "status": "live",
492
+ }
493
+
494
+ # preview = deploy_preview("user-42", "<h1>My App</h1>")
495
+ # print(preview["url"]) # https://abc123.apps.magpiecloud.com
496
+ ```
497
+
498
+ The URL stays live. When the VM goes idle it sleeps automatically. The next visitor triggers an auto-wake (~3-5 seconds) and the page loads.
499
+
500
+ ---
501
+
502
+ ## Working with Workspaces
503
+
504
+ Workspaces are persistent filesystems backed by S3. Files, installed packages, and configs survive across VM restarts and sleep/wake cycles.
505
+
506
+ ```python
507
+ # Create and populate a workspace
508
+ m.run_and_wait("pip install flask gunicorn", workspace_id="my-app")
509
+
510
+ # Run again — flask is already installed
511
+ m.run_and_wait("flask --version", workspace_id="my-app")
512
+
513
+ # Fork a workspace: start from base, save changes to a new workspace
514
+ m.run_and_wait(
515
+ "pip install pandas",
516
+ workspace_id="my-app-with-pandas",
517
+ base_workspace_id="my-app",
518
+ )
519
+
520
+ # Ephemeral run: no workspace sync, fastest possible
521
+ m.run_and_wait("echo 'no persistence'", ephemeral=True)
522
+ ```
523
+
524
+ | Mode | `workspace_id` | `base_workspace_id` | Behavior |
525
+ |------|----------------|----------------------|----------|
526
+ | Ephemeral | omit | omit | No persistence. Fastest. |
527
+ | Persistent | `"my-ws"` | omit | Read-write. Changes sync to S3. |
528
+ | Read-only base | omit | `"my-base"` | Base mounted read-only. Changes discarded. |
529
+ | Fork | `"fork-1"` | `"my-base"` | Starts from base, saves to `fork-1`. |
530
+
531
+ ---
532
+
533
+ ## File Uploads
534
+
535
+ Upload local files into a VM before the script runs. Files appear in `/root/`.
536
+
537
+ ```python
538
+ # Upload and use files
539
+ file_ids = m.upload_files(["model.pkl", "data.csv"])
540
+
541
+ result = m.run_and_wait(
542
+ "python3 -c 'import pickle; m = pickle.load(open(\"/root/model.pkl\",\"rb\")); print(type(m))'",
543
+ file_ids=file_ids,
544
+ timeout=30,
545
+ )
546
+ ```
547
+
548
+ ---
549
+
550
+ ## SSH Access
551
+
552
+ Enable SSH to get a full interactive terminal or run remote commands programmatically.
553
+
554
+ ```python
555
+ import subprocess, tempfile, os
556
+
557
+ def ssh_command(request_id, command=None):
558
+ """Run a command via SSH on a running VM."""
559
+ access = m.enable_access(request_id, port=22)
560
+
561
+ # Write private key to temp file
562
+ key_file = tempfile.NamedTemporaryFile(mode="w", suffix=".pem", delete=False)
563
+ key_file.write(access["ssh_private_key"])
564
+ key_file.close()
565
+ os.chmod(key_file.name, 0o600)
566
+
567
+ ssh_cmd = [
568
+ "ssh", "-i", key_file.name,
569
+ "-p", str(access["ssh_port"]),
570
+ "-o", "StrictHostKeyChecking=no",
571
+ "-o", "UserKnownHostsFile=/dev/null",
572
+ f"root@{access['ssh_host']}",
573
+ ]
574
+
575
+ if command:
576
+ ssh_cmd.append(command)
577
+ result = subprocess.run(ssh_cmd, capture_output=True, text=True, timeout=30)
578
+ os.unlink(key_file.name)
579
+ return {"stdout": result.stdout, "stderr": result.stderr, "returncode": result.returncode}
580
+ else:
581
+ # Interactive — opens a terminal
582
+ os.unlink(key_file.name)
583
+ return ssh_cmd # caller can use subprocess.run(ssh_cmd)
584
+ ```
585
+
586
+ ---
587
+
588
+ ## Error Handling
589
+
590
+ ```python
591
+ from mags import Mags, MagsError
592
+
593
+ m = Mags()
594
+
595
+ try:
596
+ result = m.run_and_wait("exit 1", timeout=10)
597
+ if result["exit_code"] != 0:
598
+ print("Script failed:", result["exit_code"])
599
+ for log in result["logs"]:
600
+ if log["source"] == "stderr":
601
+ print(" ", log["message"])
602
+
603
+ except MagsError as e:
604
+ # API-level errors (auth, not found, timeout, etc.)
605
+ print(f"Mags error: {e}")
606
+ if e.status_code == 401:
607
+ print("Check your API token")
608
+ ```
609
+
610
+ ---
611
+
612
+ ## Async / FastAPI Integration
613
+
614
+ The SDK uses synchronous `requests`. For async frameworks, run it in a thread pool:
615
+
616
+ ```python
617
+ import asyncio
618
+ from functools import partial
619
+
620
+ from fastapi import FastAPI
621
+ from mags import Mags
622
+
623
+ app = FastAPI()
624
+ m = Mags()
625
+
626
+ @app.post("/execute")
627
+ async def execute(code: str, language: str = "python"):
628
+ runner = f"python3 -c {code!r}" if language == "python" else code
629
+
630
+ # Run synchronous SDK in a thread to avoid blocking the event loop
631
+ loop = asyncio.get_event_loop()
632
+ result = await loop.run_in_executor(
633
+ None,
634
+ partial(m.run_and_wait, runner, timeout=30),
635
+ )
636
+
637
+ return {
638
+ "exit_code": result["exit_code"],
639
+ "output": [l["message"] for l in result["logs"] if l["source"] == "stdout"],
640
+ }
641
+ ```
642
+
643
+ ---
644
+
645
+ ## Django Integration
646
+
647
+ ```python
648
+ # settings.py
649
+ MAGS_API_TOKEN = os.environ.get("MAGS_API_TOKEN")
650
+
651
+ # services.py
652
+ from mags import Mags
653
+ from django.conf import settings
654
+
655
+ _client = None
656
+
657
+ def get_mags_client():
658
+ global _client
659
+ if _client is None:
660
+ _client = Mags(api_token=settings.MAGS_API_TOKEN)
661
+ return _client
662
+
663
+ # views.py
664
+ from django.http import JsonResponse
665
+ from .services import get_mags_client
666
+
667
+ def run_code(request):
668
+ m = get_mags_client()
669
+ code = request.POST.get("code", "echo hello")
670
+
671
+ result = m.run_and_wait(code, timeout=30)
672
+
673
+ return JsonResponse({
674
+ "exit_code": result["exit_code"],
675
+ "output": [l["message"] for l in result["logs"] if l["source"] == "stdout"],
676
+ })
677
+ ```
678
+
679
+ ---
680
+
681
+ ## Flask Integration
682
+
683
+ ```python
684
+ from flask import Flask, request, jsonify
685
+ from mags import Mags
686
+
687
+ app = Flask(__name__)
688
+ m = Mags()
689
+
690
+ @app.post("/run")
691
+ def run():
692
+ data = request.get_json()
693
+ result = m.run_and_wait(
694
+ data["script"],
695
+ workspace_id=data.get("workspace_id"),
696
+ timeout=data.get("timeout", 30),
697
+ )
698
+ return jsonify({
699
+ "exit_code": result["exit_code"],
700
+ "logs": result["logs"],
701
+ })
702
+ ```
703
+
704
+ ---
705
+
706
+ ## Environment & Configuration
707
+
708
+ | Env Variable | Description | Default |
709
+ |-------------|-------------|---------|
710
+ | `MAGS_API_TOKEN` | API token (required) | — |
711
+ | `MAGS_TOKEN` | Alias for `MAGS_API_TOKEN` | — |
712
+ | `MAGS_API_URL` | API base URL | `https://api.magpiecloud.com` |
713
+
714
+ ```python
715
+ # All configuration options
716
+ m = Mags(
717
+ api_token="...", # required
718
+ api_url="https://api.magpiecloud.com", # optional
719
+ timeout=30, # default request timeout in seconds
720
+ )
721
+ ```
722
+
723
+ ---
724
+
725
+ ## VM Specs
726
+
727
+ | Property | Value |
728
+ |----------|-------|
729
+ | OS | Alpine Linux |
730
+ | Shell | `/bin/sh` (ash) |
731
+ | Package manager | `apk add <package>` |
732
+ | User | `root` |
733
+ | Working directory | `/root` |
734
+ | Boot time | ~300ms from pool |
735
+ | Default timeout | 300 seconds |
736
+
737
+ Common packages: `python3`, `py3-pip`, `nodejs`, `npm`, `git`, `curl`, `jq`, `ffmpeg`, `go`, `rust`, `gcc`.
738
+
739
+ ---
740
+
741
+ ## Links
742
+
743
+ - **Website:** [mags.run](https://mags.run)
744
+ - **PyPI:** `pip install magpie-mags`
745
+ - **npm:** `npm install @magpiecloud/mags`
746
+ - **API Reference:** [API.md](../API.md)
747
+ - **CLI Quickstart:** [QUICKSTART.md](../QUICKSTART.md)