livepilot 1.4.5 → 1.6.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -10,60 +10,81 @@
10
10
  [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](LICENSE)
11
11
  [![CI](https://github.com/dreamrec/LivePilot/actions/workflows/ci.yml/badge.svg)](https://github.com/dreamrec/LivePilot/actions/workflows/ci.yml)
12
12
  [![GitHub stars](https://img.shields.io/github/stars/dreamrec/LivePilot)](https://github.com/dreamrec/LivePilot/stargazers)
13
+ [![npm](https://img.shields.io/npm/v/livepilot)](https://www.npmjs.com/package/livepilot)
13
14
 
14
- **AI copilot for Ableton Live 12** — 104 MCP tools for music production, sound design, and mixing.
15
+ **AI copilot for Ableton Live 12** — 135 MCP tools, a deep device knowledge corpus, real-time audio analysis, and persistent technique memory.
15
16
 
16
- Talk to your DAW. Create tracks, program MIDI, load instruments, tweak parameters, arrange songs, and mix — all through natural language. LivePilot connects any MCP-compatible AI client (Claude, Cursor, VS Code Copilot) to Ableton Live and gives it full control over your session.
17
+ Most Ableton MCP servers give the AI tools to push buttons. LivePilot gives it three things on top of that:
17
18
 
18
- Every command goes through Ableton's official Live Object Model API. No hacks, no injection the same interface Ableton's own control surfaces use. Everything is deterministic and reversible with undo.
19
+ - **Knowledge** A device atlas of 280+ instruments, 139 drum kits, and 350+ impulse responses. The AI doesn't guess device names or parameters. It looks them up.
20
+ - **Perception** — An M4L analyzer that reads the master bus in real-time: 8-band spectrum, RMS/peak metering, pitch tracking, key detection. The AI makes decisions based on what it hears, not just what's configured.
21
+ - **Memory** — A technique library that persists across sessions. The AI remembers how you built that bass sound, what swing you like on hi-hats, which reverb chain worked on vocals. It learns your taste over time.
22
+
23
+ These three layers sit on top of 135 deterministic MCP tools that cover transport, tracks, clips, MIDI, devices, scenes, mixing, browser, arrangement, and sample manipulation. Every command goes through Ableton's official Live Object Model API — the same interface Ableton's own control surfaces use. Everything is reversible with undo.
19
24
 
20
25
  ---
21
26
 
22
- ## Agent & Technique Memory
27
+ ## The Three Layers
23
28
 
24
- LivePilot is stateless by default 104 tools, deterministic execution, no hidden context. The agent layer adds **persistent state** on top: a technique memory system that stores production decisions as typed, searchable, replayable data structures with structured metadata.
29
+ Most MCP servers are a flat list of tools. LivePilot is a production system with three layers that work together.
25
30
 
26
- ### How it works
31
+ ### 1. Device Atlas — What the AI knows
27
32
 
28
- The memory system stores five technique types: `beat_pattern`, `device_chain`, `mix_template`, `preference`, and `browser_pin`. Each technique consists of three layers:
33
+ A structured knowledge corpus of 280+ Ableton devices, 139 drum kits, and 350+ impulse responses. When the AI needs to load an instrument, it doesn't hallucinate a name and hope for the best. It consults the atlas, finds the exact preset, and loads it by URI.
29
34
 
30
- | Layer | Contents | Purpose |
31
- |-------|----------|---------|
32
- | **Identity** | UUID, name, type, tags, timestamps, rating, replay count | Indexing, filtering, sorting |
33
- | **Qualities** | Structured analysis — summary, mood, genre tags, rhythm feel, harmonic character, sonic texture, production notes, reference points | Search ranking, agent context at decision time |
34
- | **Payload** | Raw data MIDI notes, device params, tempo, kit URIs, send levels | Exact replay or adaptation |
35
+ The atlas is organized by category (synths, drums, effects, samples) with metadata about each device: what it sounds like, what parameters matter, what presets are available. When you say "load something warm and analog", the AI can search the atlas for instruments tagged with those qualities and pick one that actually exists in your library.
36
+
37
+ This is the difference between an AI that says "I'll load Warm Analog Pad" (and crashes because it doesn't exist) and one that searches the drum kit index, finds "Kit-606 Tape.adg", and loads it by its real browser URI.
38
+
39
+ ### 2. AnalyzerWhat the AI hears
40
+
41
+ The LivePilot Analyzer is an M4L device that sits on the master track and feeds real-time audio data back to the AI:
42
+
43
+ - **8-band spectrum** — sub, low, low-mid, mid, high-mid, high, presence, air
44
+ - **RMS and peak metering** — true loudness, not just parameter values
45
+ - **Pitch tracking and key detection** — Krumhansl-Schmuckler algorithm on accumulated pitch data
46
+
47
+ This means the AI can verify its own work. After adding an EQ cut at 400 Hz, it reads the spectrum to confirm the cut actually reduced the mud. After loading a bass preset, it checks that the low end is present. Before writing a bass line, it detects the key of what's already playing.
48
+
49
+ Without the analyzer, the AI is working blind — it can set parameters but can't hear the result. With it, the AI closes the feedback loop.
35
50
 
36
- When you save a technique, the agent collects raw data from Ableton using existing tools (`get_notes`, `get_device_parameters`, etc.) and writes a structured qualities analysis. The qualities are what make search useful `memory_recall(query="dark heavy 808")` matches against mood, genre tags, sonic texture, and summary fields, not just names.
51
+ ### 3. Technique MemoryWhat the AI learns
37
52
 
38
- ### Three operating modes
53
+ The memory system (`memory_learn` / `memory_recall` / `memory_replay`) stores production decisions as structured, searchable, replayable data. Not just parameter snapshots — the full context: what genre, what mood, what made it work, what the signal chain was, what MIDI pattern drove it.
39
54
 
40
- | Mode | Trigger | Behavior |
41
- |------|---------|----------|
42
- | **Informed** (default) | Any creative task | Agent calls `memory_recall`, reads top results' qualities, lets them influence decisions (kit selection, parameter ranges, rhythmic density) without copying |
43
- | **Fresh** | "ignore my history" / "something new" | Agent skips memory entirely — uses only the shipped reference corpus and its own knowledge |
44
- | **Explicit recall** | "use that boom bap beat" / "load my reverb chain" | Direct retrieval via `memory_get` → `memory_replay` with `adapt=false` (exact) or `adapt=true` (variation) |
55
+ Five technique types: `beat_pattern`, `device_chain`, `mix_template`, `preference`, `browser_pin`. Each stores three layers of data:
45
56
 
46
- The agent consults memory by default but never constrains itself to it. Override is always one sentence away.
57
+ | Layer | Contents | Purpose |
58
+ |-------|----------|---------|
59
+ | **Identity** | UUID, name, type, tags, timestamps, rating | Indexing and filtering |
60
+ | **Qualities** | Mood, genre, rhythm feel, harmonic character, sonic texture, production notes | Search ranking and creative context |
61
+ | **Payload** | Raw MIDI notes, device params, tempo, kit URIs, send levels | Exact replay or adaptation |
62
+
63
+ The agent consults memory by default before creative decisions. `memory_recall(query="dark heavy 808")` matches against mood, genre, and texture — not just names. The results inform the AI's choices without constraining them. Say "ignore my history" and it works from a clean slate. Say "use that boom bap beat from last session" and it pulls the exact technique and replays it.
47
64
 
48
- ### Replay architecture
65
+ Over time, the library becomes a structured representation of your production taste: swing ranges, kit preferences, harmonic tendencies, arrangement density. The AI reads across this at decision time. New output is always generated; the memory shapes the generation.
49
66
 
50
- `memory_replay` does not execute Ableton commands directly. It returns a structured plan — an ordered list of tool calls (`search_browser`, `load_browser_item`, `create_clip`, `add_notes`, etc.) that the agent then executes through the existing MCP tools. This keeps the memory system decoupled from the Ableton connection and makes replay logic testable without a running DAW.
67
+ ### How the layers combine
51
68
 
52
- ### Building the corpus over time
69
+ "Make a boom bap beat at 86 BPM" triggers the full stack:
53
70
 
54
- The shipped plugin includes a reference corpus (~2,700 lines): genre-specific drum patterns, chord voicings, sound design recipes, mixing templates, and workflow patterns. This is the baseline — the agent is competent from the first session.
71
+ 1. **Atlas** finds the right drum kit (not a guess, a real preset with real samples)
72
+ 2. **Memory** — recalls your previous boom bap patterns, checks your preferred swing amount and velocity curves
73
+ 3. **Tools** — creates tracks, loads instruments, programs MIDI, chains effects, sets levels
74
+ 4. **Analyzer** — reads the spectrum to verify the kick sits right, detects the key for the bass line, checks RMS to balance levels
55
75
 
56
- The technique memory extends this with user-specific data. As you save techniques, rate them, and tag them, the library becomes a structured representation of your production preferences. The agent reads across saved qualities at decision time — not to copy stored patterns, but to understand tendencies: swing ranges, kit preferences, harmonic language, arrangement density. New output is always generated; the memory informs the generation.
76
+ No other Ableton MCP server does this. Others have tools. LivePilot has tools + knowledge + perception + memory.
57
77
 
58
78
  ---
59
79
 
60
80
  ## What You Can Do
61
81
 
62
- - **Produce** — Create tracks, load instruments, program drum patterns, bass lines, chord progressions, and melodies
82
+ - **Produce** — Create tracks, load instruments from the atlas, program drum patterns, bass lines, chord progressions, and melodies — informed by your saved techniques
63
83
  - **Arrange** — Build full song structures in arrangement view with MIDI editing, cue points, automation, and timeline navigation
64
- - **Design sounds** — Browse Ableton's library, load presets, tweak every device parameter, chain effects
65
- - **Mix** — Set levels, panning, sends, and routing across all track types including return tracks and master. Run diagnostics to catch silent tracks and stale solos
66
- - **Remember and evolve** — Save techniques, build a personal style library, and let the agent learn your taste over time
84
+ - **Design sounds** — Browse Ableton's library, load presets, tweak every device parameter, chain effects, walk nested racks 6 levels deep
85
+ - **Mix with ears** — Set levels, panning, sends, and routing. Read the spectrum, check RMS, detect the key. The analyzer tells the AI what changed, not just what was set
86
+ - **Remember and evolve** — Save techniques, build a personal style library, replay past decisions exactly or as variations
87
+ - **Chop samples** — Load audio into Simpler, slice, reverse, crop, warp, and reprogram — all from conversation
67
88
  - **Iterate fast** — Transpose, humanize, quantize, duplicate, and reshape patterns through conversation
68
89
 
69
90
  ---
@@ -223,7 +244,7 @@ npx -y github:dreamrec/LivePilot --status
223
244
 
224
245
  ---
225
246
 
226
- ## 104 Tools Across 10 Domains
247
+ ## 127 Tools Across 11 Domains
227
248
 
228
249
  | Domain | Tools | What you can do |
229
250
  |--------|:-----:|-----------------|
@@ -233,10 +254,11 @@ npx -y github:dreamrec/LivePilot --status
233
254
  | **Notes** | 8 | Add/get/remove/modify MIDI notes, transpose, quantize, duplicate |
234
255
  | **Devices** | 12 | Load instruments & effects, tweak parameters, rack chains, presets — works on regular, return, and master tracks |
235
256
  | **Scenes** | 8 | Create, delete, duplicate, fire, rename, color, per-scene tempo |
236
- | **Mixing** | 8 | Volume, pan, sends, routing — return tracks and master fully supported |
257
+ | **Mixing** | 11 | Volume, pan, sends, routing, meters, mix snapshot — return tracks and master fully supported |
237
258
  | **Browser** | 4 | Search Ableton's library, browse categories, load presets |
238
259
  | **Arrangement** | 19 | Create clips, full MIDI note CRUD, cue points, recording, automation |
239
260
  | **Memory** | 8 | Save, recall, replay, and manage production techniques |
261
+ | **Analyzer** | 20 | Real-time spectral analysis, key detection, sample manipulation, warp markers, device introspection (requires M4L device) |
240
262
 
241
263
  <details>
242
264
  <summary><strong>Full tool list</strong></summary>
@@ -259,8 +281,8 @@ npx -y github:dreamrec/LivePilot --status
259
281
  ### Scenes (8)
260
282
  `get_scenes_info` · `create_scene` · `delete_scene` · `duplicate_scene` · `fire_scene` · `set_scene_name` · `set_scene_color` · `set_scene_tempo`
261
283
 
262
- ### Mixing (8)
263
- `set_track_volume` · `set_track_pan` · `set_track_send` · `get_return_tracks` · `get_master_track` · `set_master_volume` · `get_track_routing` · `set_track_routing`
284
+ ### Mixing (11)
285
+ `set_track_volume` · `set_track_pan` · `set_track_send` · `get_return_tracks` · `get_master_track` · `set_master_volume` · `get_track_routing` · `set_track_routing` · `get_track_meters` · `get_master_meters` · `get_mix_snapshot`
264
286
 
265
287
  ### Browser (4)
266
288
  `get_browser_tree` · `get_browser_items` · `search_browser` · `load_browser_item`
@@ -271,11 +293,14 @@ npx -y github:dreamrec/LivePilot --status
271
293
  ### Memory (8)
272
294
  `memory_learn` · `memory_recall` · `memory_get` · `memory_replay` · `memory_list` · `memory_favorite` · `memory_update` · `memory_delete`
273
295
 
296
+ ### Analyzer (20) — requires LivePilot Analyzer M4L device on master track
297
+ `get_master_spectrum` · `get_master_rms` · `get_detected_key` · `get_hidden_parameters` · `get_automation_state` · `walk_device_tree` · `get_clip_file_path` · `replace_simpler_sample` · `load_sample_to_simpler` · `get_simpler_slices` · `crop_simpler` · `reverse_simpler` · `warp_simpler` · `get_warp_markers` · `add_warp_marker` · `move_warp_marker` · `remove_warp_marker` · `scrub_clip` · `stop_scrub` · `get_display_values`
298
+
274
299
  </details>
275
300
 
276
301
  ---
277
302
 
278
- ## Claude Code Plugin
303
+ ## Plugin
279
304
 
280
305
  The plugin adds a skill, an autonomous agent, and 5 slash commands on top of the MCP tools.
281
306
 
@@ -295,45 +320,95 @@ claude plugin add github:dreamrec/LivePilot/plugin
295
320
 
296
321
  ### Producer Agent
297
322
 
298
- Autonomous agent that executes multi-step production tasks from high-level descriptions. Handles the full pipeline: session planning, track creation, instrument loading, MIDI programming, effect configuration, and mixingwith mandatory health checks between each stage to verify every track produces audible output.
323
+ Autonomous agent that handles multi-step production tasks end-to-end. "Make a lo-fi hip hop beat at 75 BPM" triggers a full pipeline: consult the technique memory for style context, search the device atlas for the right drum kit and instruments, create tracks, program MIDI, chain effects, set levelsthen read the spectrum through the analyzer to verify everything sounds right.
299
324
 
300
- The agent ships with a 2,700-line reference corpus covering genre-specific drum patterns, chord voicings, sound design parameter recipes, mixing templates, and song structures. It consults the technique memory by default (see above), and can be overridden to work from a clean slate.
325
+ The agent ships with a 2,700-line reference corpus (drum patterns, chord voicings, sound design recipes, mixing templates) and consults the technique memory by default. Mandatory health checks between each stage verify every track produces audible output — the analyzer confirms what the meters suggest.
301
326
 
302
327
  ### Core Skill
303
328
 
304
- `livepilot-core` encodes operational discipline for the 104 tools: read state before writing, verify after every mutation, validate instrument loading (empty Drum Racks produce silence), never hallucinate device names (always `search_browser` first), use negative track indices for return tracks. Without it, an LLM with access to the tools will produce silent tracks and load wrong devices.
329
+ `livepilot-core` encodes the operational discipline that connects all three layers. It teaches the AI to consult the device atlas before loading instruments, read the analyzer after mixing moves, check technique memory before creative decisions, and verify every mutation through state reads. It enforces the rules that prevent silent failures: never load empty Drum Racks, never hallucinate device names, always verify audio output. Without it, an LLM with access to the tools will produce silent tracks and load wrong devices.
330
+
331
+ ---
332
+
333
+ ## M4L Analyzer
334
+
335
+ The LivePilot Analyzer (`LivePilot_Analyzer.amxd`) gives the AI ears. Drop it on the master track and 20 additional tools unlock: 8-band spectral analysis, RMS/peak metering, Krumhansl-Schmuckler key detection, plus deep LOM access for sample manipulation, warp markers, device introspection, and human-readable parameter display values.
336
+
337
+ All 107 core tools work without it. The analyzer is what turns LivePilot from a remote control into a feedback loop — the AI can set an EQ curve and then read the spectrum to verify the result.
305
338
 
306
339
  ---
307
340
 
308
- ## How It Compares
309
-
310
- | Feature | LivePilot | [AbletonMCP](https://github.com/ahujasid/ableton-mcp) | [Ableton MCP Extended](https://github.com/uisato/ableton-mcp-extended) |
311
- |---------|:---------:|:---------:|:---------:|
312
- | **Tools** | 104 | ~20 | ~50 |
313
- | **Arrangement view** | Full (clips, notes, cue points, automation) | No | Partial (automation "not perfect yet") |
314
- | **MIDI note editing** | Full CRUD with note IDs, probability, velocity deviation | Basic add/get | Add/get/modify |
315
- | **Device control** | Load, params, batch edit, rack chains, presets, Simpler modes | Load, basic params | Load, params |
316
- | **Browser search** | Tree navigation, path filtering, URI-based loading | Basic search | Search with categories |
317
- | **Mixing** | Volume, pan, sends, routing, master, diagnostics | Volume, pan | Volume, pan, sends |
318
- | **Undo support** | Full (begin/end_undo_step wrapping) | No | Partial |
319
- | **Session diagnostics** | Built-in health checks (armed tracks, solos, silent tracks) | No | No |
320
- | **Per-note probability** | Yes (Live 12 API) | No | No |
321
- | **Plugin/skills** | Claude Code plugin with 5 commands + producer agent | No | No |
322
- | **Voice generation** | No | No | Yes (ElevenLabs) |
323
- | **UDP low-latency mode** | No (TCP, reliable) | No | Yes (experimental) |
324
- | **Protocol** | JSON/TCP, single-client, structured errors | JSON/TCP | JSON/TCP + UDP |
325
- | **Installation** | Auto-detect CLI (`--install`) | Manual copy | Manual copy |
326
- | **Live version** | Live 12 (modern note API) | Live 11+ | Live 11+ |
327
- | **License** | MIT | MIT | MIT |
328
-
329
- LivePilot focuses on **comprehensive, deterministic control** with safety nets (undo wrapping, diagnostics, verification patterns). It trades real-time parameter streaming (Extended's UDP mode) and external service integration (Extended's ElevenLabs) for deeper coverage of Ableton's core operations — especially arrangement, device management, and MIDI editing with Live 12's modern note API.
341
+ ## The Landscape
342
+
343
+ There are **15+ MCP servers for Ableton Live** as of March 2026. Here's how the major ones compare:
344
+
345
+ ### At a Glance
346
+
347
+ | | [LivePilot](https://github.com/dreamrec/LivePilot) | [AbletonMCP](https://github.com/ahujasid/ableton-mcp) | [MCP Extended](https://github.com/uisato/ableton-mcp-extended) | [Ableton Copilot](https://github.com/xiaolaa2/ableton-copilot-mcp) | [AbletonBridge](https://github.com/hidingwill/AbletonBridge) | [Producer Pal](https://github.com/adamjmurray/producer-pal) |
348
+ |---|:-:|:-:|:-:|:-:|:-:|:-:|
349
+ | **Tools** | 127 | ~20 | ~35 | ~45 | 322 | ~25 |
350
+ | **Device knowledge** | 280+ devices | -- | -- | -- | -- | -- |
351
+ | **Audio analysis** | Spectrum/RMS/key | -- | -- | -- | Metering | -- |
352
+ | **Technique memory** | Persistent | -- | -- | -- | -- | -- |
353
+ | **Stars** | new | 2.3k | 139 | 72 | 13 | 103 |
354
+ | **Language** | Python | Python | Python | TypeScript | Python | TypeScript |
355
+ | **Active** | Yes | Slow | Yes | Yes | Yes | Yes |
356
+
357
+ ### Feature Comparison
358
+
359
+ | Capability | LivePilot | AbletonMCP | Extended | Copilot | Bridge | Producer Pal |
360
+ |---|:-:|:-:|:-:|:-:|:-:|:-:|
361
+ | Transport | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
362
+ | Tracks (MIDI/audio/return) | | Partial | | | | |
363
+ | Session clips | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
364
+ | **Arrangement view** | ✅ | — | — | ✅ | ? | ? |
365
+ | **Arrangement automation** | ✅ | — | — | — | ? | — |
366
+ | MIDI notes (add/get) | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
367
+ | **MIDI notes (modify/delete by ID)** | ✅ | — | — | ✅ | ? | — |
368
+ | **Per-note probability** | ✅ | — | — | — | — | — |
369
+ | Device loading | ✅ | ✅ | ✅ | ✅ | ✅ | ? |
370
+ | Device parameters | ✅ | Basic | ✅ | ✅ | ✅ | ? |
371
+ | **Batch parameter editing** | ✅ | — | — | — | ? | — |
372
+ | **Rack chains** | ✅ | — | — | — | ✅ | — |
373
+ | Browser (tree/search/URI) | ✅ | Basic | ✅ | ✅ | ✅ | — |
374
+ | **Plugin browser (AU/VST)** | ✅ | — | — | — | ? | — |
375
+ | Mixing (vol/pan/sends) | ✅ | Basic | ✅ | Basic | ✅ | ? |
376
+ | **Master track control** | ✅ | — | — | — | ✅ | — |
377
+ | Scenes | ✅ | — | ✅ | ? | ✅ | ✅ |
378
+ | **Undo wrapping** | ✅ | — | Partial | — | ? | — |
379
+ | **Session diagnostics** | ✅ | — | — | — | — | — |
380
+ | **Technique memory** | ✅ | — | — | — | — | — |
381
+ | **AI plugin (skills/agent)** | ✅ | — | — | — | — | — |
382
+ | **Device Atlas (built-in)** | ✅ | — | — | — | — | — |
383
+ | **Auto-detect installer** | ✅ | — | — | ✅ | — | — |
384
+ | Snapshots/rollback | — | — | — | ✅ | — | — |
385
+ | Voice generation | — | — | — | — | ✅ | — |
386
+ | **Real-time DSP analysis** | ✅ | — | — | — | ✅ | — |
387
+ | M4L-native install | — | — | — | — | — | ✅ |
388
+ | Multi-LLM support | Any MCP | Claude | Claude | Any MCP | Any MCP | Multi |
389
+
390
+ ### Also Notable
391
+
392
+ - **[Simon-Kansara](https://github.com/Simon-Kansara/ableton-live-mcp-server)** (369★) — OSC-based, exhaustive address mapping, inactive since 2025
393
+ - **[jpoindexter](https://github.com/jpoindexter/ableton-mcp)** — 200+ tools, triple interface (MCP + REST + M4L), 13 scales
394
+ - **[cafeTechne](https://github.com/cafeTechne/ableton-11-mcp-for-windows-codex-and-antigravity)** — 220+ tools, Windows/Codex optimized, Live 11 focused
395
+ - **[FabianTinkl](https://github.com/FabianTinkl/AbletonMCP)** — AI-powered chord/melody generation, genre-specific composition
396
+ - **[nozomi-koborinai](https://github.com/nozomi-koborinai/ableton-osc-mcp)** — Only Go implementation, uses Google Genkit
397
+
398
+ ### Where LivePilot Fits
399
+
400
+ Every server on this list gives the AI tools to control Ableton. LivePilot is the only one that also gives it **knowledge** (device atlas with 280+ devices, 139 kits, 350+ IRs), **perception** (real-time spectrum, RMS, key detection from the M4L analyzer), and **memory** (persistent technique library that accumulates production decisions across sessions).
401
+
402
+ The practical difference: other servers let the AI set a parameter. LivePilot lets the AI choose the right parameter based on what device is loaded (atlas), verify the result by reading the audio output (analyzer), and remember the technique for next time (memory).
403
+
404
+ AbletonBridge has more raw tools (322 vs 127). Producer Pal has the easiest install (drag a .amxd). The original AbletonMCP has the community (2.3k stars). LivePilot has the deepest integration — tools that execute, knowledge that informs, perception that verifies, and memory that accumulates.
330
405
 
331
406
  ---
332
407
 
333
408
  ## Architecture
334
409
 
335
410
  ```
336
- Claude / AI Client
411
+ AI Client
337
412
  │ MCP Protocol (stdio)
338
413
 
339
414
  ┌─────────────────────┐
@@ -367,7 +442,7 @@ All commands execute on Ableton's main thread via `schedule_message` — the sam
367
442
  | Max for Live devices | — | Yes |
368
443
  | Third-party VST/AU plugins | Yes | — |
369
444
 
370
- **Requirements:** Ableton Live 12 · Python 3.10+ · Node.js 18+
445
+ **Requirements:** Ableton Live 12 · Python 3.9+ · Node.js 18+
371
446
 
372
447
  ---
373
448
 
@@ -0,0 +1,161 @@
1
+ # LivePilot Analyzer — Max for Live Build Guide
2
+
3
+ Step-by-step instructions to build the `.amxd` device in Max for Live.
4
+ The device analyzes the master bus audio and streams data to LivePilot.
5
+
6
+ ## Prerequisites
7
+
8
+ - Ableton Live 12 Suite (includes Max for Live)
9
+ - The `livepilot_bridge.js` file from this directory
10
+
11
+ ## Step 1: Create New Device
12
+
13
+ 1. In Live, go to an empty Audio track
14
+ 2. Click **Create** → **Max Audio Effect** (or drag "Max Audio Effect" from browser)
15
+ 3. Click the **pencil icon** on the device title bar to open the Max editor
16
+
17
+ ## Step 2: Audio Pass-Through
18
+
19
+ The device MUST pass audio through unchanged.
20
+
21
+ 1. You'll see `[plugin~]` and `[plugout~]` already connected
22
+ 2. Verify: left outlet of `plugin~` → left inlet of `plugout~`
23
+ 3. Verify: right outlet of `plugin~` → right inlet of `plugout~`
24
+
25
+ ## Step 3: Mono Sum for Analysis
26
+
27
+ We tap the audio for analysis without affecting the pass-through.
28
+
29
+ 1. Add object: `[+~]` (adds L+R to mono)
30
+ 2. Connect: `plugin~` left outlet → `[+~]` left inlet
31
+ 3. Connect: `plugin~` right outlet → `[+~]` right inlet
32
+ 4. Add object: `[*~ 0.5]` (scale to prevent clipping)
33
+ 5. Connect: `[+~]` outlet → `[*~ 0.5]` inlet
34
+
35
+ ## Step 4: 8-Band Spectrum Analysis
36
+
37
+ 1. Add object: `[fffb~ 8]` (fast 8-band filter bank)
38
+ 2. Connect: `[*~ 0.5]` outlet → `[fffb~ 8]` inlet
39
+ 3. Set `fffb~` frequencies in Inspector or via message:
40
+ - Band 1: 40 Hz (sub)
41
+ - Band 2: 130 Hz (low)
42
+ - Band 3: 350 Hz (low-mid)
43
+ - Band 4: 1000 Hz (mid)
44
+ - Band 5: 3000 Hz (high-mid)
45
+ - Band 6: 6000 Hz (high)
46
+ - Band 7: 10000 Hz (presence)
47
+ - Band 8: 16000 Hz (air)
48
+
49
+ To set: add `[loadmess 40 130 350 1000 3000 6000 10000 16000]` → `[fffb~ 8]` right inlet
50
+
51
+ 4. For each of the 8 outlets of `[fffb~ 8]`:
52
+ - Add `[abs~]` (rectify to positive)
53
+ - Add `[snapshot~ 200]` (sample at 5 Hz)
54
+
55
+ 5. Add `[pack f f f f f f f f]` and connect all 8 `[snapshot~]` outlets to it
56
+ 6. Add `[prepend /spectrum]` → connect from `[pack]`
57
+ 7. Add `[udpsend 127.0.0.1 9880]` → connect from `[prepend]`
58
+
59
+ ## Step 5: Peak and RMS Metering
60
+
61
+ 1. Add `[peakamp~ 200]` → connect from `[*~ 0.5]`
62
+ 2. Add `[snapshot~ 200]` → connect from `[peakamp~]`
63
+ 3. Add `[prepend /peak]` → `[udpsend]` (same udpsend as spectrum)
64
+
65
+ 4. Add `[average~ 200 rms]` → connect from `[*~ 0.5]`
66
+ 5. Add `[snapshot~ 200]` → connect from `[average~]`
67
+ 6. Add `[prepend /rms]` → `[udpsend]`
68
+
69
+ ## Step 6: Pitch Tracking
70
+
71
+ 1. Add `[sigmund~ pitch env @npts 2048]`
72
+ 2. Connect from `[*~ 0.5]` outlet
73
+ 3. Left outlet (pitch as MIDI note) → `[snapshot~ 200]` → wire to JS (see Step 7)
74
+ 4. Right outlet (envelope/amplitude) → `[snapshot~ 200]` → wire to JS
75
+
76
+ ## Step 7: JavaScript Bridge
77
+
78
+ 1. Add `[js livepilot_bridge.js]`
79
+ - Copy `livepilot_bridge.js` into the same folder as the `.amxd`
80
+ - Or use Max's File Preferences to add the m4l_device folder
81
+
82
+ 2. Add `[live.thisdevice]`
83
+ - Connect its left outlet (bang on load) → `[js]` inlet
84
+
85
+ 3. Add `[udpreceive 9881]` (incoming commands from MCP server)
86
+ - Connect outlet → `[js]` inlet (messages route via `anything()`)
87
+
88
+ 4. Connect `[js]` outlet 0 → `[udpsend 127.0.0.1 9880]` (responses)
89
+
90
+ 5. Connect pitch tracking to JS for key detection:
91
+ - `[sigmund~]` pitch → `[prepend pitch_in]` → `[js]` inlet
92
+ - Wire amplitude after pitch: pack both into the prepend
93
+
94
+ Specifically:
95
+ - `[sigmund~ pitch env]` left outlet → first inlet of `[pack f f]`
96
+ - `[sigmund~ pitch env]` right outlet → second inlet of `[pack f f]`
97
+ - `[pack f f]` → `[prepend pitch_in]` → `[js livepilot_bridge.js]` inlet
98
+
99
+ ## Step 8: UI (Optional but Recommended)
100
+
101
+ ### Status LED
102
+ 1. Add `[live.text]` — set to "Connected" in Inspector
103
+ 2. Connect `[js]` outlet 1 → route "status" messages to `[live.text]`
104
+
105
+ ### Spectrum Display (cosmetic)
106
+ 1. Add `[multislider]` — 8 sliders, vertical, range 0-1
107
+ 2. Connect the same `[pack f f f f f f f f]` from Step 4 → `[multislider]`
108
+ 3. Set size to ~100x40px, no border, dark theme colors
109
+
110
+ ### Key Display
111
+ 1. Add `[live.text]` or `[comment]` — shows detected key
112
+ 2. Route "key" messages from `[js]` outlet 1
113
+
114
+ ### LivePilot Branding
115
+ 1. Add `[fpic]` with a small LivePilot logo PNG (white on dark)
116
+ 2. Or add `[comment]` with text "LivePilot Analyzer"
117
+
118
+ ### Device Size
119
+ - In Max Inspector, set presentation mode dimensions: **258 × 80 px** (standard M4L width)
120
+ - Switch to Presentation Mode (Cmd+Alt+E) and arrange UI elements
121
+
122
+ ## Step 9: Save and Install
123
+
124
+ 1. Click **Save** (Cmd+S) in Max editor
125
+ 2. Name it `LivePilot_Analyzer.amxd`
126
+ 3. Save to: `~/Music/Ableton/User Library/Presets/Audio Effects/Max Audio Effect/`
127
+ 4. Close the Max editor
128
+ 5. The device now appears in Live's browser under Audio Effects → Max Audio Effect
129
+
130
+ ## Step 10: Test
131
+
132
+ 1. Drop `LivePilot Analyzer` on the **master track**
133
+ 2. Play some audio
134
+ 3. In Claude Code, run: `get_master_spectrum` — should return 8 band values
135
+ 4. Run: `get_master_rms` — should return RMS and peak
136
+ 5. After 8+ bars: `get_detected_key` — should return key and scale
137
+
138
+ ## Signal Flow Summary
139
+
140
+ ```
141
+ ┌─────────────────────────────────────────────────┐
142
+ │ LivePilot_Analyzer.amxd │
143
+ │ │
144
+ plugin~ ──┤──L+R──► plugout~ (pass-through) │
145
+ │ │
146
+ │──L+R──► +~ ──► *~ 0.5 ──┬──► fffb~ 8 ──► UDP │
147
+ │ ├──► peakamp~ ──► UDP │
148
+ │ ├──► average~ ──► UDP │
149
+ │ └──► sigmund~ ──► JS │
150
+ │ │
151
+ │ udpreceive 9881 ──► JS ──► udpsend 9880 │
152
+ │ live.thisdevice ──► JS │
153
+ └─────────────────────────────────────────────────┘
154
+ ```
155
+
156
+ ## Troubleshooting
157
+
158
+ - **"JS file not found"**: Copy `livepilot_bridge.js` to the same folder as the `.amxd`, or add the folder to Max's File Preferences
159
+ - **No spectrum data**: Check that audio is playing through the master, and `udpsend` is targeting `127.0.0.1 9880`
160
+ - **High CPU**: Remove any `spectroscope~` or `meter~` objects — these are GUI-heavy. Our analysis uses no GUI.
161
+ - **Clicks/artifacts**: The `+~` and analysis chain should NOT feed back into `plugout~`. Only the direct plugin~ → plugout~ connection carries audio.