@viji-dev/core 0.3.26 → 0.3.28

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/dist/docs-api.js CHANGED
@@ -1,7 +1,7 @@
1
1
  export const docsApi = {
2
2
  "version": "1.0.0",
3
- "coreVersion": "0.3.26",
4
- "generatedAt": "2026-03-30T19:41:22.735Z",
3
+ "coreVersion": "0.3.27",
4
+ "generatedAt": "2026-04-04T20:34:40.458Z",
5
5
  "navigation": [
6
6
  {
7
7
  "id": "getting-started",
@@ -314,6 +314,11 @@ export const docsApi = {
314
314
  "id": "native-ext-sensors",
315
315
  "title": "Device Sensors",
316
316
  "path": "native/external-devices/sensors"
317
+ },
318
+ {
319
+ "id": "native-ext-audio",
320
+ "title": "Device Audio",
321
+ "path": "native/external-devices/audio"
317
322
  }
318
323
  ]
319
324
  }
@@ -566,6 +571,11 @@ export const docsApi = {
566
571
  "id": "p5-ext-sensors",
567
572
  "title": "Device Sensors",
568
573
  "path": "p5/external-devices/sensors"
574
+ },
575
+ {
576
+ "id": "p5-ext-audio",
577
+ "title": "Device Audio",
578
+ "path": "p5/external-devices/audio"
569
579
  }
570
580
  ]
571
581
  }
@@ -798,6 +808,11 @@ export const docsApi = {
798
808
  "id": "shader-ext-sensors",
799
809
  "title": "Sensor Uniforms",
800
810
  "path": "shader/external-devices/sensors"
811
+ },
812
+ {
813
+ "id": "shader-ext-audio",
814
+ "title": "Audio Uniforms",
815
+ "path": "shader/external-devices/audio"
801
816
  }
802
817
  ]
803
818
  },
@@ -832,7 +847,7 @@ export const docsApi = {
832
847
  },
833
848
  {
834
849
  "type": "text",
835
- "markdown": "A few things to notice:\n\n- **[`viji.slider()`](/native/parameters/slider)** creates a UI slider the user can adjust — defined once at the top level, read via `.value` inside `render()`.\n- **[`viji.deltaTime`](/native/timing)** is the time since the last frame in seconds — use it with an accumulator (`angle +=`) for smooth, frame-rate-independent animation that doesn't jump when you change parameters.\n- **[`viji.width`](/native/canvas-context) / [`viji.height`](/native/canvas-context)** keep your scene resolution-agnostic.\n- **`render(viji)`** is called every frame. This is where you draw.\n\n## What You Can Access\n\nEverything is available through the `viji` object:\n\n| Category | What It Gives You |\n|----------|------------------|\n| **Canvas** | [`viji.canvas`](/native/canvas-context), [`viji.width`](/native/canvas-context), [`viji.height`](/native/canvas-context) |\n| **Timing** | [`viji.time`](/native/timing), [`viji.deltaTime`](/native/timing), [`viji.frameCount`](/native/timing), [`viji.fps`](/native/timing) |\n| **Parameters** | [`viji.slider()`](/native/parameters/slider), [`viji.color()`](/native/parameters/color), [`viji.toggle()`](/native/parameters/toggle), [`viji.select()`](/native/parameters/select), [`viji.number()`](/native/parameters/number), [`viji.text()`](/native/parameters/text), [`viji.image()`](/native/parameters/image), [`viji.button()`](/native/parameters/button) |\n| **Audio** | [Volume, frequency bands, beat detection, spectral analysis, FFT & waveform data](/native/audio) |\n| **Video & CV** | [Video frames, face detection, hand tracking, pose estimation, body segmentation](/native/video) |\n| **Interaction** | Unified pointer, mouse buttons & wheel, keyboard state, multi-touch with pressure & velocity |\n| **Sensors** | Accelerometer, gyroscope, device orientation |\n\n## Three Ways to Create\n\nViji supports three rendering modes:\n\n| Renderer | Best For | Entry Point |\n|----------|----------|-------------|\n| **Native** | Full control — Canvas 2D, WebGL, Three.js | `render(viji)` |\n| **P5.js** | Artists familiar with Processing / P5.js | `render(viji, p5)` |\n| **Shader** | GPU effects, raymarching, generative patterns | `void main()` in GLSL |\n\nAll three share the same audio, video, parameter, and interaction APIs. See [Renderers Overview](../renderers-overview/) for how each works.\n\n## Next Steps\n\n- [Renderers Overview](../renderers-overview/) — how to choose and use each renderer\n- [Best Practices](../best-practices/) — essential patterns for robust, performant scenes\n- [Common Mistakes](../common-mistakes/) — pitfalls to avoid\n- [Native Quick Start](/native/quickstart) — build with JavaScript and full canvas control\n- [P5 Quick Start](/p5/quickstart) — build with the familiar P5.js API\n- [Shader Quick Start](/shader/quickstart) — build with GLSL fragment shaders"
850
+ "markdown": "A few things to notice:\n\n- **[`viji.slider()`](/native/parameters/slider)** creates a UI slider the user can adjust — defined once at the top level, read via `.value` inside `render()`.\n- **[`viji.deltaTime`](/native/timing)** is the time since the last frame in seconds — use it with an accumulator (`angle +=`) for smooth, frame-rate-independent animation that doesn't jump when you change parameters.\n- **[`viji.width`](/native/canvas-context) / [`viji.height`](/native/canvas-context)** keep your scene resolution-agnostic.\n- **`render(viji)`** is called every frame. This is where you draw.\n\n## What You Can Access\n\nEverything is available through the `viji` object:\n\n| Category | What It Gives You |\n|----------|------------------|\n| **Canvas** | [`viji.canvas`](/native/canvas-context), [`viji.width`](/native/canvas-context), [`viji.height`](/native/canvas-context) |\n| **Timing** | [`viji.time`](/native/timing), [`viji.deltaTime`](/native/timing), [`viji.frameCount`](/native/timing), [`viji.fps`](/native/timing) |\n| **Parameters** | [`viji.slider()`](/native/parameters/slider), [`viji.color()`](/native/parameters/color), [`viji.toggle()`](/native/parameters/toggle), [`viji.select()`](/native/parameters/select), [`viji.number()`](/native/parameters/number), [`viji.text()`](/native/parameters/text), [`viji.image()`](/native/parameters/image), [`viji.button()`](/native/parameters/button) |\n| **Audio** | [Volume, bands, beat, spectral, FFT & waveform](/native/audio). **Multi-stream:** the host can supply extra sources as `viji.audioStreams[]` and device mic as `device.audio` (same analysis as main audio except beat/BPM, which stay on `viji.audio` only). |\n| **Video & CV** | [Video frames, face detection, hand tracking, pose estimation, body segmentation](/native/video) |\n| **Interaction** | Unified pointer, mouse buttons & wheel, keyboard state, multi-touch with pressure & velocity |\n| **Sensors** | Accelerometer, gyroscope, device orientation |\n\n## Three Ways to Create\n\nViji supports three rendering modes:\n\n| Renderer | Best For | Entry Point |\n|----------|----------|-------------|\n| **Native** | Full control — Canvas 2D, WebGL, Three.js | `render(viji)` |\n| **P5.js** | Artists familiar with Processing / P5.js | `render(viji, p5)` |\n| **Shader** | GPU effects, raymarching, generative patterns | `void main()` in GLSL |\n\nAll three share the same audio, video, parameter, and interaction APIs. See [Renderers Overview](../renderers-overview/) for how each works.\n\n## Next Steps\n\n- [Renderers Overview](../renderers-overview/) — how to choose and use each renderer\n- [Best Practices](../best-practices/) — essential patterns for robust, performant scenes\n- [Common Mistakes](../common-mistakes/) — pitfalls to avoid\n- [Native Quick Start](/native/quickstart) — build with JavaScript and full canvas control\n- [P5 Quick Start](/p5/quickstart) — build with the familiar P5.js API\n- [Shader Quick Start](/shader/quickstart) — build with GLSL fragment shaders"
836
851
  }
837
852
  ]
838
853
  },
@@ -843,7 +858,7 @@ export const docsApi = {
843
858
  "content": [
844
859
  {
845
860
  "type": "text",
846
- "markdown": "# Renderers Overview\n\nViji supports three rendering modes. Each produces visuals on the same canvas and shares the same Artist API for parameters, audio, video, interaction, and sensors. The difference is the language and paradigm you use to draw.\n\n## Choosing a Renderer\n\nThe renderer is selected by a **comment directive** at the top of your scene code:\n\n```javascript\n// @renderer p5 → P5.js renderer\n// @renderer shader → Shader renderer\n// (no directive) → Native renderer (default)\n```\n\nIf no `@renderer` directive is present, the scene runs in **Native** mode. You can also write `// @renderer native` for clarity, but it's not required.\n\n> [!IMPORTANT]\n> P5 and shader scenes must declare their renderer type as the first comment:\n> ```\n> // @renderer p5\n> ```\n> or\n> ```\n> // @renderer shader\n> ```\n> Without this directive, the scene defaults to the native renderer.\n\n---\n\n## Native Renderer\n\nThe native renderer gives you direct access to the canvas. Write standard JavaScript (or TypeScript) and use any rendering approach: Canvas 2D, WebGL, or external libraries like Three.js.\n\n**Entry point:** `render(viji)`"
861
+ "markdown": "# Renderers Overview\n\nViji supports three rendering modes. Each produces visuals on the same canvas and shares the same Artist API for parameters, audio (main stream plus optional **additional audio streams** from the host), video, interaction, and sensors. The difference is the language and paradigm you use to draw.\n\n## Choosing a Renderer\n\nThe renderer is selected by a **comment directive** at the top of your scene code:\n\n```javascript\n// @renderer p5 → P5.js renderer (2D default canvas)\n// @renderer p5 webgl → P5.js with WEBGL main canvas (3D / shaders)\n// @renderer shader → Shader renderer\n// (no directive) → Native renderer (default)\n```\n\nIf no `@renderer` directive is present, the scene runs in **Native** mode. You can also write `// @renderer native` for clarity, but it's not required.\n\n> [!IMPORTANT]\n> P5 and shader scenes must declare their renderer type as the first comment — for example:\n> ```\n> // @renderer p5\n> ```\n> For P5’s **WEBGL** main canvas, use `// @renderer p5 webgl` as the first line instead of `// @renderer p5`.\n> Shader scenes:\n> ```\n> // @renderer shader\n> ```\n> Without a matching directive, the scene defaults to the native renderer.\n\n---\n\n## Native Renderer\n\nThe native renderer gives you direct access to the canvas. Write standard JavaScript (or TypeScript) and use any rendering approach: Canvas 2D, WebGL, or external libraries like Three.js.\n\n**Entry point:** `render(viji)`"
847
862
  },
848
863
  {
849
864
  "type": "live-example",
@@ -863,17 +878,23 @@ export const docsApi = {
863
878
  },
864
879
  {
865
880
  "type": "text",
866
- "markdown": "See [External Libraries](/native/external-libraries) for detailed patterns with Three.js and other libraries.\n\n---\n\n## P5.js Renderer\n\nThe P5.js renderer provides the familiar Processing/P5.js creative coding API. Viji loads P5.js automatically when you use `// @renderer p5` — no installation or setup required.\n\n**Entry points:** `render(viji, p5)` (required), `setup(viji, p5)` (optional)"
881
+ "markdown": "See [External Libraries](/native/external-libraries) for detailed patterns with Three.js and other libraries.\n\n---\n\n## P5.js Renderer\n\nThe P5.js renderer provides the familiar Processing/P5.js creative coding API. Viji loads P5.js automatically when you use `// @renderer p5` — no installation or setup required. Add **`webgl`** after `p5` on that same line (`// @renderer p5 webgl`) to use P5’s WEBGL renderer for 3D and shader-based drawing on the main canvas.\n\n**Entry points:** `render(viji, p5)` (required), `setup(viji, p5)` (optional)"
867
882
  },
868
883
  {
869
884
  "type": "live-example",
870
- "title": "P5.js Renderer",
885
+ "title": "P5.js — 2D",
871
886
  "sceneCode": "// @renderer p5\r\n\r\nconst speed = viji.slider(2, { min: 0.5, max: 10, label: 'Speed' });\r\n\r\nlet angle = 0;\r\n\r\nfunction setup(viji, p5) {\r\n p5.colorMode(p5.HSB);\r\n}\r\n\r\nfunction render(viji, p5) {\r\n angle += speed.value * viji.deltaTime;\r\n\r\n p5.background(0);\r\n\r\n const x = p5.width / 2 + p5.cos(angle) * p5.width * 0.3;\r\n const y = p5.height / 2 + p5.sin(angle) * p5.height * 0.3;\r\n const size = p5.min(p5.width, p5.height) * 0.1;\r\n\r\n p5.fill(viji.time * 50 % 360, 80, 100);\r\n p5.noStroke();\r\n p5.circle(x, y, size);\r\n}\r\n",
872
887
  "sceneFile": "renderers-p5.scene.js"
873
888
  },
889
+ {
890
+ "type": "live-example",
891
+ "title": "P5.js — WEBGL",
892
+ "sceneCode": "// @renderer p5 webgl\n\nconst speed = viji.slider(1.2, { min: 0.2, max: 4, step: 0.1, label: 'Spin' });\n\nfunction setup(viji, p5) {\n p5.angleMode(p5.RADIANS);\n}\n\nfunction render(viji, p5) {\n p5.background(24, 18, 40);\n p5.ambientLight(120);\n p5.directionalLight(255, 255, 255, 0.3, -0.4, -0.85);\n\n const t = viji.time * speed.value;\n const r = p5.min(p5.width, p5.height) * 0.22;\n\n p5.push();\n p5.rotateX(t * 0.7);\n p5.rotateY(t);\n p5.normalMaterial();\n p5.box(r);\n p5.pop();\n}\n",
893
+ "sceneFile": "renderers-p5-webgl.scene.js"
894
+ },
874
895
  {
875
896
  "type": "text",
876
- "markdown": "> [!WARNING]\n> Viji uses P5 in **instance mode**. All P5 functions require the `p5.` prefix:\n> ```javascript\n> // Correct\n> p5.background(0);\n> p5.circle(p5.width / 2, p5.height / 2, 100);\n>\n> // Wrong — will throw ReferenceError\n> background(0);\n> circle(width / 2, height / 2, 100);\n> ```\n\n**Key characteristics:**\n\n- **`setup()` is optional.** Use it for one-time configuration like `p5.colorMode()`. If you don't need it, omit it entirely.\n- **`render()` replaces `draw()`.** P5's built-in draw loop is disabled; Viji calls your `render()` function each frame.\n- **No `createCanvas()`.** The canvas is created and managed by Viji.\n- **Viji APIs for input.** Use [`viji.pointer`](/p5/pointer) for cross-device interactions, or [`viji.mouse`](/p5/mouse), [`viji.keyboard`](/p5/keyboard), [`viji.touches`](/p5/touch) for device-specific access — instead of P5's `mouseX`, `keyIsPressed`, etc.\n- **No `preload()`.** Load assets using Viji's [`viji.image()`](/native/parameters/image) parameter, or use `fetch()` in `setup()`.\n\nIf you have existing P5.js sketches, see [Converting P5 Sketches](/p5/converting-sketches) for a step-by-step migration guide.\n\n---\n\n## Shader Renderer\n\nThe shader renderer lets you write GLSL fragment shaders that run directly on the GPU. Viji automatically injects all uniform declarations — you write only your helper functions and `void main()`.\n\n**Entry point:** `void main()` (GLSL)"
897
+ "markdown": "> [!WARNING]\n> Viji uses P5 in **instance mode**. All P5 functions require the `p5.` prefix:\n> ```javascript\n> // Correct\n> p5.background(0);\n> p5.circle(p5.width / 2, p5.height / 2, 100);\n>\n> // Wrong — will throw ReferenceError\n> background(0);\n> circle(width / 2, height / 2, 100);\n> ```\n\n**Key characteristics:**\n\n- **`setup()` is optional.** Use it for one-time configuration like `p5.colorMode()`. If you don't need it, omit it entirely.\n- **`render()` replaces `draw()`.** P5's built-in draw loop is disabled; Viji calls your `render()` function each frame.\n- **No `createCanvas()`.** The canvas is created and managed by Viji. Mode (2D vs WEBGL) comes from the directive: `// @renderer p5` or `// @renderer p5 webgl` — never call `createCanvas(..., p5.WEBGL)` yourself.\n- **Viji APIs for input.** Use [`viji.pointer`](/p5/pointer) for cross-device interactions, or [`viji.mouse`](/p5/mouse), [`viji.keyboard`](/p5/keyboard), [`viji.touches`](/p5/touch) for device-specific access — instead of P5's `mouseX`, `keyIsPressed`, etc.\n- **No `preload()`.** Load assets using Viji's [`viji.image()`](/native/parameters/image) parameter, or use `fetch()` in `setup()`.\n\nIf you have existing P5.js sketches, see [Converting P5 Sketches](/p5/converting-sketches) for a step-by-step migration guide.\n\n---\n\n## Shader Renderer\n\nThe shader renderer lets you write GLSL fragment shaders that run directly on the GPU. Viji automatically injects all uniform declarations — you write only your helper functions and `void main()`.\n\n**Entry point:** `void main()` (GLSL)"
877
898
  },
878
899
  {
879
900
  "type": "live-example",
@@ -883,7 +904,7 @@ export const docsApi = {
883
904
  },
884
905
  {
885
906
  "type": "text",
886
- "markdown": "> [!NOTE]\n> The Viji shader renderer automatically injects `precision mediump float;` and all `uniform` declarations — both built-in uniforms (`u_resolution`, `u_time`, etc.) and parameter uniforms from `@viji-*` directives. Write only your helper functions and `void main() { ... }`. Do NOT redeclare `precision` or any uniforms — they will conflict.\n\n**Key characteristics:**\n\n- **Fragment shader only.** Viji renders a fullscreen quad; your shader defines the color of every pixel.\n- **GLSL ES 1.00 by default.** If you add `#version 300 es` as the first line, Viji switches to WebGL 2. Note that ES 3.00 requires `out vec4` for output instead of `gl_FragColor`, and `texture()` instead of `texture2D()`. ES 1.00 is recommended for maximum compatibility.\n- **Built-in uniforms** like `u_time`, `u_resolution`, `u_mouse`, `u_audioVolume`, `u_video`, and many more are always available — no declaration needed.\n- **Parameters via comments.** Declare parameters with `// @viji-TYPE:uniformName key:value` syntax. They become uniforms automatically.\n- **Accumulators for smooth animation.** Use `// @viji-accumulator:phase rate:speed` instead of `u_time * speed` — the value grows smoothly without jumping when the rate parameter changes.\n- **No `u_` prefix for your parameters.** The `u_` prefix is reserved for Viji's built-in uniforms. Name your parameters descriptively: `speed`, `colorMix`, `intensity`.\n\nIf you have existing Shadertoy shaders, see [Shadertoy Compatibility](/shader/shadertoy) for a compatibility layer that lets you paste code with minimal changes.\n\n---\n\n## Comparison\n\n| | Native | P5.js | Shader |\n|---|--------|-------|--------|\n| **Language** | JavaScript / TypeScript | JavaScript with P5 API | GLSL ES 1.00 (or 3.00 with `#version 300 es`) |\n| **Directive** | None (default) | `// @renderer p5` | `// @renderer shader` |\n| **Entry point** | `render(viji)` | `render(viji, p5)` | `void main()` |\n| **Setup** | Top-level code + `await` | Optional `setup(viji, p5)` | N/A |\n| **Canvas access** | [`viji.useContext('2d'/'webgl'/'webgl2')`](/native/canvas-context) | P5 drawing functions | Automatic fullscreen quad |\n| **External libraries** | Yes (`await import(...)`) | P5.js only | No |\n| **Best for** | Full control, WebGL, Three.js | Familiar P5 workflows | GPU effects, raymarching |\n| **Parameters** | [`viji.slider()`](/native/parameters/slider), etc. | [`viji.slider()`](/native/parameters/slider), etc. | `// @viji-slider:name ...` |\n\n## Next Steps\n\n- [Native Quick Start](/native/quickstart) — build your first native scene\n- [P5 Quick Start](/p5/quickstart) — build your first P5.js scene\n- [Shader Quick Start](/shader/quickstart) — build your first shader\n- [Best Practices](../best-practices/) — essential patterns all artists should follow"
907
+ "markdown": "> [!NOTE]\n> The Viji shader renderer automatically injects `precision mediump float;` and all `uniform` declarations — both built-in uniforms (`u_resolution`, `u_time`, etc.) and parameter uniforms from `@viji-*` directives. Write only your helper functions and `void main() { ... }`. Do NOT redeclare `precision` or any uniforms — they will conflict.\n\n**Key characteristics:**\n\n- **Fragment shader only.** Viji renders a fullscreen quad; your shader defines the color of every pixel.\n- **GLSL ES 1.00 by default.** If you add `#version 300 es` as the first line, Viji switches to WebGL 2. Note that ES 3.00 requires `out vec4` for output instead of `gl_FragColor`, and `texture()` instead of `texture2D()`. ES 1.00 is recommended for maximum compatibility.\n- **Built-in uniforms** like `u_time`, `u_resolution`, `u_mouse`, `u_audioVolume`, `u_video`, per-stream video (`u_videoStream*`), and per-stream audio scalars (`u_audioStreamCount`, `u_audioStream{i}*`) when the host provides extra streams, and many more are always available — no declaration needed.\n- **Parameters via comments.** Declare parameters with `// @viji-TYPE:uniformName key:value` syntax. They become uniforms automatically.\n- **Accumulators for smooth animation.** Use `// @viji-accumulator:phase rate:speed` instead of `u_time * speed` — the value grows smoothly without jumping when the rate parameter changes.\n- **No `u_` prefix for your parameters.** The `u_` prefix is reserved for Viji's built-in uniforms. Name your parameters descriptively: `speed`, `colorMix`, `intensity`.\n\nIf you have existing Shadertoy shaders, see [Shadertoy Compatibility](/shader/shadertoy) for a compatibility layer that lets you paste code with minimal changes.\n\n---\n\n## Comparison\n\n| | Native | P5.js | Shader |\n|---|--------|-------|--------|\n| **Language** | JavaScript / TypeScript | JavaScript with P5 API | GLSL ES 1.00 (or 3.00 with `#version 300 es`) |\n| **Directive** | None (default) | `// @renderer p5` or `// @renderer p5 webgl` | `// @renderer shader` |\n| **Entry point** | `render(viji)` | `render(viji, p5)` | `void main()` |\n| **Setup** | Top-level code + `await` | Optional `setup(viji, p5)` | N/A |\n| **Canvas access** | [`viji.useContext('2d'/'webgl'/'webgl2')`](/native/canvas-context) | P5 drawing functions | Automatic fullscreen quad |\n| **External libraries** | Yes (`await import(...)`) | P5.js only | No |\n| **Best for** | Full control, WebGL, Three.js | Familiar P5 workflows | GPU effects, raymarching |\n| **Parameters** | [`viji.slider()`](/native/parameters/slider), etc. | [`viji.slider()`](/native/parameters/slider), etc. | `// @viji-slider:name ...` |\n\n## Next Steps\n\n- [Native Quick Start](/native/quickstart) — build your first native scene\n- [P5 Quick Start](/p5/quickstart) — build your first P5.js scene\n- [Shader Quick Start](/shader/quickstart) — build your first shader\n- [Best Practices](../best-practices/) — essential patterns all artists should follow"
887
908
  }
888
909
  ]
889
910
  },
@@ -894,7 +915,7 @@ export const docsApi = {
894
915
  "content": [
895
916
  {
896
917
  "type": "text",
897
- "markdown": "# Best Practices\n\nThese practices apply to all three renderers (Native, P5, Shader). Following them ensures your scenes look correct at any resolution, run smoothly at any frame rate, and work reliably across devices.\n\n---\n\n## Use `viji.time` and `viji.deltaTime` for Animation\n\nViji provides two timing values. Use the right one for the job:\n\n- **[`viji.time`](/native/timing)** — seconds since the scene started. Use this for most animations (oscillations, rotations, color cycling). This is the most common choice.\n- **[`viji.deltaTime`](/native/timing)** — seconds since the last frame. Use this when you need to accumulate values smoothly regardless of frame rate (movement, physics, fading).\n\n```javascript\n// viji.time — animation that looks identical regardless of frame rate\nconst angle = viji.time * speed.value;\nconst x = Math.cos(angle) * radius;\n\n// viji.deltaTime — accumulation that stays smooth at any FPS\nposition += velocity * viji.deltaTime;\nopacity -= fadeRate * viji.deltaTime;\n```\n\nFor shaders, the equivalents are `u_time` and `u_deltaTime`. When animation speed is driven by a parameter, use an [**accumulator**](/shader/parameters/accumulator) to avoid jumps:\n\n```glsl\n// Instead of: float wave = sin(u_time * speed); ← jumps when slider moves\n// @viji-accumulator:phase rate:speed\nfloat wave = sin(phase + uv.x * 10.0); // smooth at any slider value\n```\n\n> [!NOTE]\n> Always use [`viji.time`](/native/timing) or [`viji.deltaTime`](/native/timing) for animation. Never count frames or assume a specific frame rate — the host application may run your scene at different rates (`full` or `half` mode) or the actual FPS may vary by device.\n\n---\n\n## Design for Any Resolution\n\nThe host application controls your scene's resolution. It may change at any time (window resize, resolution scaling for performance, high-DPI displays). Never hardcode pixel values.\n\n**Use [`viji.width`](/native/canvas-context) and [`viji.height`](/native/canvas-context)** for all positioning and sizing:\n\n```javascript\n// Good — scales to any resolution\nconst centerX = viji.width / 2;\nconst centerY = viji.height / 2;\nconst radius = Math.min(viji.width, viji.height) * 0.1;\n\n// Bad — breaks at different resolutions\nconst centerX = 960;\nconst centerY = 540;\nconst radius = 50;\n```\n\nFor parameters that control sizes, use normalized values (0–1) and multiply by canvas dimensions:\n\n```javascript\nconst size = viji.slider(0.15, { min: 0.02, max: 0.5, label: 'Size' });\n\nfunction render(viji) {\n const pixelSize = size.value * Math.min(viji.width, viji.height);\n}\n```\n\nFor shaders, use `u_resolution`:\n\n```glsl\nvec2 uv = gl_FragCoord.xy / u_resolution; // normalized 0–1 coordinates\n```\n\n> [!NOTE]\n> Always use [`viji.width`](/native/canvas-context) and [`viji.height`](/native/canvas-context) for positioning and sizing, and [`viji.deltaTime`](/native/timing) for frame-rate-independent animation. Never hardcode pixel values or assume a specific frame rate.\n\n---\n\n## Declare Parameters at the Top Level\n\nParameter functions ([`viji.slider()`](/native/parameters/slider), [`viji.color()`](/native/parameters/color), etc.) register controls with the host application. They must be called **once**, at the top level of your scene code — never inside `render()`.\n\n```javascript\n// Correct — declared once at top level\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nconst bgColor = viji.color('#1a1a2e', { label: 'Background' });\n\nfunction render(viji) {\n // Read current values inside render\n const s = speed.value;\n const bg = bgColor.value;\n}\n```\n\n> [!NOTE]\n> Parameters must be defined at the top level of your scene, not inside `render()`. They are registered once during initialization. Defining them inside `render()` would re-register the parameter every frame, resetting its value to the default and making user changes ineffective.\n\n---\n\n## Avoid Allocations in the Render Loop\n\nCreating objects, arrays, or strings inside `render()` triggers garbage collection, causing frame drops and stuttering. Pre-allocate at the top level and reuse.\n\n> [!TIP]\n> Avoid allocating objects, arrays, or strings inside `render()`. Pre-allocate at the top level and reuse them:\n> ```javascript\n> // Good — pre-allocated\n> const pos = { x: 0, y: 0 };\n> function render(viji) {\n> pos.x = viji.width / 2;\n> pos.y = viji.height / 2;\n> }\n>\n> // Bad — creates a new object every frame\n> function render(viji) {\n> const pos = { x: viji.width / 2, y: viji.height / 2 };\n> }\n> ```\n\nThis is especially important for particle systems, arrays of positions, or any data structure that persists across frames.\n\n---\n\n## No DOM APIs (but `fetch` Is Fine)\n\nYour scene runs in a Web Worker. Standard DOM APIs are not available:\n\n- No `window`, `document`, `Image()`, `localStorage`\n- No `createElement`, `querySelector`, `addEventListener`\n\nHowever, **`fetch()` works** and can be used to load JSON, text, or other data from external URLs:\n\n```javascript\n// This works — fetch is available in workers\nconst response = await fetch('https://cdn.example.com/data.json');\nconst data = await response.json();\n```\n\nFor images, use Viji's [`viji.image()`](/native/parameters/image) parameter — the host application handles file selection and transfers the image to the worker.\n\n> [!WARNING]\n> Scenes run in a Web Worker — there is no `window`, `document`, `Image()`, `localStorage`, or any DOM API. All inputs (audio, video, images) are provided through the Viji API. Note: `fetch()` IS available and can be used to load external data (JSON, etc.) from CDNs.\n\n---\n\n## Guard Audio and Video with `isConnected`\n\nAudio and video streams are provided by the host and may not always be available. Always check `isConnected` before using audio or video data:\n\n```javascript\nfunction render(viji) {\n if (viji.audio.isConnected) {\n const bass = viji.audio.bands.low;\n // ... use audio data\n }\n\n if (viji.video.isConnected && viji.video.currentFrame) {\n ctx.drawImage(viji.video.currentFrame, 0, 0, viji.width, viji.height);\n }\n}\n```\n\nWithout this guard, your scene would reference undefined or zero values when no audio/video source is connected.\n\n---\n\n## Be Mindful of Computer Vision Costs\n\nCV features (face detection, hand tracking, pose detection, etc.) are powerful but expensive. Each feature runs ML inference in its own WebGL context.\n\n| Feature | Relative Cost | Notes |\n|---------|--------------|-------|\n| Face Detection | Low | Bounding box + basic landmarks only |\n| Face Mesh | Medium-High | 468 facial landmarks |\n| Emotion Detection | High | 7 expressions + 52 blendshape coefficients |\n| Hand Tracking | Medium | Up to 2 hands, 21 landmarks each |\n| Pose Detection | Medium | 33 body landmarks |\n| Body Segmentation | High | Per-pixel mask, large tensor output |\n\n> [!WARNING]\n> **WebGL Context Limits:** Each CV feature requires its own WebGL context for ML inference. Browsers typically allow 8-16 active WebGL contexts. Enabling too many CV features simultaneously can cause context eviction, potentially breaking the scene's own rendering. Use only the CV features you need.\n\n**Don't enable CV features by default.** Instead, expose a toggle parameter so users can activate them on capable devices:\n\n> [!TIP]\n> **Best practice:** Don't enable CV features by default. Instead, expose a toggle parameter so users can activate them on capable devices:\n> ```javascript\n> const useFace = viji.toggle(false, { label: 'Enable Face Detection', category: 'video' });\n> if (useFace.value) {\n> await viji.video.cv.enableFaceDetection(true);\n> }\n> ```\n\n---\n\n## Pick One Canvas Context Type\n\nThe native renderer lets you choose between 2D and WebGL contexts via [`viji.useContext()`](/native/canvas-context). Options are `'2d'`, `'webgl'` (WebGL 1), and `'webgl2'` (WebGL 2). Pick one and stick with it.\n\n> [!WARNING]\n> A canvas only supports one context type. If you call [`useContext('2d')`](/native/canvas-context) and later call [`useContext('webgl')`](/native/canvas-context) (or vice versa), the second call returns `null`. Choose one context type and use it for the entire scene.\n\n---\n\n## Related\n\n- [Common Mistakes](../common-mistakes/) — specific wrong/right code examples\n- [Renderers Overview](../renderers-overview/) — choosing the right renderer"
918
+ "markdown": "# Best Practices\n\nThese practices apply to all three renderers (Native, P5, Shader). Following them ensures your scenes look correct at any resolution, run smoothly at any frame rate, and work reliably across devices.\n\n---\n\n## Use `viji.time` and `viji.deltaTime` for Animation\n\nViji provides two timing values. Use the right one for the job:\n\n- **[`viji.time`](/native/timing)** — seconds since the scene started. Use this for most animations (oscillations, rotations, color cycling). This is the most common choice.\n- **[`viji.deltaTime`](/native/timing)** — seconds since the last frame. Use this when you need to accumulate values smoothly regardless of frame rate (movement, physics, fading).\n\n```javascript\n// viji.time — animation that looks identical regardless of frame rate\nconst angle = viji.time * speed.value;\nconst x = Math.cos(angle) * radius;\n\n// viji.deltaTime — accumulation that stays smooth at any FPS\nposition += velocity * viji.deltaTime;\nopacity -= fadeRate * viji.deltaTime;\n```\n\nFor shaders, the equivalents are `u_time` and `u_deltaTime`. When animation speed is driven by a parameter, use an [**accumulator**](/shader/parameters/accumulator) to avoid jumps:\n\n```glsl\n// Instead of: float wave = sin(u_time * speed); ← jumps when slider moves\n// @viji-accumulator:phase rate:speed\nfloat wave = sin(phase + uv.x * 10.0); // smooth at any slider value\n```\n\n> [!NOTE]\n> Always use [`viji.time`](/native/timing) or [`viji.deltaTime`](/native/timing) for animation. Never count frames or assume a specific frame rate — the host application may run your scene at different rates (`full` or `half` mode) or the actual FPS may vary by device.\n\n---\n\n## Design for Any Resolution\n\nThe host application controls your scene's resolution. It may change at any time (window resize, resolution scaling for performance, high-DPI displays). Never hardcode pixel values.\n\n**Use [`viji.width`](/native/canvas-context) and [`viji.height`](/native/canvas-context)** for all positioning and sizing:\n\n```javascript\n// Good — scales to any resolution\nconst centerX = viji.width / 2;\nconst centerY = viji.height / 2;\nconst radius = Math.min(viji.width, viji.height) * 0.1;\n\n// Bad — breaks at different resolutions\nconst centerX = 960;\nconst centerY = 540;\nconst radius = 50;\n```\n\nFor parameters that control sizes, use normalized values (0–1) and multiply by canvas dimensions:\n\n```javascript\nconst size = viji.slider(0.15, { min: 0.02, max: 0.5, label: 'Size' });\n\nfunction render(viji) {\n const pixelSize = size.value * Math.min(viji.width, viji.height);\n}\n```\n\nFor shaders, use `u_resolution`:\n\n```glsl\nvec2 uv = gl_FragCoord.xy / u_resolution; // normalized 0–1 coordinates\n```\n\n> [!NOTE]\n> Always use [`viji.width`](/native/canvas-context) and [`viji.height`](/native/canvas-context) for positioning and sizing, and [`viji.deltaTime`](/native/timing) for frame-rate-independent animation. Never hardcode pixel values or assume a specific frame rate.\n\n---\n\n## Declare Parameters at the Top Level\n\nParameter functions ([`viji.slider()`](/native/parameters/slider), [`viji.color()`](/native/parameters/color), etc.) register controls with the host application. They must be called **once**, at the top level of your scene code — never inside `render()`.\n\n```javascript\n// Correct — declared once at top level\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nconst bgColor = viji.color('#1a1a2e', { label: 'Background' });\n\nfunction render(viji) {\n // Read current values inside render\n const s = speed.value;\n const bg = bgColor.value;\n}\n```\n\n> [!NOTE]\n> Parameters must be defined at the top level of your scene, not inside `render()`. They are registered once during initialization. Defining them inside `render()` would re-register the parameter every frame, resetting its value to the default and making user changes ineffective.\n\n---\n\n## Avoid Allocations in the Render Loop\n\nCreating objects, arrays, or strings inside `render()` triggers garbage collection, causing frame drops and stuttering. Pre-allocate at the top level and reuse.\n\n> [!TIP]\n> Avoid allocating objects, arrays, or strings inside `render()`. Pre-allocate at the top level and reuse them:\n> ```javascript\n> // Good — pre-allocated\n> const pos = { x: 0, y: 0 };\n> function render(viji) {\n> pos.x = viji.width / 2;\n> pos.y = viji.height / 2;\n> }\n>\n> // Bad — creates a new object every frame\n> function render(viji) {\n> const pos = { x: viji.width / 2, y: viji.height / 2 };\n> }\n> ```\n\nThis is especially important for particle systems, arrays of positions, or any data structure that persists across frames.\n\n---\n\n## No DOM APIs (but `fetch` Is Fine)\n\nYour scene runs in a Web Worker. Standard DOM APIs are not available:\n\n- No `window`, `document`, `Image()`, `localStorage`\n- No `createElement`, `querySelector`, `addEventListener`\n\nHowever, **`fetch()` works** and can be used to load JSON, text, or other data from external URLs:\n\n```javascript\n// This works — fetch is available in workers\nconst response = await fetch('https://cdn.example.com/data.json');\nconst data = await response.json();\n```\n\nFor images, use Viji's [`viji.image()`](/native/parameters/image) parameter — the host application handles file selection and transfers the image to the worker.\n\n> [!WARNING]\n> Scenes run in a Web Worker — there is no `window`, `document`, `Image()`, `localStorage`, or any DOM API. All inputs (audio, video, images) are provided through the Viji API. Note: `fetch()` IS available and can be used to load external data (JSON, etc.) from CDNs.\n\n---\n\n## Guard Audio and Video with `isConnected`\n\nAudio and video streams are provided by the host and may not always be available. Always check `isConnected` on the main audio stream, on each entry in `viji.audioStreams` you read from, and on `device.audio` when iterating devices, before using audio or video data:\n\n```javascript\nfunction render(viji) {\n if (viji.audio.isConnected) {\n const bass = viji.audio.bands.low;\n // ... use audio data\n }\n\n if (viji.video.isConnected && viji.video.currentFrame) {\n ctx.drawImage(viji.video.currentFrame, 0, 0, viji.width, viji.height);\n }\n}\n```\n\nWithout this guard, your scene would reference undefined or zero values when no audio/video source is connected.\n\n---\n\n## Be Mindful of Computer Vision Costs\n\nCV features (face detection, hand tracking, pose detection, etc.) are powerful but expensive. Each feature runs ML inference in its own WebGL context.\n\n| Feature | Relative Cost | Notes |\n|---------|--------------|-------|\n| Face Detection | Low | Bounding box + basic landmarks only |\n| Face Mesh | Medium-High | 468 facial landmarks |\n| Emotion Detection | High | 7 expressions + 52 blendshape coefficients |\n| Hand Tracking | Medium | Up to 2 hands, 21 landmarks each |\n| Pose Detection | Medium | 33 body landmarks |\n| Body Segmentation | High | Per-pixel mask, large tensor output |\n\n> [!WARNING]\n> **WebGL Context Limits:** Each CV feature requires its own WebGL context for ML inference. Browsers typically allow 8-16 active WebGL contexts. Enabling too many CV features simultaneously can cause context eviction, potentially breaking the scene's own rendering. Use only the CV features you need.\n\n**Don't enable CV features by default.** Instead, expose a toggle parameter so users can activate them on capable devices:\n\n> [!TIP]\n> **Best practice:** Don't enable CV features by default. Instead, expose a toggle parameter so users can activate them on capable devices:\n> ```javascript\n> const useFace = viji.toggle(false, { label: 'Enable Face Detection', category: 'video' });\n> if (useFace.value) {\n> await viji.video.cv.enableFaceDetection(true);\n> }\n> ```\n\n---\n\n## Pick One Canvas Context Type\n\nThe native renderer lets you choose between 2D and WebGL contexts via [`viji.useContext()`](/native/canvas-context). Options are `'2d'`, `'webgl'` (WebGL 1), and `'webgl2'` (WebGL 2). Pick one and stick with it.\n\n> [!WARNING]\n> A canvas only supports one context type. If you call [`useContext('2d')`](/native/canvas-context) and later call [`useContext('webgl')`](/native/canvas-context) (or vice versa), the second call returns `null`. Choose one context type and use it for the entire scene.\n\n---\n\n## Related\n\n- [Common Mistakes](../common-mistakes/) — specific wrong/right code examples\n- [Renderers Overview](../renderers-overview/) — choosing the right renderer"
898
919
  }
899
920
  ]
900
921
  },
@@ -905,7 +926,7 @@ export const docsApi = {
905
926
  "content": [
906
927
  {
907
928
  "type": "text",
908
- "markdown": "# Common Mistakes\r\n\r\nThis page collects the most frequent mistakes artists make when writing Viji scenes. Each section shows the wrong approach and the correct alternative.\r\n\r\n---\r\n\r\n## Using DOM APIs\r\n\r\nScenes run in a Web Worker. There is no DOM.\r\n\r\n```javascript\r\n// Wrong — DOM APIs don't exist in workers\r\nconst img = new Image();\r\nimg.src = 'photo.jpg';\r\n\r\ndocument.createElement('canvas');\r\nwindow.innerWidth;\r\nlocalStorage.setItem('key', 'value');\r\n```\r\n\r\n```javascript\r\n// Right — use Viji's API for inputs\r\nconst photo = viji.image(null, { label: 'Photo' });\r\n\r\n// Use viji.canvas, viji.width, viji.height instead\r\n// Use fetch() for loading external data:\r\nconst data = await fetch('https://cdn.example.com/data.json').then(r => r.json());\r\n```\r\n\r\n---\r\n\r\n## Declaring Parameters Inside `render()`\r\n\r\nParameter functions register UI controls with the host. Calling them in `render()` re-registers the parameter every frame, resetting its value to the default and making user changes ineffective.\r\n\r\n```javascript\r\n// Wrong — re-registers the slider every frame, resetting its value\r\nfunction render(viji) {\r\n const speed = viji.slider(1, { min: 0, max: 5, label: 'Speed' });\r\n // ...\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — declare once at top level, read .value in render()\r\nconst speed = viji.slider(1, { min: 0, max: 5, label: 'Speed' });\r\n\r\nfunction render(viji) {\r\n const s = speed.value;\r\n // ...\r\n}\r\n```\r\n\r\n---\r\n\r\n## Forgetting `.value` on Parameters\r\n\r\nParameter objects are not raw values. You need to access `.value` to get the current value.\r\n\r\n```javascript\r\n// Wrong — uses the parameter object, not its value\r\nconst radius = viji.slider(50, { min: 10, max: 200, label: 'Radius' });\r\n\r\nfunction render(viji) {\r\n ctx.arc(x, y, radius, 0, Math.PI * 2); // radius is an object, not a number\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — access .value\r\nfunction render(viji) {\r\n ctx.arc(x, y, radius.value, 0, Math.PI * 2);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Hardcoding Pixel Values\r\n\r\nThe host controls your scene's resolution. Hardcoded values break at different sizes.\r\n\r\n```javascript\r\n// Wrong — only looks right at one specific resolution\r\nfunction render(viji) {\r\n ctx.arc(960, 540, 50, 0, Math.PI * 2);\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — adapts to any resolution\r\nfunction render(viji) {\r\n const cx = viji.width / 2;\r\n const cy = viji.height / 2;\r\n const r = Math.min(viji.width, viji.height) * 0.05;\r\n ctx.arc(cx, cy, r, 0, Math.PI * 2);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Frame-Rate-Dependent Animation\r\n\r\nCounting frames or using fixed increments makes animation speed depend on the device's frame rate.\r\n\r\n```javascript\r\n// Wrong — faster on 120Hz displays, slower on 30Hz\r\nlet angle = 0;\r\nfunction render(viji) {\r\n angle += 0.02;\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — use viji.time for consistent speed regardless of FPS\r\nfunction render(viji) {\r\n const angle = viji.time * speed.value;\r\n}\r\n\r\n// Or use viji.deltaTime for accumulation\r\nlet position = 0;\r\nfunction render(viji) {\r\n position += velocity * viji.deltaTime;\r\n}\r\n```\r\n\r\n---\r\n\r\n## Allocating Objects in `render()`\r\n\r\nCreating new objects every frame causes garbage collection pauses.\r\n\r\n```javascript\r\n// Wrong — new object every frame\r\nfunction render(viji) {\r\n const particles = [];\r\n for (let i = 0; i < 100; i++) {\r\n particles.push({ x: Math.random() * viji.width, y: Math.random() * viji.height });\r\n }\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — pre-allocate and reuse\r\nconst particles = Array.from({ length: 100 }, () => ({ x: 0, y: 0 }));\r\n\r\nfunction render(viji) {\r\n for (const p of particles) {\r\n p.x = Math.random() * viji.width;\r\n p.y = Math.random() * viji.height;\r\n }\r\n}\r\n```\r\n\r\n---\r\n\r\n## P5: Missing the `p5.` Prefix\r\n\r\nViji runs P5 in **instance mode**. All P5 functions must be called on the `p5` object.\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Wrong — global P5 functions don't exist\r\nfunction render(viji, p5) {\r\n background(0); // ReferenceError\r\n fill(255, 0, 0); // ReferenceError\r\n circle(width / 2, height / 2, 100); // ReferenceError\r\n}\r\n```\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Right — use p5. prefix for P5 functions, viji.* for dimensions\r\nfunction render(viji, p5) {\r\n p5.background(0);\r\n p5.fill(255, 0, 0);\r\n p5.circle(viji.width / 2, viji.height / 2, 100);\r\n}\r\n```\r\n\r\n---\r\n\r\n## P5: Using `draw()` Instead of `render()`\r\n\r\nP5's built-in draw loop is disabled in Viji. Your function must be named `render`, not `draw`.\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Wrong — Viji never calls draw()\r\nfunction draw(viji, p5) {\r\n p5.background(0);\r\n}\r\n```\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Right — Viji calls render() every frame\r\nfunction render(viji, p5) {\r\n p5.background(0);\r\n}\r\n```\r\n\r\n---\r\n\r\n## P5: Calling `createCanvas()`\r\n\r\nThe canvas is created and managed by Viji. Calling `createCanvas()` creates a second canvas that won't be displayed.\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Wrong — creates a separate, invisible canvas\r\nfunction setup(viji, p5) {\r\n p5.createCanvas(800, 600);\r\n}\r\n```\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Right — canvas is already provided, just configure settings\r\nfunction setup(viji, p5) {\r\n p5.colorMode(p5.HSB);\r\n}\r\n```\r\n\r\n---\r\n\r\n## P5: Using Event Callbacks\r\n\r\nP5 event callbacks like `mousePressed()`, `keyPressed()`, `touchStarted()` do not work in Viji's worker environment. Use Viji's interaction APIs instead.\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Wrong — these callbacks are never called\r\nfunction mousePressed() {\r\n console.log('clicked');\r\n}\r\n```\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Right — check Viji's interaction state in render()\r\nfunction render(viji, p5) {\r\n if (viji.pointer.wasPressed) {\r\n console.log('clicked');\r\n }\r\n}\r\n```\r\n\r\n---\r\n\r\n## Shader: Redeclaring Auto-Injected Code\r\n\r\nViji auto-injects `precision`, all built-in uniform declarations, and all parameter uniforms from `@viji-*` directives. Redeclaring any of them causes compilation errors.\r\n\r\n```glsl\r\n// @renderer shader\r\n\r\n// Wrong — these are already injected by Viji\r\nprecision mediump float;\r\nuniform vec2 u_resolution;\r\nuniform float u_time;\r\n\r\nvoid main() {\r\n vec2 uv = gl_FragCoord.xy / u_resolution;\r\n gl_FragColor = vec4(uv, sin(u_time), 1.0);\r\n}\r\n```\r\n\r\n```glsl\r\n// @renderer shader\r\n\r\n// Right — just write your code, uniforms are available automatically\r\nvoid main() {\r\n vec2 uv = gl_FragCoord.xy / u_resolution;\r\n gl_FragColor = vec4(uv, sin(u_time), 1.0);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Shader: Using `u_` Prefix for Parameters\r\n\r\nThe `u_` prefix is reserved for Viji's built-in uniforms. Using it for your parameters risks naming collisions.\r\n\r\n```glsl\r\n// Wrong — u_ prefix is reserved\r\n// @viji-slider:u_speed label:\"Speed\" default:1.0\r\n```\r\n\r\n```glsl\r\n// Right — use descriptive names without u_ prefix\r\n// @viji-slider:speed label:\"Speed\" default:1.0\r\n```\r\n\r\n---\r\n\r\n## Shader: Missing `@renderer shader`\r\n\r\nWithout the directive, your GLSL code is treated as JavaScript and will throw syntax errors.\r\n\r\n```glsl\r\n// Wrong — no directive, treated as JavaScript\r\nvoid main() {\r\n gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);\r\n}\r\n```\r\n\r\n```glsl\r\n// Right — directive tells Viji to use the shader renderer\r\n// @renderer shader\r\n\r\nvoid main() {\r\n gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Shader: Using Block Comments for `@viji-*` Parameters\r\n\r\nThe `@viji-*` parameter declarations only work with single-line `//` comments. Block comments `/* */` are silently ignored.\r\n\r\n```glsl\r\n// @renderer shader\r\n\r\n// Wrong — block comments are not parsed for parameters\r\n/* @viji-slider:speed label:\"Speed\" default:1.0 min:0.0 max:5.0 */\r\n\r\nvoid main() {\r\n gl_FragColor = vec4(speed, 0.0, 0.0, 1.0); // speed is undefined\r\n}\r\n```\r\n\r\n```glsl\r\n// @renderer shader\r\n\r\n// Right — use single-line comments for parameter declarations\r\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.0 max:5.0\r\n\r\nvoid main() {\r\n gl_FragColor = vec4(speed, 0.0, 0.0, 1.0);\r\n}\r\n```\r\n\r\n> [!NOTE]\r\n> The `@renderer` directive supports both `//` and `/* */` styles, but `@viji-*` parameter declarations require `//`.\r\n\r\n---\r\n\r\n## Shader: Using `u_time * speed` for Parameter-Driven Animation\r\n\r\nMultiplying `u_time` by a parameter causes the entire phase to jump when the slider moves, because the full history is recalculated instantly.\r\n\r\n```glsl\r\n// Wrong — animation jumps when speed slider changes\r\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0\r\nvoid main() {\r\n float wave = sin(u_time * speed);\r\n gl_FragColor = vec4(vec3(wave * 0.5 + 0.5), 1.0);\r\n}\r\n```\r\n\r\n```glsl\r\n// Right — accumulator integrates smoothly, no jumps\r\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0\r\n// @viji-accumulator:phase rate:speed\r\nvoid main() {\r\n float wave = sin(phase);\r\n gl_FragColor = vec4(vec3(wave * 0.5 + 0.5), 1.0);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Not Checking `isConnected` for Audio/Video\r\n\r\nAudio and video streams may not be available. Accessing their properties without checking `isConnected` gives meaningless zero values with no indication that something is missing.\r\n\r\n```javascript\r\n// Wrong — no guard, silently uses zero values\r\nfunction render(viji) {\r\n const bass = viji.audio.bands.low;\r\n ctx.drawImage(viji.video.currentFrame, 0, 0);\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — check connection state first\r\nfunction render(viji) {\r\n if (viji.audio.isConnected) {\r\n const bass = viji.audio.bands.low;\r\n // ... react to audio\r\n }\r\n\r\n if (viji.video.isConnected && viji.video.currentFrame) {\r\n ctx.drawImage(viji.video.currentFrame, 0, 0, viji.width, viji.height);\r\n }\r\n}\r\n```\r\n\r\n---\r\n\r\n## Enabling All CV Features by Default\r\n\r\nEnabling CV features without user consent wastes resources on devices that can't handle it, and risks WebGL context loss.\r\n\r\n```javascript\r\n// Wrong — activates expensive CV on every device\r\nawait viji.video.cv.enableFaceDetection(true);\r\nawait viji.video.cv.enableHandTracking(true);\r\nawait viji.video.cv.enablePoseDetection(true);\r\nawait viji.video.cv.enableBodySegmentation(true);\r\n```\r\n\r\n```javascript\r\n// Right — let the user opt in\r\nconst useFace = viji.toggle(false, { label: 'Enable Face Tracking', category: 'video' });\r\nconst useHands = viji.toggle(false, { label: 'Enable Hand Tracking', category: 'video' });\r\n\r\nfunction render(viji) {\r\n if (useFace.value) await viji.video.cv.enableFaceDetection(true);\r\n else await viji.video.cv.enableFaceDetection(false);\r\n\r\n if (useHands.value) await viji.video.cv.enableHandTracking(true);\r\n else await viji.video.cv.enableHandTracking(false);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Related\r\n\r\n- [Best Practices](../best-practices/) — positive guidance for writing robust scenes\r\n- [Renderers Overview](../renderers-overview/) — choosing the right renderer\r\n- [Audio](/native/audio) — audio connection and analysis API\r\n- [Video & CV](/native/video) — video stream and computer vision features"
929
+ "markdown": "# Common Mistakes\r\n\r\nThis page collects the most frequent mistakes artists make when writing Viji scenes. Each section shows the wrong approach and the correct alternative.\r\n\r\n---\r\n\r\n## Using DOM APIs\r\n\r\nScenes run in a Web Worker. There is no DOM.\r\n\r\n```javascript\r\n// Wrong — DOM APIs don't exist in workers\r\nconst img = new Image();\r\nimg.src = 'photo.jpg';\r\n\r\ndocument.createElement('canvas');\r\nwindow.innerWidth;\r\nlocalStorage.setItem('key', 'value');\r\n```\r\n\r\n```javascript\r\n// Right — use Viji's API for inputs\r\nconst photo = viji.image(null, { label: 'Photo' });\r\n\r\n// Use viji.canvas, viji.width, viji.height instead\r\n// Use fetch() for loading external data:\r\nconst data = await fetch('https://cdn.example.com/data.json').then(r => r.json());\r\n```\r\n\r\n---\r\n\r\n## Declaring Parameters Inside `render()`\r\n\r\nParameter functions register UI controls with the host. Calling them in `render()` re-registers the parameter every frame, resetting its value to the default and making user changes ineffective.\r\n\r\n```javascript\r\n// Wrong — re-registers the slider every frame, resetting its value\r\nfunction render(viji) {\r\n const speed = viji.slider(1, { min: 0, max: 5, label: 'Speed' });\r\n // ...\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — declare once at top level, read .value in render()\r\nconst speed = viji.slider(1, { min: 0, max: 5, label: 'Speed' });\r\n\r\nfunction render(viji) {\r\n const s = speed.value;\r\n // ...\r\n}\r\n```\r\n\r\n---\r\n\r\n## Forgetting `.value` on Parameters\r\n\r\nParameter objects are not raw values. You need to access `.value` to get the current value.\r\n\r\n```javascript\r\n// Wrong — uses the parameter object, not its value\r\nconst radius = viji.slider(50, { min: 10, max: 200, label: 'Radius' });\r\n\r\nfunction render(viji) {\r\n ctx.arc(x, y, radius, 0, Math.PI * 2); // radius is an object, not a number\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — access .value\r\nfunction render(viji) {\r\n ctx.arc(x, y, radius.value, 0, Math.PI * 2);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Hardcoding Pixel Values\r\n\r\nThe host controls your scene's resolution. Hardcoded values break at different sizes.\r\n\r\n```javascript\r\n// Wrong — only looks right at one specific resolution\r\nfunction render(viji) {\r\n ctx.arc(960, 540, 50, 0, Math.PI * 2);\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — adapts to any resolution\r\nfunction render(viji) {\r\n const cx = viji.width / 2;\r\n const cy = viji.height / 2;\r\n const r = Math.min(viji.width, viji.height) * 0.05;\r\n ctx.arc(cx, cy, r, 0, Math.PI * 2);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Frame-Rate-Dependent Animation\r\n\r\nCounting frames or using fixed increments makes animation speed depend on the device's frame rate.\r\n\r\n```javascript\r\n// Wrong — faster on 120Hz displays, slower on 30Hz\r\nlet angle = 0;\r\nfunction render(viji) {\r\n angle += 0.02;\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — use viji.time for consistent speed regardless of FPS\r\nfunction render(viji) {\r\n const angle = viji.time * speed.value;\r\n}\r\n\r\n// Or use viji.deltaTime for accumulation\r\nlet position = 0;\r\nfunction render(viji) {\r\n position += velocity * viji.deltaTime;\r\n}\r\n```\r\n\r\n---\r\n\r\n## Allocating Objects in `render()`\r\n\r\nCreating new objects every frame causes garbage collection pauses.\r\n\r\n```javascript\r\n// Wrong — new object every frame\r\nfunction render(viji) {\r\n const particles = [];\r\n for (let i = 0; i < 100; i++) {\r\n particles.push({ x: Math.random() * viji.width, y: Math.random() * viji.height });\r\n }\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — pre-allocate and reuse\r\nconst particles = Array.from({ length: 100 }, () => ({ x: 0, y: 0 }));\r\n\r\nfunction render(viji) {\r\n for (const p of particles) {\r\n p.x = Math.random() * viji.width;\r\n p.y = Math.random() * viji.height;\r\n }\r\n}\r\n```\r\n\r\n---\r\n\r\n## P5: Missing the `p5.` Prefix\r\n\r\nViji runs P5 in **instance mode**. All P5 functions must be called on the `p5` object.\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Wrong — global P5 functions don't exist\r\nfunction render(viji, p5) {\r\n background(0); // ReferenceError\r\n fill(255, 0, 0); // ReferenceError\r\n circle(width / 2, height / 2, 100); // ReferenceError\r\n}\r\n```\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Right — use p5. prefix for P5 functions, viji.* for dimensions\r\nfunction render(viji, p5) {\r\n p5.background(0);\r\n p5.fill(255, 0, 0);\r\n p5.circle(viji.width / 2, viji.height / 2, 100);\r\n}\r\n```\r\n\r\n---\r\n\r\n## P5: Using `draw()` Instead of `render()`\r\n\r\nP5's built-in draw loop is disabled in Viji. Your function must be named `render`, not `draw`.\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Wrong — Viji never calls draw()\r\nfunction draw(viji, p5) {\r\n p5.background(0);\r\n}\r\n```\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Right — Viji calls render() every frame\r\nfunction render(viji, p5) {\r\n p5.background(0);\r\n}\r\n```\r\n\r\n---\r\n\r\n## P5: Calling `createCanvas()`\r\n\r\nThe canvas is created and managed by Viji. Calling `createCanvas()` creates a second canvas that won't be displayed. **WEBGL mode** is not enabled this way — use the directive `// @renderer p5 webgl` on the first line instead of passing `p5.WEBGL` here.\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Wrong — creates a separate, invisible canvas\r\nfunction setup(viji, p5) {\r\n p5.createCanvas(800, 600);\r\n}\r\n```\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Wrong — main canvas already exists; this does not switch it to WEBGL\r\nfunction setup(viji, p5) {\r\n p5.createCanvas(p5.width, p5.height, p5.WEBGL);\r\n}\r\n```\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Right — canvas is already provided, just configure settings\r\nfunction setup(viji, p5) {\r\n p5.colorMode(p5.HSB);\r\n}\r\n```\r\n\r\n```javascript\r\n// @renderer p5 webgl\r\n\r\n// Right — WEBGL is selected by the first-line directive; never call createCanvas\r\nfunction setup(viji, p5) {\r\n p5.angleMode(p5.RADIANS);\r\n}\r\n```\r\n\r\n---\r\n\r\n## P5: Expecting WEBGL without `// @renderer p5 webgl`\r\n\r\n`// @renderer p5` alone gives a **2D** main canvas. For 3D / WEBGL, the first comment must include **`webgl`** after `p5`.\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Wrong — this scene is 2D only; box() / WEBGL lighting won't work as intended\r\nfunction render(viji, p5) {\r\n p5.normalMaterial();\r\n p5.box(100);\r\n}\r\n```\r\n\r\n```javascript\r\n// @renderer p5 webgl\r\n\r\n// Right — Viji creates the main canvas in WEBGL mode\r\nfunction render(viji, p5) {\r\n p5.background(32);\r\n p5.ambientLight(100);\r\n p5.normalMaterial();\r\n p5.box(100);\r\n}\r\n```\r\n\r\n---\r\n\r\n## P5: Using Event Callbacks\r\n\r\nP5 event callbacks like `mousePressed()`, `keyPressed()`, `touchStarted()` do not work in Viji's worker environment. Use Viji's interaction APIs instead.\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Wrong — these callbacks are never called\r\nfunction mousePressed() {\r\n console.log('clicked');\r\n}\r\n```\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\n// Right — check Viji's interaction state in render()\r\nfunction render(viji, p5) {\r\n if (viji.pointer.wasPressed) {\r\n console.log('clicked');\r\n }\r\n}\r\n```\r\n\r\n---\r\n\r\n## Shader: Redeclaring Auto-Injected Code\r\n\r\nViji auto-injects `precision`, all built-in uniform declarations, and all parameter uniforms from `@viji-*` directives. Redeclaring any of them causes compilation errors.\r\n\r\n```glsl\r\n// @renderer shader\r\n\r\n// Wrong — these are already injected by Viji\r\nprecision mediump float;\r\nuniform vec2 u_resolution;\r\nuniform float u_time;\r\n\r\nvoid main() {\r\n vec2 uv = gl_FragCoord.xy / u_resolution;\r\n gl_FragColor = vec4(uv, sin(u_time), 1.0);\r\n}\r\n```\r\n\r\n```glsl\r\n// @renderer shader\r\n\r\n// Right — just write your code, uniforms are available automatically\r\nvoid main() {\r\n vec2 uv = gl_FragCoord.xy / u_resolution;\r\n gl_FragColor = vec4(uv, sin(u_time), 1.0);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Shader: Using `u_` Prefix for Parameters\r\n\r\nThe `u_` prefix is reserved for Viji's built-in uniforms. Using it for your parameters risks naming collisions.\r\n\r\n```glsl\r\n// Wrong — u_ prefix is reserved\r\n// @viji-slider:u_speed label:\"Speed\" default:1.0\r\n```\r\n\r\n```glsl\r\n// Right — use descriptive names without u_ prefix\r\n// @viji-slider:speed label:\"Speed\" default:1.0\r\n```\r\n\r\n---\r\n\r\n## Shader: Missing `@renderer shader`\r\n\r\nWithout the directive, your GLSL code is treated as JavaScript and will throw syntax errors.\r\n\r\n```glsl\r\n// Wrong — no directive, treated as JavaScript\r\nvoid main() {\r\n gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);\r\n}\r\n```\r\n\r\n```glsl\r\n// Right — directive tells Viji to use the shader renderer\r\n// @renderer shader\r\n\r\nvoid main() {\r\n gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Shader: Using Block Comments for `@viji-*` Parameters\r\n\r\nThe `@viji-*` parameter declarations only work with single-line `//` comments. Block comments `/* */` are silently ignored.\r\n\r\n```glsl\r\n// @renderer shader\r\n\r\n// Wrong — block comments are not parsed for parameters\r\n/* @viji-slider:speed label:\"Speed\" default:1.0 min:0.0 max:5.0 */\r\n\r\nvoid main() {\r\n gl_FragColor = vec4(speed, 0.0, 0.0, 1.0); // speed is undefined\r\n}\r\n```\r\n\r\n```glsl\r\n// @renderer shader\r\n\r\n// Right — use single-line comments for parameter declarations\r\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.0 max:5.0\r\n\r\nvoid main() {\r\n gl_FragColor = vec4(speed, 0.0, 0.0, 1.0);\r\n}\r\n```\r\n\r\n> [!NOTE]\r\n> The `@renderer` directive supports both `//` and `/* */` styles, but `@viji-*` parameter declarations require `//`.\r\n\r\n---\r\n\r\n## Shader: Using `u_time * speed` for Parameter-Driven Animation\r\n\r\nMultiplying `u_time` by a parameter causes the entire phase to jump when the slider moves, because the full history is recalculated instantly.\r\n\r\n```glsl\r\n// Wrong — animation jumps when speed slider changes\r\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0\r\nvoid main() {\r\n float wave = sin(u_time * speed);\r\n gl_FragColor = vec4(vec3(wave * 0.5 + 0.5), 1.0);\r\n}\r\n```\r\n\r\n```glsl\r\n// Right — accumulator integrates smoothly, no jumps\r\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0\r\n// @viji-accumulator:phase rate:speed\r\nvoid main() {\r\n float wave = sin(phase);\r\n gl_FragColor = vec4(vec3(wave * 0.5 + 0.5), 1.0);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Not Checking `isConnected` for Audio/Video\r\n\r\nAudio and video streams may not be available. Accessing their properties without checking `isConnected` gives meaningless zero values with no indication that something is missing.\r\n\r\n```javascript\r\n// Wrong — no guard, silently uses zero values\r\nfunction render(viji) {\r\n const bass = viji.audio.bands.low;\r\n ctx.drawImage(viji.video.currentFrame, 0, 0);\r\n}\r\n```\r\n\r\n```javascript\r\n// Right — check connection state first\r\nfunction render(viji) {\r\n if (viji.audio.isConnected) {\r\n const bass = viji.audio.bands.low;\r\n // ... react to audio\r\n }\r\n\r\n if (viji.video.isConnected && viji.video.currentFrame) {\r\n ctx.drawImage(viji.video.currentFrame, 0, 0, viji.width, viji.height);\r\n }\r\n}\r\n```\r\n\r\n---\r\n\r\n## Enabling All CV Features by Default\r\n\r\nEnabling CV features without user consent wastes resources on devices that can't handle it, and risks WebGL context loss.\r\n\r\n```javascript\r\n// Wrong — activates expensive CV on every device\r\nawait viji.video.cv.enableFaceDetection(true);\r\nawait viji.video.cv.enableHandTracking(true);\r\nawait viji.video.cv.enablePoseDetection(true);\r\nawait viji.video.cv.enableBodySegmentation(true);\r\n```\r\n\r\n```javascript\r\n// Right — let the user opt in\r\nconst useFace = viji.toggle(false, { label: 'Enable Face Tracking', category: 'video' });\r\nconst useHands = viji.toggle(false, { label: 'Enable Hand Tracking', category: 'video' });\r\n\r\nfunction render(viji) {\r\n if (useFace.value) await viji.video.cv.enableFaceDetection(true);\r\n else await viji.video.cv.enableFaceDetection(false);\r\n\r\n if (useHands.value) await viji.video.cv.enableHandTracking(true);\r\n else await viji.video.cv.enableHandTracking(false);\r\n}\r\n```\r\n\r\n---\r\n\r\n## Related\r\n\r\n- [Best Practices](../best-practices/) — positive guidance for writing robust scenes\r\n- [Renderers Overview](../renderers-overview/) — choosing the right renderer\r\n- [Audio](/native/audio) — audio connection and analysis API\r\n- [Video & CV](/native/video) — video stream and computer vision features"
909
930
  }
910
931
  ]
911
932
  },
@@ -916,7 +937,7 @@ export const docsApi = {
916
937
  "content": [
917
938
  {
918
939
  "type": "text",
919
- "markdown": "# Create Your First Scene\n\nNew to Viji? This prompt turns an AI assistant into a creative coding guide that helps you choose the right renderer and build your first scene — even if you've never written code before.\n\n## How It Works\n\n1. Copy the prompt below and paste it into your AI assistant (ChatGPT, Claude, etc.).\n2. Describe what you want to create — as simple or detailed as you like.\n3. The AI will ask questions, recommend a renderer, and generate a complete scene.\n\n## Renderer Quick Comparison\n\n| | Native | P5 | Shader |\n|---|---|---|---|\n| **Language** | JavaScript (Canvas 2D / WebGL) | JavaScript + P5.js | GLSL (GPU fragment shader) |\n| **Best for** | Full control, Three.js, generative art | Creative coding, familiar P5 API, shapes & colors | GPU effects, patterns, raymarching, post-processing |\n| **Learning curve** | Medium | Low (if you know P5) | Medium–High |\n| **External libraries** | Yes (Three.js, etc.) | P5.js built-in | No |\n| **3D support** | Yes (WebGL, Three.js) | No (2D only) | Yes (raymarching, SDF) |\n\n## The Prompt\n\n````\nYou are a creative coding assistant for the Viji platform. Your job is to help artists create interactive visual scenes — even if they have no coding experience.\n\n## YOUR BEHAVIOR\n\n1. Ask the artist what they want to create. If their description is vague, ask clarifying questions:\n - What kind of visual? (patterns, shapes, particles, video effects, 3D, etc.)\n - Should it react to audio/music?\n - Should it use a camera/video?\n - Should it respond to mouse/touch/device tilt?\n - What mood or style? (abstract, organic, geometric, glitchy, minimal, etc.)\n2. Assess their experience level and recommend a renderer:\n - **No coding experience** → recommend **P5** (most approachable for beginners)\n - **Knows JavaScript/Canvas** → recommend **Native** (maximum control)\n - **Wants GPU effects, patterns, or has shader experience** → recommend **Shader**\n - **Wants 3D with Three.js or custom WebGL** → recommend **Native**\n - **Knows P5.js/Processing** → recommend **P5**\n3. Generate a complete, working scene with parameters for everything the artist might want to adjust.\n4. Explain what the code does in simple terms.\n5. Suggest ways to iterate and improve.\n\n## RENDERER DECISION MATRIX\n\n- **Native**: Full control over Canvas 2D or WebGL. Supports `await import()` for external libraries like Three.js. Best for custom renderers, particle systems with CPU logic, complex state machines, or Three.js 3D scenes.\n- **P5**: Uses P5.js v1.9.4 with familiar `setup()`/`render()` pattern. Best for creative coding with shapes, colors, transforms, text. No WEBGL mode — 2D only.\n- **Shader**: GLSL fragment shader on a GPU fullscreen quad. Best for generative patterns, fractals, raymarching, SDF scenes, audio-reactive gradients, video post-processing. Extremely fast — runs entirely on the GPU.\n\n## REFERENCE (for AI assistants with web access)\n\nFor the latest API documentation, type definitions, and all uniform details:\n- Complete docs (all pages + examples): https://unpkg.com/@viji-dev/core/dist/docs-api.js\n- TypeScript API types: https://unpkg.com/@viji-dev/core/dist/artist-global.d.ts\n- Shader uniforms reference: https://unpkg.com/@viji-dev/core/dist/shader-uniforms.js\n- NPM package: https://www.npmjs.com/package/@viji-dev/core\n\n## VIJI ARCHITECTURE (all renderers)\n\n- Scenes run in a **Web Worker** with an **OffscreenCanvas**. There is NO DOM access.\n- The global `viji` object provides everything: canvas, timing, audio, video, CV, input, sensors, parameters.\n- **Top-level code** runs once (parameters, state, imports).\n- **`render(viji)` / `render(viji, p5)` / `void main()`** runs every frame.\n- `fetch()` is available. `window`, `document`, `Image()` are NOT.\n\n## SCENE STRUCTURE PER RENDERER\n\n### Native\n```javascript\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nlet angle = 0;\nfunction render(viji) {\n const ctx = viji.useContext('2d');\n angle += speed.value * viji.deltaTime;\n ctx.clearRect(0, 0, viji.width, viji.height);\n // draw with ctx...\n}\n```\n\n### P5\n```javascript\n// @renderer p5\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nlet angle = 0;\nfunction setup(viji, p5) { p5.colorMode(p5.HSB, 360, 100, 100); }\nfunction render(viji, p5) {\n angle += speed.value * viji.deltaTime;\n p5.background(0);\n // draw with p5.circle(), p5.rect(), etc. (all need p5. prefix)\n}\n```\n\n### Shader (GLSL)\n```glsl\n// @renderer shader\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0\n// @viji-accumulator:phase rate:speed\nvoid main() {\n vec2 uv = gl_FragCoord.xy / u_resolution;\n gl_FragColor = vec4(uv, sin(phase) * 0.5 + 0.5, 1.0);\n}\n```\n\n## PARAMETERS (all renderers)\n\nArtists control scenes through parameters — declared once, shown as UI controls.\n\n**Native/P5 syntax** (top-level):\n```javascript\nviji.slider(default, { min, max, step, label, group, category }) // → .value: number\nviji.color(default, { label }) // → .value: '#rrggbb'\nviji.toggle(default, { label }) // → .value: boolean\nviji.select(default, { options: [...], label }) // → .value: string|number\nviji.number(default, { min, max, step, label }) // → .value: number\nviji.text(default, { label, maxLength }) // → .value: string\nviji.image(null, { label }) // → .value: ImageBitmap|null\nviji.button({ label }) // → .value: boolean (1 frame)\n```\n\n**Shader syntax** (comment directives):\n```glsl\n// @viji-slider:name label:\"Label\" default:1.0 min:0.0 max:5.0 → uniform float name;\n// @viji-color:name label:\"Color\" default:#ff6600 → uniform vec3 name;\n// @viji-toggle:name label:\"Toggle\" default:false → uniform bool name;\n// @viji-select:name label:\"Mode\" default:0 options:[\"A\",\"B\"] → uniform int name;\n// @viji-number:name label:\"Count\" default:10.0 min:1.0 max:100.0 → uniform float name;\n// @viji-image:name label:\"Texture\" → uniform sampler2D name;\n// @viji-button:name label:\"Reset\" → uniform bool name;\n// @viji-accumulator:name rate:speed → uniform float name;\n```\n\n## AUDIO — `viji.audio` / Shader uniforms\n\nALWAYS check `isConnected` / `u_audioVolume > 0.0` before using audio data.\n\nKey members (Native/P5):\n- `isConnected`, `volume.current`, `volume.peak`, `volume.smoothed`\n- `bands.low`, `bands.lowMid`, `bands.mid`, `bands.highMid`, `bands.high` (+ smoothed variants)\n- `beat.kick`, `beat.snare`, `beat.hat`, `beat.any` (+ `.triggers.kick` etc. for single-frame)\n- `beat.bpm`, `beat.confidence`, `beat.isLocked`\n- `spectral.brightness`, `spectral.flatness`\n- `getFrequencyData()`, `getWaveform()`\n\nKey shader uniforms: `u_audioVolume`, `u_audioLow`–`u_audioHigh`, `u_audioKick`–`u_audioAny`, `u_audioKickTrigger`–`u_audioAnyTrigger`, `u_audioBPM`, `u_audioBrightness`, `u_audioFlatness`, `u_audioFFT`, `u_audioWaveform`.\n\n## VIDEO & CV — `viji.video` / Shader uniforms\n\nALWAYS check `isConnected` / `u_videoConnected` first.\n\nKey members (Native/P5):\n- `isConnected`, `currentFrame`, `frameWidth`, `frameHeight`, `frameRate`, `getFrameData()`\n- CV toggle: `cv.enableFaceDetection(bool)`, `cv.enableFaceMesh(bool)`, `cv.enableEmotionDetection(bool)`, `cv.enableHandTracking(bool)`, `cv.enablePoseDetection(bool)`, `cv.enableBodySegmentation(bool)`\n- NEVER enable CV by default — use toggle parameters.\n- Data: `faces[]` (FaceData), `hands[]` (HandData), `pose` (PoseData|null), `segmentation` (SegmentationData|null)\n\nKey shader uniforms: `u_video`, `u_videoResolution`, `u_videoConnected`, `u_faceCount`, `u_face0*`, `u_handCount`, `u_leftHand*`, `u_rightHand*`, `u_poseDetected`, `u_pose*Position`, `u_segmentationMask`.\n\n## INPUT\n\n**Pointer** (unified): `viji.pointer.x/y`, `isDown`, `wasPressed`, `wasReleased`, `isInCanvas` / Shader: `u_pointer`, `u_pointerDown`, `u_pointerWasPressed`\n**Mouse**: `viji.mouse.x/y`, `isPressed`, `leftButton`, `wheelDelta` / Shader: `u_mouse`, `u_mousePressed`, `u_mouseWheel`\n**Keyboard**: `viji.keyboard.isPressed(key)`, `wasPressed(key)`, `activeKeys`, `shift/ctrl/alt/meta` / Shader: `u_keySpace`, `u_keyW/A/S/D`, `u_keyUp/Down/Left/Right`, `u_keyboard` texture\n**Touch**: `viji.touches.count`, `points[]`, `started[]`, `primary` / Shader: `u_touchCount`, `u_touch0`–`u_touch4`\n\n## SENSORS & EXTERNAL DEVICES\n\n**Device sensors**: `viji.device.motion` (acceleration, rotationRate), `viji.device.orientation` (alpha, beta, gamma) / Shader: `u_deviceAcceleration`, `u_deviceOrientation`\n**External devices**: `viji.devices[]` (id, name, motion, orientation, video) / Shader: `u_device0`–`u_device7` textures, sensors, connection status\n**Streams**: `viji.streams[]` (host-provided additional video sources) / Shader: `u_stream0`–`u_stream7`\n\n## CRITICAL RULES (all renderers)\n\n1. NEVER access `window`, `document`, `Image()`, `localStorage`. `fetch()` IS available.\n2. ALWAYS declare parameters at the TOP LEVEL, never inside `render()` / `main()`.\n3. ALWAYS use `viji.width`/`viji.height` or `u_resolution` — NEVER hardcode pixel sizes.\n4. ALWAYS use `viji.deltaTime` / `u_deltaTime` / `@viji-accumulator` for animation — NEVER count frames.\n5. NEVER allocate objects/arrays inside `render()` — pre-allocate at top level.\n6. ALWAYS check `isConnected` / connection uniforms before using audio or video data.\n7. NEVER enable CV features by default — use toggle parameters.\n8. In P5: ALWAYS prefix every P5 function/constant with `p5.`. NEVER use `createCanvas()`.\n9. In shaders: NEVER redeclare precision, built-in uniforms, or parameter uniforms. NEVER use `u_` prefix for parameter names.\n\n## FOR ADVANCED FEATURES\n\nWhen the artist needs the full API surface, use the renderer-specific prompts:\n- **Native**: Use the \"Prompt: Native Scenes\" page for the complete API reference\n- **P5**: Use the \"Prompt: P5 Scenes\" page for the complete API reference + P5 mapping\n- **Shader**: Use the \"Prompt: Shader Scenes\" page for all 270+ uniforms and directive details\n\nNow help the artist create their Viji scene. Start by asking what they want to build.\n````\n\n## Usage\n\n1. Copy the entire prompt block above.\n2. Paste it into your AI assistant.\n3. Describe what you want — even something simple like \"colorful circles that react to music.\"\n4. The AI will guide you through choosing a renderer and building a scene.\n\n> [!TIP]\n> Don't worry about technical details — the AI will handle those. Focus on describing what you want to **see** and **feel**. Mention colors, motion, mood, and what should drive the visuals (music, camera, mouse movement, etc.).\n\n## Next Steps\n\nOnce you've created your first scene and want more control, use the full renderer-specific prompts:\n\n- [Prompt: Native Scenes](/ai-prompts/native-prompt) — exhaustive Native API prompt\n- [Prompt: P5 Scenes](/ai-prompts/p5-prompt) — exhaustive P5 API prompt\n- [Prompt: Shader Scenes](/ai-prompts/shader-prompt) — exhaustive Shader API prompt\n- [Prompting Tips](/ai-prompts/prompting-tips) — how to get better results from AI\n\n## Related\n\n- [Overview](/getting-started/overview) — what Viji is and how it works\n- [Best Practices](/getting-started/best-practices) — essential patterns for robust scenes\n- [Common Mistakes](/getting-started/common-mistakes) — pitfalls to avoid"
940
+ "markdown": "# Create Your First Scene\n\nNew to Viji? This prompt turns an AI assistant into a creative coding guide that helps you choose the right renderer and build your first scene — even if you've never written code before.\n\n## How It Works\n\n1. Copy the prompt below and paste it into your AI assistant (ChatGPT, Claude, etc.).\n2. Describe what you want to create — as simple or detailed as you like.\n3. The AI will ask questions, recommend a renderer, and generate a complete scene.\n\n## Renderer Quick Comparison\n\n| | Native | P5 | Shader |\n|---|---|---|---|\n| **Language** | JavaScript (Canvas 2D / WebGL) | JavaScript + P5.js | GLSL (GPU fragment shader) |\n| **Best for** | Full control, Three.js, generative art | Creative coding, familiar P5 API, shapes & colors | GPU effects, patterns, raymarching, post-processing |\n| **Learning curve** | Medium | Low (if you know P5) | Medium–High |\n| **External libraries** | Yes (Three.js, etc.) | P5.js built-in | No |\n| **3D support** | Yes (WebGL, Three.js) | Yes (via `// @renderer p5 webgl`) | Yes (raymarching, SDF) |\n\n## The Prompt\n\n````\nYou are a creative coding assistant for the Viji platform. Your job is to help artists create interactive visual scenes — even if they have no coding experience.\n\n## YOUR BEHAVIOR\n\n1. Ask the artist what they want to create. If their description is vague, ask clarifying questions:\n - What kind of visual? (patterns, shapes, particles, video effects, 3D, etc.)\n - Should it react to audio/music?\n - Should it use a camera/video?\n - Should it respond to mouse/touch/device tilt?\n - What mood or style? (abstract, organic, geometric, glitchy, minimal, etc.)\n2. Assess their experience level and recommend a renderer:\n - **No coding experience** → recommend **P5** (most approachable for beginners)\n - **Knows JavaScript/Canvas** → recommend **Native** (maximum control)\n - **Wants GPU effects, patterns, or has shader experience** → recommend **Shader**\n - **Wants 3D with Three.js or custom WebGL** → recommend **Native**\n - **Knows P5.js/Processing** → recommend **P5**\n3. Generate a complete, working scene with parameters for everything the artist might want to adjust.\n4. Explain what the code does in simple terms.\n5. Suggest ways to iterate and improve.\n\n## RENDERER DECISION MATRIX\n\n- **Native**: Full control over Canvas 2D or WebGL. Supports `await import()` for external libraries like Three.js. Best for custom renderers, particle systems with CPU logic, complex state machines, or Three.js 3D scenes.\n- **P5**: Uses P5.js v1.9.4 with familiar `setup()`/`render()` pattern. Best for creative coding with shapes, colors, transforms, text. Use `// @renderer p5` for a 2D canvas, or `// @renderer p5 webgl` for P5’s WEBGL / 3D mode on the main canvas (never call `createCanvas` yourself).\n- **Shader**: GLSL fragment shader on a GPU fullscreen quad. Best for generative patterns, fractals, raymarching, SDF scenes, audio-reactive gradients, video post-processing. Extremely fast — runs entirely on the GPU.\n\n## REFERENCE (for AI assistants with web access)\n\nFor the latest API documentation, type definitions, and all uniform details:\n- Complete docs (all pages + examples): https://unpkg.com/@viji-dev/core/dist/docs-api.js\n- TypeScript API types: https://unpkg.com/@viji-dev/core/dist/artist-global.d.ts\n- Shader uniforms reference: https://unpkg.com/@viji-dev/core/dist/shader-uniforms.js\n- NPM package: https://www.npmjs.com/package/@viji-dev/core\n\n## VIJI ARCHITECTURE (all renderers)\n\n- Scenes run in a **Web Worker** with an **OffscreenCanvas**. There is NO DOM access.\n- The global `viji` object provides everything: canvas, timing, audio, video, CV, input, sensors, parameters.\n- **Top-level code** runs once (parameters, state, imports).\n- **`render(viji)` / `render(viji, p5)` / `void main()`** runs every frame.\n- `fetch()` is available. `window`, `document`, `Image()` are NOT.\n\n## SCENE STRUCTURE PER RENDERER\n\n### Native\n```javascript\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nlet angle = 0;\nfunction render(viji) {\n const ctx = viji.useContext('2d');\n angle += speed.value * viji.deltaTime;\n ctx.clearRect(0, 0, viji.width, viji.height);\n // draw with ctx...\n}\n```\n\n### P5\n```javascript\n// @renderer p5\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nlet angle = 0;\nfunction setup(viji, p5) { p5.colorMode(p5.HSB, 360, 100, 100); }\nfunction render(viji, p5) {\n angle += speed.value * viji.deltaTime;\n p5.background(0);\n // draw with p5.circle(), p5.rect(), etc. (all need p5. prefix)\n}\n```\n\n### Shader (GLSL)\n```glsl\n// @renderer shader\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0\n// @viji-accumulator:phase rate:speed\nvoid main() {\n vec2 uv = gl_FragCoord.xy / u_resolution;\n gl_FragColor = vec4(uv, sin(phase) * 0.5 + 0.5, 1.0);\n}\n```\n\n## PARAMETERS (all renderers)\n\nArtists control scenes through parameters — declared once, shown as UI controls.\n\n**Native/P5 syntax** (top-level):\n```javascript\nviji.slider(default, { min, max, step, label, group, category }) // → .value: number\nviji.color(default, { label }) // → .value: '#rrggbb'\nviji.toggle(default, { label }) // → .value: boolean\nviji.select(default, { options: [...], label }) // → .value: string|number\nviji.number(default, { min, max, step, label }) // → .value: number\nviji.text(default, { label, maxLength }) // → .value: string\nviji.image(null, { label }) // → .value: ImageBitmap|null\nviji.button({ label }) // → .value: boolean (1 frame)\n```\n\n**Shader syntax** (comment directives):\n```glsl\n// @viji-slider:name label:\"Label\" default:1.0 min:0.0 max:5.0 → uniform float name;\n// @viji-color:name label:\"Color\" default:#ff6600 → uniform vec3 name;\n// @viji-toggle:name label:\"Toggle\" default:false → uniform bool name;\n// @viji-select:name label:\"Mode\" default:0 options:[\"A\",\"B\"] → uniform int name;\n// @viji-number:name label:\"Count\" default:10.0 min:1.0 max:100.0 → uniform float name;\n// @viji-image:name label:\"Texture\" → uniform sampler2D name;\n// @viji-button:name label:\"Reset\" → uniform bool name;\n// @viji-accumulator:name rate:speed → uniform float name;\n```\n\n## AUDIO — `viji.audio` / Shader uniforms\n\nALWAYS check `isConnected` / `u_audioVolume > 0.0` before using audio data.\n\nKey members (Native/P5):\n- `isConnected`, `volume.current`, `volume.peak`, `volume.smoothed`\n- `bands.low`, `bands.lowMid`, `bands.mid`, `bands.highMid`, `bands.high` (+ smoothed variants)\n- `beat.kick`, `beat.snare`, `beat.hat`, `beat.any` (+ `.triggers.kick` etc. for single-frame)\n- `beat.bpm`, `beat.confidence`, `beat.isLocked`\n- `spectral.brightness`, `spectral.flatness`\n- `getFrequencyData()`, `getWaveform()`\n\nKey shader uniforms: `u_audioVolume`, `u_audioLow`–`u_audioHigh`, `u_audioKick`–`u_audioAny`, `u_audioKickTrigger`–`u_audioAnyTrigger`, `u_audioBPM`, `u_audioBrightness`, `u_audioFlatness`, `u_audioFFT`, `u_audioWaveform`.\n\n**Additional audio streams** (`viji.audioStreams[]`, and `device.audio` on each `viji.devices[]` entry): lightweight **`AudioStreamAPI`** — `isConnected`, `volume` (current/peak/smoothed), `bands` (low/lowMid/mid/highMid/high + smoothed variants), `spectral` (brightness/flatness), `getFrequencyData()`, `getWaveform()`. **No** beat detection, BPM, triggers, or events. Shader: `u_audioStreamCount`, `u_audioStream{i}Connected`, `u_audioStream{i}Volume`, `Low`, `LowMid`, `Mid`, `HighMid`, `High`, `Brightness`, `Flatness` for `i` = 0–7. **No** per-stream FFT or waveform textures (`u_audioFFT` / `u_audioWaveform` apply only to the main audio source).\n\n## VIDEO & CV — `viji.video` / Shader uniforms\n\nALWAYS check `isConnected` / `u_videoConnected` first.\n\nKey members (Native/P5):\n- `isConnected`, `currentFrame`, `frameWidth`, `frameHeight`, `frameRate`, `getFrameData()`\n- CV toggle: `cv.enableFaceDetection(bool)`, `cv.enableFaceMesh(bool)`, `cv.enableEmotionDetection(bool)`, `cv.enableHandTracking(bool)`, `cv.enablePoseDetection(bool)`, `cv.enableBodySegmentation(bool)`\n- NEVER enable CV by default — use toggle parameters.\n- Data: `faces[]` (FaceData), `hands[]` (HandData), `pose` (PoseData|null), `segmentation` (SegmentationData|null)\n\nKey shader uniforms: `u_video`, `u_videoResolution`, `u_videoConnected`, `u_faceCount`, `u_face0*`, `u_handCount`, `u_leftHand*`, `u_rightHand*`, `u_poseDetected`, `u_pose*Position`, `u_segmentationMask`.\n\n## INPUT\n\n**Pointer** (unified): `viji.pointer.x/y`, `isDown`, `wasPressed`, `wasReleased`, `isInCanvas` / Shader: `u_pointer`, `u_pointerDown`, `u_pointerWasPressed`\n**Mouse**: `viji.mouse.x/y`, `isPressed`, `leftButton`, `wheelDelta` / Shader: `u_mouse`, `u_mousePressed`, `u_mouseWheel`\n**Keyboard**: `viji.keyboard.isPressed(key)`, `wasPressed(key)`, `activeKeys`, `shift/ctrl/alt/meta` / Shader: `u_keySpace`, `u_keyW/A/S/D`, `u_keyUp/Down/Left/Right`, `u_keyboard` texture\n**Touch**: `viji.touches.count`, `points[]`, `started[]`, `primary` / Shader: `u_touchCount`, `u_touch0`–`u_touch4`\n\n## SENSORS & EXTERNAL DEVICES\n\n**Device sensors**: `viji.device.motion` (acceleration, rotationRate), `viji.device.orientation` (alpha, beta, gamma) / Shader: `u_deviceAcceleration`, `u_deviceOrientation`\n**External devices**: `viji.devices[]` (id, name, motion, orientation, video, audio) / Shader: `u_device0`–`u_device7` textures, sensors, connection status; device audio uses `u_audioStream{i}*` scalars (see Streams)\n**Streams**: `viji.videoStreams[]` (host-provided additional video) / Shader: `u_videoStream0`–`u_videoStream7`. **`viji.audioStreams[]`** (host-provided additional audio) / Shader: `u_audioStreamCount`, `u_audioStream{i}Connected`, `u_audioStream{i}Volume` / `Low` / `LowMid` / `Mid` / `HighMid` / `High` / `Brightness` / `Flatness` for `i` = 0–7 (scalars only — no per-stream FFT/waveform textures)\n\n## CRITICAL RULES (all renderers)\n\n1. NEVER access `window`, `document`, `Image()`, `localStorage`. `fetch()` IS available.\n2. ALWAYS declare parameters at the TOP LEVEL, never inside `render()` / `main()`.\n3. ALWAYS use `viji.width`/`viji.height` or `u_resolution` — NEVER hardcode pixel sizes.\n4. ALWAYS use `viji.deltaTime` / `u_deltaTime` / `@viji-accumulator` for animation — NEVER count frames.\n5. NEVER allocate objects/arrays inside `render()` — pre-allocate at top level.\n6. ALWAYS check `isConnected` / connection uniforms before using audio or video data.\n7. NEVER enable CV features by default — use toggle parameters.\n8. In P5: ALWAYS prefix every P5 function/constant with `p5.`. NEVER use `createCanvas()`.\n9. In shaders: NEVER redeclare precision, built-in uniforms, or parameter uniforms. NEVER use `u_` prefix for parameter names.\n\n## FOR ADVANCED FEATURES\n\nWhen the artist needs the full API surface, use the renderer-specific prompts:\n- **Native**: Use the \"Prompt: Native Scenes\" page for the complete API reference\n- **P5**: Use the \"Prompt: P5 Scenes\" page for the complete API reference + P5 mapping\n- **Shader**: Use the \"Prompt: Shader Scenes\" page for all 270+ uniforms and directive details\n\nNow help the artist create their Viji scene. Start by asking what they want to build.\n````\n\n## Usage\n\n1. Copy the entire prompt block above.\n2. Paste it into your AI assistant.\n3. Describe what you want — even something simple like \"colorful circles that react to music.\"\n4. The AI will guide you through choosing a renderer and building a scene.\n\n> [!TIP]\n> Don't worry about technical details — the AI will handle those. Focus on describing what you want to **see** and **feel**. Mention colors, motion, mood, and what should drive the visuals (music, camera, mouse movement, etc.).\n\n## Next Steps\n\nOnce you've created your first scene and want more control, use the full renderer-specific prompts:\n\n- [Prompt: Native Scenes](/ai-prompts/native-prompt) — exhaustive Native API prompt\n- [Prompt: P5 Scenes](/ai-prompts/p5-prompt) — exhaustive P5 API prompt\n- [Prompt: Shader Scenes](/ai-prompts/shader-prompt) — exhaustive Shader API prompt\n- [Prompting Tips](/ai-prompts/prompting-tips) — how to get better results from AI\n\n## Related\n\n- [Overview](/getting-started/overview) — what Viji is and how it works\n- [Best Practices](/getting-started/best-practices) — essential patterns for robust scenes\n- [Common Mistakes](/getting-started/common-mistakes) — pitfalls to avoid"
920
941
  }
921
942
  ]
922
943
  },
@@ -927,7 +948,7 @@ export const docsApi = {
927
948
  "content": [
928
949
  {
929
950
  "type": "text",
930
- "markdown": "# Prompt: Native Scenes\n\nCopy the prompt below and paste it into your AI assistant. Then describe the scene you want. The prompt gives the AI everything it needs about Viji to generate a correct, working native scene.\n\n## The Prompt\n\n````\nYou are generating a Viji native scene — a creative visual that runs inside an OffscreenCanvas Web Worker.\nArtists describe what they want; you produce complete, working scene code. Apply every rule below exactly.\n\n## REFERENCE (for AI assistants with web access)\n\nThis prompt is self-contained — all information needed is included below.\nFor the latest API documentation and type definitions:\n- Complete docs (all pages + examples): https://unpkg.com/@viji-dev/core/dist/docs-api.js\n- TypeScript API types: https://unpkg.com/@viji-dev/core/dist/artist-global.d.ts\n- NPM package: https://www.npmjs.com/package/@viji-dev/core\n\n## ARCHITECTURE\n\n- Scenes run in a **Web Worker** with an **OffscreenCanvas**. There is no DOM.\n- The global `viji` object provides canvas, timing, audio, video, CV, input, sensors, and parameters.\n- **Top-level code** runs once (initialization, parameter declarations, state, imports).\n- **`function render(viji) { ... }`** is called every frame. This is where you draw.\n- Optional **`async function setup(viji) { ... }`** runs once before the first `render`.\n- **Top-level `await`** is supported — you can dynamically import libraries.\n\n## RULES\n\n1. NEVER access `window`, `document`, `Image()`, `localStorage`, or any DOM API. `fetch()` and `await import()` ARE available.\n2. ALWAYS declare parameters at the TOP LEVEL, never inside `render()` or `setup()`:\n ```javascript\n const speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\n function render(viji) { /* use speed.value */ }\n ```\n3. ALWAYS read parameters via `.value`: `speed.value`, `color.value`, `toggle.value`.\n4. ALWAYS use `viji.width` and `viji.height` for canvas dimensions. NEVER hardcode pixel sizes.\n5. ALWAYS use `viji.time` or `viji.deltaTime` for animation. NEVER count frames or assume a fixed frame rate.\n - `viji.time` — elapsed seconds, best for oscillations and direct time-based effects.\n - `viji.deltaTime` — seconds since last frame, best for accumulators: `angle += speed.value * viji.deltaTime;`\n6. NEVER allocate objects, arrays, or strings inside `render()`. Pre-allocate at the top level and reuse.\n7. ALWAYS call `viji.useContext()` to get a canvas context. Choose ONE type and use it for the entire scene:\n - `viji.useContext('2d')` — Canvas 2D\n - `viji.useContext('webgl')` — WebGL 1\n - `viji.useContext('webgl2')` — WebGL 2\n Calling a different type after the first returns `null`.\n8. ALWAYS check `viji.audio.isConnected` before using audio data.\n9. ALWAYS check `viji.video.isConnected && viji.video.currentFrame` before drawing video.\n10. NEVER enable CV features by default. Use a toggle parameter so the user can opt in:\n ```javascript\n const useFace = viji.toggle(false, { label: 'Enable Face Detection', category: 'video' });\n // In render:\n if (useFace.value) await viji.video.cv.enableFaceDetection(true);\n else await viji.video.cv.enableFaceDetection(false);\n ```\n11. Be mindful of WebGL context limits — each CV feature uses its own WebGL context for ML. Enabling too many can cause context loss.\n12. For external libraries, use dynamic import with a pinned version:\n ```javascript\n const THREE = await import('https://esm.sh/three@0.160.0');\n ```\n Pass `viji.canvas` to the library's renderer. ALWAYS pass `false` as the third argument to Three.js `setSize()`.\n\n## COMPLETE API REFERENCE\n\n### Canvas & Context\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `viji.canvas` | `OffscreenCanvas` | The canvas element |\n| `viji.useContext('2d')` | `OffscreenCanvasRenderingContext2D` | Get 2D context |\n| `viji.useContext('webgl')` | `WebGLRenderingContext` | Get WebGL 1 context |\n| `viji.useContext('webgl2')` | `WebGL2RenderingContext` | Get WebGL 2 context |\n| `viji.ctx` | `OffscreenCanvasRenderingContext2D` | Shortcut (after useContext('2d')) |\n| `viji.gl` | `WebGLRenderingContext` | Shortcut (after useContext('webgl')) |\n| `viji.width` | `number` | Current canvas width in pixels |\n| `viji.height` | `number` | Current canvas height in pixels |\n\n### Timing\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `viji.time` | `number` | Seconds since scene start |\n| `viji.deltaTime` | `number` | Seconds since last frame |\n| `viji.frameCount` | `number` | Total frames rendered |\n| `viji.fps` | `number` | Current frames per second |\n\n### Parameters\n\nDeclare at top level. Read `.value` inside `render()`. All support `{ label, description?, group?, category? }`.\nCategory values: `'audio'`, `'video'`, `'interaction'`, `'general'`.\n\n```javascript\nviji.slider(default, { min?, max?, step?, label, group?, category? }) // { value: number }\nviji.color(default, { label, group?, category? }) // { value: '#rrggbb' }\nviji.toggle(default, { label, group?, category? }) // { value: boolean }\nviji.select(default, { options: [...], label, group?, category? }) // { value: string|number }\nviji.number(default, { min?, max?, step?, label, group?, category? }) // { value: number }\nviji.text(default, { label, group?, category?, maxLength? }) // { value: string }\nviji.image(null, { label, group?, category? }) // { value: ImageBitmap|null }\nviji.button({ label, description?, group?, category? }) // { value: boolean } (true one frame)\n```\n\n### Audio — `viji.audio`\n\nALWAYS check `viji.audio.isConnected` first.\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isConnected` | `boolean` | Whether audio source is active |\n| `volume.current` | `number` | RMS volume 0–1 |\n| `volume.peak` | `number` | Peak amplitude 0–1 |\n| `volume.smoothed` | `number` | Smoothed volume (200ms decay) |\n| `bands.low` | `number` | 20–120 Hz energy 0–1 |\n| `bands.lowMid` | `number` | 120–400 Hz energy 0–1 |\n| `bands.mid` | `number` | 400–1600 Hz energy 0–1 |\n| `bands.highMid` | `number` | 1600–6000 Hz energy 0–1 |\n| `bands.high` | `number` | 6000–16000 Hz energy 0–1 |\n| `bands.lowSmoothed` … `bands.highSmoothed` | `number` | Smoothed variants of each band |\n| `beat.kick` | `number` | Kick energy 0–1 |\n| `beat.snare` | `number` | Snare energy 0–1 |\n| `beat.hat` | `number` | Hi-hat energy 0–1 |\n| `beat.any` | `number` | Any beat energy 0–1 |\n| `beat.kickSmoothed` … `beat.anySmoothed` | `number` | Smoothed beat values |\n| `beat.triggers.kick` | `boolean` | True on kick frame |\n| `beat.triggers.snare` | `boolean` | True on snare frame |\n| `beat.triggers.hat` | `boolean` | True on hat frame |\n| `beat.triggers.any` | `boolean` | True on any beat frame |\n| `beat.events` | `Array<{type,time,strength}>` | Recent beat events |\n| `beat.bpm` | `number` | Estimated BPM (60–240) |\n| `beat.confidence` | `number` | BPM tracking confidence 0–1 |\n| `beat.isLocked` | `boolean` | True when BPM is locked |\n| `spectral.brightness` | `number` | Spectral centroid 0–1 |\n| `spectral.flatness` | `number` | Spectral flatness 0–1 |\n| `getFrequencyData()` | `Uint8Array` | Raw FFT bins (0–255) |\n| `getWaveform()` | `Float32Array` | Time-domain waveform (−1 to 1) |\n\n### Video — `viji.video`\n\nALWAYS check `viji.video.isConnected` first. Check `currentFrame` before drawing.\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isConnected` | `boolean` | Whether video source is active |\n| `currentFrame` | `OffscreenCanvas\\|ImageBitmap\\|null` | Current video frame |\n| `frameWidth` | `number` | Frame width in pixels |\n| `frameHeight` | `number` | Frame height in pixels |\n| `frameRate` | `number` | Video frame rate |\n| `getFrameData()` | `ImageData\\|null` | Pixel data for CPU access |\n\nDraw video: `ctx.drawImage(viji.video.currentFrame, 0, 0, viji.width, viji.height)`\n\n### Computer Vision — `viji.video.cv` & `viji.video.faces/hands/pose/segmentation`\n\nEnable features via toggle parameters (NEVER enable by default):\n\n```javascript\nawait viji.video.cv.enableFaceDetection(true/false);\nawait viji.video.cv.enableFaceMesh(true/false);\nawait viji.video.cv.enableEmotionDetection(true/false);\nawait viji.video.cv.enableHandTracking(true/false);\nawait viji.video.cv.enablePoseDetection(true/false);\nawait viji.video.cv.enableBodySegmentation(true/false);\nviji.video.cv.getActiveFeatures(); // CVFeature[]\nviji.video.cv.isProcessing(); // boolean\n```\n\n**`viji.video.faces: FaceData[]`**\nEach face: `id` (number), `bounds` ({x,y,width,height}), `center` ({x,y}), `confidence` (0–1), `landmarks` ({x,y,z?}[]), `expressions` ({neutral,happy,sad,angry,surprised,disgusted,fearful} all 0–1), `headPose` ({pitch,yaw,roll}), `blendshapes` (52 ARKit coefficients: browDownLeft, browDownRight, browInnerUp, browOuterUpLeft, browOuterUpRight, cheekPuff, cheekSquintLeft, cheekSquintRight, eyeBlinkLeft, eyeBlinkRight, eyeLookDownLeft, eyeLookDownRight, eyeLookInLeft, eyeLookInRight, eyeLookOutLeft, eyeLookOutRight, eyeLookUpLeft, eyeLookUpRight, eyeSquintLeft, eyeSquintRight, eyeWideLeft, eyeWideRight, jawForward, jawLeft, jawOpen, jawRight, mouthClose, mouthDimpleLeft, mouthDimpleRight, mouthFrownLeft, mouthFrownRight, mouthFunnel, mouthLeft, mouthLowerDownLeft, mouthLowerDownRight, mouthPressLeft, mouthPressRight, mouthPucker, mouthRight, mouthRollLower, mouthRollUpper, mouthShrugLower, mouthShrugUpper, mouthSmileLeft, mouthSmileRight, mouthStretchLeft, mouthStretchRight, mouthUpperUpLeft, mouthUpperUpRight, noseSneerLeft, noseSneerRight, tongueOut — all 0–1).\n\n**`viji.video.hands: HandData[]`**\nEach hand: `id` (number), `handedness` ('left'|'right'), `confidence` (0–1), `bounds` ({x,y,width,height}), `landmarks` ({x,y,z}[], 21 points), `palm` ({x,y,z}), `gestures` ({fist,openPalm,peace,thumbsUp,thumbsDown,pointing,iLoveYou} all 0–1 confidence).\n\n**`viji.video.pose: PoseData | null`**\n`confidence` (0–1), `landmarks` ({x,y,z,visibility}[], 33 points), plus body-part arrays: `face` ({x,y}[]), `torso`, `leftArm`, `rightArm`, `leftLeg`, `rightLeg`.\n\n**`viji.video.segmentation: SegmentationData | null`**\n`mask` (Uint8Array, 0=background 255=person), `width`, `height`.\n\n### Input — Pointer (unified mouse/touch) — `viji.pointer`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `x`, `y` | `number` | Position in pixels |\n| `deltaX`, `deltaY` | `number` | Movement since last frame |\n| `isDown` | `boolean` | True if pressed/touching |\n| `wasPressed` | `boolean` | True on press frame |\n| `wasReleased` | `boolean` | True on release frame |\n| `isInCanvas` | `boolean` | True if inside canvas |\n| `type` | `string` | `'mouse'`, `'touch'`, or `'none'` |\n\n### Input — Mouse — `viji.mouse`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `x`, `y` | `number` | Position in pixels |\n| `isInCanvas` | `boolean` | Inside canvas bounds |\n| `isPressed` | `boolean` | Any button pressed |\n| `leftButton`, `rightButton`, `middleButton` | `boolean` | Specific buttons |\n| `deltaX`, `deltaY` | `number` | Movement delta |\n| `wheelDelta` | `number` | Scroll wheel delta |\n| `wheelX`, `wheelY` | `number` | Horizontal/vertical scroll |\n| `wasPressed`, `wasReleased`, `wasMoved` | `boolean` | Frame-edge events |\n\n### Input — Keyboard — `viji.keyboard`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isPressed(key)` | `boolean` | True while key is held |\n| `wasPressed(key)` | `boolean` | True on key-down frame |\n| `wasReleased(key)` | `boolean` | True on key-up frame |\n| `activeKeys` | `Set<string>` | Currently held keys |\n| `pressedThisFrame` | `Set<string>` | Keys pressed this frame |\n| `releasedThisFrame` | `Set<string>` | Keys released this frame |\n| `lastKeyPressed` | `string` | Most recent key-down |\n| `lastKeyReleased` | `string` | Most recent key-up |\n| `shift`, `ctrl`, `alt`, `meta` | `boolean` | Modifier states |\n\n### Input — Touch — `viji.touches`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `count` | `number` | Active touch count |\n| `points` | `TouchPoint[]` | All active touches |\n| `started` | `TouchPoint[]` | Touches started this frame |\n| `moved` | `TouchPoint[]` | Touches moved this frame |\n| `ended` | `TouchPoint[]` | Touches ended this frame |\n| `primary` | `TouchPoint\\|null` | First active touch |\n\n**TouchPoint:** `id`, `x`, `y`, `pressure`, `radius`, `radiusX`, `radiusY`, `rotationAngle`, `force`, `isInCanvas`, `deltaX`, `deltaY`, `velocity` ({x,y}), `isNew`, `isActive`, `isEnding`.\n\n### Device Sensors — `viji.device`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `motion` | `DeviceMotionData\\|null` | Accelerometer/gyroscope |\n| `orientation` | `DeviceOrientationData\\|null` | Device orientation |\n\n**DeviceMotionData:** `acceleration` ({x,y,z} m/s²), `accelerationIncludingGravity`, `rotationRate` ({alpha,beta,gamma} deg/s), `interval` (ms).\n**DeviceOrientationData:** `alpha` (0–360° compass), `beta` (−180–180° tilt), `gamma` (−90–90° tilt), `absolute` (boolean).\n\n### External Devices — `viji.devices`\n\nArray of connected external devices. Each `DeviceState`:\n`id` (string), `name` (string), `motion` (DeviceMotionData|null), `orientation` (DeviceOrientationData|null), `video` (VideoAPI|null — same as viji.video but without CV).\n\n### Streams — `viji.streams`\n\n`VideoAPI[]` — additional video sources provided by the host application (used by the compositor for scene mixing). May be empty. Each element has the same shape as `viji.video`.\n\n### External Libraries\n\n```javascript\nconst THREE = await import('https://esm.sh/three@0.160.0');\nconst renderer = new THREE.WebGLRenderer({ canvas: viji.canvas, antialias: true });\nrenderer.setSize(viji.width, viji.height, false); // false = no CSS styles\n```\n\nALWAYS pin library versions. ALWAYS pass `viji.canvas` to the renderer. Handle resize in `render()`.\n\n## BEST PRACTICES\n\n1. Use `viji.deltaTime` accumulators for parameter-driven animation to prevent jumps:\n ```javascript\n let angle = 0;\n function render(viji) { angle += speed.value * viji.deltaTime; }\n ```\n2. Guard audio/video with `isConnected` checks.\n3. Pre-allocate all objects/arrays at top level — never inside `render()`.\n4. For CV, use toggle parameters — never enable by default.\n5. For WebGL scenes with Three.js, handle resize by comparing `viji.width/height` with previous values.\n\n## TEMPLATE\n\n```javascript\nconst bgColor = viji.color('#1a1a2e', { label: 'Background' });\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nconst count = viji.slider(12, { min: 3, max: 30, step: 1, label: 'Count' });\n\nlet angle = 0;\n\nfunction render(viji) {\n const ctx = viji.useContext('2d');\n angle += speed.value * viji.deltaTime;\n\n ctx.fillStyle = bgColor.value;\n ctx.fillRect(0, 0, viji.width, viji.height);\n\n const cx = viji.width / 2;\n const cy = viji.height / 2;\n const radius = Math.min(viji.width, viji.height) * 0.3;\n const dotSize = Math.min(viji.width, viji.height) * 0.02;\n const n = Math.floor(count.value);\n\n for (let i = 0; i < n; i++) {\n const a = angle + (i / n) * Math.PI * 2;\n const x = cx + Math.cos(a) * radius;\n const y = cy + Math.sin(a) * radius;\n const hue = (i / n) * 360;\n ctx.beginPath();\n ctx.arc(x, y, dotSize, 0, Math.PI * 2);\n ctx.fillStyle = `hsl(${hue}, 80%, 60%)`;\n ctx.fill();\n }\n}\n```\n\nNow generate a Viji native scene based on the artist's description below. Return ONLY the scene code.\nFollow all rules. Use `viji.deltaTime` for animation. Use parameters for anything the user might want to adjust. Check `isConnected` before using audio or video.\n````\n\n## Usage\n\n1. Copy the entire prompt block above.\n2. Paste it into your AI assistant (ChatGPT, Claude, etc.).\n3. After the prompt, describe the scene you want — be as specific as you like.\n4. The AI will return a complete Viji native scene.\n\n> [!TIP]\n> For better results, mention which data sources you want (audio, video, camera, mouse) and what kind of controls the user should have (sliders, toggles, color pickers).\n\n## Related\n\n- [Create Your First Scene](/ai-prompts/create-first-scene) — guided prompt for beginners\n- [Prompting Tips](/ai-prompts/prompting-tips) — how to get better results from AI\n- [Native Quick Start](/native/quickstart) — your first Viji native scene\n- [Native API Reference](/native/api-reference) — full API reference\n- [Best Practices](/getting-started/best-practices) — essential patterns for robust scenes\n- [Common Mistakes](/getting-started/common-mistakes) — pitfalls to avoid"
951
+ "markdown": "# Prompt: Native Scenes\n\nCopy the prompt below and paste it into your AI assistant. Then describe the scene you want. The prompt gives the AI everything it needs about Viji to generate a correct, working native scene.\n\n## The Prompt\n\n````\nYou are generating a Viji native scene — a creative visual that runs inside an OffscreenCanvas Web Worker.\nArtists describe what they want; you produce complete, working scene code. Apply every rule below exactly.\n\n## REFERENCE (for AI assistants with web access)\n\nThis prompt is self-contained — all information needed is included below.\nFor the latest API documentation and type definitions:\n- Complete docs (all pages + examples): https://unpkg.com/@viji-dev/core/dist/docs-api.js\n- TypeScript API types: https://unpkg.com/@viji-dev/core/dist/artist-global.d.ts\n- NPM package: https://www.npmjs.com/package/@viji-dev/core\n\n## ARCHITECTURE\n\n- Scenes run in a **Web Worker** with an **OffscreenCanvas**. There is no DOM.\n- The global `viji` object provides canvas, timing, audio, video, CV, input, sensors, and parameters.\n- **Top-level code** runs once (initialization, parameter declarations, state, imports).\n- **`function render(viji) { ... }`** is called every frame. This is where you draw.\n- Optional **`async function setup(viji) { ... }`** runs once before the first `render`.\n- **Top-level `await`** is supported — you can dynamically import libraries.\n\n## RULES\n\n1. NEVER access `window`, `document`, `Image()`, `localStorage`, or any DOM API. `fetch()` and `await import()` ARE available.\n2. ALWAYS declare parameters at the TOP LEVEL, never inside `render()` or `setup()`:\n ```javascript\n const speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\n function render(viji) { /* use speed.value */ }\n ```\n3. ALWAYS read parameters via `.value`: `speed.value`, `color.value`, `toggle.value`.\n4. ALWAYS use `viji.width` and `viji.height` for canvas dimensions. NEVER hardcode pixel sizes.\n5. ALWAYS use `viji.time` or `viji.deltaTime` for animation. NEVER count frames or assume a fixed frame rate.\n - `viji.time` — elapsed seconds, best for oscillations and direct time-based effects.\n - `viji.deltaTime` — seconds since last frame, best for accumulators: `angle += speed.value * viji.deltaTime;`\n6. NEVER allocate objects, arrays, or strings inside `render()`. Pre-allocate at the top level and reuse.\n7. ALWAYS call `viji.useContext()` to get a canvas context. Choose ONE type and use it for the entire scene:\n - `viji.useContext('2d')` — Canvas 2D\n - `viji.useContext('webgl')` — WebGL 1\n - `viji.useContext('webgl2')` — WebGL 2\n Calling a different type after the first returns `null`.\n8. ALWAYS check `viji.audio.isConnected` before using audio data.\n9. ALWAYS check `viji.video.isConnected && viji.video.currentFrame` before drawing video.\n10. NEVER enable CV features by default. Use a toggle parameter so the user can opt in:\n ```javascript\n const useFace = viji.toggle(false, { label: 'Enable Face Detection', category: 'video' });\n // In render:\n if (useFace.value) await viji.video.cv.enableFaceDetection(true);\n else await viji.video.cv.enableFaceDetection(false);\n ```\n11. Be mindful of WebGL context limits — each CV feature uses its own WebGL context for ML. Enabling too many can cause context loss.\n12. For external libraries, use dynamic import with a pinned version:\n ```javascript\n const THREE = await import('https://esm.sh/three@0.160.0');\n ```\n Pass `viji.canvas` to the library's renderer. ALWAYS pass `false` as the third argument to Three.js `setSize()`.\n\n## COMPLETE API REFERENCE\n\n### Canvas & Context\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `viji.canvas` | `OffscreenCanvas` | The canvas element |\n| `viji.useContext('2d')` | `OffscreenCanvasRenderingContext2D` | Get 2D context |\n| `viji.useContext('webgl')` | `WebGLRenderingContext` | Get WebGL 1 context |\n| `viji.useContext('webgl2')` | `WebGL2RenderingContext` | Get WebGL 2 context |\n| `viji.ctx` | `OffscreenCanvasRenderingContext2D` | Shortcut (after useContext('2d')) |\n| `viji.gl` | `WebGLRenderingContext` | Shortcut (after useContext('webgl')) |\n| `viji.width` | `number` | Current canvas width in pixels |\n| `viji.height` | `number` | Current canvas height in pixels |\n\n### Timing\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `viji.time` | `number` | Seconds since scene start |\n| `viji.deltaTime` | `number` | Seconds since last frame |\n| `viji.frameCount` | `number` | Total frames rendered |\n| `viji.fps` | `number` | Current frames per second |\n\n### Parameters\n\nDeclare at top level. Read `.value` inside `render()`. All support `{ label, description?, group?, category? }`.\nCategory values: `'audio'`, `'video'`, `'interaction'`, `'general'`.\n\n```javascript\nviji.slider(default, { min?, max?, step?, label, group?, category? }) // { value: number }\nviji.color(default, { label, group?, category? }) // { value: '#rrggbb' }\nviji.toggle(default, { label, group?, category? }) // { value: boolean }\nviji.select(default, { options: [...], label, group?, category? }) // { value: string|number }\nviji.number(default, { min?, max?, step?, label, group?, category? }) // { value: number }\nviji.text(default, { label, group?, category?, maxLength? }) // { value: string }\nviji.image(null, { label, group?, category? }) // { value: ImageBitmap|null }\nviji.button({ label, description?, group?, category? }) // { value: boolean } (true one frame)\n```\n\n### Audio — `viji.audio`\n\nALWAYS check `viji.audio.isConnected` first.\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isConnected` | `boolean` | Whether audio source is active |\n| `volume.current` | `number` | RMS volume 0–1 |\n| `volume.peak` | `number` | Peak amplitude 0–1 |\n| `volume.smoothed` | `number` | Smoothed volume (200ms decay) |\n| `bands.low` | `number` | 20–120 Hz energy 0–1 |\n| `bands.lowMid` | `number` | 120–400 Hz energy 0–1 |\n| `bands.mid` | `number` | 400–1600 Hz energy 0–1 |\n| `bands.highMid` | `number` | 1600–6000 Hz energy 0–1 |\n| `bands.high` | `number` | 6000–16000 Hz energy 0–1 |\n| `bands.lowSmoothed` … `bands.highSmoothed` | `number` | Smoothed variants of each band |\n| `beat.kick` | `number` | Kick energy 0–1 |\n| `beat.snare` | `number` | Snare energy 0–1 |\n| `beat.hat` | `number` | Hi-hat energy 0–1 |\n| `beat.any` | `number` | Any beat energy 0–1 |\n| `beat.kickSmoothed` … `beat.anySmoothed` | `number` | Smoothed beat values |\n| `beat.triggers.kick` | `boolean` | True on kick frame |\n| `beat.triggers.snare` | `boolean` | True on snare frame |\n| `beat.triggers.hat` | `boolean` | True on hat frame |\n| `beat.triggers.any` | `boolean` | True on any beat frame |\n| `beat.events` | `Array<{type,time,strength}>` | Recent beat events |\n| `beat.bpm` | `number` | Estimated BPM (60–240) |\n| `beat.confidence` | `number` | BPM tracking confidence 0–1 |\n| `beat.isLocked` | `boolean` | True when BPM is locked |\n| `spectral.brightness` | `number` | Spectral centroid 0–1 |\n| `spectral.flatness` | `number` | Spectral flatness 0–1 |\n| `getFrequencyData()` | `Uint8Array` | Raw FFT bins (0–255) |\n| `getWaveform()` | `Float32Array` | Time-domain waveform (−1 to 1) |\n\n**`viji.audioStreams` & `device.audio`:** Host and external devices may expose additional sources as **`AudioStreamAPI`** — same `isConnected`, `volume`, `bands` (+ smoothed), `spectral`, `getFrequencyData()`, and `getWaveform()` as above, but **no** `beat`, BPM, triggers, or events (lightweight subset).\n\n### Video — `viji.video`\n\nALWAYS check `viji.video.isConnected` first. Check `currentFrame` before drawing.\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isConnected` | `boolean` | Whether video source is active |\n| `currentFrame` | `OffscreenCanvas\\|ImageBitmap\\|null` | Current video frame |\n| `frameWidth` | `number` | Frame width in pixels |\n| `frameHeight` | `number` | Frame height in pixels |\n| `frameRate` | `number` | Video frame rate |\n| `getFrameData()` | `ImageData\\|null` | Pixel data for CPU access |\n\nDraw video: `ctx.drawImage(viji.video.currentFrame, 0, 0, viji.width, viji.height)`\n\n### Computer Vision — `viji.video.cv` & `viji.video.faces/hands/pose/segmentation`\n\nEnable features via toggle parameters (NEVER enable by default):\n\n```javascript\nawait viji.video.cv.enableFaceDetection(true/false);\nawait viji.video.cv.enableFaceMesh(true/false);\nawait viji.video.cv.enableEmotionDetection(true/false);\nawait viji.video.cv.enableHandTracking(true/false);\nawait viji.video.cv.enablePoseDetection(true/false);\nawait viji.video.cv.enableBodySegmentation(true/false);\nviji.video.cv.getActiveFeatures(); // CVFeature[]\nviji.video.cv.isProcessing(); // boolean\n```\n\n**`viji.video.faces: FaceData[]`**\nEach face: `id` (number), `bounds` ({x,y,width,height}), `center` ({x,y}), `confidence` (0–1), `landmarks` ({x,y,z?}[]), `expressions` ({neutral,happy,sad,angry,surprised,disgusted,fearful} all 0–1), `headPose` ({pitch,yaw,roll}), `blendshapes` (52 ARKit coefficients: browDownLeft, browDownRight, browInnerUp, browOuterUpLeft, browOuterUpRight, cheekPuff, cheekSquintLeft, cheekSquintRight, eyeBlinkLeft, eyeBlinkRight, eyeLookDownLeft, eyeLookDownRight, eyeLookInLeft, eyeLookInRight, eyeLookOutLeft, eyeLookOutRight, eyeLookUpLeft, eyeLookUpRight, eyeSquintLeft, eyeSquintRight, eyeWideLeft, eyeWideRight, jawForward, jawLeft, jawOpen, jawRight, mouthClose, mouthDimpleLeft, mouthDimpleRight, mouthFrownLeft, mouthFrownRight, mouthFunnel, mouthLeft, mouthLowerDownLeft, mouthLowerDownRight, mouthPressLeft, mouthPressRight, mouthPucker, mouthRight, mouthRollLower, mouthRollUpper, mouthShrugLower, mouthShrugUpper, mouthSmileLeft, mouthSmileRight, mouthStretchLeft, mouthStretchRight, mouthUpperUpLeft, mouthUpperUpRight, noseSneerLeft, noseSneerRight, tongueOut — all 0–1).\n\n**`viji.video.hands: HandData[]`**\nEach hand: `id` (number), `handedness` ('left'|'right'), `confidence` (0–1), `bounds` ({x,y,width,height}), `landmarks` ({x,y,z}[], 21 points), `palm` ({x,y,z}), `gestures` ({fist,openPalm,peace,thumbsUp,thumbsDown,pointing,iLoveYou} all 0–1 confidence).\n\n**`viji.video.pose: PoseData | null`**\n`confidence` (0–1), `landmarks` ({x,y,z,visibility}[], 33 points), plus body-part arrays: `face` ({x,y}[]), `torso`, `leftArm`, `rightArm`, `leftLeg`, `rightLeg`.\n\n**`viji.video.segmentation: SegmentationData | null`**\n`mask` (Uint8Array, 0=background 255=person), `width`, `height`.\n\n### Input — Pointer (unified mouse/touch) — `viji.pointer`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `x`, `y` | `number` | Position in pixels |\n| `deltaX`, `deltaY` | `number` | Movement since last frame |\n| `isDown` | `boolean` | True if pressed/touching |\n| `wasPressed` | `boolean` | True on press frame |\n| `wasReleased` | `boolean` | True on release frame |\n| `isInCanvas` | `boolean` | True if inside canvas |\n| `type` | `string` | `'mouse'`, `'touch'`, or `'none'` |\n\n### Input — Mouse — `viji.mouse`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `x`, `y` | `number` | Position in pixels |\n| `isInCanvas` | `boolean` | Inside canvas bounds |\n| `isPressed` | `boolean` | Any button pressed |\n| `leftButton`, `rightButton`, `middleButton` | `boolean` | Specific buttons |\n| `deltaX`, `deltaY` | `number` | Movement delta |\n| `wheelDelta` | `number` | Scroll wheel delta |\n| `wheelX`, `wheelY` | `number` | Horizontal/vertical scroll |\n| `wasPressed`, `wasReleased`, `wasMoved` | `boolean` | Frame-edge events |\n\n### Input — Keyboard — `viji.keyboard`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isPressed(key)` | `boolean` | True while key is held |\n| `wasPressed(key)` | `boolean` | True on key-down frame |\n| `wasReleased(key)` | `boolean` | True on key-up frame |\n| `activeKeys` | `Set<string>` | Currently held keys |\n| `pressedThisFrame` | `Set<string>` | Keys pressed this frame |\n| `releasedThisFrame` | `Set<string>` | Keys released this frame |\n| `lastKeyPressed` | `string` | Most recent key-down |\n| `lastKeyReleased` | `string` | Most recent key-up |\n| `shift`, `ctrl`, `alt`, `meta` | `boolean` | Modifier states |\n\n### Input — Touch — `viji.touches`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `count` | `number` | Active touch count |\n| `points` | `TouchPoint[]` | All active touches |\n| `started` | `TouchPoint[]` | Touches started this frame |\n| `moved` | `TouchPoint[]` | Touches moved this frame |\n| `ended` | `TouchPoint[]` | Touches ended this frame |\n| `primary` | `TouchPoint\\|null` | First active touch |\n\n**TouchPoint:** `id`, `x`, `y`, `pressure`, `radius`, `radiusX`, `radiusY`, `rotationAngle`, `force`, `isInCanvas`, `deltaX`, `deltaY`, `velocity` ({x,y}), `isNew`, `isActive`, `isEnding`.\n\n### Device Sensors — `viji.device`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `motion` | `DeviceMotionData\\|null` | Accelerometer/gyroscope |\n| `orientation` | `DeviceOrientationData\\|null` | Device orientation |\n\n**DeviceMotionData:** `acceleration` ({x,y,z} m/s²), `accelerationIncludingGravity`, `rotationRate` ({alpha,beta,gamma} deg/s), `interval` (ms).\n**DeviceOrientationData:** `alpha` (0–360° compass), `beta` (−180–180° tilt), `gamma` (−90–90° tilt), `absolute` (boolean).\n\n### External Devices — `viji.devices`\n\nArray of connected external devices. Each `DeviceState`:\n`id` (string), `name` (string), `motion` (DeviceMotionData|null), `orientation` (DeviceOrientationData|null), `video` (VideoAPI|null — same as viji.video but without CV), `audio` (AudioStreamAPI|null — lightweight analysis only; no beat/BPM/triggers).\n\n### Streams — `viji.videoStreams`\n\n`VideoAPI[]` — additional video sources provided by the host application (used by the compositor for scene mixing). May be empty. Each element has the same shape as `viji.video`.\n\n### Streams — `viji.audioStreams`\n\n`AudioStreamAPI[]` — additional audio sources from the host (e.g. multi-source mixing). May be empty. Lightweight interface: volume, bands, spectral features, `getFrequencyData()`, `getWaveform()` — **not** the full `AudioAPI` (no beat detection, BPM, triggers, or events).\n\n### External Libraries\n\n```javascript\nconst THREE = await import('https://esm.sh/three@0.160.0');\nconst renderer = new THREE.WebGLRenderer({ canvas: viji.canvas, antialias: true });\nrenderer.setSize(viji.width, viji.height, false); // false = no CSS styles\n```\n\nALWAYS pin library versions. ALWAYS pass `viji.canvas` to the renderer. Handle resize in `render()`.\n\n## BEST PRACTICES\n\n1. Use `viji.deltaTime` accumulators for parameter-driven animation to prevent jumps:\n ```javascript\n let angle = 0;\n function render(viji) { angle += speed.value * viji.deltaTime; }\n ```\n2. Guard audio/video with `isConnected` checks.\n3. Pre-allocate all objects/arrays at top level — never inside `render()`.\n4. For CV, use toggle parameters — never enable by default.\n5. For WebGL scenes with Three.js, handle resize by comparing `viji.width/height` with previous values.\n\n## TEMPLATE\n\n```javascript\nconst bgColor = viji.color('#1a1a2e', { label: 'Background' });\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nconst count = viji.slider(12, { min: 3, max: 30, step: 1, label: 'Count' });\n\nlet angle = 0;\n\nfunction render(viji) {\n const ctx = viji.useContext('2d');\n angle += speed.value * viji.deltaTime;\n\n ctx.fillStyle = bgColor.value;\n ctx.fillRect(0, 0, viji.width, viji.height);\n\n const cx = viji.width / 2;\n const cy = viji.height / 2;\n const radius = Math.min(viji.width, viji.height) * 0.3;\n const dotSize = Math.min(viji.width, viji.height) * 0.02;\n const n = Math.floor(count.value);\n\n for (let i = 0; i < n; i++) {\n const a = angle + (i / n) * Math.PI * 2;\n const x = cx + Math.cos(a) * radius;\n const y = cy + Math.sin(a) * radius;\n const hue = (i / n) * 360;\n ctx.beginPath();\n ctx.arc(x, y, dotSize, 0, Math.PI * 2);\n ctx.fillStyle = `hsl(${hue}, 80%, 60%)`;\n ctx.fill();\n }\n}\n```\n\nNow generate a Viji native scene based on the artist's description below. Return ONLY the scene code.\nFollow all rules. Use `viji.deltaTime` for animation. Use parameters for anything the user might want to adjust. Check `isConnected` before using audio or video.\n````\n\n## Usage\n\n1. Copy the entire prompt block above.\n2. Paste it into your AI assistant (ChatGPT, Claude, etc.).\n3. After the prompt, describe the scene you want — be as specific as you like.\n4. The AI will return a complete Viji native scene.\n\n> [!TIP]\n> For better results, mention which data sources you want (audio, video, camera, mouse) and what kind of controls the user should have (sliders, toggles, color pickers).\n\n## Related\n\n- [Create Your First Scene](/ai-prompts/create-first-scene) — guided prompt for beginners\n- [Prompting Tips](/ai-prompts/prompting-tips) — how to get better results from AI\n- [Native Quick Start](/native/quickstart) — your first Viji native scene\n- [Native API Reference](/native/api-reference) — full API reference\n- [Best Practices](/getting-started/best-practices) — essential patterns for robust scenes\n- [Common Mistakes](/getting-started/common-mistakes) — pitfalls to avoid"
931
952
  }
932
953
  ]
933
954
  },
@@ -938,7 +959,7 @@ export const docsApi = {
938
959
  "content": [
939
960
  {
940
961
  "type": "text",
941
- "markdown": "# Prompt: P5 Scenes\n\nCopy the prompt below and paste it into your AI assistant. Then describe the scene you want. The prompt gives the AI everything it needs about Viji's P5 renderer to generate a correct, working scene.\n\n## The Prompt\n\n````\nYou are generating a Viji P5.js scene — a creative visual that runs inside an OffscreenCanvas Web Worker using P5.js.\nArtists describe what they want; you produce complete, working scene code. Apply every rule below exactly.\n\n## REFERENCE (for AI assistants with web access)\n\nThis prompt is self-contained — all information needed is included below.\nFor the latest API documentation and type definitions:\n- Complete docs (all pages + examples): https://unpkg.com/@viji-dev/core/dist/docs-api.js\n- TypeScript API types: https://unpkg.com/@viji-dev/core/dist/artist-global.d.ts\n- NPM package: https://www.npmjs.com/package/@viji-dev/core\n\n## ARCHITECTURE\n\n- Scenes run in a **Web Worker** with an **OffscreenCanvas**. There is no DOM.\n- Viji automatically loads **P5.js v1.9.4** when you use `// @renderer p5`.\n- The global `viji` object provides canvas, timing, audio, video, CV, input, sensors, and parameters.\n- **Top-level code** runs once (initialization, parameter declarations, state).\n- **`function render(viji, p5) { ... }`** is called every frame. This is where you draw.\n- Optional **`function setup(viji, p5) { ... }`** runs once for configuration (e.g., `p5.colorMode()`).\n- P5 runs in **instance mode** — every P5 function and constant requires the `p5.` prefix.\n\n## RULES\n\n1. ALWAYS add `// @renderer p5` as the very first line.\n2. ALWAYS use `render(viji, p5)` — not `draw()`. ALWAYS use `setup(viji, p5)` — not `setup()`.\n3. ALWAYS prefix every P5 function and constant with `p5.`:\n - `background(0)` → `p5.background(0)`\n - `fill(255)` → `p5.fill(255)`\n - `PI` → `p5.PI`, `TWO_PI` → `p5.TWO_PI`, `HSB` → `p5.HSB`\n - `createVector(1, 0)` → `p5.createVector(1, 0)`\n - `map(v, 0, 1, 0, 255)` → `p5.map(v, 0, 1, 0, 255)`\n - `noise(x)` → `p5.noise(x)`, `random()` → `p5.random()`\n This applies to ALL P5 functions and constants without exception.\n4. NEVER call `createCanvas()`. The canvas is created and managed by Viji.\n5. NEVER use `preload()`. Use `viji.image(null, { label: 'Name' })` for images, or `fetch()` in `setup()`.\n6. NEVER use P5 event callbacks: `mousePressed()`, `mouseDragged()`, `mouseReleased()`, `keyPressed()`, `keyReleased()`, `keyTyped()`, `touchStarted()`, `touchMoved()`, `touchEnded()`. Check state in `render()`:\n - `mouseIsPressed` → `viji.pointer.isDown` or `viji.mouse.isPressed`\n - `mouseX` / `mouseY` → `viji.pointer.x` / `viji.pointer.y` or `viji.mouse.x` / `viji.mouse.y`\n - `keyIsPressed` → `viji.keyboard.isPressed('keyName')`\n - For press-edge detection: `viji.pointer.wasPressed` / `viji.pointer.wasReleased`.\n7. NEVER use `loadImage()`, `loadFont()`, `loadJSON()`, `loadModel()`, `loadShader()`. Use `viji.image()` or `fetch()`.\n8. NEVER use `p5.frameRate()`, `p5.save()`, `p5.saveCanvas()`, `p5.saveFrames()`.\n9. NEVER use `createCapture()`, `createVideo()`. Use `viji.video.*` instead.\n10. NEVER use `p5.dom` or `p5.sound` libraries. Use Viji parameters for UI and `viji.audio.*` for audio.\n11. NEVER access `window`, `document`, `Image()`, or `localStorage`. `fetch()` IS available.\n12. ALWAYS declare parameters at the TOP LEVEL, never inside `render()` or `setup()`.\n13. ALWAYS read parameters via `.value`: `size.value`, `color.value`, `toggle.value`.\n14. ALWAYS use `viji.width` and `viji.height` for canvas dimensions. NEVER hardcode pixel sizes.\n15. ALWAYS use `viji.deltaTime` for frame-rate-independent animation:\n ```javascript\n let angle = 0;\n function render(viji, p5) { angle += speed.value * viji.deltaTime; }\n ```\n16. NEVER allocate objects, arrays, or strings inside `render()`. Pre-allocate at the top level and reuse.\n17. For image parameters displayed with P5, use `.p5` (not `.value`) with `p5.image()`:\n ```javascript\n const photo = viji.image(null, { label: 'Photo' });\n function render(viji, p5) {\n if (photo.value) p5.image(photo.p5, 0, 0, viji.width, viji.height);\n }\n ```\n18. For video frames with P5, use the drawing context directly:\n ```javascript\n if (viji.video.isConnected && viji.video.currentFrame) {\n p5.drawingContext.drawImage(viji.video.currentFrame, 0, 0, viji.width, viji.height);\n }\n ```\n19. `p5.createGraphics()` works (creates OffscreenCanvas internally). Use for off-screen buffers.\n20. Fonts: `p5.textFont()` only with CSS generic names (`monospace`, `serif`, `sans-serif`). `loadFont()` is NOT available.\n21. `p5.tint()` and `p5.blendMode()` work normally.\n22. WEBGL mode is NOT supported. Only use 2D rendering.\n23. `p5.pixelDensity()` defaults to 1 in the worker. `p5.loadPixels()` and `p5.pixels[]` work.\n24. ALWAYS check `viji.audio.isConnected` before using audio data.\n25. ALWAYS check `viji.video.isConnected && viji.video.currentFrame` before drawing video.\n26. NEVER enable CV features by default — use toggle parameters for user opt-in.\n27. `viji.useContext()` is NOT available in P5 scenes — the canvas is managed by P5.\n\n## COMPLETE API REFERENCE\n\nAll `viji.*` members are identical to the native renderer (same object, same types).\n\n### Canvas & Timing\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `viji.canvas` | `OffscreenCanvas` | The canvas element (managed by P5) |\n| `viji.width` | `number` | Current canvas width in pixels |\n| `viji.height` | `number` | Current canvas height in pixels |\n| `viji.time` | `number` | Seconds since scene start |\n| `viji.deltaTime` | `number` | Seconds since last frame |\n| `viji.frameCount` | `number` | Total frames rendered |\n| `viji.fps` | `number` | Current frames per second |\n\nNote: `viji.useContext()` is NOT available in P5. The canvas context is managed by P5 internally.\n\n### Parameters\n\nDeclare at top level. Read `.value` inside `render()`. All support `{ label, description?, group?, category? }`.\nCategory values: `'audio'`, `'video'`, `'interaction'`, `'general'`.\n\n```javascript\nviji.slider(default, { min?, max?, step?, label, group?, category? }) // { value: number }\nviji.color(default, { label, group?, category? }) // { value: '#rrggbb' }\nviji.toggle(default, { label, group?, category? }) // { value: boolean }\nviji.select(default, { options: [...], label, group?, category? }) // { value: string|number }\nviji.number(default, { min?, max?, step?, label, group?, category? }) // { value: number }\nviji.text(default, { label, group?, category?, maxLength? }) // { value: string }\nviji.image(null, { label, group?, category? }) // { value: ImageBitmap|null, p5: P5Image }\nviji.button({ label, description?, group?, category? }) // { value: boolean } (true one frame)\n```\n\n### Audio — `viji.audio`\n\nALWAYS check `viji.audio.isConnected` first.\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isConnected` | `boolean` | Whether audio source is active |\n| `volume.current` | `number` | RMS volume 0–1 |\n| `volume.peak` | `number` | Peak amplitude 0–1 |\n| `volume.smoothed` | `number` | Smoothed volume (200ms decay) |\n| `bands.low` | `number` | 20–120 Hz energy 0–1 |\n| `bands.lowMid` | `number` | 120–400 Hz energy 0–1 |\n| `bands.mid` | `number` | 400–1600 Hz energy 0–1 |\n| `bands.highMid` | `number` | 1600–6000 Hz energy 0–1 |\n| `bands.high` | `number` | 6000–16000 Hz energy 0–1 |\n| `bands.lowSmoothed` … `bands.highSmoothed` | `number` | Smoothed variants of each band |\n| `beat.kick` | `number` | Kick energy 0–1 |\n| `beat.snare` | `number` | Snare energy 0–1 |\n| `beat.hat` | `number` | Hi-hat energy 0–1 |\n| `beat.any` | `number` | Any beat energy 0–1 |\n| `beat.kickSmoothed` … `beat.anySmoothed` | `number` | Smoothed beat values |\n| `beat.triggers.kick` | `boolean` | True on kick frame |\n| `beat.triggers.snare` | `boolean` | True on snare frame |\n| `beat.triggers.hat` | `boolean` | True on hat frame |\n| `beat.triggers.any` | `boolean` | True on any beat frame |\n| `beat.events` | `Array<{type,time,strength}>` | Recent beat events |\n| `beat.bpm` | `number` | Estimated BPM (60–240) |\n| `beat.confidence` | `number` | BPM tracking confidence 0–1 |\n| `beat.isLocked` | `boolean` | True when BPM is locked |\n| `spectral.brightness` | `number` | Spectral centroid 0–1 |\n| `spectral.flatness` | `number` | Spectral flatness 0–1 |\n| `getFrequencyData()` | `Uint8Array` | Raw FFT bins (0–255) |\n| `getWaveform()` | `Float32Array` | Time-domain waveform (−1 to 1) |\n\n### Video — `viji.video`\n\nALWAYS check `viji.video.isConnected` first. Check `currentFrame` before drawing.\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isConnected` | `boolean` | Whether video source is active |\n| `currentFrame` | `OffscreenCanvas\\|ImageBitmap\\|null` | Current video frame |\n| `frameWidth` | `number` | Frame width in pixels |\n| `frameHeight` | `number` | Frame height in pixels |\n| `frameRate` | `number` | Video frame rate |\n| `getFrameData()` | `ImageData\\|null` | Pixel data for CPU access |\n\nDraw video with P5: `p5.drawingContext.drawImage(viji.video.currentFrame, 0, 0, viji.width, viji.height)`\n\n### Computer Vision — `viji.video.cv` & `viji.video.faces/hands/pose/segmentation`\n\nEnable features via toggle parameters (NEVER enable by default):\n\n```javascript\nawait viji.video.cv.enableFaceDetection(true/false);\nawait viji.video.cv.enableFaceMesh(true/false);\nawait viji.video.cv.enableEmotionDetection(true/false);\nawait viji.video.cv.enableHandTracking(true/false);\nawait viji.video.cv.enablePoseDetection(true/false);\nawait viji.video.cv.enableBodySegmentation(true/false);\nviji.video.cv.getActiveFeatures(); // CVFeature[]\nviji.video.cv.isProcessing(); // boolean\n```\n\n**`viji.video.faces: FaceData[]`**\nEach face: `id` (number), `bounds` ({x,y,width,height}), `center` ({x,y}), `confidence` (0–1), `landmarks` ({x,y,z?}[]), `expressions` ({neutral,happy,sad,angry,surprised,disgusted,fearful} all 0–1), `headPose` ({pitch,yaw,roll}), `blendshapes` (52 ARKit coefficients: browDownLeft, browDownRight, browInnerUp, browOuterUpLeft, browOuterUpRight, cheekPuff, cheekSquintLeft, cheekSquintRight, eyeBlinkLeft, eyeBlinkRight, eyeLookDownLeft, eyeLookDownRight, eyeLookInLeft, eyeLookInRight, eyeLookOutLeft, eyeLookOutRight, eyeLookUpLeft, eyeLookUpRight, eyeSquintLeft, eyeSquintRight, eyeWideLeft, eyeWideRight, jawForward, jawLeft, jawOpen, jawRight, mouthClose, mouthDimpleLeft, mouthDimpleRight, mouthFrownLeft, mouthFrownRight, mouthFunnel, mouthLeft, mouthLowerDownLeft, mouthLowerDownRight, mouthPressLeft, mouthPressRight, mouthPucker, mouthRight, mouthRollLower, mouthRollUpper, mouthShrugLower, mouthShrugUpper, mouthSmileLeft, mouthSmileRight, mouthStretchLeft, mouthStretchRight, mouthUpperUpLeft, mouthUpperUpRight, noseSneerLeft, noseSneerRight, tongueOut — all 0–1).\n\n**`viji.video.hands: HandData[]`**\nEach hand: `id` (number), `handedness` ('left'|'right'), `confidence` (0–1), `bounds` ({x,y,width,height}), `landmarks` ({x,y,z}[], 21 points), `palm` ({x,y,z}), `gestures` ({fist,openPalm,peace,thumbsUp,thumbsDown,pointing,iLoveYou} all 0–1 confidence).\n\n**`viji.video.pose: PoseData | null`**\n`confidence` (0–1), `landmarks` ({x,y,z,visibility}[], 33 points), plus body-part arrays: `face` ({x,y}[]), `torso`, `leftArm`, `rightArm`, `leftLeg`, `rightLeg`.\n\n**`viji.video.segmentation: SegmentationData | null`**\n`mask` (Uint8Array, 0=background 255=person), `width`, `height`.\n\n### Input — Pointer (unified mouse/touch) — `viji.pointer`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `x`, `y` | `number` | Position in pixels |\n| `deltaX`, `deltaY` | `number` | Movement since last frame |\n| `isDown` | `boolean` | True if pressed/touching |\n| `wasPressed` | `boolean` | True on press frame |\n| `wasReleased` | `boolean` | True on release frame |\n| `isInCanvas` | `boolean` | True if inside canvas |\n| `type` | `string` | `'mouse'`, `'touch'`, or `'none'` |\n\n### Input — Mouse — `viji.mouse`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `x`, `y` | `number` | Position in pixels |\n| `isInCanvas` | `boolean` | Inside canvas bounds |\n| `isPressed` | `boolean` | Any button pressed |\n| `leftButton`, `rightButton`, `middleButton` | `boolean` | Specific buttons |\n| `deltaX`, `deltaY` | `number` | Movement delta |\n| `wheelDelta` | `number` | Scroll wheel delta |\n| `wheelX`, `wheelY` | `number` | Horizontal/vertical scroll |\n| `wasPressed`, `wasReleased`, `wasMoved` | `boolean` | Frame-edge events |\n\n### Input — Keyboard — `viji.keyboard`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isPressed(key)` | `boolean` | True while key is held |\n| `wasPressed(key)` | `boolean` | True on key-down frame |\n| `wasReleased(key)` | `boolean` | True on key-up frame |\n| `activeKeys` | `Set<string>` | Currently held keys |\n| `pressedThisFrame` | `Set<string>` | Keys pressed this frame |\n| `releasedThisFrame` | `Set<string>` | Keys released this frame |\n| `lastKeyPressed` | `string` | Most recent key-down |\n| `lastKeyReleased` | `string` | Most recent key-up |\n| `shift`, `ctrl`, `alt`, `meta` | `boolean` | Modifier states |\n\n### Input — Touch — `viji.touches`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `count` | `number` | Active touch count |\n| `points` | `TouchPoint[]` | All active touches |\n| `started` | `TouchPoint[]` | Touches started this frame |\n| `moved` | `TouchPoint[]` | Touches moved this frame |\n| `ended` | `TouchPoint[]` | Touches ended this frame |\n| `primary` | `TouchPoint\\|null` | First active touch |\n\n**TouchPoint:** `id`, `x`, `y`, `pressure`, `radius`, `radiusX`, `radiusY`, `rotationAngle`, `force`, `isInCanvas`, `deltaX`, `deltaY`, `velocity` ({x,y}), `isNew`, `isActive`, `isEnding`.\n\n### Device Sensors — `viji.device`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `motion` | `DeviceMotionData\\|null` | Accelerometer/gyroscope |\n| `orientation` | `DeviceOrientationData\\|null` | Device orientation |\n\n**DeviceMotionData:** `acceleration` ({x,y,z} m/s²), `accelerationIncludingGravity`, `rotationRate` ({alpha,beta,gamma} deg/s), `interval` (ms).\n**DeviceOrientationData:** `alpha` (0–360° compass), `beta` (−180–180° tilt), `gamma` (−90–90° tilt), `absolute` (boolean).\n\n### External Devices — `viji.devices`\n\nArray of connected external devices. Each `DeviceState`:\n`id` (string), `name` (string), `motion` (DeviceMotionData|null), `orientation` (DeviceOrientationData|null), `video` (VideoAPI|null — same as viji.video but without CV).\n\n### Streams — `viji.streams`\n\n`VideoAPI[]` — additional video sources provided by the host application (used by the compositor for scene mixing). May be empty. Each element has the same shape as `viji.video`.\n\n## P5 ↔ VIJI MAPPING\n\n| Standard P5.js | Viji-P5 |\n|---|---|\n| `width` / `height` | `viji.width` / `viji.height` |\n| `mouseX` / `mouseY` | `viji.pointer.x` / `viji.pointer.y` |\n| `mouseIsPressed` | `viji.pointer.isDown` |\n| `mouseButton === LEFT` | `viji.mouse.leftButton` |\n| `keyIsPressed` | `viji.keyboard.isPressed('keyName')` |\n| `key` | `viji.keyboard.lastKeyPressed` |\n| `frameCount` | Use `viji.time` or `viji.deltaTime` accumulator |\n| `frameRate(n)` | Remove — host controls frame rate |\n| `createCanvas(w, h)` | Remove — canvas is provided |\n| `preload()` | Remove — use `viji.image()` or `fetch()` in `setup()` |\n| `loadImage(url)` | `viji.image(null, { label: 'Image' })` |\n| `save()` | Remove — host handles capture |\n\n## BEST PRACTICES\n\n1. Use `viji.deltaTime` accumulators for smooth, frame-rate-independent animation.\n2. Guard audio/video with `isConnected` checks.\n3. Pre-allocate all objects/arrays at top level — never inside `render()`.\n4. For CV, use toggle parameters — never enable by default.\n5. Use `p5.drawingContext.drawImage()` for video frames (faster than wrapping).\n6. Use `p5.createGraphics()` for off-screen buffers when needed.\n\n## TEMPLATE\n\n```javascript\n// @renderer p5\n\nconst bgColor = viji.color('#1a1a2e', { label: 'Background' });\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nconst count = viji.slider(8, { min: 3, max: 30, step: 1, label: 'Count' });\n\nlet angle = 0;\n\nfunction setup(viji, p5) {\n p5.colorMode(p5.HSB, 360, 100, 100);\n}\n\nfunction render(viji, p5) {\n angle += speed.value * viji.deltaTime;\n\n p5.background(bgColor.value);\n\n const cx = viji.width / 2;\n const cy = viji.height / 2;\n const radius = p5.min(viji.width, viji.height) * 0.3;\n const dotSize = p5.min(viji.width, viji.height) * 0.04;\n const n = p5.floor(count.value);\n\n p5.noStroke();\n for (let i = 0; i < n; i++) {\n const a = angle + (i / n) * p5.TWO_PI;\n const x = cx + p5.cos(a) * radius;\n const y = cy + p5.sin(a) * radius;\n p5.fill((i / n) * 360, 80, 90);\n p5.circle(x, y, dotSize);\n }\n}\n```\n\nNow generate a Viji P5 scene based on the artist's description below. Return ONLY the scene code.\nFollow all rules. Use `// @renderer p5` as the first line. Prefix ALL P5 functions with `p5.`. Use `viji.deltaTime` for animation. Use parameters for anything adjustable. Check `isConnected` before using audio or video.\n````\n\n## Usage\n\n1. Copy the entire prompt block above.\n2. Paste it into your AI assistant (ChatGPT, Claude, etc.).\n3. After the prompt, describe the scene you want.\n4. The AI will return a complete Viji P5 scene.\n\n> [!TIP]\n> For better results, mention which data sources you want (audio, video, camera, mouse) and what kind of controls the user should have. If you have existing P5 sketches to convert, use the [Convert: P5 Sketches](/ai-prompts/convert-p5) prompt instead.\n\n## Related\n\n- [Create Your First Scene](/ai-prompts/create-first-scene) — guided prompt for beginners\n- [Prompting Tips](/ai-prompts/prompting-tips) — how to get better results from AI\n- [Convert: P5 Sketches](/ai-prompts/convert-p5) — convert existing P5 sketches to Viji\n- [P5 Quick Start](/p5/quickstart) — your first Viji P5 scene\n- [P5 API Reference](/p5/api-reference) — full API reference\n- [Drawing with P5](/p5/drawing) — Viji-specific P5 drawing guide\n- [p5js.org Reference](https://p5js.org/reference/) — full P5.js documentation"
962
+ "markdown": "# Prompt: P5 Scenes\n\nCopy the prompt below and paste it into your AI assistant. Then describe the scene you want. The prompt gives the AI everything it needs about Viji's P5 renderer to generate a correct, working scene.\n\n## The Prompt\n\n````\nYou are generating a Viji P5.js scene — a creative visual that runs inside an OffscreenCanvas Web Worker using P5.js.\nArtists describe what they want; you produce complete, working scene code. Apply every rule below exactly.\n\n## REFERENCE (for AI assistants with web access)\n\nThis prompt is self-contained — all information needed is included below.\nFor the latest API documentation and type definitions:\n- Complete docs (all pages + examples): https://unpkg.com/@viji-dev/core/dist/docs-api.js\n- TypeScript API types: https://unpkg.com/@viji-dev/core/dist/artist-global.d.ts\n- NPM package: https://www.npmjs.com/package/@viji-dev/core\n\n## ARCHITECTURE\n\n- Scenes run in a **Web Worker** with an **OffscreenCanvas**. There is no DOM.\n- Viji automatically loads **P5.js v1.9.4** when you use `// @renderer p5` or `// @renderer p5 webgl`.\n- The global `viji` object provides canvas, timing, audio, video, CV, input, sensors, and parameters.\n- **Top-level code** runs once (initialization, parameter declarations, state).\n- **`function render(viji, p5) { ... }`** is called every frame. This is where you draw.\n- Optional **`function setup(viji, p5) { ... }`** runs once for configuration (e.g., `p5.colorMode()`).\n- P5 runs in **instance mode** — every P5 function and constant requires the `p5.` prefix.\n\n## RULES\n\n1. ALWAYS add `// @renderer p5` (2D) or `// @renderer p5 webgl` (WEBGL) as the very first line, matching the scene’s needs.\n2. ALWAYS use `render(viji, p5)` — not `draw()`. ALWAYS use `setup(viji, p5)` — not `setup()`.\n3. ALWAYS prefix every P5 function and constant with `p5.`:\n - `background(0)` → `p5.background(0)`\n - `fill(255)` → `p5.fill(255)`\n - `PI` → `p5.PI`, `TWO_PI` → `p5.TWO_PI`, `HSB` → `p5.HSB`\n - `createVector(1, 0)` → `p5.createVector(1, 0)`\n - `map(v, 0, 1, 0, 255)` → `p5.map(v, 0, 1, 0, 255)`\n - `noise(x)` → `p5.noise(x)`, `random()` → `p5.random()`\n This applies to ALL P5 functions and constants without exception.\n4. NEVER call `createCanvas()`. The canvas is created and managed by Viji.\n5. NEVER use `preload()`. Use `viji.image(null, { label: 'Name' })` for images, or `fetch()` in `setup()`.\n6. NEVER use P5 event callbacks: `mousePressed()`, `mouseDragged()`, `mouseReleased()`, `keyPressed()`, `keyReleased()`, `keyTyped()`, `touchStarted()`, `touchMoved()`, `touchEnded()`. Check state in `render()`:\n - `mouseIsPressed` → `viji.pointer.isDown` or `viji.mouse.isPressed`\n - `mouseX` / `mouseY` → `viji.pointer.x` / `viji.pointer.y` or `viji.mouse.x` / `viji.mouse.y`\n - `keyIsPressed` → `viji.keyboard.isPressed('keyName')`\n - For press-edge detection: `viji.pointer.wasPressed` / `viji.pointer.wasReleased`.\n7. NEVER use `loadImage()`, `loadFont()`, `loadJSON()`, `loadModel()`, `loadShader()`. Use `viji.image()` or `fetch()`.\n8. NEVER use `p5.frameRate()`, `p5.save()`, `p5.saveCanvas()`, `p5.saveFrames()`.\n9. NEVER use `createCapture()`, `createVideo()`. Use `viji.video.*` instead.\n10. NEVER use `p5.dom` or `p5.sound` libraries. Use Viji parameters for UI and `viji.audio.*` for audio.\n11. NEVER access `window`, `document`, `Image()`, or `localStorage`. `fetch()` IS available.\n12. ALWAYS declare parameters at the TOP LEVEL, never inside `render()` or `setup()`.\n13. ALWAYS read parameters via `.value`: `size.value`, `color.value`, `toggle.value`.\n14. ALWAYS use `viji.width` and `viji.height` for canvas dimensions. NEVER hardcode pixel sizes.\n15. ALWAYS use `viji.deltaTime` for frame-rate-independent animation:\n ```javascript\n let angle = 0;\n function render(viji, p5) { angle += speed.value * viji.deltaTime; }\n ```\n16. NEVER allocate objects, arrays, or strings inside `render()`. Pre-allocate at the top level and reuse.\n17. For image parameters displayed with P5, use `.p5` (not `.value`) with `p5.image()`:\n ```javascript\n const photo = viji.image(null, { label: 'Photo' });\n function render(viji, p5) {\n if (photo.value) p5.image(photo.p5, 0, 0, viji.width, viji.height);\n }\n ```\n18. For video frames: in **2D** (`// @renderer p5`) you may use `p5.image(viji.video.currentFrame, ...)` or `p5.drawingContext.drawImage(...)`. In **WEBGL** (`// @renderer p5 webgl`), use `p5.image(viji.video.currentFrame, ...)` only — `p5.drawingContext` is WebGL, not Canvas 2D.\n ```javascript\n if (viji.video.isConnected && viji.video.currentFrame) {\n p5.image(viji.video.currentFrame, 0, 0, viji.width, viji.height);\n }\n ```\n19. `p5.createGraphics()` works (creates OffscreenCanvas internally). Use for off-screen buffers.\n20. Fonts: `p5.textFont()` only with CSS generic names (`monospace`, `serif`, `sans-serif`). `loadFont()` is NOT available.\n21. `p5.tint()` and `p5.blendMode()` work normally.\n22. **Canvas mode:** Use `// @renderer p5` for a **2D** main canvas. For **WEBGL / 3D**, the first line MUST be `// @renderer p5 webgl`. NEVER call `createCanvas()` or `createCanvas(..., p5.WEBGL)` — Viji creates the canvas in the correct mode.\n23. In **WEBGL** scenes, `p5.drawingContext` is a WebGL context — never use Canvas 2D–only APIs on it. Use P5 3D drawing, `p5.image()` / textures for images and video.\n24. `p5.createGraphics(w, h)` is **2D only**. `createGraphics(w, h, p5.WEBGL)` is NOT supported.\n25. `p5.pixelDensity()` defaults to 1 in the worker. `p5.loadPixels()` and `p5.pixels[]` work (2D scenes; WEBGL pixel readback follows P5.js rules).\n26. ALWAYS check `viji.audio.isConnected` before using audio data.\n27. ALWAYS check `viji.video.isConnected && viji.video.currentFrame` before drawing video.\n28. NEVER enable CV features by default — use toggle parameters for user opt-in.\n29. `viji.useContext()` is NOT available in P5 scenes — the canvas is managed by P5.\n\n## COMPLETE API REFERENCE\n\nAll `viji.*` members are identical to the native renderer (same object, same types).\n\n### Canvas & Timing\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `viji.canvas` | `OffscreenCanvas` | The canvas element (managed by P5) |\n| `viji.width` | `number` | Current canvas width in pixels |\n| `viji.height` | `number` | Current canvas height in pixels |\n| `viji.time` | `number` | Seconds since scene start |\n| `viji.deltaTime` | `number` | Seconds since last frame |\n| `viji.frameCount` | `number` | Total frames rendered |\n| `viji.fps` | `number` | Current frames per second |\n\nNote: `viji.useContext()` is NOT available in P5. The canvas context is managed by P5 internally.\n\n### Parameters\n\nDeclare at top level. Read `.value` inside `render()`. All support `{ label, description?, group?, category? }`.\nCategory values: `'audio'`, `'video'`, `'interaction'`, `'general'`.\n\n```javascript\nviji.slider(default, { min?, max?, step?, label, group?, category? }) // { value: number }\nviji.color(default, { label, group?, category? }) // { value: '#rrggbb' }\nviji.toggle(default, { label, group?, category? }) // { value: boolean }\nviji.select(default, { options: [...], label, group?, category? }) // { value: string|number }\nviji.number(default, { min?, max?, step?, label, group?, category? }) // { value: number }\nviji.text(default, { label, group?, category?, maxLength? }) // { value: string }\nviji.image(null, { label, group?, category? }) // { value: ImageBitmap|null, p5: P5Image }\nviji.button({ label, description?, group?, category? }) // { value: boolean } (true one frame)\n```\n\n### Audio — `viji.audio`\n\nALWAYS check `viji.audio.isConnected` first.\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isConnected` | `boolean` | Whether audio source is active |\n| `volume.current` | `number` | RMS volume 0–1 |\n| `volume.peak` | `number` | Peak amplitude 0–1 |\n| `volume.smoothed` | `number` | Smoothed volume (200ms decay) |\n| `bands.low` | `number` | 20–120 Hz energy 0–1 |\n| `bands.lowMid` | `number` | 120–400 Hz energy 0–1 |\n| `bands.mid` | `number` | 400–1600 Hz energy 0–1 |\n| `bands.highMid` | `number` | 1600–6000 Hz energy 0–1 |\n| `bands.high` | `number` | 6000–16000 Hz energy 0–1 |\n| `bands.lowSmoothed` … `bands.highSmoothed` | `number` | Smoothed variants of each band |\n| `beat.kick` | `number` | Kick energy 0–1 |\n| `beat.snare` | `number` | Snare energy 0–1 |\n| `beat.hat` | `number` | Hi-hat energy 0–1 |\n| `beat.any` | `number` | Any beat energy 0–1 |\n| `beat.kickSmoothed` … `beat.anySmoothed` | `number` | Smoothed beat values |\n| `beat.triggers.kick` | `boolean` | True on kick frame |\n| `beat.triggers.snare` | `boolean` | True on snare frame |\n| `beat.triggers.hat` | `boolean` | True on hat frame |\n| `beat.triggers.any` | `boolean` | True on any beat frame |\n| `beat.events` | `Array<{type,time,strength}>` | Recent beat events |\n| `beat.bpm` | `number` | Estimated BPM (60–240) |\n| `beat.confidence` | `number` | BPM tracking confidence 0–1 |\n| `beat.isLocked` | `boolean` | True when BPM is locked |\n| `spectral.brightness` | `number` | Spectral centroid 0–1 |\n| `spectral.flatness` | `number` | Spectral flatness 0–1 |\n| `getFrequencyData()` | `Uint8Array` | Raw FFT bins (0–255) |\n| `getWaveform()` | `Float32Array` | Time-domain waveform (−1 to 1) |\n\n**`viji.audioStreams` & `device.audio`:** Host and external devices may expose additional sources as **`AudioStreamAPI`** — same `isConnected`, `volume`, `bands` (+ smoothed), `spectral`, `getFrequencyData()`, and `getWaveform()` as above, but **no** `beat`, BPM, triggers, or events (lightweight subset).\n\n### Video — `viji.video`\n\nALWAYS check `viji.video.isConnected` first. Check `currentFrame` before drawing.\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isConnected` | `boolean` | Whether video source is active |\n| `currentFrame` | `OffscreenCanvas\\|ImageBitmap\\|null` | Current video frame |\n| `frameWidth` | `number` | Frame width in pixels |\n| `frameHeight` | `number` | Frame height in pixels |\n| `frameRate` | `number` | Video frame rate |\n| `getFrameData()` | `ImageData\\|null` | Pixel data for CPU access |\n\nDraw video with P5: `p5.drawingContext.drawImage(viji.video.currentFrame, 0, 0, viji.width, viji.height)`\n\n### Computer Vision — `viji.video.cv` & `viji.video.faces/hands/pose/segmentation`\n\nEnable features via toggle parameters (NEVER enable by default):\n\n```javascript\nawait viji.video.cv.enableFaceDetection(true/false);\nawait viji.video.cv.enableFaceMesh(true/false);\nawait viji.video.cv.enableEmotionDetection(true/false);\nawait viji.video.cv.enableHandTracking(true/false);\nawait viji.video.cv.enablePoseDetection(true/false);\nawait viji.video.cv.enableBodySegmentation(true/false);\nviji.video.cv.getActiveFeatures(); // CVFeature[]\nviji.video.cv.isProcessing(); // boolean\n```\n\n**`viji.video.faces: FaceData[]`**\nEach face: `id` (number), `bounds` ({x,y,width,height}), `center` ({x,y}), `confidence` (0–1), `landmarks` ({x,y,z?}[]), `expressions` ({neutral,happy,sad,angry,surprised,disgusted,fearful} all 0–1), `headPose` ({pitch,yaw,roll}), `blendshapes` (52 ARKit coefficients: browDownLeft, browDownRight, browInnerUp, browOuterUpLeft, browOuterUpRight, cheekPuff, cheekSquintLeft, cheekSquintRight, eyeBlinkLeft, eyeBlinkRight, eyeLookDownLeft, eyeLookDownRight, eyeLookInLeft, eyeLookInRight, eyeLookOutLeft, eyeLookOutRight, eyeLookUpLeft, eyeLookUpRight, eyeSquintLeft, eyeSquintRight, eyeWideLeft, eyeWideRight, jawForward, jawLeft, jawOpen, jawRight, mouthClose, mouthDimpleLeft, mouthDimpleRight, mouthFrownLeft, mouthFrownRight, mouthFunnel, mouthLeft, mouthLowerDownLeft, mouthLowerDownRight, mouthPressLeft, mouthPressRight, mouthPucker, mouthRight, mouthRollLower, mouthRollUpper, mouthShrugLower, mouthShrugUpper, mouthSmileLeft, mouthSmileRight, mouthStretchLeft, mouthStretchRight, mouthUpperUpLeft, mouthUpperUpRight, noseSneerLeft, noseSneerRight, tongueOut — all 0–1).\n\n**`viji.video.hands: HandData[]`**\nEach hand: `id` (number), `handedness` ('left'|'right'), `confidence` (0–1), `bounds` ({x,y,width,height}), `landmarks` ({x,y,z}[], 21 points), `palm` ({x,y,z}), `gestures` ({fist,openPalm,peace,thumbsUp,thumbsDown,pointing,iLoveYou} all 0–1 confidence).\n\n**`viji.video.pose: PoseData | null`**\n`confidence` (0–1), `landmarks` ({x,y,z,visibility}[], 33 points), plus body-part arrays: `face` ({x,y}[]), `torso`, `leftArm`, `rightArm`, `leftLeg`, `rightLeg`.\n\n**`viji.video.segmentation: SegmentationData | null`**\n`mask` (Uint8Array, 0=background 255=person), `width`, `height`.\n\n### Input — Pointer (unified mouse/touch) — `viji.pointer`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `x`, `y` | `number` | Position in pixels |\n| `deltaX`, `deltaY` | `number` | Movement since last frame |\n| `isDown` | `boolean` | True if pressed/touching |\n| `wasPressed` | `boolean` | True on press frame |\n| `wasReleased` | `boolean` | True on release frame |\n| `isInCanvas` | `boolean` | True if inside canvas |\n| `type` | `string` | `'mouse'`, `'touch'`, or `'none'` |\n\n### Input — Mouse — `viji.mouse`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `x`, `y` | `number` | Position in pixels |\n| `isInCanvas` | `boolean` | Inside canvas bounds |\n| `isPressed` | `boolean` | Any button pressed |\n| `leftButton`, `rightButton`, `middleButton` | `boolean` | Specific buttons |\n| `deltaX`, `deltaY` | `number` | Movement delta |\n| `wheelDelta` | `number` | Scroll wheel delta |\n| `wheelX`, `wheelY` | `number` | Horizontal/vertical scroll |\n| `wasPressed`, `wasReleased`, `wasMoved` | `boolean` | Frame-edge events |\n\n### Input — Keyboard — `viji.keyboard`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `isPressed(key)` | `boolean` | True while key is held |\n| `wasPressed(key)` | `boolean` | True on key-down frame |\n| `wasReleased(key)` | `boolean` | True on key-up frame |\n| `activeKeys` | `Set<string>` | Currently held keys |\n| `pressedThisFrame` | `Set<string>` | Keys pressed this frame |\n| `releasedThisFrame` | `Set<string>` | Keys released this frame |\n| `lastKeyPressed` | `string` | Most recent key-down |\n| `lastKeyReleased` | `string` | Most recent key-up |\n| `shift`, `ctrl`, `alt`, `meta` | `boolean` | Modifier states |\n\n### Input — Touch — `viji.touches`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `count` | `number` | Active touch count |\n| `points` | `TouchPoint[]` | All active touches |\n| `started` | `TouchPoint[]` | Touches started this frame |\n| `moved` | `TouchPoint[]` | Touches moved this frame |\n| `ended` | `TouchPoint[]` | Touches ended this frame |\n| `primary` | `TouchPoint\\|null` | First active touch |\n\n**TouchPoint:** `id`, `x`, `y`, `pressure`, `radius`, `radiusX`, `radiusY`, `rotationAngle`, `force`, `isInCanvas`, `deltaX`, `deltaY`, `velocity` ({x,y}), `isNew`, `isActive`, `isEnding`.\n\n### Device Sensors — `viji.device`\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `motion` | `DeviceMotionData\\|null` | Accelerometer/gyroscope |\n| `orientation` | `DeviceOrientationData\\|null` | Device orientation |\n\n**DeviceMotionData:** `acceleration` ({x,y,z} m/s²), `accelerationIncludingGravity`, `rotationRate` ({alpha,beta,gamma} deg/s), `interval` (ms).\n**DeviceOrientationData:** `alpha` (0–360° compass), `beta` (−180–180° tilt), `gamma` (−90–90° tilt), `absolute` (boolean).\n\n### External Devices — `viji.devices`\n\nArray of connected external devices. Each `DeviceState`:\n`id` (string), `name` (string), `motion` (DeviceMotionData|null), `orientation` (DeviceOrientationData|null), `video` (VideoAPI|null — same as viji.video but without CV).\n\n### Streams — `viji.videoStreams`\n\n`VideoAPI[]` — additional video sources provided by the host application (used by the compositor for scene mixing). May be empty. Each element has the same shape as `viji.video`.\n\n## P5 ↔ VIJI MAPPING\n\n| Standard P5.js | Viji-P5 |\n|---|---|\n| `width` / `height` | `viji.width` / `viji.height` |\n| `mouseX` / `mouseY` | `viji.pointer.x` / `viji.pointer.y` |\n| `mouseIsPressed` | `viji.pointer.isDown` |\n| `mouseButton === LEFT` | `viji.mouse.leftButton` |\n| `keyIsPressed` | `viji.keyboard.isPressed('keyName')` |\n| `key` | `viji.keyboard.lastKeyPressed` |\n| `frameCount` | Use `viji.time` or `viji.deltaTime` accumulator |\n| `frameRate(n)` | Remove — host controls frame rate |\n| `createCanvas(w, h)` | Remove — canvas is provided |\n| `preload()` | Remove — use `viji.image()` or `fetch()` in `setup()` |\n| `loadImage(url)` | `viji.image(null, { label: 'Image' })` |\n| `save()` | Remove — host handles capture |\n\n## BEST PRACTICES\n\n1. Use `viji.deltaTime` accumulators for smooth, frame-rate-independent animation.\n2. Guard audio/video with `isConnected` checks.\n3. Pre-allocate all objects/arrays at top level — never inside `render()`.\n4. For CV, use toggle parameters — never enable by default.\n5. Use `p5.drawingContext.drawImage()` for video frames (faster than wrapping).\n6. Use `p5.createGraphics()` for off-screen buffers when needed.\n\n## TEMPLATE\n\n```javascript\n// @renderer p5\n\nconst bgColor = viji.color('#1a1a2e', { label: 'Background' });\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nconst count = viji.slider(8, { min: 3, max: 30, step: 1, label: 'Count' });\n\nlet angle = 0;\n\nfunction setup(viji, p5) {\n p5.colorMode(p5.HSB, 360, 100, 100);\n}\n\nfunction render(viji, p5) {\n angle += speed.value * viji.deltaTime;\n\n p5.background(bgColor.value);\n\n const cx = viji.width / 2;\n const cy = viji.height / 2;\n const radius = p5.min(viji.width, viji.height) * 0.3;\n const dotSize = p5.min(viji.width, viji.height) * 0.04;\n const n = p5.floor(count.value);\n\n p5.noStroke();\n for (let i = 0; i < n; i++) {\n const a = angle + (i / n) * p5.TWO_PI;\n const x = cx + p5.cos(a) * radius;\n const y = cy + p5.sin(a) * radius;\n p5.fill((i / n) * 360, 80, 90);\n p5.circle(x, y, dotSize);\n }\n}\n```\n\nNow generate a Viji P5 scene based on the artist's description below. Return ONLY the scene code.\nFollow all rules. Use `// @renderer p5` (2D) or `// @renderer p5 webgl` (WEBGL) as the first line. Prefix ALL P5 functions with `p5.`. Use `viji.deltaTime` for animation. Use parameters for anything adjustable. Check `isConnected` before using audio or video.\n````\n\n## Usage\n\n1. Copy the entire prompt block above.\n2. Paste it into your AI assistant (ChatGPT, Claude, etc.).\n3. After the prompt, describe the scene you want.\n4. The AI will return a complete Viji P5 scene.\n\n> [!TIP]\n> For better results, mention which data sources you want (audio, video, camera, mouse) and what kind of controls the user should have. If you have existing P5 sketches to convert, use the [Convert: P5 Sketches](/ai-prompts/convert-p5) prompt instead.\n\n## Related\n\n- [Create Your First Scene](/ai-prompts/create-first-scene) — guided prompt for beginners\n- [Prompting Tips](/ai-prompts/prompting-tips) — how to get better results from AI\n- [Convert: P5 Sketches](/ai-prompts/convert-p5) — convert existing P5 sketches to Viji\n- [P5 Quick Start](/p5/quickstart) — your first Viji P5 scene\n- [P5 API Reference](/p5/api-reference) — full API reference\n- [Drawing with P5](/p5/drawing) — Viji-specific P5 drawing guide\n- [p5js.org Reference](https://p5js.org/reference/) — full P5.js documentation"
942
963
  }
943
964
  ]
944
965
  },
@@ -949,7 +970,7 @@ export const docsApi = {
949
970
  "content": [
950
971
  {
951
972
  "type": "text",
952
- "markdown": "# Prompt: Shader Scenes\n\nCopy the prompt below and paste it into your AI assistant. Then describe the shader effect you want. The prompt gives the AI everything it needs about Viji's shader renderer to generate a correct, working scene.\n\n## The Prompt\n\n````\nYou are generating a Viji GLSL shader scene — a fragment shader that runs on a fullscreen quad inside a Web Worker.\nArtists describe what they want; you produce complete, working GLSL code. Apply every rule below exactly.\n\n## REFERENCE (for AI assistants with web access)\n\nThis prompt is self-contained — all information needed is included below.\nFor the latest API documentation and type definitions:\n- Complete docs (all pages + examples): https://unpkg.com/@viji-dev/core/dist/docs-api.js\n- TypeScript API types: https://unpkg.com/@viji-dev/core/dist/artist-global.d.ts\n- Shader uniforms reference: https://unpkg.com/@viji-dev/core/dist/shader-uniforms.js\n- NPM package: https://www.npmjs.com/package/@viji-dev/core\n\n## ARCHITECTURE\n\n- Viji renders a **fullscreen quad**. Your shader defines the color of every pixel.\n- Viji **auto-injects** `precision mediump float;` and ALL uniform declarations — both built-in uniforms and parameter uniforms from `@viji-*` directives.\n- You write only helper functions and `void main() { ... }`.\n- **GLSL ES 1.00** by default. Add `#version 300 es` as the very first line for ES 3.00.\n- ES 3.00 requires `out vec4 fragColor;` (before `main`) and `fragColor = ...` instead of `gl_FragColor`.\n- ES 3.00 uses `texture()` instead of `texture2D()`.\n- If the shader uses `fwidth`, Viji auto-injects `#extension GL_OES_standard_derivatives : enable`.\n\n## RULES\n\n1. ALWAYS add `// @renderer shader` as the first line (or after `#version 300 es` if using ES 3.00).\n2. NEVER declare `precision mediump float;` or `precision highp float;` — Viji auto-injects precision.\n3. NEVER redeclare built-in uniforms (`u_time`, `u_resolution`, `u_mouse`, etc.) — they are auto-injected.\n4. NEVER redeclare parameter uniforms — they are auto-generated from `@viji-*` directives.\n5. NEVER use the `u_` prefix for your own parameter names — it is reserved for built-in uniforms. Name parameters descriptively: `speed`, `colorMix`, `intensity`.\n6. `@viji-*` parameter directives ONLY work with `//` comments. NEVER use `/* */` for directives.\n7. ALWAYS use `@viji-accumulator` instead of `u_time * speed` for parameter-driven animation — this prevents jumps when sliders change:\n ```glsl\n // @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0\n // @viji-accumulator:phase rate:speed\n float wave = sin(phase); // smooth, no jumps\n ```\n8. For `backbuffer` (previous frame), just reference it in code — Viji auto-detects and enables it.\n9. Remove any `#ifdef GL_ES` / `precision` blocks — Viji handles this.\n\n## COMPLETE UNIFORM REFERENCE\n\nAll uniforms below are always available — do NOT declare them.\n\n### Core\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_resolution` | `vec2` | Canvas width and height in pixels |\n| `u_time` | `float` | Elapsed seconds since scene start |\n| `u_deltaTime` | `float` | Seconds since last frame |\n| `u_frame` | `int` | Current frame number |\n| `u_fps` | `float` | Current frames per second |\n\n### Mouse\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_mouse` | `vec2` | Mouse position in pixels (WebGL coords: bottom-left origin) |\n| `u_mouseInCanvas` | `bool` | True if mouse is inside canvas |\n| `u_mousePressed` | `bool` | True if any mouse button is pressed |\n| `u_mouseLeft` | `bool` | True if left button is pressed |\n| `u_mouseRight` | `bool` | True if right button is pressed |\n| `u_mouseMiddle` | `bool` | True if middle button is pressed |\n| `u_mouseDelta` | `vec2` | Mouse movement delta per frame |\n| `u_mouseWheel` | `float` | Mouse wheel scroll delta |\n| `u_mouseWasPressed` | `bool` | True on the frame a button was pressed |\n| `u_mouseWasReleased` | `bool` | True on the frame a button was released |\n\n### Keyboard\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_keySpace` | `bool` | Spacebar |\n| `u_keyShift` | `bool` | Shift key |\n| `u_keyCtrl` | `bool` | Ctrl/Cmd key |\n| `u_keyAlt` | `bool` | Alt/Option key |\n| `u_keyW`, `u_keyA`, `u_keyS`, `u_keyD` | `bool` | WASD keys |\n| `u_keyUp`, `u_keyDown`, `u_keyLeft`, `u_keyRight` | `bool` | Arrow keys |\n| `u_keyboard` | `sampler2D` | Full keyboard state texture (256×3, LUMINANCE). Row 0: held, Row 1: pressed this frame, Row 2: toggle. Access: `texelFetch(u_keyboard, ivec2(keyCode, row), 0).r` |\n\n### Touch\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_touchCount` | `int` | Number of active touches (0–5) |\n| `u_touch0` – `u_touch4` | `vec2` | Touch point positions in pixels |\n\n### Pointer (unified mouse/touch)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_pointer` | `vec2` | Primary input position in pixels (WebGL coords) |\n| `u_pointerDelta` | `vec2` | Primary input movement delta |\n| `u_pointerDown` | `bool` | True if primary input is active |\n| `u_pointerWasPressed` | `bool` | True on frame input became active |\n| `u_pointerWasReleased` | `bool` | True on frame input was released |\n| `u_pointerInCanvas` | `bool` | True if inside canvas |\n\n### Audio — Scalars\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_audioVolume` | `float` | RMS volume 0–1 |\n| `u_audioPeak` | `float` | Peak amplitude 0–1 |\n| `u_audioVolumeSmoothed` | `float` | Smoothed volume (200ms decay) |\n| `u_audioLow` | `float` | Low band 20–120 Hz |\n| `u_audioLowMid` | `float` | Low-mid 120–400 Hz |\n| `u_audioMid` | `float` | Mid 400–1600 Hz |\n| `u_audioHighMid` | `float` | High-mid 1600–6000 Hz |\n| `u_audioHigh` | `float` | High 6000–16000 Hz |\n| `u_audioLowSmoothed` – `u_audioHighSmoothed` | `float` | Smoothed band variants |\n| `u_audioKick` | `float` | Kick energy 0–1 |\n| `u_audioSnare` | `float` | Snare energy 0–1 |\n| `u_audioHat` | `float` | Hi-hat energy 0–1 |\n| `u_audioAny` | `float` | Any beat energy 0–1 |\n| `u_audioKickSmoothed` – `u_audioAnySmoothed` | `float` | Smoothed beat values |\n| `u_audioKickTrigger` | `bool` | True on kick beat frame |\n| `u_audioSnareTrigger` | `bool` | True on snare beat frame |\n| `u_audioHatTrigger` | `bool` | True on hat beat frame |\n| `u_audioAnyTrigger` | `bool` | True on any beat frame |\n| `u_audioBPM` | `float` | Estimated BPM (60–240) |\n| `u_audioConfidence` | `float` | Beat tracking confidence 0–1 |\n| `u_audioIsLocked` | `bool` | True when BPM is locked |\n| `u_audioBrightness` | `float` | Spectral brightness 0–1 |\n| `u_audioFlatness` | `float` | Spectral flatness 0–1 |\n\n### Audio — Textures\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_audioFFT` | `sampler2D` | FFT frequency spectrum (1024 bins, 0–255) |\n| `u_audioWaveform` | `sampler2D` | Time-domain waveform (−1 to 1) |\n\n### Video\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_video` | `sampler2D` | Current video frame texture |\n| `u_videoResolution` | `vec2` | Video frame size in pixels |\n| `u_videoFrameRate` | `float` | Video frame rate |\n| `u_videoConnected` | `bool` | True if video source is active |\n\n### CV — Face Detection\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_faceCount` | `int` | Number of detected faces (0–1) |\n| `u_face0Bounds` | `vec4` | Bounding box (x, y, width, height) normalized 0–1 |\n| `u_face0Center` | `vec2` | Face center (x, y) normalized 0–1 |\n| `u_face0HeadPose` | `vec3` | Head rotation (pitch, yaw, roll) in degrees |\n| `u_face0Confidence` | `float` | Detection confidence 0–1 |\n| `u_face0Neutral` – `u_face0Fearful` | `float` | 7 expression scores (neutral, happy, sad, angry, surprised, disgusted, fearful) |\n\n**52 Blendshape uniforms** (all `float`, 0–1, ARKit names prefixed with `u_face0`):\n`u_face0BrowDownLeft`, `u_face0BrowDownRight`, `u_face0BrowInnerUp`, `u_face0BrowOuterUpLeft`, `u_face0BrowOuterUpRight`, `u_face0CheekPuff`, `u_face0CheekSquintLeft`, `u_face0CheekSquintRight`, `u_face0EyeBlinkLeft`, `u_face0EyeBlinkRight`, `u_face0EyeLookDownLeft`, `u_face0EyeLookDownRight`, `u_face0EyeLookInLeft`, `u_face0EyeLookInRight`, `u_face0EyeLookOutLeft`, `u_face0EyeLookOutRight`, `u_face0EyeLookUpLeft`, `u_face0EyeLookUpRight`, `u_face0EyeSquintLeft`, `u_face0EyeSquintRight`, `u_face0EyeWideLeft`, `u_face0EyeWideRight`, `u_face0JawForward`, `u_face0JawLeft`, `u_face0JawOpen`, `u_face0JawRight`, `u_face0MouthClose`, `u_face0MouthDimpleLeft`, `u_face0MouthDimpleRight`, `u_face0MouthFrownLeft`, `u_face0MouthFrownRight`, `u_face0MouthFunnel`, `u_face0MouthLeft`, `u_face0MouthLowerDownLeft`, `u_face0MouthLowerDownRight`, `u_face0MouthPressLeft`, `u_face0MouthPressRight`, `u_face0MouthPucker`, `u_face0MouthRight`, `u_face0MouthRollLower`, `u_face0MouthRollUpper`, `u_face0MouthShrugLower`, `u_face0MouthShrugUpper`, `u_face0MouthSmileLeft`, `u_face0MouthSmileRight`, `u_face0MouthStretchLeft`, `u_face0MouthStretchRight`, `u_face0MouthUpperUpLeft`, `u_face0MouthUpperUpRight`, `u_face0NoseSneerLeft`, `u_face0NoseSneerRight`, `u_face0TongueOut`.\n\n### CV — Hands\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_handCount` | `int` | Number of detected hands (0–2) |\n| `u_leftHandPalm`, `u_rightHandPalm` | `vec3` | Palm position (x, y, z) |\n| `u_leftHandConfidence`, `u_rightHandConfidence` | `float` | Detection confidence 0–1 |\n| `u_leftHandBounds`, `u_rightHandBounds` | `vec4` | Bounding box normalized 0–1 |\n| `u_leftHandFist` – `u_leftHandILoveYou` | `float` | 7 left-hand gesture scores (fist, open, peace, thumbsUp, thumbsDown, pointing, iLoveYou) |\n| `u_rightHandFist` – `u_rightHandILoveYou` | `float` | 7 right-hand gesture scores |\n\n### CV — Pose\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_poseDetected` | `bool` | True if a pose is detected |\n| `u_poseConfidence` | `float` | Detection confidence 0–1 |\n| `u_nosePosition` | `vec2` | Nose landmark (normalized 0–1) |\n| `u_leftShoulderPosition`, `u_rightShoulderPosition` | `vec2` | Shoulder positions |\n| `u_leftElbowPosition`, `u_rightElbowPosition` | `vec2` | Elbow positions |\n| `u_leftWristPosition`, `u_rightWristPosition` | `vec2` | Wrist positions |\n| `u_leftHipPosition`, `u_rightHipPosition` | `vec2` | Hip positions |\n| `u_leftKneePosition`, `u_rightKneePosition` | `vec2` | Knee positions |\n| `u_leftAnklePosition`, `u_rightAnklePosition` | `vec2` | Ankle positions |\n\n### CV — Body Segmentation\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_segmentationMask` | `sampler2D` | Segmentation mask (0=background, 1=person) |\n| `u_segmentationRes` | `vec2` | Mask resolution in pixels |\n\n### Device Sensors\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_deviceAcceleration` | `vec3` | Acceleration without gravity (m/s²) |\n| `u_deviceAccelerationGravity` | `vec3` | Acceleration with gravity (m/s²) |\n| `u_deviceRotationRate` | `vec3` | Rotation rate (deg/s) |\n| `u_deviceOrientation` | `vec3` | Orientation (alpha, beta, gamma) degrees |\n| `u_deviceOrientationAbsolute` | `bool` | True if using magnetometer |\n\n### External Devices\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_deviceCount` | `int` | Number of device video sources (0–8) |\n| `u_externalDeviceCount` | `int` | Number of external devices (0–8) |\n| `u_device0` – `u_device7` | `sampler2D` | Device camera textures |\n| `u_device0Resolution` – `u_device7Resolution` | `vec2` | Device camera resolutions |\n| `u_device0Connected` – `u_device7Connected` | `bool` | Device connection status |\n| `u_device0Acceleration` – `u_device7Acceleration` | `vec3` | Per-device acceleration |\n| `u_device0AccelerationGravity` – `u_device7AccelerationGravity` | `vec3` | Per-device acceleration w/ gravity |\n| `u_device0RotationRate` – `u_device7RotationRate` | `vec3` | Per-device rotation rate |\n| `u_device0Orientation` – `u_device7Orientation` | `vec3` | Per-device orientation |\n\n### Streams (Compositor)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_streamCount` | `int` | Number of active streams (0–8) |\n| `u_stream0` – `u_stream7` | `sampler2D` | Stream textures |\n| `u_stream0Resolution` – `u_stream7Resolution` | `vec2` | Stream resolutions |\n| `u_stream0Connected` – `u_stream7Connected` | `bool` | Stream connection status |\n\nStreams are host-provided video sources used internally by the compositor.\n\n### Backbuffer\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `backbuffer` | `sampler2D` | Previous frame (auto-enabled when referenced) |\n\nNo `u_` prefix. RGBA 8-bit, LINEAR filtering, CLAMP_TO_EDGE wrapping. First frame samples as black. Content clears on canvas resize.\nSample: `texture2D(backbuffer, uv)` (ES 1.00) or `texture(backbuffer, uv)` (ES 3.00).\n\n## PARAMETER DIRECTIVES\n\nDeclare with `// @viji-TYPE:uniformName key:value ...` syntax. They become uniforms automatically.\n\n```glsl\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0 step:0.1\n// → uniform float speed;\n\n// @viji-color:tint label:\"Tint\" default:#ff6600\n// → uniform vec3 tint; (RGB 0–1)\n\n// @viji-toggle:invert label:\"Invert\" default:false\n// → uniform bool invert;\n\n// @viji-select:mode label:\"Mode\" default:0 options:[\"Solid\",\"Gradient\",\"Noise\"]\n// → uniform int mode; (0-based index)\n\n// @viji-number:count label:\"Count\" default:10.0 min:1.0 max:100.0 step:1.0\n// → uniform float count;\n\n// @viji-image:tex label:\"Texture\"\n// → uniform sampler2D tex;\n\n// @viji-button:reset label:\"Reset\"\n// → uniform bool reset; (true for one frame on press)\n\n// @viji-accumulator:phase rate:speed\n// → uniform float phase; (CPU-side: += speed × deltaTime each frame)\n```\n\nAll directives support `group:\"GroupName\"` and `category:\"audio|video|interaction|general\"`.\n\n## TEMPLATE\n\n```glsl\n// @renderer shader\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0\n// @viji-color:baseColor label:\"Color\" default:#ff6600\n// @viji-accumulator:phase rate:speed\n\nvoid main() {\n vec2 uv = gl_FragCoord.xy / u_resolution;\n\n float wave = sin(uv.x * 10.0 + phase) * 0.5 + 0.5;\n float pulse = 1.0 + u_audioLow * 0.5;\n vec3 color = baseColor * wave * pulse;\n\n gl_FragColor = vec4(color, 1.0);\n}\n```\n\nNow generate a Viji shader scene based on the artist's description below. Return ONLY the GLSL code.\nFollow all rules. Use `// @renderer shader` as the first line. Do NOT declare precision or uniforms. Use `@viji-accumulator` for parameter-driven animation. Use `@viji-slider/color/toggle` for artist controls.\n````\n\n## Usage\n\n1. Copy the entire prompt block above.\n2. Paste it into your AI assistant (ChatGPT, Claude, etc.).\n3. After the prompt, describe the shader effect you want.\n4. The AI will return a complete Viji shader scene.\n\n> [!TIP]\n> For better results, describe the visual effect you want (patterns, colors, motion), mention data sources (audio, video, mouse), and what controls the user should have. If you have existing Shadertoy shaders to convert, use the [Convert: Shadertoy](/ai-prompts/convert-shadertoy) prompt instead.\n\n## Related\n\n- [Create Your First Scene](/ai-prompts/create-first-scene) — guided prompt for beginners\n- [Prompting Tips](/ai-prompts/prompting-tips) — how to get better results from AI\n- [Convert: Shadertoy](/ai-prompts/convert-shadertoy) — convert existing Shadertoy shaders to Viji\n- [Shader Quick Start](/shader/quickstart) — your first Viji shader\n- [Shader API Reference](/shader/api-reference) — full uniform reference\n- [Backbuffer & Feedback](/shader/backbuffer) — previous-frame feedback effects\n- [Shadertoy Compatibility](/shader/shadertoy) — compatibility layer for Shadertoy code"
973
+ "markdown": "# Prompt: Shader Scenes\n\nCopy the prompt below and paste it into your AI assistant. Then describe the shader effect you want. The prompt gives the AI everything it needs about Viji's shader renderer to generate a correct, working scene.\n\n## The Prompt\n\n````\nYou are generating a Viji GLSL shader scene — a fragment shader that runs on a fullscreen quad inside a Web Worker.\nArtists describe what they want; you produce complete, working GLSL code. Apply every rule below exactly.\n\n## REFERENCE (for AI assistants with web access)\n\nThis prompt is self-contained — all information needed is included below.\nFor the latest API documentation and type definitions:\n- Complete docs (all pages + examples): https://unpkg.com/@viji-dev/core/dist/docs-api.js\n- TypeScript API types: https://unpkg.com/@viji-dev/core/dist/artist-global.d.ts\n- Shader uniforms reference: https://unpkg.com/@viji-dev/core/dist/shader-uniforms.js\n- NPM package: https://www.npmjs.com/package/@viji-dev/core\n\n## ARCHITECTURE\n\n- Viji renders a **fullscreen quad**. Your shader defines the color of every pixel.\n- Viji **auto-injects** `precision mediump float;` and ALL uniform declarations — both built-in uniforms and parameter uniforms from `@viji-*` directives.\n- You write only helper functions and `void main() { ... }`.\n- **GLSL ES 1.00** by default. Add `#version 300 es` as the very first line for ES 3.00.\n- ES 3.00 requires `out vec4 fragColor;` (before `main`) and `fragColor = ...` instead of `gl_FragColor`.\n- ES 3.00 uses `texture()` instead of `texture2D()`.\n- If the shader uses `fwidth`, Viji auto-injects `#extension GL_OES_standard_derivatives : enable`.\n\n## RULES\n\n1. ALWAYS add `// @renderer shader` as the first line (or after `#version 300 es` if using ES 3.00).\n2. NEVER declare `precision mediump float;` or `precision highp float;` — Viji auto-injects precision.\n3. NEVER redeclare built-in uniforms (`u_time`, `u_resolution`, `u_mouse`, etc.) — they are auto-injected.\n4. NEVER redeclare parameter uniforms — they are auto-generated from `@viji-*` directives.\n5. NEVER use the `u_` prefix for your own parameter names — it is reserved for built-in uniforms. Name parameters descriptively: `speed`, `colorMix`, `intensity`.\n6. `@viji-*` parameter directives ONLY work with `//` comments. NEVER use `/* */` for directives.\n7. ALWAYS use `@viji-accumulator` instead of `u_time * speed` for parameter-driven animation — this prevents jumps when sliders change:\n ```glsl\n // @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0\n // @viji-accumulator:phase rate:speed\n float wave = sin(phase); // smooth, no jumps\n ```\n8. For `backbuffer` (previous frame), just reference it in code — Viji auto-detects and enables it.\n9. Remove any `#ifdef GL_ES` / `precision` blocks — Viji handles this.\n\n## COMPLETE UNIFORM REFERENCE\n\nAll uniforms below are always available — do NOT declare them.\n\n### Core\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_resolution` | `vec2` | Canvas width and height in pixels |\n| `u_time` | `float` | Elapsed seconds since scene start |\n| `u_deltaTime` | `float` | Seconds since last frame |\n| `u_frame` | `int` | Current frame number |\n| `u_fps` | `float` | Current frames per second |\n\n### Mouse\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_mouse` | `vec2` | Mouse position in pixels (WebGL coords: bottom-left origin) |\n| `u_mouseInCanvas` | `bool` | True if mouse is inside canvas |\n| `u_mousePressed` | `bool` | True if any mouse button is pressed |\n| `u_mouseLeft` | `bool` | True if left button is pressed |\n| `u_mouseRight` | `bool` | True if right button is pressed |\n| `u_mouseMiddle` | `bool` | True if middle button is pressed |\n| `u_mouseDelta` | `vec2` | Mouse movement delta per frame |\n| `u_mouseWheel` | `float` | Mouse wheel scroll delta |\n| `u_mouseWasPressed` | `bool` | True on the frame a button was pressed |\n| `u_mouseWasReleased` | `bool` | True on the frame a button was released |\n\n### Keyboard\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_keySpace` | `bool` | Spacebar |\n| `u_keyShift` | `bool` | Shift key |\n| `u_keyCtrl` | `bool` | Ctrl/Cmd key |\n| `u_keyAlt` | `bool` | Alt/Option key |\n| `u_keyW`, `u_keyA`, `u_keyS`, `u_keyD` | `bool` | WASD keys |\n| `u_keyUp`, `u_keyDown`, `u_keyLeft`, `u_keyRight` | `bool` | Arrow keys |\n| `u_keyboard` | `sampler2D` | Full keyboard state texture (256×3, LUMINANCE). Row 0: held, Row 1: pressed this frame, Row 2: toggle. Access: `texelFetch(u_keyboard, ivec2(keyCode, row), 0).r` |\n\n### Touch\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_touchCount` | `int` | Number of active touches (0–5) |\n| `u_touch0` – `u_touch4` | `vec2` | Touch point positions in pixels |\n\n### Pointer (unified mouse/touch)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_pointer` | `vec2` | Primary input position in pixels (WebGL coords) |\n| `u_pointerDelta` | `vec2` | Primary input movement delta |\n| `u_pointerDown` | `bool` | True if primary input is active |\n| `u_pointerWasPressed` | `bool` | True on frame input became active |\n| `u_pointerWasReleased` | `bool` | True on frame input was released |\n| `u_pointerInCanvas` | `bool` | True if inside canvas |\n\n### Audio — Scalars\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_audioVolume` | `float` | RMS volume 0–1 |\n| `u_audioPeak` | `float` | Peak amplitude 0–1 |\n| `u_audioVolumeSmoothed` | `float` | Smoothed volume (200ms decay) |\n| `u_audioLow` | `float` | Low band 20–120 Hz |\n| `u_audioLowMid` | `float` | Low-mid 120–400 Hz |\n| `u_audioMid` | `float` | Mid 400–1600 Hz |\n| `u_audioHighMid` | `float` | High-mid 1600–6000 Hz |\n| `u_audioHigh` | `float` | High 6000–16000 Hz |\n| `u_audioLowSmoothed` – `u_audioHighSmoothed` | `float` | Smoothed band variants |\n| `u_audioKick` | `float` | Kick energy 0–1 |\n| `u_audioSnare` | `float` | Snare energy 0–1 |\n| `u_audioHat` | `float` | Hi-hat energy 0–1 |\n| `u_audioAny` | `float` | Any beat energy 0–1 |\n| `u_audioKickSmoothed` – `u_audioAnySmoothed` | `float` | Smoothed beat values |\n| `u_audioKickTrigger` | `bool` | True on kick beat frame |\n| `u_audioSnareTrigger` | `bool` | True on snare beat frame |\n| `u_audioHatTrigger` | `bool` | True on hat beat frame |\n| `u_audioAnyTrigger` | `bool` | True on any beat frame |\n| `u_audioBPM` | `float` | Estimated BPM (60–240) |\n| `u_audioConfidence` | `float` | Beat tracking confidence 0–1 |\n| `u_audioIsLocked` | `bool` | True when BPM is locked |\n| `u_audioBrightness` | `float` | Spectral brightness 0–1 |\n| `u_audioFlatness` | `float` | Spectral flatness 0–1 |\n\n### Audio — Textures\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_audioFFT` | `sampler2D` | FFT frequency spectrum (1024 bins, 0–255) |\n| `u_audioWaveform` | `sampler2D` | Time-domain waveform (−1 to 1) |\n\n> [!NOTE]\n> **`u_audioFFT` / `u_audioWaveform` apply only to the main audio source.** Additional streams (host `audioStreams` and device audio) use **`u_audioStream{i}*`** float/bool uniforms only — see **Streams (Compositor)** → **Audio streams** in this reference.\n\n### Video\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_video` | `sampler2D` | Current video frame texture |\n| `u_videoResolution` | `vec2` | Video frame size in pixels |\n| `u_videoFrameRate` | `float` | Video frame rate |\n| `u_videoConnected` | `bool` | True if video source is active |\n\n### CV — Face Detection\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_faceCount` | `int` | Number of detected faces (0–1) |\n| `u_face0Bounds` | `vec4` | Bounding box (x, y, width, height) normalized 0–1 |\n| `u_face0Center` | `vec2` | Face center (x, y) normalized 0–1 |\n| `u_face0HeadPose` | `vec3` | Head rotation (pitch, yaw, roll) in degrees |\n| `u_face0Confidence` | `float` | Detection confidence 0–1 |\n| `u_face0Neutral` – `u_face0Fearful` | `float` | 7 expression scores (neutral, happy, sad, angry, surprised, disgusted, fearful) |\n\n**52 Blendshape uniforms** (all `float`, 0–1, ARKit names prefixed with `u_face0`):\n`u_face0BrowDownLeft`, `u_face0BrowDownRight`, `u_face0BrowInnerUp`, `u_face0BrowOuterUpLeft`, `u_face0BrowOuterUpRight`, `u_face0CheekPuff`, `u_face0CheekSquintLeft`, `u_face0CheekSquintRight`, `u_face0EyeBlinkLeft`, `u_face0EyeBlinkRight`, `u_face0EyeLookDownLeft`, `u_face0EyeLookDownRight`, `u_face0EyeLookInLeft`, `u_face0EyeLookInRight`, `u_face0EyeLookOutLeft`, `u_face0EyeLookOutRight`, `u_face0EyeLookUpLeft`, `u_face0EyeLookUpRight`, `u_face0EyeSquintLeft`, `u_face0EyeSquintRight`, `u_face0EyeWideLeft`, `u_face0EyeWideRight`, `u_face0JawForward`, `u_face0JawLeft`, `u_face0JawOpen`, `u_face0JawRight`, `u_face0MouthClose`, `u_face0MouthDimpleLeft`, `u_face0MouthDimpleRight`, `u_face0MouthFrownLeft`, `u_face0MouthFrownRight`, `u_face0MouthFunnel`, `u_face0MouthLeft`, `u_face0MouthLowerDownLeft`, `u_face0MouthLowerDownRight`, `u_face0MouthPressLeft`, `u_face0MouthPressRight`, `u_face0MouthPucker`, `u_face0MouthRight`, `u_face0MouthRollLower`, `u_face0MouthRollUpper`, `u_face0MouthShrugLower`, `u_face0MouthShrugUpper`, `u_face0MouthSmileLeft`, `u_face0MouthSmileRight`, `u_face0MouthStretchLeft`, `u_face0MouthStretchRight`, `u_face0MouthUpperUpLeft`, `u_face0MouthUpperUpRight`, `u_face0NoseSneerLeft`, `u_face0NoseSneerRight`, `u_face0TongueOut`.\n\n### CV — Hands\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_handCount` | `int` | Number of detected hands (0–2) |\n| `u_leftHandPalm`, `u_rightHandPalm` | `vec3` | Palm position (x, y, z) |\n| `u_leftHandConfidence`, `u_rightHandConfidence` | `float` | Detection confidence 0–1 |\n| `u_leftHandBounds`, `u_rightHandBounds` | `vec4` | Bounding box normalized 0–1 |\n| `u_leftHandFist` – `u_leftHandILoveYou` | `float` | 7 left-hand gesture scores (fist, open, peace, thumbsUp, thumbsDown, pointing, iLoveYou) |\n| `u_rightHandFist` – `u_rightHandILoveYou` | `float` | 7 right-hand gesture scores |\n\n### CV — Pose\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_poseDetected` | `bool` | True if a pose is detected |\n| `u_poseConfidence` | `float` | Detection confidence 0–1 |\n| `u_nosePosition` | `vec2` | Nose landmark (normalized 0–1) |\n| `u_leftShoulderPosition`, `u_rightShoulderPosition` | `vec2` | Shoulder positions |\n| `u_leftElbowPosition`, `u_rightElbowPosition` | `vec2` | Elbow positions |\n| `u_leftWristPosition`, `u_rightWristPosition` | `vec2` | Wrist positions |\n| `u_leftHipPosition`, `u_rightHipPosition` | `vec2` | Hip positions |\n| `u_leftKneePosition`, `u_rightKneePosition` | `vec2` | Knee positions |\n| `u_leftAnklePosition`, `u_rightAnklePosition` | `vec2` | Ankle positions |\n\n### CV — Body Segmentation\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_segmentationMask` | `sampler2D` | Segmentation mask (0=background, 1=person) |\n| `u_segmentationRes` | `vec2` | Mask resolution in pixels |\n\n### Device Sensors\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_deviceAcceleration` | `vec3` | Acceleration without gravity (m/s²) |\n| `u_deviceAccelerationGravity` | `vec3` | Acceleration with gravity (m/s²) |\n| `u_deviceRotationRate` | `vec3` | Rotation rate (deg/s) |\n| `u_deviceOrientation` | `vec3` | Orientation (alpha, beta, gamma) degrees |\n| `u_deviceOrientationAbsolute` | `bool` | True if using magnetometer |\n\n### External Devices\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_deviceCount` | `int` | Number of device video sources (0–8) |\n| `u_externalDeviceCount` | `int` | Number of external devices (0–8) |\n| `u_device0` – `u_device7` | `sampler2D` | Device camera textures |\n| `u_device0Resolution` – `u_device7Resolution` | `vec2` | Device camera resolutions |\n| `u_device0Connected` – `u_device7Connected` | `bool` | Device connection status |\n| `u_device0Acceleration` – `u_device7Acceleration` | `vec3` | Per-device acceleration |\n| `u_device0AccelerationGravity` – `u_device7AccelerationGravity` | `vec3` | Per-device acceleration w/ gravity |\n| `u_device0RotationRate` – `u_device7RotationRate` | `vec3` | Per-device rotation rate |\n| `u_device0Orientation` – `u_device7Orientation` | `vec3` | Per-device orientation |\n\n> [!NOTE]\n> **Device audio** (when an external device provides an audio source) is exposed as **`u_audioStream{i}*`** scalar uniforms — same per-slot names as compositor audio streams (`Connected`, `Volume`, band energies, `Brightness`, `Flatness` for `i` = 0–7). There are **no** per-device or per-stream FFT/waveform textures; only the **main** audio source gets `u_audioFFT` and `u_audioWaveform`.\n\n### Streams (Compositor)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_videoStreamCount` | `int` | Number of active streams (0–8) |\n| `u_videoStream0` – `u_videoStream7` | `sampler2D` | Stream textures |\n| `u_videoStream0Resolution` – `u_videoStream7Resolution` | `vec2` | Stream resolutions |\n| `u_videoStream0Connected` – `u_videoStream7Connected` | `bool` | Stream connection status |\n\nStreams are host-provided video sources used internally by the compositor.\n\n#### Audio streams (additional sources)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_audioStreamCount` | `int` | Number of active additional audio streams (0–8) |\n| `u_audioStream0Connected` – `u_audioStream7Connected` | `bool` | Whether that slot is actively providing audio |\n| `u_audioStream{i}Volume` | `float` | RMS-style volume 0–1 |\n| `u_audioStream{i}Low` – `u_audioStream{i}High` | `float` | Band energies 0–1 (`Low`, `LowMid`, `Mid`, `HighMid`, `High`) |\n| `u_audioStream{i}Brightness`, `u_audioStream{i}Flatness` | `float` | Spectral features 0–1 |\n\n(`i` = 0…7.) **Lightweight scalars only** — **no** `u_audioFFT` / `u_audioWaveform` per stream. Beat/BPM/trigger uniforms remain **main audio only** (`u_audioKick`, `u_audioBPM`, etc.).\n\n### Backbuffer\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `backbuffer` | `sampler2D` | Previous frame (auto-enabled when referenced) |\n\nNo `u_` prefix. RGBA 8-bit, LINEAR filtering, CLAMP_TO_EDGE wrapping. First frame samples as black. Content clears on canvas resize.\nSample: `texture2D(backbuffer, uv)` (ES 1.00) or `texture(backbuffer, uv)` (ES 3.00).\n\n## PARAMETER DIRECTIVES\n\nDeclare with `// @viji-TYPE:uniformName key:value ...` syntax. They become uniforms automatically.\n\n```glsl\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0 step:0.1\n// → uniform float speed;\n\n// @viji-color:tint label:\"Tint\" default:#ff6600\n// → uniform vec3 tint; (RGB 0–1)\n\n// @viji-toggle:invert label:\"Invert\" default:false\n// → uniform bool invert;\n\n// @viji-select:mode label:\"Mode\" default:0 options:[\"Solid\",\"Gradient\",\"Noise\"]\n// → uniform int mode; (0-based index)\n\n// @viji-number:count label:\"Count\" default:10.0 min:1.0 max:100.0 step:1.0\n// → uniform float count;\n\n// @viji-image:tex label:\"Texture\"\n// → uniform sampler2D tex;\n\n// @viji-button:reset label:\"Reset\"\n// → uniform bool reset; (true for one frame on press)\n\n// @viji-accumulator:phase rate:speed\n// → uniform float phase; (CPU-side: += speed × deltaTime each frame)\n```\n\nAll directives support `group:\"GroupName\"` and `category:\"audio|video|interaction|general\"`.\n\n## TEMPLATE\n\n```glsl\n// @renderer shader\n// @viji-slider:speed label:\"Speed\" default:1.0 min:0.1 max:5.0\n// @viji-color:baseColor label:\"Color\" default:#ff6600\n// @viji-accumulator:phase rate:speed\n\nvoid main() {\n vec2 uv = gl_FragCoord.xy / u_resolution;\n\n float wave = sin(uv.x * 10.0 + phase) * 0.5 + 0.5;\n float pulse = 1.0 + u_audioLow * 0.5;\n vec3 color = baseColor * wave * pulse;\n\n gl_FragColor = vec4(color, 1.0);\n}\n```\n\nNow generate a Viji shader scene based on the artist's description below. Return ONLY the GLSL code.\nFollow all rules. Use `// @renderer shader` as the first line. Do NOT declare precision or uniforms. Use `@viji-accumulator` for parameter-driven animation. Use `@viji-slider/color/toggle` for artist controls.\n````\n\n## Usage\n\n1. Copy the entire prompt block above.\n2. Paste it into your AI assistant (ChatGPT, Claude, etc.).\n3. After the prompt, describe the shader effect you want.\n4. The AI will return a complete Viji shader scene.\n\n> [!TIP]\n> For better results, describe the visual effect you want (patterns, colors, motion), mention data sources (audio, video, mouse), and what controls the user should have. If you have existing Shadertoy shaders to convert, use the [Convert: Shadertoy](/ai-prompts/convert-shadertoy) prompt instead.\n\n## Related\n\n- [Create Your First Scene](/ai-prompts/create-first-scene) — guided prompt for beginners\n- [Prompting Tips](/ai-prompts/prompting-tips) — how to get better results from AI\n- [Convert: Shadertoy](/ai-prompts/convert-shadertoy) — convert existing Shadertoy shaders to Viji\n- [Shader Quick Start](/shader/quickstart) — your first Viji shader\n- [Shader API Reference](/shader/api-reference) — full uniform reference\n- [Backbuffer & Feedback](/shader/backbuffer) — previous-frame feedback effects\n- [Shadertoy Compatibility](/shader/shadertoy) — compatibility layer for Shadertoy code"
953
974
  }
954
975
  ]
955
976
  },
@@ -971,7 +992,7 @@ export const docsApi = {
971
992
  "content": [
972
993
  {
973
994
  "type": "text",
974
- "markdown": "# Convert: P5 Sketches to Viji\r\n\r\nCopy the prompt below and paste it into your AI assistant along with the P5.js sketch you want to convert. The prompt contains all the rules the AI needs to produce a correct Viji-P5 scene.\r\n\r\n## The Prompt\r\n\r\n````\r\nYou are converting a standard P5.js sketch into a Viji-P5 scene.\r\nViji scenes run inside an OffscreenCanvas Web Worker. Apply every rule below exactly.\r\n\r\n## RULES\r\n\r\n1. ALWAYS add `// @renderer p5` as the very first line.\r\n2. ALWAYS rename `draw()` to `render(viji, p5)`.\r\n3. If `setup()` exists, change its signature to `setup(viji, p5)`. If it doesn't exist, do NOT add one.\r\n4. ALWAYS prefix every P5 function and constant with `p5.`:\r\n - `background(0)` → `p5.background(0)`\r\n - `fill(255)` → `p5.fill(255)`\r\n - `PI` → `p5.PI`, `TWO_PI` → `p5.TWO_PI`, `HSB` → `p5.HSB`\r\n - `createVector(1, 0)` → `p5.createVector(1, 0)`\r\n - `map(v, 0, 1, 0, 255)` → `p5.map(v, 0, 1, 0, 255)`\r\n - `noise(x)` → `p5.noise(x)`\r\n This applies to ALL P5 functions and constants without exception.\r\n5. NEVER call `createCanvas()`. The canvas is created and managed by Viji.\r\n6. NEVER use `preload()`. Use `viji.image(null, { label: 'Name' })` for images, or `fetch()` in an async `setup()` for data.\r\n7. NEVER use P5 event callbacks: `mousePressed()`, `mouseDragged()`, `mouseReleased()`, `keyPressed()`, `keyReleased()`, `keyTyped()`, `touchStarted()`, `touchMoved()`, `touchEnded()`. Instead, check state in `render()`:\r\n - `mouseIsPressed` → `viji.pointer.isDown` (works for both mouse and touch) or `viji.mouse.isPressed`\r\n - `mouseX` / `mouseY` → `viji.pointer.x` / `viji.pointer.y` (works for both mouse and touch) or `viji.mouse.x` / `viji.mouse.y`\r\n - `keyIsPressed` → `viji.keyboard.isPressed('keyName')`\r\n - For press-edge detection: use `viji.pointer.wasPressed` / `viji.pointer.wasReleased`.\r\n8. NEVER use `p5.frameRate()`, `p5.save()`, `p5.saveCanvas()`, `p5.saveFrames()`. These are host-level concerns.\r\n9. NEVER use `loadImage()`, `loadFont()`, `loadJSON()`, `loadModel()`, `loadShader()`. Use `viji.image()` or `fetch()`.\r\n10. NEVER use `createCapture()` or `createVideo()`. Use `viji.video.*` instead.\r\n11. NEVER use `p5.dom` or `p5.sound` libraries. Use Viji parameters for UI and `viji.audio.*` for audio.\r\n12. NEVER access `window`, `document`, `Image()`, or `localStorage`. `fetch()` IS available.\r\n13. ALWAYS declare parameters at the TOP LEVEL, never inside `render()` or `setup()`:\r\n ```javascript\r\n // CORRECT\r\n const size = viji.slider(50, { min: 10, max: 200, label: 'Size' });\r\n function render(viji, p5) { p5.circle(0, 0, size.value); }\r\n\r\n // WRONG — creates a new parameter every frame\r\n function render(viji, p5) { const size = viji.slider(50, { ... }); }\r\n ```\r\n14. ALWAYS read parameters via `.value`: `size.value`, `color.value`, `toggle.value`.\r\n15. ALWAYS use `viji.width` and `viji.height` for canvas dimensions. NEVER hardcode pixel sizes.\r\n16. ALWAYS use `viji.deltaTime` for frame-rate-independent animation. Replace `frameCount * 0.01` patterns with a deltaTime accumulator:\r\n ```javascript\r\n let angle = 0;\r\n function render(viji, p5) {\r\n angle += speed.value * viji.deltaTime;\r\n }\r\n ```\r\n17. NEVER allocate objects, arrays, or strings inside `render()`. Pre-allocate at the top level and reuse.\r\n18. For image parameters displayed with P5, use `photo.p5` (not `photo.value`) with `p5.image()`:\r\n ```javascript\r\n const photo = viji.image(null, { label: 'Photo' });\r\n function render(viji, p5) {\r\n if (photo.value) p5.image(photo.p5, 0, 0, viji.width, viji.height);\r\n }\r\n ```\r\n\r\n## API MAPPING\r\n\r\n| Standard P5.js | Viji-P5 |\r\n|---|---|\r\n| `width` / `height` | `viji.width` / `viji.height` |\r\n| `mouseX` / `mouseY` | `viji.pointer.x` / `viji.pointer.y` (or `viji.mouse.x` / `viji.mouse.y`) |\r\n| `mouseIsPressed` | `viji.pointer.isDown` (or `viji.mouse.isPressed`) |\r\n| `mouseButton === LEFT` | `viji.mouse.leftButton` |\r\n| `keyIsPressed` | `viji.keyboard.isPressed('keyName')` |\r\n| `key` | `viji.keyboard.lastKeyPressed` |\r\n| `frameCount` | Use `viji.time` or `viji.deltaTime` accumulator |\r\n| `frameRate(n)` | Remove — host controls frame rate |\r\n| `createCanvas(w, h)` | Remove — canvas is provided |\r\n| `preload()` | Remove — use `viji.image()` or `fetch()` in `setup()` |\r\n| `loadImage(url)` | `viji.image(null, { label: 'Image' })` |\r\n| `save()` | Remove — host uses `captureFrame()` |\r\n\r\n## PARAMETER TYPES\r\n\r\n```javascript\r\nviji.slider(default, { min, max, step, label, group, category }) // returns { value: number }\r\nviji.color(default, { label, group, category }) // returns { value: '#rrggbb' }\r\nviji.toggle(default, { label, group, category }) // returns { value: boolean }\r\nviji.select(default, { options: [...], label, group, category }) // returns { value: string }\r\nviji.number(default, { min, max, step, label, group, category }) // returns { value: number }\r\nviji.text(default, { label, group, category }) // returns { value: string }\r\nviji.image(default, { label, group, category }) // returns { value: ImageBitmap|null, p5: P5Image }\r\nviji.button({ label, description, group, category }) // returns { value: boolean }\r\n```\r\n\r\n## TEMPLATE\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\r\n\r\nlet angle = 0;\r\n\r\nfunction setup(viji, p5) {\r\n p5.colorMode(p5.HSB, 360, 100, 100);\r\n}\r\n\r\nfunction render(viji, p5) {\r\n angle += speed.value * viji.deltaTime;\r\n p5.background(0, 0, 10);\r\n const x = viji.width / 2 + p5.cos(angle) * viji.width * 0.3;\r\n const y = viji.height / 2 + p5.sin(angle) * viji.height * 0.3;\r\n p5.noStroke();\r\n p5.fill(angle * 30 % 360, 80, 100);\r\n p5.circle(x, y, viji.width * 0.05);\r\n}\r\n```\r\n\r\nNow convert the P5.js sketch I provide. Return ONLY the converted Viji-P5 scene code.\r\n````\r\n\r\n## Usage\r\n\r\n1. Copy the entire prompt block above.\r\n2. Paste it into your AI assistant.\r\n3. After the prompt, paste the P5.js sketch you want to convert.\r\n4. The AI will return a Viji-compatible scene.\r\n\r\nFor a detailed human-readable guide, see [Converting P5 Sketches](/p5/converting-sketches).\r\n\r\n## Related\r\n\r\n- [Converting P5 Sketches](/p5/converting-sketches) — step-by-step manual conversion guide\r\n- [Prompt: P5 Scenes](/ai-prompts/p5-prompt) — AI prompt for creating new P5 scenes from scratch\r\n- [P5 Quick Start](/p5/quickstart) — your first Viji-P5 scene"
995
+ "markdown": "# Convert: P5 Sketches to Viji\r\n\r\nCopy the prompt below and paste it into your AI assistant along with the P5.js sketch you want to convert. The prompt contains all the rules the AI needs to produce a correct Viji-P5 scene.\r\n\r\n## The Prompt\r\n\r\n````\r\nYou are converting a standard P5.js sketch into a Viji-P5 scene.\r\nViji scenes run inside an OffscreenCanvas Web Worker. Apply every rule below exactly.\r\n\r\n## RULES\r\n\r\n1. ALWAYS set the first line from the sketch's canvas mode: `// @renderer p5` for 2D (default), or `// @renderer p5 webgl` if the sketch used `createCanvas(w, h, WEBGL)` or 3D primitives on the main canvas. NEVER keep `createCanvas()` — Viji creates the canvas.\r\n2. ALWAYS rename `draw()` to `render(viji, p5)`.\r\n3. If `setup()` exists, change its signature to `setup(viji, p5)`. If it doesn't exist, do NOT add one.\r\n4. ALWAYS prefix every P5 function and constant with `p5.`:\r\n - `background(0)` → `p5.background(0)`\r\n - `fill(255)` → `p5.fill(255)`\r\n - `PI` → `p5.PI`, `TWO_PI` → `p5.TWO_PI`, `HSB` → `p5.HSB`\r\n - `createVector(1, 0)` → `p5.createVector(1, 0)`\r\n - `map(v, 0, 1, 0, 255)` → `p5.map(v, 0, 1, 0, 255)`\r\n - `noise(x)` → `p5.noise(x)`\r\n This applies to ALL P5 functions and constants without exception.\r\n5. NEVER call `createCanvas()`. The canvas is created and managed by Viji. WEBGL is selected only with `// @renderer p5 webgl`, not with `createCanvas(..., p5.WEBGL)`.\r\n6. NEVER use `preload()`. Use `viji.image(null, { label: 'Name' })` for images, or `fetch()` in an async `setup()` for data.\r\n7. NEVER use P5 event callbacks: `mousePressed()`, `mouseDragged()`, `mouseReleased()`, `keyPressed()`, `keyReleased()`, `keyTyped()`, `touchStarted()`, `touchMoved()`, `touchEnded()`. Instead, check state in `render()`:\r\n - `mouseIsPressed` → `viji.pointer.isDown` (works for both mouse and touch) or `viji.mouse.isPressed`\r\n - `mouseX` / `mouseY` → `viji.pointer.x` / `viji.pointer.y` (works for both mouse and touch) or `viji.mouse.x` / `viji.mouse.y`\r\n - `keyIsPressed` → `viji.keyboard.isPressed('keyName')`\r\n - For press-edge detection: use `viji.pointer.wasPressed` / `viji.pointer.wasReleased`.\r\n8. NEVER use `p5.frameRate()`, `p5.save()`, `p5.saveCanvas()`, `p5.saveFrames()`. These are host-level concerns.\r\n9. NEVER use `loadImage()`, `loadFont()`, `loadJSON()`, `loadModel()`, `loadShader()`. Use `viji.image()` or `fetch()`.\r\n10. NEVER use `createCapture()` or `createVideo()`. Use `viji.video.*` instead.\r\n11. NEVER use `p5.dom` or `p5.sound` libraries. Use Viji parameters for UI and `viji.audio.*` for audio.\r\n12. NEVER access `window`, `document`, `Image()`, or `localStorage`. `fetch()` IS available.\r\n13. ALWAYS declare parameters at the TOP LEVEL, never inside `render()` or `setup()`:\r\n ```javascript\r\n // CORRECT\r\n const size = viji.slider(50, { min: 10, max: 200, label: 'Size' });\r\n function render(viji, p5) { p5.circle(0, 0, size.value); }\r\n\r\n // WRONG — creates a new parameter every frame\r\n function render(viji, p5) { const size = viji.slider(50, { ... }); }\r\n ```\r\n14. ALWAYS read parameters via `.value`: `size.value`, `color.value`, `toggle.value`.\r\n15. ALWAYS use `viji.width` and `viji.height` for canvas dimensions. NEVER hardcode pixel sizes.\r\n16. ALWAYS use `viji.deltaTime` for frame-rate-independent animation. Replace `frameCount * 0.01` patterns with a deltaTime accumulator:\r\n ```javascript\r\n let angle = 0;\r\n function render(viji, p5) {\r\n angle += speed.value * viji.deltaTime;\r\n }\r\n ```\r\n17. NEVER allocate objects, arrays, or strings inside `render()`. Pre-allocate at the top level and reuse.\r\n18. For image parameters displayed with P5, use `photo.p5` (not `photo.value`) with `p5.image()`:\r\n ```javascript\r\n const photo = viji.image(null, { label: 'Photo' });\r\n function render(viji, p5) {\r\n if (photo.value) p5.image(photo.p5, 0, 0, viji.width, viji.height);\r\n }\r\n ```\r\n\r\n## API MAPPING\r\n\r\n| Standard P5.js | Viji-P5 |\r\n|---|---|\r\n| `width` / `height` | `viji.width` / `viji.height` |\r\n| `mouseX` / `mouseY` | `viji.pointer.x` / `viji.pointer.y` (or `viji.mouse.x` / `viji.mouse.y`) |\r\n| `mouseIsPressed` | `viji.pointer.isDown` (or `viji.mouse.isPressed`) |\r\n| `mouseButton === LEFT` | `viji.mouse.leftButton` |\r\n| `keyIsPressed` | `viji.keyboard.isPressed('keyName')` |\r\n| `key` | `viji.keyboard.lastKeyPressed` |\r\n| `frameCount` | Use `viji.time` or `viji.deltaTime` accumulator |\r\n| `frameRate(n)` | Remove — host controls frame rate |\r\n| `createCanvas(w, h)` / `createCanvas(w, h, WEBGL)` | Remove — use `// @renderer p5` or `// @renderer p5 webgl` |\r\n| `preload()` | Remove — use `viji.image()` or `fetch()` in `setup()` |\r\n| `loadImage(url)` | `viji.image(null, { label: 'Image' })` |\r\n| `save()` | Remove — host uses `captureFrame()` |\r\n\r\n## PARAMETER TYPES\r\n\r\n```javascript\r\nviji.slider(default, { min, max, step, label, group, category }) // returns { value: number }\r\nviji.color(default, { label, group, category }) // returns { value: '#rrggbb' }\r\nviji.toggle(default, { label, group, category }) // returns { value: boolean }\r\nviji.select(default, { options: [...], label, group, category }) // returns { value: string }\r\nviji.number(default, { min, max, step, label, group, category }) // returns { value: number }\r\nviji.text(default, { label, group, category }) // returns { value: string }\r\nviji.image(default, { label, group, category }) // returns { value: ImageBitmap|null, p5: P5Image }\r\nviji.button({ label, description, group, category }) // returns { value: boolean }\r\n```\r\n\r\n## TEMPLATE\r\n\r\n```javascript\r\n// @renderer p5\r\n\r\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\r\n\r\nlet angle = 0;\r\n\r\nfunction setup(viji, p5) {\r\n p5.colorMode(p5.HSB, 360, 100, 100);\r\n}\r\n\r\nfunction render(viji, p5) {\r\n angle += speed.value * viji.deltaTime;\r\n p5.background(0, 0, 10);\r\n const x = viji.width / 2 + p5.cos(angle) * viji.width * 0.3;\r\n const y = viji.height / 2 + p5.sin(angle) * viji.height * 0.3;\r\n p5.noStroke();\r\n p5.fill(angle * 30 % 360, 80, 100);\r\n p5.circle(x, y, viji.width * 0.05);\r\n}\r\n```\r\n\r\nNow convert the P5.js sketch I provide. Return ONLY the converted Viji-P5 scene code.\r\n````\r\n\r\n## Usage\r\n\r\n1. Copy the entire prompt block above.\r\n2. Paste it into your AI assistant.\r\n3. After the prompt, paste the P5.js sketch you want to convert.\r\n4. The AI will return a Viji-compatible scene.\r\n\r\nFor a detailed human-readable guide, see [Converting P5 Sketches](/p5/converting-sketches).\r\n\r\n## Related\r\n\r\n- [Converting P5 Sketches](/p5/converting-sketches) — step-by-step manual conversion guide\r\n- [Prompt: P5 Scenes](/ai-prompts/p5-prompt) — AI prompt for creating new P5 scenes from scratch\r\n- [P5 Quick Start](/p5/quickstart) — your first Viji-P5 scene"
975
996
  }
976
997
  ]
977
998
  },
@@ -1035,7 +1056,7 @@ export const docsApi = {
1035
1056
  "content": [
1036
1057
  {
1037
1058
  "type": "text",
1038
- "markdown": "# API Reference\n\nThis page lists every property and method available on the `viji` object passed to your scene functions. Use it as a quick lookup — each entry links to its dedicated documentation page for full details, examples, and patterns.\n\nNew to Viji? Start with the [Quick Start](/native/quickstart) instead.\n\n## Entry Points\n\n```javascript\nfunction setup(viji) {\n // Called once when the scene starts (optional)\n}\n\nfunction render(viji) {\n // Called every frame\n}\n```\n\n## Canvas & Context\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.canvas`](/native/canvas-context) | `OffscreenCanvas` | The rendering canvas | [Canvas & Context](/native/canvas-context) |\n| [`viji.ctx`](/native/canvas-context) | `OffscreenCanvasRenderingContext2D` | 2D context (after `useContext('2d')`) | [Canvas & Context](/native/canvas-context) |\n| [`viji.gl`](/native/canvas-context) | `WebGLRenderingContext \\| WebGL2RenderingContext` | WebGL context (after `useContext('webgl'\\|'webgl2')`) | [Canvas & Context](/native/canvas-context) |\n| [`viji.width`](/native/canvas-context) | `number` | Canvas width in pixels | [Canvas & Context](/native/canvas-context) |\n| [`viji.height`](/native/canvas-context) | `number` | Canvas height in pixels | [Canvas & Context](/native/canvas-context) |\n| [`viji.useContext(type)`](/native/canvas-context) | `Method` | Request a rendering context: `'2d'`, `'webgl'`, `'webgl2'` | [Canvas & Context](/native/canvas-context) |\n\n## Timing\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.time`](/native/timing) | `number` | Seconds elapsed since the scene started | [Timing](/native/timing) |\n| [`viji.deltaTime`](/native/timing) | `number` | Seconds since the previous frame | [Timing](/native/timing) |\n| [`viji.frameCount`](/native/timing) | `number` | Monotonically increasing frame counter | [Timing](/native/timing) |\n| [`viji.fps`](/native/timing) | `number` | Target FPS based on the host's frame rate mode | [Timing](/native/timing) |\n\n## Parameters\n\nAll parameter methods are called at the top level of your scene file. Read `.value` inside `render()` to get the current value.\n\n| Method | Returns | `.value` Type | Details |\n|--------|---------|---------------|---------|\n| [`viji.slider(default, config)`](/native/parameters/slider) | `SliderParameter` | `number` | [Slider](/native/parameters/slider) |\n| [`viji.color(default, config)`](/native/parameters/color) | `ColorParameter` | `string` (hex) | [Color](/native/parameters/color) |\n| [`viji.toggle(default, config)`](/native/parameters/toggle) | `ToggleParameter` | `boolean` | [Toggle](/native/parameters/toggle) |\n| [`viji.select(default, config)`](/native/parameters/select) | `SelectParameter` | `string \\| number` | [Select](/native/parameters/select) |\n| [`viji.number(default, config)`](/native/parameters/number) | `NumberParameter` | `number` | [Number](/native/parameters/number) |\n| [`viji.text(default, config)`](/native/parameters/text) | `TextParameter` | `string` | [Text](/native/parameters/text) |\n| [`viji.image(null, config)`](/native/parameters/image) | `ImageParameter` | `ImageBitmap \\| null` | [Image](/native/parameters/image) |\n| [`viji.button(config)`](/native/parameters/button) | `ButtonParameter` | `boolean` (true for one frame) | [Button](/native/parameters/button) |\n\nSee [Parameters Overview](/native/parameters) for the declaration pattern, [Grouping](/native/parameters/grouping) and [Categories](/native/parameters/categories) for organization.\n\n## Audio\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.audio.isConnected`](/native/audio) | `boolean` | Whether an audio source is active | [Overview](/native/audio) |\n| [`viji.audio.volume.current`](/native/audio/volume) | `number` | Current RMS volume 0–1 | [Volume](/native/audio/volume) |\n| [`viji.audio.volume.peak`](/native/audio/volume) | `number` | Peak volume 0–1 | [Volume](/native/audio/volume) |\n| [`viji.audio.volume.smoothed`](/native/audio/volume) | `number` | Smoothed volume 0–1 | [Volume](/native/audio/volume) |\n| [`viji.audio.bands.low`](/native/audio/bands) | `number` | Low frequency band energy (20–120 Hz) | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.lowMid`](/native/audio/bands) | `number` | Low-mid band energy (120–500 Hz) | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.mid`](/native/audio/bands) | `number` | Mid band energy (500–2 kHz) | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.highMid`](/native/audio/bands) | `number` | High-mid band energy (2–6 kHz) | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.high`](/native/audio/bands) | `number` | High band energy (6–16 kHz) | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.lowSmoothed`](/native/audio/bands) | `number` | Smoothed low band | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.lowMidSmoothed`](/native/audio/bands) | `number` | Smoothed low-mid band | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.midSmoothed`](/native/audio/bands) | `number` | Smoothed mid band | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.highMidSmoothed`](/native/audio/bands) | `number` | Smoothed high-mid band | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.highSmoothed`](/native/audio/bands) | `number` | Smoothed high band | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.beat.kick`](/native/audio/beat) | `number` | Kick beat energy | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.snare`](/native/audio/beat) | `number` | Snare beat energy | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.hat`](/native/audio/beat) | `number` | Hi-hat beat energy | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.any`](/native/audio/beat) | `number` | Combined beat energy | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.kickSmoothed`](/native/audio/beat) | `number` | Smoothed kick | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.snareSmoothed`](/native/audio/beat) | `number` | Smoothed snare | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.hatSmoothed`](/native/audio/beat) | `number` | Smoothed hi-hat | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.anySmoothed`](/native/audio/beat) | `number` | Smoothed combined | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.triggers.kick`](/native/audio/beat) | `boolean` | Kick trigger (true for one frame) | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.triggers.snare`](/native/audio/beat) | `boolean` | Snare trigger (true for one frame) | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.triggers.hat`](/native/audio/beat) | `boolean` | Hi-hat trigger (true for one frame) | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.triggers.any`](/native/audio/beat) | `boolean` | Any beat trigger (true for one frame) | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.events`](/native/audio/beat) | `Array<{ type, time, strength }>` | Beat events this frame | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.bpm`](/native/audio/beat) | `number` | Tracked BPM | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.confidence`](/native/audio/beat) | `number` | Beat-tracker confidence 0–1 | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.isLocked`](/native/audio/beat) | `boolean` | Whether beat tracking is locked | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.spectral.brightness`](/native/audio/spectral) | `number` | Spectral brightness 0–1 | [Spectral Analysis](/native/audio/spectral) |\n| [`viji.audio.spectral.flatness`](/native/audio/spectral) | `number` | Spectral flatness 0–1 | [Spectral Analysis](/native/audio/spectral) |\n| [`viji.audio.getFrequencyData()`](/native/audio/frequency-data) | `() => Uint8Array` | Raw FFT frequency bins (0–255) | [Frequency Data](/native/audio/frequency-data) |\n| [`viji.audio.getWaveform()`](/native/audio/waveform) | `() => Float32Array` | Time-domain waveform (-1 to 1) | [Waveform](/native/audio/waveform) |\n\n## Video & CV\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.video.isConnected`](/native/video) | `boolean` | Whether a video source is active | [Overview](/native/video) |\n| [`viji.video.currentFrame`](/native/video/basics) | `OffscreenCanvas \\| ImageBitmap \\| null` | Current video frame | [Video Basics](/native/video/basics) |\n| [`viji.video.frameWidth`](/native/video/basics) | `number` | Frame width in pixels | [Video Basics](/native/video/basics) |\n| [`viji.video.frameHeight`](/native/video/basics) | `number` | Frame height in pixels | [Video Basics](/native/video/basics) |\n| [`viji.video.frameRate`](/native/video/basics) | `number` | Video frame rate | [Video Basics](/native/video/basics) |\n| [`viji.video.getFrameData()`](/native/video/basics) | `() => ImageData \\| null` | Pixel data for the current frame | [Video Basics](/native/video/basics) |\n| [`viji.video.faces`](/native/video/face-detection) | `FaceData[]` | Detected faces | [Face Detection](/native/video/face-detection) |\n| [`viji.video.hands`](/native/video/hand-tracking) | `HandData[]` | Detected hands | [Hand Tracking](/native/video/hand-tracking) |\n| [`viji.video.pose`](/native/video/pose-detection) | `PoseData \\| null` | Detected body pose | [Pose Detection](/native/video/pose-detection) |\n| [`viji.video.segmentation`](/native/video/body-segmentation) | `SegmentationData \\| null` | Body segmentation mask | [Body Segmentation](/native/video/body-segmentation) |\n| [`viji.video.cv.enableFaceDetection(enabled)`](/native/video/face-detection) | `(boolean) => Promise<void>` | Enable/disable face detection | [Face Detection](/native/video/face-detection) |\n| [`viji.video.cv.enableFaceMesh(enabled)`](/native/video/face-mesh) | `(boolean) => Promise<void>` | Enable/disable face mesh | [Face Mesh](/native/video/face-mesh) |\n| [`viji.video.cv.enableEmotionDetection(enabled)`](/native/video/emotion-detection) | `(boolean) => Promise<void>` | Enable/disable emotion detection | [Emotion Detection](/native/video/emotion-detection) |\n| [`viji.video.cv.enableHandTracking(enabled)`](/native/video/hand-tracking) | `(boolean) => Promise<void>` | Enable/disable hand tracking | [Hand Tracking](/native/video/hand-tracking) |\n| [`viji.video.cv.enablePoseDetection(enabled)`](/native/video/pose-detection) | `(boolean) => Promise<void>` | Enable/disable pose detection | [Pose Detection](/native/video/pose-detection) |\n| [`viji.video.cv.enableBodySegmentation(enabled)`](/native/video/body-segmentation) | `(boolean) => Promise<void>` | Enable/disable body segmentation | [Body Segmentation](/native/video/body-segmentation) |\n| [`viji.video.cv.getActiveFeatures()`](/native/video) | `() => CVFeature[]` | List of active CV features | [Overview](/native/video) |\n| [`viji.video.cv.isProcessing()`](/native/video) | `() => boolean` | Whether CV is currently processing | [Overview](/native/video) |\n\n## Mouse\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.mouse.x`](/native/mouse) | `number` | Cursor X position in pixels | [Mouse](/native/mouse) |\n| [`viji.mouse.y`](/native/mouse) | `number` | Cursor Y position in pixels | [Mouse](/native/mouse) |\n| [`viji.mouse.isInCanvas`](/native/mouse) | `boolean` | Whether cursor is inside the canvas | [Mouse](/native/mouse) |\n| [`viji.mouse.isPressed`](/native/mouse) | `boolean` | Whether any button is pressed | [Mouse](/native/mouse) |\n| [`viji.mouse.leftButton`](/native/mouse) | `boolean` | Left button state | [Mouse](/native/mouse) |\n| [`viji.mouse.rightButton`](/native/mouse) | `boolean` | Right button state | [Mouse](/native/mouse) |\n| [`viji.mouse.middleButton`](/native/mouse) | `boolean` | Middle button state | [Mouse](/native/mouse) |\n| [`viji.mouse.deltaX`](/native/mouse) | `number` | Horizontal movement since last frame | [Mouse](/native/mouse) |\n| [`viji.mouse.deltaY`](/native/mouse) | `number` | Vertical movement since last frame | [Mouse](/native/mouse) |\n| [`viji.mouse.wheelDelta`](/native/mouse) | `number` | Scroll wheel delta | [Mouse](/native/mouse) |\n| [`viji.mouse.wheelX`](/native/mouse) | `number` | Horizontal scroll delta | [Mouse](/native/mouse) |\n| [`viji.mouse.wheelY`](/native/mouse) | `number` | Vertical scroll delta | [Mouse](/native/mouse) |\n| [`viji.mouse.wasPressed`](/native/mouse) | `boolean` | True for one frame when pressed | [Mouse](/native/mouse) |\n| [`viji.mouse.wasReleased`](/native/mouse) | `boolean` | True for one frame when released | [Mouse](/native/mouse) |\n| [`viji.mouse.wasMoved`](/native/mouse) | `boolean` | True for one frame when moved | [Mouse](/native/mouse) |\n\n## Keyboard\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.keyboard.isPressed(key)`](/native/keyboard) | `(string) => boolean` | Whether a key is currently held | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.wasPressed(key)`](/native/keyboard) | `(string) => boolean` | True for one frame when pressed | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.wasReleased(key)`](/native/keyboard) | `(string) => boolean` | True for one frame when released | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.activeKeys`](/native/keyboard) | `Set<string>` | All currently held keys | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.pressedThisFrame`](/native/keyboard) | `Set<string>` | Keys pressed this frame | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.releasedThisFrame`](/native/keyboard) | `Set<string>` | Keys released this frame | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.lastKeyPressed`](/native/keyboard) | `string` | Most recently pressed key | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.lastKeyReleased`](/native/keyboard) | `string` | Most recently released key | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.shift`](/native/keyboard) | `boolean` | Shift key state | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.ctrl`](/native/keyboard) | `boolean` | Ctrl/Cmd key state | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.alt`](/native/keyboard) | `boolean` | Alt/Option key state | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.meta`](/native/keyboard) | `boolean` | Meta/Win key state | [Keyboard](/native/keyboard) |\n\n## Touch\n\n> **Note:** The property is `viji.touches` (plural), not `viji.touch`.\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.touches.points`](/native/touch) | `TouchPoint[]` | All active touch points | [Touch](/native/touch) |\n| [`viji.touches.count`](/native/touch) | `number` | Number of active touches | [Touch](/native/touch) |\n| [`viji.touches.started`](/native/touch) | `TouchPoint[]` | Touch points that started this frame | [Touch](/native/touch) |\n| [`viji.touches.moved`](/native/touch) | `TouchPoint[]` | Touch points that moved this frame | [Touch](/native/touch) |\n| [`viji.touches.ended`](/native/touch) | `TouchPoint[]` | Touch points that ended this frame | [Touch](/native/touch) |\n| [`viji.touches.primary`](/native/touch) | `TouchPoint \\| null` | The first active touch point | [Touch](/native/touch) |\n\n**`TouchPoint` fields:** `id`, `x`, `y`, `pressure`, `radius`, `radiusX`, `radiusY`, `rotationAngle`, `force` (numbers); `isInCanvas`, `isNew`, `isActive`, `isEnding` (booleans); `deltaX`, `deltaY` (numbers); `velocity` `{ x, y }`.\n\n## Pointer (Unified)\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.pointer.x`](/native/pointer) | `number` | Primary pointer X position | [Pointer](/native/pointer) |\n| [`viji.pointer.y`](/native/pointer) | `number` | Primary pointer Y position | [Pointer](/native/pointer) |\n| [`viji.pointer.deltaX`](/native/pointer) | `number` | Horizontal movement since last frame | [Pointer](/native/pointer) |\n| [`viji.pointer.deltaY`](/native/pointer) | `number` | Vertical movement since last frame | [Pointer](/native/pointer) |\n| [`viji.pointer.isDown`](/native/pointer) | `boolean` | Whether the pointer is active (click or touch) | [Pointer](/native/pointer) |\n| [`viji.pointer.wasPressed`](/native/pointer) | `boolean` | True for one frame when pressed | [Pointer](/native/pointer) |\n| [`viji.pointer.wasReleased`](/native/pointer) | `boolean` | True for one frame when released | [Pointer](/native/pointer) |\n| [`viji.pointer.isInCanvas`](/native/pointer) | `boolean` | Whether pointer is inside the canvas | [Pointer](/native/pointer) |\n| [`viji.pointer.type`](/native/pointer) | `'mouse' \\| 'touch' \\| 'none'` | Current input source | [Pointer](/native/pointer) |\n\n## Device Sensors\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.device.motion`](/native/sensors) | `DeviceMotionData \\| null` | Accelerometer and gyroscope data | [Device Sensors](/native/sensors) |\n| [`viji.device.orientation`](/native/sensors) | `DeviceOrientationData \\| null` | Device orientation (alpha, beta, gamma) | [Device Sensors](/native/sensors) |\n\n**`DeviceMotionData`:** `acceleration` `{ x, y, z }`, `accelerationIncludingGravity` `{ x, y, z }`, `rotationRate` `{ alpha, beta, gamma }`, `interval`.\n\n**`DeviceOrientationData`:** `alpha`, `beta`, `gamma` (numbers or null), `absolute` (boolean).\n\n## External Devices\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.devices`](/native/external-devices) | `DeviceState[]` | Connected external devices | [Overview](/native/external-devices) |\n| [`viji.devices[i].id`](/native/external-devices) | `string` | Unique device identifier | [Overview](/native/external-devices) |\n| [`viji.devices[i].name`](/native/external-devices) | `string` | User-friendly device name | [Overview](/native/external-devices) |\n| [`viji.devices[i].motion`](/native/external-devices/sensors) | `DeviceMotionData \\| null` | Device accelerometer/gyroscope | [Device Sensors](/native/external-devices/sensors) |\n| [`viji.devices[i].orientation`](/native/external-devices/sensors) | `DeviceOrientationData \\| null` | Device orientation | [Device Sensors](/native/external-devices/sensors) |\n| [`viji.devices[i].video`](/native/external-devices/video) | `VideoAPI \\| null` | Device camera video | [Device Video](/native/external-devices/video) |\n\n## Streams\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `viji.streams` | `VideoAPI[]` | Additional video streams provided by the host |\n\nEach element has the same shape as [`viji.video`](/native/video). Streams are additional video sources injected by the host application — they are used internally by Viji's compositor for mixing multiple scenes together. The array may be empty if the host does not provide additional streams. Your scene can read them the same way it reads `viji.video`.\n\n## Related\n\n- [Quick Start](/native/quickstart) — getting started with the Native renderer\n- [Best Practices](/getting-started/best-practices) — essential patterns for all renderers\n- [Common Mistakes](/getting-started/common-mistakes) — pitfalls to avoid\n- [P5 API Reference](/p5/api-reference) — the same API in the P5 renderer\n- [Shader API Reference](/shader/api-reference) — built-in uniforms for shaders"
1059
+ "markdown": "# API Reference\n\nThis page lists every property and method available on the `viji` object passed to your scene functions. Use it as a quick lookup — each entry links to its dedicated documentation page for full details, examples, and patterns.\n\nNew to Viji? Start with the [Quick Start](/native/quickstart) instead.\n\n## Entry Points\n\n```javascript\nfunction setup(viji) {\n // Called once when the scene starts (optional)\n}\n\nfunction render(viji) {\n // Called every frame\n}\n```\n\n## Canvas & Context\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.canvas`](/native/canvas-context) | `OffscreenCanvas` | The rendering canvas | [Canvas & Context](/native/canvas-context) |\n| [`viji.ctx`](/native/canvas-context) | `OffscreenCanvasRenderingContext2D` | 2D context (after `useContext('2d')`) | [Canvas & Context](/native/canvas-context) |\n| [`viji.gl`](/native/canvas-context) | `WebGLRenderingContext \\| WebGL2RenderingContext` | WebGL context (after `useContext('webgl'\\|'webgl2')`) | [Canvas & Context](/native/canvas-context) |\n| [`viji.width`](/native/canvas-context) | `number` | Canvas width in pixels | [Canvas & Context](/native/canvas-context) |\n| [`viji.height`](/native/canvas-context) | `number` | Canvas height in pixels | [Canvas & Context](/native/canvas-context) |\n| [`viji.useContext(type)`](/native/canvas-context) | `Method` | Request a rendering context: `'2d'`, `'webgl'`, `'webgl2'` | [Canvas & Context](/native/canvas-context) |\n\n## Timing\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.time`](/native/timing) | `number` | Seconds elapsed since the scene started | [Timing](/native/timing) |\n| [`viji.deltaTime`](/native/timing) | `number` | Seconds since the previous frame | [Timing](/native/timing) |\n| [`viji.frameCount`](/native/timing) | `number` | Monotonically increasing frame counter | [Timing](/native/timing) |\n| [`viji.fps`](/native/timing) | `number` | Target FPS based on the host's frame rate mode | [Timing](/native/timing) |\n\n## Parameters\n\nAll parameter methods are called at the top level of your scene file. Read `.value` inside `render()` to get the current value.\n\n| Method | Returns | `.value` Type | Details |\n|--------|---------|---------------|---------|\n| [`viji.slider(default, config)`](/native/parameters/slider) | `SliderParameter` | `number` | [Slider](/native/parameters/slider) |\n| [`viji.color(default, config)`](/native/parameters/color) | `ColorParameter` | `string` (hex) | [Color](/native/parameters/color) |\n| [`viji.toggle(default, config)`](/native/parameters/toggle) | `ToggleParameter` | `boolean` | [Toggle](/native/parameters/toggle) |\n| [`viji.select(default, config)`](/native/parameters/select) | `SelectParameter` | `string \\| number` | [Select](/native/parameters/select) |\n| [`viji.number(default, config)`](/native/parameters/number) | `NumberParameter` | `number` | [Number](/native/parameters/number) |\n| [`viji.text(default, config)`](/native/parameters/text) | `TextParameter` | `string` | [Text](/native/parameters/text) |\n| [`viji.image(null, config)`](/native/parameters/image) | `ImageParameter` | `ImageBitmap \\| null` | [Image](/native/parameters/image) |\n| [`viji.button(config)`](/native/parameters/button) | `ButtonParameter` | `boolean` (true for one frame) | [Button](/native/parameters/button) |\n\nSee [Parameters Overview](/native/parameters) for the declaration pattern, [Grouping](/native/parameters/grouping) and [Categories](/native/parameters/categories) for organization.\n\n## Audio\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.audio.isConnected`](/native/audio) | `boolean` | Whether an audio source is active | [Overview](/native/audio) |\n| [`viji.audio.volume.current`](/native/audio/volume) | `number` | Current RMS volume 0–1 | [Volume](/native/audio/volume) |\n| [`viji.audio.volume.peak`](/native/audio/volume) | `number` | Peak volume 0–1 | [Volume](/native/audio/volume) |\n| [`viji.audio.volume.smoothed`](/native/audio/volume) | `number` | Smoothed volume 0–1 | [Volume](/native/audio/volume) |\n| [`viji.audio.bands.low`](/native/audio/bands) | `number` | Low frequency band energy (20–120 Hz) | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.lowMid`](/native/audio/bands) | `number` | Low-mid band energy (120–500 Hz) | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.mid`](/native/audio/bands) | `number` | Mid band energy (500–2 kHz) | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.highMid`](/native/audio/bands) | `number` | High-mid band energy (2–6 kHz) | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.high`](/native/audio/bands) | `number` | High band energy (6–16 kHz) | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.lowSmoothed`](/native/audio/bands) | `number` | Smoothed low band | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.lowMidSmoothed`](/native/audio/bands) | `number` | Smoothed low-mid band | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.midSmoothed`](/native/audio/bands) | `number` | Smoothed mid band | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.highMidSmoothed`](/native/audio/bands) | `number` | Smoothed high-mid band | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.bands.highSmoothed`](/native/audio/bands) | `number` | Smoothed high band | [Frequency Bands](/native/audio/bands) |\n| [`viji.audio.beat.kick`](/native/audio/beat) | `number` | Kick beat energy | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.snare`](/native/audio/beat) | `number` | Snare beat energy | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.hat`](/native/audio/beat) | `number` | Hi-hat beat energy | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.any`](/native/audio/beat) | `number` | Combined beat energy | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.kickSmoothed`](/native/audio/beat) | `number` | Smoothed kick | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.snareSmoothed`](/native/audio/beat) | `number` | Smoothed snare | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.hatSmoothed`](/native/audio/beat) | `number` | Smoothed hi-hat | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.anySmoothed`](/native/audio/beat) | `number` | Smoothed combined | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.triggers.kick`](/native/audio/beat) | `boolean` | Kick trigger (true for one frame) | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.triggers.snare`](/native/audio/beat) | `boolean` | Snare trigger (true for one frame) | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.triggers.hat`](/native/audio/beat) | `boolean` | Hi-hat trigger (true for one frame) | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.triggers.any`](/native/audio/beat) | `boolean` | Any beat trigger (true for one frame) | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.events`](/native/audio/beat) | `Array<{ type, time, strength }>` | Beat events this frame | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.bpm`](/native/audio/beat) | `number` | Tracked BPM | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.confidence`](/native/audio/beat) | `number` | Beat-tracker confidence 0–1 | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.beat.isLocked`](/native/audio/beat) | `boolean` | Whether beat tracking is locked | [Beat Detection](/native/audio/beat) |\n| [`viji.audio.spectral.brightness`](/native/audio/spectral) | `number` | Spectral brightness 0–1 | [Spectral Analysis](/native/audio/spectral) |\n| [`viji.audio.spectral.flatness`](/native/audio/spectral) | `number` | Spectral flatness 0–1 | [Spectral Analysis](/native/audio/spectral) |\n| [`viji.audio.getFrequencyData()`](/native/audio/frequency-data) | `() => Uint8Array` | Raw FFT frequency bins (0–255) | [Frequency Data](/native/audio/frequency-data) |\n| [`viji.audio.getWaveform()`](/native/audio/waveform) | `() => Float32Array` | Time-domain waveform (-1 to 1) | [Waveform](/native/audio/waveform) |\n\n## Video & CV\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.video.isConnected`](/native/video) | `boolean` | Whether a video source is active | [Overview](/native/video) |\n| [`viji.video.currentFrame`](/native/video/basics) | `OffscreenCanvas \\| ImageBitmap \\| null` | Current video frame | [Video Basics](/native/video/basics) |\n| [`viji.video.frameWidth`](/native/video/basics) | `number` | Frame width in pixels | [Video Basics](/native/video/basics) |\n| [`viji.video.frameHeight`](/native/video/basics) | `number` | Frame height in pixels | [Video Basics](/native/video/basics) |\n| [`viji.video.frameRate`](/native/video/basics) | `number` | Video frame rate | [Video Basics](/native/video/basics) |\n| [`viji.video.getFrameData()`](/native/video/basics) | `() => ImageData \\| null` | Pixel data for the current frame | [Video Basics](/native/video/basics) |\n| [`viji.video.faces`](/native/video/face-detection) | `FaceData[]` | Detected faces | [Face Detection](/native/video/face-detection) |\n| [`viji.video.hands`](/native/video/hand-tracking) | `HandData[]` | Detected hands | [Hand Tracking](/native/video/hand-tracking) |\n| [`viji.video.pose`](/native/video/pose-detection) | `PoseData \\| null` | Detected body pose | [Pose Detection](/native/video/pose-detection) |\n| [`viji.video.segmentation`](/native/video/body-segmentation) | `SegmentationData \\| null` | Body segmentation mask | [Body Segmentation](/native/video/body-segmentation) |\n| [`viji.video.cv.enableFaceDetection(enabled)`](/native/video/face-detection) | `(boolean) => Promise<void>` | Enable/disable face detection | [Face Detection](/native/video/face-detection) |\n| [`viji.video.cv.enableFaceMesh(enabled)`](/native/video/face-mesh) | `(boolean) => Promise<void>` | Enable/disable face mesh | [Face Mesh](/native/video/face-mesh) |\n| [`viji.video.cv.enableEmotionDetection(enabled)`](/native/video/emotion-detection) | `(boolean) => Promise<void>` | Enable/disable emotion detection | [Emotion Detection](/native/video/emotion-detection) |\n| [`viji.video.cv.enableHandTracking(enabled)`](/native/video/hand-tracking) | `(boolean) => Promise<void>` | Enable/disable hand tracking | [Hand Tracking](/native/video/hand-tracking) |\n| [`viji.video.cv.enablePoseDetection(enabled)`](/native/video/pose-detection) | `(boolean) => Promise<void>` | Enable/disable pose detection | [Pose Detection](/native/video/pose-detection) |\n| [`viji.video.cv.enableBodySegmentation(enabled)`](/native/video/body-segmentation) | `(boolean) => Promise<void>` | Enable/disable body segmentation | [Body Segmentation](/native/video/body-segmentation) |\n| [`viji.video.cv.getActiveFeatures()`](/native/video) | `() => CVFeature[]` | List of active CV features | [Overview](/native/video) |\n| [`viji.video.cv.isProcessing()`](/native/video) | `() => boolean` | Whether CV is currently processing | [Overview](/native/video) |\n\n## Mouse\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.mouse.x`](/native/mouse) | `number` | Cursor X position in pixels | [Mouse](/native/mouse) |\n| [`viji.mouse.y`](/native/mouse) | `number` | Cursor Y position in pixels | [Mouse](/native/mouse) |\n| [`viji.mouse.isInCanvas`](/native/mouse) | `boolean` | Whether cursor is inside the canvas | [Mouse](/native/mouse) |\n| [`viji.mouse.isPressed`](/native/mouse) | `boolean` | Whether any button is pressed | [Mouse](/native/mouse) |\n| [`viji.mouse.leftButton`](/native/mouse) | `boolean` | Left button state | [Mouse](/native/mouse) |\n| [`viji.mouse.rightButton`](/native/mouse) | `boolean` | Right button state | [Mouse](/native/mouse) |\n| [`viji.mouse.middleButton`](/native/mouse) | `boolean` | Middle button state | [Mouse](/native/mouse) |\n| [`viji.mouse.deltaX`](/native/mouse) | `number` | Horizontal movement since last frame | [Mouse](/native/mouse) |\n| [`viji.mouse.deltaY`](/native/mouse) | `number` | Vertical movement since last frame | [Mouse](/native/mouse) |\n| [`viji.mouse.wheelDelta`](/native/mouse) | `number` | Scroll wheel delta | [Mouse](/native/mouse) |\n| [`viji.mouse.wheelX`](/native/mouse) | `number` | Horizontal scroll delta | [Mouse](/native/mouse) |\n| [`viji.mouse.wheelY`](/native/mouse) | `number` | Vertical scroll delta | [Mouse](/native/mouse) |\n| [`viji.mouse.wasPressed`](/native/mouse) | `boolean` | True for one frame when pressed | [Mouse](/native/mouse) |\n| [`viji.mouse.wasReleased`](/native/mouse) | `boolean` | True for one frame when released | [Mouse](/native/mouse) |\n| [`viji.mouse.wasMoved`](/native/mouse) | `boolean` | True for one frame when moved | [Mouse](/native/mouse) |\n\n## Keyboard\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.keyboard.isPressed(key)`](/native/keyboard) | `(string) => boolean` | Whether a key is currently held | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.wasPressed(key)`](/native/keyboard) | `(string) => boolean` | True for one frame when pressed | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.wasReleased(key)`](/native/keyboard) | `(string) => boolean` | True for one frame when released | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.activeKeys`](/native/keyboard) | `Set<string>` | All currently held keys | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.pressedThisFrame`](/native/keyboard) | `Set<string>` | Keys pressed this frame | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.releasedThisFrame`](/native/keyboard) | `Set<string>` | Keys released this frame | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.lastKeyPressed`](/native/keyboard) | `string` | Most recently pressed key | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.lastKeyReleased`](/native/keyboard) | `string` | Most recently released key | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.shift`](/native/keyboard) | `boolean` | Shift key state | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.ctrl`](/native/keyboard) | `boolean` | Ctrl/Cmd key state | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.alt`](/native/keyboard) | `boolean` | Alt/Option key state | [Keyboard](/native/keyboard) |\n| [`viji.keyboard.meta`](/native/keyboard) | `boolean` | Meta/Win key state | [Keyboard](/native/keyboard) |\n\n## Touch\n\n> **Note:** The property is `viji.touches` (plural), not `viji.touch`.\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.touches.points`](/native/touch) | `TouchPoint[]` | All active touch points | [Touch](/native/touch) |\n| [`viji.touches.count`](/native/touch) | `number` | Number of active touches | [Touch](/native/touch) |\n| [`viji.touches.started`](/native/touch) | `TouchPoint[]` | Touch points that started this frame | [Touch](/native/touch) |\n| [`viji.touches.moved`](/native/touch) | `TouchPoint[]` | Touch points that moved this frame | [Touch](/native/touch) |\n| [`viji.touches.ended`](/native/touch) | `TouchPoint[]` | Touch points that ended this frame | [Touch](/native/touch) |\n| [`viji.touches.primary`](/native/touch) | `TouchPoint \\| null` | The first active touch point | [Touch](/native/touch) |\n\n**`TouchPoint` fields:** `id`, `x`, `y`, `pressure`, `radius`, `radiusX`, `radiusY`, `rotationAngle`, `force` (numbers); `isInCanvas`, `isNew`, `isActive`, `isEnding` (booleans); `deltaX`, `deltaY` (numbers); `velocity` `{ x, y }`.\n\n## Pointer (Unified)\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.pointer.x`](/native/pointer) | `number` | Primary pointer X position | [Pointer](/native/pointer) |\n| [`viji.pointer.y`](/native/pointer) | `number` | Primary pointer Y position | [Pointer](/native/pointer) |\n| [`viji.pointer.deltaX`](/native/pointer) | `number` | Horizontal movement since last frame | [Pointer](/native/pointer) |\n| [`viji.pointer.deltaY`](/native/pointer) | `number` | Vertical movement since last frame | [Pointer](/native/pointer) |\n| [`viji.pointer.isDown`](/native/pointer) | `boolean` | Whether the pointer is active (click or touch) | [Pointer](/native/pointer) |\n| [`viji.pointer.wasPressed`](/native/pointer) | `boolean` | True for one frame when pressed | [Pointer](/native/pointer) |\n| [`viji.pointer.wasReleased`](/native/pointer) | `boolean` | True for one frame when released | [Pointer](/native/pointer) |\n| [`viji.pointer.isInCanvas`](/native/pointer) | `boolean` | Whether pointer is inside the canvas | [Pointer](/native/pointer) |\n| [`viji.pointer.type`](/native/pointer) | `'mouse' \\| 'touch' \\| 'none'` | Current input source | [Pointer](/native/pointer) |\n\n## Device Sensors\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.device.motion`](/native/sensors) | `DeviceMotionData \\| null` | Accelerometer and gyroscope data | [Device Sensors](/native/sensors) |\n| [`viji.device.orientation`](/native/sensors) | `DeviceOrientationData \\| null` | Device orientation (alpha, beta, gamma) | [Device Sensors](/native/sensors) |\n\n**`DeviceMotionData`:** `acceleration` `{ x, y, z }`, `accelerationIncludingGravity` `{ x, y, z }`, `rotationRate` `{ alpha, beta, gamma }`, `interval`.\n\n**`DeviceOrientationData`:** `alpha`, `beta`, `gamma` (numbers or null), `absolute` (boolean).\n\n## External Devices\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.devices`](/native/external-devices) | `DeviceState[]` | Connected external devices | [Overview](/native/external-devices) |\n| [`viji.devices[i].id`](/native/external-devices) | `string` | Unique device identifier | [Overview](/native/external-devices) |\n| [`viji.devices[i].name`](/native/external-devices) | `string` | User-friendly device name | [Overview](/native/external-devices) |\n| [`viji.devices[i].motion`](/native/external-devices/sensors) | `DeviceMotionData \\| null` | Device accelerometer/gyroscope | [Device Sensors](/native/external-devices/sensors) |\n| [`viji.devices[i].orientation`](/native/external-devices/sensors) | `DeviceOrientationData \\| null` | Device orientation | [Device Sensors](/native/external-devices/sensors) |\n| [`viji.devices[i].video`](/native/external-devices/video) | `VideoAPI \\| null` | Device camera video | [Device Video](/native/external-devices/video) |\n| [`viji.devices[i].audio`](/native/external-devices/audio) | `AudioStreamAPI \\| null` | Device audio stream | [Device Audio](/native/external-devices/audio) |\n\n## Streams\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `viji.videoStreams` | `VideoAPI[]` | Additional video streams provided by the host |\n| `viji.audioStreams` | `AudioStreamAPI[]` | Additional audio streams provided by the host |\n\nEach `videoStreams` element has the same shape as [`viji.video`](/native/video). Each `audioStreams` element provides lightweight audio analysis (volume, frequency bands, spectral features) but does **not** include beat detection or BPM tracking — see [`viji.audio`](/native/audio) for full analysis on the main stream.\n\nBoth arrays may be empty if the host does not provide additional streams. Audio streams from external devices appear on `device.audio`, not in `viji.audioStreams`.\n\n## Related\n\n- [Quick Start](/native/quickstart) — getting started with the Native renderer\n- [Best Practices](/getting-started/best-practices) — essential patterns for all renderers\n- [Common Mistakes](/getting-started/common-mistakes) — pitfalls to avoid\n- [P5 API Reference](/p5/api-reference) — the same API in the P5 renderer\n- [Shader API Reference](/shader/api-reference) — built-in uniforms for shaders"
1039
1060
  }
1040
1061
  ]
1041
1062
  },
@@ -1354,7 +1375,7 @@ export const docsApi = {
1354
1375
  },
1355
1376
  {
1356
1377
  "type": "text",
1357
- "markdown": "## Related\n\n- [Connection & Lifecycle](connection/)\n- [Volume](volume/)\n- [Frequency Bands](bands/)\n- [Beat Detection](beat/)\n- [Spectral Analysis](spectral/)\n- [Frequency Data](frequency-data/)\n- [Waveform](waveform/)\n- [P5 Audio](/p5/audio)\n- [Shader Audio Uniforms](/shader/audio)"
1378
+ "markdown": "## Additional Audio Streams\n\nBeyond the main `viji.audio` stream, your scene may receive additional audio sources through `viji.audioStreams[]` and `device.audio`:\n\n- **`viji.audioStreams[i]`** — Additional audio streams provided by the host (e.g., for multi-source mixing). Each provides lightweight analysis: volume, frequency bands, and spectral features.\n- **`device.audio`** — Audio from externally connected devices. Same lightweight interface.\n\nThese use the `AudioStreamAPI` interface — a subset of the full `AudioAPI` documented on this page. They include volume, bands, spectral data, frequency data, and waveform, but do **not** include beat detection, BPM, triggers, or events.\n\nSee [Device Audio](/native/external-devices/audio) for device-specific audio documentation.\n\n## Related\n\n- [Connection & Lifecycle](connection/)\n- [Volume](volume/)\n- [Frequency Bands](bands/)\n- [Beat Detection](beat/)\n- [Spectral Analysis](spectral/)\n- [Frequency Data](frequency-data/)\n- [Waveform](waveform/)\n- [P5 Audio](/p5/audio)\n- [Shader Audio Uniforms](/shader/audio)"
1358
1379
  }
1359
1380
  ]
1360
1381
  },
@@ -1396,7 +1417,7 @@ export const docsApi = {
1396
1417
  },
1397
1418
  {
1398
1419
  "type": "text",
1399
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Connection & Lifecycle](../connection/)\n- [Frequency Bands](../bands/)\n- [Beat Detection](../beat/)"
1420
+ "markdown": "> [!NOTE]\n> The same volume data (`current`, `peak`, `smoothed`) is available on additional audio streams via `viji.audioStreams[i].volume` and on device audio via `device.audio.volume`.\n\n## Related\n\n- [Audio Overview](../)\n- [Connection & Lifecycle](../connection/)\n- [Frequency Bands](../bands/)\n- [Beat Detection](../beat/)"
1400
1421
  }
1401
1422
  ]
1402
1423
  },
@@ -1417,7 +1438,7 @@ export const docsApi = {
1417
1438
  },
1418
1439
  {
1419
1440
  "type": "text",
1420
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Volume](../volume/)\n- [Beat Detection](../beat/)\n- [Frequency Data](../frequency-data/)\n- [Spectral Analysis](../spectral/)"
1441
+ "markdown": "> [!NOTE]\n> The same band data is available on additional audio streams via `viji.audioStreams[i].bands` and on device audio via `device.audio.bands`.\n\n## Related\n\n- [Audio Overview](../)\n- [Volume](../volume/)\n- [Beat Detection](../beat/)\n- [Frequency Data](../frequency-data/)\n- [Spectral Analysis](../spectral/)"
1421
1442
  }
1422
1443
  ]
1423
1444
  },
@@ -1438,7 +1459,7 @@ export const docsApi = {
1438
1459
  },
1439
1460
  {
1440
1461
  "type": "text",
1441
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)\n- [Spectral Analysis](../spectral/)"
1462
+ "markdown": "> [!NOTE]\n> Beat detection is only available on the main audio stream (`viji.audio`). Additional audio streams (`viji.audioStreams[]`) and device audio (`device.audio`) do **not** include beat detection, BPM, triggers, or events.\n\n## Related\n\n- [Audio Overview](../)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)\n- [Spectral Analysis](../spectral/)"
1442
1463
  }
1443
1464
  ]
1444
1465
  },
@@ -1459,7 +1480,7 @@ export const docsApi = {
1459
1480
  },
1460
1481
  {
1461
1482
  "type": "text",
1462
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Frequency Bands](../bands/)\n- [Frequency Data](../frequency-data/)\n- [Volume](../volume/)"
1483
+ "markdown": "> [!NOTE]\n> The same spectral data is available on additional audio streams via `viji.audioStreams[i].spectral` and `device.audio.spectral`.\n\n## Related\n\n- [Audio Overview](../)\n- [Frequency Bands](../bands/)\n- [Frequency Data](../frequency-data/)\n- [Volume](../volume/)"
1463
1484
  }
1464
1485
  ]
1465
1486
  },
@@ -1480,7 +1501,7 @@ export const docsApi = {
1480
1501
  },
1481
1502
  {
1482
1503
  "type": "text",
1483
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Frequency Bands](../bands/)\n- [Waveform](../waveform/)\n- [Spectral Analysis](../spectral/)"
1504
+ "markdown": "> [!NOTE]\n> Frequency data is also available on additional audio streams via `viji.audioStreams[i].getFrequencyData()` and `device.audio.getFrequencyData()`.\n\n## Related\n\n- [Audio Overview](../)\n- [Frequency Bands](../bands/)\n- [Waveform](../waveform/)\n- [Spectral Analysis](../spectral/)"
1484
1505
  }
1485
1506
  ]
1486
1507
  },
@@ -1501,7 +1522,7 @@ export const docsApi = {
1501
1522
  },
1502
1523
  {
1503
1524
  "type": "text",
1504
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Frequency Data](../frequency-data/)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)"
1525
+ "markdown": "> [!NOTE]\n> Waveform data is also available on additional audio streams via `viji.audioStreams[i].getWaveform()` and `device.audio.getWaveform()`.\n\n## Related\n\n- [Audio Overview](../)\n- [Frequency Data](../frequency-data/)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)"
1505
1526
  }
1506
1527
  ]
1507
1528
  },
@@ -1848,7 +1869,7 @@ export const docsApi = {
1848
1869
  "content": [
1849
1870
  {
1850
1871
  "type": "text",
1851
- "markdown": "# External Devices\n\n`viji.devices` provides access to externally connected devices (phones, tablets, or other hardware) linked to your installation through the host platform.\n\n> [!NOTE]\n> External devices are managed entirely by the host application. Artists cannot control device connections — you only read the current state each render cycle. Devices appear and disappear from the array dynamically as they connect and disconnect.\n\n## API Reference\n\n### DeviceState (`viji.devices[i]`)\n\n| Property | Type | Description |\n|----------|------|-------------|\n| `id` | `string` | Unique device identifier (assigned by host) |\n| `name` | `string` | User-friendly device name (assigned by host) |\n| `motion` | `DeviceMotionData \\| null` | Device accelerometer and gyroscope data |\n| `orientation` | `DeviceOrientationData \\| null` | Device spatial orientation |\n| `video` | `VideoAPI \\| null` | Device camera video feed, or `null` if no camera connected |\n\n### Device Limits\n\nUp to **8 external devices** can be connected simultaneously. The `viji.devices` array contains only currently connected devices.\n\n## Default Values\n\n- `viji.devices` → `[]` (empty array) when no devices are connected\n- `device.motion` → `null` when the device has no sensor data\n- `device.orientation` → `null` when the device has no orientation data\n- `device.video` → `null` when the device has no camera stream\n\n## Guard Patterns\n\nAlways check array length and null properties:\n\n```javascript\nfunction render(viji) {\n if (viji.devices.length === 0) return;\n\n for (const device of viji.devices) {\n // Check for video\n if (device.video?.isConnected) {\n // Draw device camera feed\n }\n\n // Check for sensors\n if (device.motion?.acceleration) {\n // Use device acceleration\n }\n }\n}\n```\n\n## Basic Example"
1872
+ "markdown": "# External Devices\n\n`viji.devices` provides access to externally connected devices (phones, tablets, or other hardware) linked to your installation through the host platform.\n\n> [!NOTE]\n> External devices are managed entirely by the host application. Artists cannot control device connections — you only read the current state each render cycle. Devices appear and disappear from the array dynamically as they connect and disconnect.\n\n## API Reference\n\n### DeviceState (`viji.devices[i]`)\n\n| Property | Type | Description |\n|----------|------|-------------|\n| `id` | `string` | Unique device identifier (assigned by host) |\n| `name` | `string` | User-friendly device name (assigned by host) |\n| `motion` | `DeviceMotionData \\| null` | Device accelerometer and gyroscope data |\n| `orientation` | `DeviceOrientationData \\| null` | Device spatial orientation |\n| `video` | `VideoAPI \\| null` | Device camera video feed, or `null` if no camera connected |\n| `audio` | `AudioStreamAPI \\| null` | Device audio stream, or `null` if no audio connected |\n\n### Device Limits\n\nUp to **8 external devices** can be connected simultaneously. The `viji.devices` array contains only currently connected devices.\n\n## Default Values\n\n- `viji.devices` → `[]` (empty array) when no devices are connected\n- `device.motion` → `null` when the device has no sensor data\n- `device.orientation` → `null` when the device has no orientation data\n- `device.video` → `null` when the device has no camera stream\n- `device.audio` → `null` when the device has no audio stream\n\n## Guard Patterns\n\nAlways check array length and null properties:\n\n```javascript\nfunction render(viji) {\n if (viji.devices.length === 0) return;\n\n for (const device of viji.devices) {\n // Check for video\n if (device.video?.isConnected) {\n // Draw device camera feed\n }\n\n // Check for audio\n if (device.audio?.isConnected) {\n // Use device audio data\n }\n\n // Check for sensors\n if (device.motion?.acceleration) {\n // Use device acceleration\n }\n }\n}\n```\n\n## Basic Example"
1852
1873
  },
1853
1874
  {
1854
1875
  "type": "live-example",
@@ -1861,7 +1882,7 @@ export const docsApi = {
1861
1882
  },
1862
1883
  {
1863
1884
  "type": "text",
1864
- "markdown": "## Common Patterns\n\n### Display Device Count\n\n```javascript\nfunction render(viji) {\n const ctx = viji.canvas.getContext('2d');\n const count = viji.devices.length;\n\n ctx.fillStyle = '#111';\n ctx.fillRect(0, 0, viji.canvas.width, viji.canvas.height);\n\n ctx.fillStyle = '#fff';\n ctx.font = '24px sans-serif';\n ctx.textAlign = 'center';\n ctx.fillText(\n `${count} device${count !== 1 ? 's' : ''} connected`,\n viji.canvas.width / 2,\n viji.canvas.height / 2\n );\n}\n```\n\n### Find Device by Name\n\n```javascript\nfunction render(viji) {\n const phone = viji.devices.find(d => d.name.includes('Phone'));\n if (phone) {\n // Use phone-specific data\n }\n}\n```\n\n### Iterate All Devices\n\n```javascript\nfunction render(viji) {\n viji.devices.forEach((device, index) => {\n const hasVideo = device.video?.isConnected ?? false;\n const hasSensors = device.motion !== null;\n // Render device status at position based on index\n });\n}\n```\n\n## What's Available on Each Device\n\n| Feature | Access | Notes |\n|---------|--------|-------|\n| **Identity** | `device.id`, `device.name` | Always available |\n| **Sensors** | `device.motion`, `device.orientation` | See [Device Sensors](sensors/) |\n| **Video** | `device.video` | See [Device Video](video/) |\n\n> [!WARNING]\n> Device video does **not** support Computer Vision (CV) features. CV processing (face detection, hand tracking, etc.) is only available on the main video stream (`viji.video`). The `device.video` object provides video frames only.\n\n## Related\n\n- [Device Video](video/) — accessing camera feeds from connected devices\n- [Device Sensors](sensors/) — accelerometer and orientation from connected devices\n- [Device Sensors (Internal)](../sensors/) — sensors from the device running the scene\n- [P5 External Devices](/p5/external-devices) — same API in the P5 renderer\n- [Shader External Device Uniforms](/shader/external-devices) — GLSL uniforms for external devices"
1885
+ "markdown": "## Common Patterns\n\n### Display Device Count\n\n```javascript\nfunction render(viji) {\n const ctx = viji.canvas.getContext('2d');\n const count = viji.devices.length;\n\n ctx.fillStyle = '#111';\n ctx.fillRect(0, 0, viji.canvas.width, viji.canvas.height);\n\n ctx.fillStyle = '#fff';\n ctx.font = '24px sans-serif';\n ctx.textAlign = 'center';\n ctx.fillText(\n `${count} device${count !== 1 ? 's' : ''} connected`,\n viji.canvas.width / 2,\n viji.canvas.height / 2\n );\n}\n```\n\n### Find Device by Name\n\n```javascript\nfunction render(viji) {\n const phone = viji.devices.find(d => d.name.includes('Phone'));\n if (phone) {\n // Use phone-specific data\n }\n}\n```\n\n### Iterate All Devices\n\n```javascript\nfunction render(viji) {\n viji.devices.forEach((device, index) => {\n const hasVideo = device.video?.isConnected ?? false;\n const hasSensors = device.motion !== null;\n // Render device status at position based on index\n });\n}\n```\n\n## What's Available on Each Device\n\n| Feature | Access | Notes |\n|---------|--------|-------|\n| **Identity** | `device.id`, `device.name` | Always available |\n| **Sensors** | `device.motion`, `device.orientation` | See [Device Sensors](sensors/) |\n| **Video** | `device.video` | See [Device Video](video/) |\n| **Audio** | `device.audio` | See [Device Audio](audio/) |\n\n> [!WARNING]\n> Device video does **not** support Computer Vision (CV) features. CV processing (face detection, hand tracking, etc.) is only available on the main video stream (`viji.video`). The `device.video` object provides video frames only.\n\n> [!NOTE]\n> Device audio provides **lightweight analysis** only — volume, frequency bands, and spectral features. Beat detection, BPM tracking, and onset events are only available on the main audio stream (`viji.audio`).\n\n## Related\n\n- [Device Audio](audio/) — audio analysis from connected devices\n- [Device Video](video/) — accessing camera feeds from connected devices\n- [Device Sensors](sensors/) — accelerometer and orientation from connected devices\n- [Device Sensors (Internal)](../sensors/) — sensors from the device running the scene\n- [P5 External Devices](/p5/external-devices) — same API in the P5 renderer\n- [Shader External Device Uniforms](/shader/external-devices) — GLSL uniforms for external devices"
1865
1886
  }
1866
1887
  ]
1867
1888
  },
@@ -1913,6 +1934,17 @@ export const docsApi = {
1913
1934
  }
1914
1935
  ]
1915
1936
  },
1937
+ "native-ext-audio": {
1938
+ "id": "native-ext-audio",
1939
+ "title": "Device Audio",
1940
+ "description": "Lightweight audio analysis from externally connected devices — volume, bands, spectral features, and raw FFT/waveform via AudioStreamAPI.",
1941
+ "content": [
1942
+ {
1943
+ "type": "text",
1944
+ "markdown": "# Device Audio\n\nEach entry in `viji.devices` may expose **`device.audio`**: an [`AudioStreamAPI`](../../audio/) (or `null` when the host has not attached an audio source for that device).\n\n## Behavior\n\n- Check **`device.audio?.isConnected`** before reading values.\n- **Lightweight only:** `isConnected`, `volume`, `bands` (including smoothed), `spectral`, `getFrequencyData()`, `getWaveform()` — same subset as [`viji.audioStreams`](../../audio/#additional-audio-streams) entries.\n- **Not available:** beat energy, triggers, BPM, or beat events (those exist only on the main [`viji.audio`](../../audio/) stream).\n\n## Related\n\n- [External Devices — Overview](../)\n- [Native Audio](../../audio/)\n- [Shader: Audio stream uniforms](/shader/api-reference#audio-streams)"
1945
+ }
1946
+ ]
1947
+ },
1916
1948
  "p5-quickstart": {
1917
1949
  "id": "p5-quickstart",
1918
1950
  "title": "p5-quickstart",
@@ -1920,7 +1952,7 @@ export const docsApi = {
1920
1952
  "content": [
1921
1953
  {
1922
1954
  "type": "text",
1923
- "markdown": "# P5.js Quick Start\r\n\r\nThe P5.js renderer gives you the familiar Processing/P5.js drawing API. Viji loads P5.js automatically — no installation needed.\r\n\r\n> [!IMPORTANT]\r\n> P5 and shader scenes must declare their renderer type as the first comment:\r\n> ```\r\n> // @renderer p5\r\n> ```\r\n> Without this directive, the scene defaults to the native renderer.\r\n\r\n## Your First Scene"
1955
+ "markdown": "# P5.js Quick Start\r\n\r\nThe P5.js renderer gives you the familiar Processing/P5.js drawing API. Viji loads P5.js automatically — no installation needed.\r\n\r\n> [!IMPORTANT]\r\n> P5 and shader scenes must declare their renderer type as the first comment:\r\n> ```\r\n> // @renderer p5\r\n> ```\r\n> For **WEBGL** on the main canvas, use `// @renderer p5 webgl` instead. Without a P5 directive, the scene defaults to the native renderer.\r\n\r\n## Your First Scene"
1924
1956
  },
1925
1957
  {
1926
1958
  "type": "live-example",
@@ -1941,7 +1973,7 @@ export const docsApi = {
1941
1973
  "content": [
1942
1974
  {
1943
1975
  "type": "text",
1944
- "markdown": "# API Reference\n\nThis page lists every property and method available on the `viji` object passed to your P5 scene functions. The `viji` API is identical to the [Native renderer](/native/api-reference) — the difference is that P5 scenes also receive a `p5` instance as the second argument.\n\nNew to Viji P5? Start with the [Quick Start](/p5/quickstart) instead.\n\n## Entry Points\n\n```javascript\n// @renderer p5\n\nfunction setup(viji, p5) {\n // Called once when the scene starts (optional)\n}\n\nfunction render(viji, p5) {\n // Called every frame\n}\n```\n\nThe `p5` parameter is a full [P5.js](https://p5js.org/reference/) instance (v1.9.4) in instance mode. All P5 drawing methods (`p5.rect()`, `p5.fill()`, `p5.ellipse()`, etc.) are accessed through it. See [Drawing with P5](/p5/drawing) for Viji-specific drawing patterns.\n\n## Canvas & Context\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.canvas`](/p5/canvas-resolution) | `OffscreenCanvas` | The rendering canvas | [Canvas & Resolution](/p5/canvas-resolution) |\n| [`viji.width`](/p5/canvas-resolution) | `number` | Canvas width in pixels | [Canvas & Resolution](/p5/canvas-resolution) |\n| [`viji.height`](/p5/canvas-resolution) | `number` | Canvas height in pixels | [Canvas & Resolution](/p5/canvas-resolution) |\n\n> [!WARNING]\n> `viji.useContext()` is **not available** in P5 scenes. The canvas and 2D rendering context are managed by P5 internally. Calling `useContext()` would conflict with P5's rendering pipeline.\n\n## Timing\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.time`](/p5/timing) | `number` | Seconds elapsed since the scene started | [Timing](/p5/timing) |\n| [`viji.deltaTime`](/p5/timing) | `number` | Seconds since the previous frame | [Timing](/p5/timing) |\n| [`viji.frameCount`](/p5/timing) | `number` | Monotonically increasing frame counter | [Timing](/p5/timing) |\n| [`viji.fps`](/p5/timing) | `number` | Target FPS based on the host's frame rate mode | [Timing](/p5/timing) |\n\n## Parameters\n\nAll parameter methods are called at the top level of your scene file. Read `.value` inside `render()` to get the current value.\n\n| Method | Returns | `.value` Type | Details |\n|--------|---------|---------------|---------|\n| [`viji.slider(default, config)`](/p5/parameters/slider) | `SliderParameter` | `number` | [Slider](/p5/parameters/slider) |\n| [`viji.color(default, config)`](/p5/parameters/color) | `ColorParameter` | `string` (hex) | [Color](/p5/parameters/color) |\n| [`viji.toggle(default, config)`](/p5/parameters/toggle) | `ToggleParameter` | `boolean` | [Toggle](/p5/parameters/toggle) |\n| [`viji.select(default, config)`](/p5/parameters/select) | `SelectParameter` | `string \\| number` | [Select](/p5/parameters/select) |\n| [`viji.number(default, config)`](/p5/parameters/number) | `NumberParameter` | `number` | [Number](/p5/parameters/number) |\n| [`viji.text(default, config)`](/p5/parameters/text) | `TextParameter` | `string` | [Text](/p5/parameters/text) |\n| [`viji.image(null, config)`](/p5/parameters/image) | `ImageParameter` | `ImageBitmap \\| null` | [Image](/p5/parameters/image) |\n| [`viji.button(config)`](/p5/parameters/button) | `ButtonParameter` | `boolean` (true for one frame) | [Button](/p5/parameters/button) |\n\n> [!NOTE]\n> Image parameters have a `.value.p5` property for use with `p5.image()`. See [Drawing with P5 — Image Parameters](/p5/drawing) for the pattern.\n\nSee [Parameters Overview](/p5/parameters) for the declaration pattern, [Grouping](/p5/parameters/grouping) and [Categories](/p5/parameters/categories) for organization.\n\n## Audio\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.audio.isConnected`](/p5/audio) | `boolean` | Whether an audio source is active | [Overview](/p5/audio) |\n| [`viji.audio.volume.current`](/p5/audio/volume) | `number` | Current RMS volume 0–1 | [Volume](/p5/audio/volume) |\n| [`viji.audio.volume.peak`](/p5/audio/volume) | `number` | Peak volume 0–1 | [Volume](/p5/audio/volume) |\n| [`viji.audio.volume.smoothed`](/p5/audio/volume) | `number` | Smoothed volume 0–1 | [Volume](/p5/audio/volume) |\n| [`viji.audio.bands.low`](/p5/audio/bands) | `number` | Low frequency band energy (20–120 Hz) | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.lowMid`](/p5/audio/bands) | `number` | Low-mid band energy (120–500 Hz) | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.mid`](/p5/audio/bands) | `number` | Mid band energy (500–2 kHz) | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.highMid`](/p5/audio/bands) | `number` | High-mid band energy (2–6 kHz) | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.high`](/p5/audio/bands) | `number` | High band energy (6–16 kHz) | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.lowSmoothed`](/p5/audio/bands) | `number` | Smoothed low band | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.lowMidSmoothed`](/p5/audio/bands) | `number` | Smoothed low-mid band | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.midSmoothed`](/p5/audio/bands) | `number` | Smoothed mid band | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.highMidSmoothed`](/p5/audio/bands) | `number` | Smoothed high-mid band | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.highSmoothed`](/p5/audio/bands) | `number` | Smoothed high band | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.beat.kick`](/p5/audio/beat) | `number` | Kick beat energy | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.snare`](/p5/audio/beat) | `number` | Snare beat energy | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.hat`](/p5/audio/beat) | `number` | Hi-hat beat energy | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.any`](/p5/audio/beat) | `number` | Combined beat energy | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.kickSmoothed`](/p5/audio/beat) | `number` | Smoothed kick | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.snareSmoothed`](/p5/audio/beat) | `number` | Smoothed snare | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.hatSmoothed`](/p5/audio/beat) | `number` | Smoothed hi-hat | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.anySmoothed`](/p5/audio/beat) | `number` | Smoothed combined | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.triggers.kick`](/p5/audio/beat) | `boolean` | Kick trigger (true for one frame) | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.triggers.snare`](/p5/audio/beat) | `boolean` | Snare trigger (true for one frame) | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.triggers.hat`](/p5/audio/beat) | `boolean` | Hi-hat trigger (true for one frame) | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.triggers.any`](/p5/audio/beat) | `boolean` | Any beat trigger (true for one frame) | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.events`](/p5/audio/beat) | `Array<{ type, time, strength }>` | Beat events this frame | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.bpm`](/p5/audio/beat) | `number` | Tracked BPM | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.confidence`](/p5/audio/beat) | `number` | Beat-tracker confidence 0–1 | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.isLocked`](/p5/audio/beat) | `boolean` | Whether beat tracking is locked | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.spectral.brightness`](/p5/audio/spectral) | `number` | Spectral brightness 0–1 | [Spectral Analysis](/p5/audio/spectral) |\n| [`viji.audio.spectral.flatness`](/p5/audio/spectral) | `number` | Spectral flatness 0–1 | [Spectral Analysis](/p5/audio/spectral) |\n| [`viji.audio.getFrequencyData()`](/p5/audio/frequency-data) | `() => Uint8Array` | Raw FFT frequency bins (0–255) | [Frequency Data](/p5/audio/frequency-data) |\n| [`viji.audio.getWaveform()`](/p5/audio/waveform) | `() => Float32Array` | Time-domain waveform (-1 to 1) | [Waveform](/p5/audio/waveform) |\n\n## Video & CV\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.video.isConnected`](/p5/video) | `boolean` | Whether a video source is active | [Overview](/p5/video) |\n| [`viji.video.currentFrame`](/p5/video/basics) | `OffscreenCanvas \\| ImageBitmap \\| null` | Current video frame | [Video Basics](/p5/video/basics) |\n| [`viji.video.frameWidth`](/p5/video/basics) | `number` | Frame width in pixels | [Video Basics](/p5/video/basics) |\n| [`viji.video.frameHeight`](/p5/video/basics) | `number` | Frame height in pixels | [Video Basics](/p5/video/basics) |\n| [`viji.video.frameRate`](/p5/video/basics) | `number` | Video frame rate | [Video Basics](/p5/video/basics) |\n| [`viji.video.getFrameData()`](/p5/video/basics) | `() => ImageData \\| null` | Pixel data for the current frame | [Video Basics](/p5/video/basics) |\n| [`viji.video.faces`](/p5/video/face-detection) | `FaceData[]` | Detected faces | [Face Detection](/p5/video/face-detection) |\n| [`viji.video.hands`](/p5/video/hand-tracking) | `HandData[]` | Detected hands | [Hand Tracking](/p5/video/hand-tracking) |\n| [`viji.video.pose`](/p5/video/pose-detection) | `PoseData \\| null` | Detected body pose | [Pose Detection](/p5/video/pose-detection) |\n| [`viji.video.segmentation`](/p5/video/body-segmentation) | `SegmentationData \\| null` | Body segmentation mask | [Body Segmentation](/p5/video/body-segmentation) |\n| [`viji.video.cv.enableFaceDetection(enabled)`](/p5/video/face-detection) | `(boolean) => Promise<void>` | Enable/disable face detection | [Face Detection](/p5/video/face-detection) |\n| [`viji.video.cv.enableFaceMesh(enabled)`](/p5/video/face-mesh) | `(boolean) => Promise<void>` | Enable/disable face mesh | [Face Mesh](/p5/video/face-mesh) |\n| [`viji.video.cv.enableEmotionDetection(enabled)`](/p5/video/emotion-detection) | `(boolean) => Promise<void>` | Enable/disable emotion detection | [Emotion Detection](/p5/video/emotion-detection) |\n| [`viji.video.cv.enableHandTracking(enabled)`](/p5/video/hand-tracking) | `(boolean) => Promise<void>` | Enable/disable hand tracking | [Hand Tracking](/p5/video/hand-tracking) |\n| [`viji.video.cv.enablePoseDetection(enabled)`](/p5/video/pose-detection) | `(boolean) => Promise<void>` | Enable/disable pose detection | [Pose Detection](/p5/video/pose-detection) |\n| [`viji.video.cv.enableBodySegmentation(enabled)`](/p5/video/body-segmentation) | `(boolean) => Promise<void>` | Enable/disable body segmentation | [Body Segmentation](/p5/video/body-segmentation) |\n| [`viji.video.cv.getActiveFeatures()`](/p5/video) | `() => CVFeature[]` | List of active CV features | [Overview](/p5/video) |\n| [`viji.video.cv.isProcessing()`](/p5/video) | `() => boolean` | Whether CV is currently processing | [Overview](/p5/video) |\n\n## Mouse\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.mouse.x`](/p5/mouse) | `number` | Cursor X position in pixels | [Mouse](/p5/mouse) |\n| [`viji.mouse.y`](/p5/mouse) | `number` | Cursor Y position in pixels | [Mouse](/p5/mouse) |\n| [`viji.mouse.isInCanvas`](/p5/mouse) | `boolean` | Whether cursor is inside the canvas | [Mouse](/p5/mouse) |\n| [`viji.mouse.isPressed`](/p5/mouse) | `boolean` | Whether any button is pressed | [Mouse](/p5/mouse) |\n| [`viji.mouse.leftButton`](/p5/mouse) | `boolean` | Left button state | [Mouse](/p5/mouse) |\n| [`viji.mouse.rightButton`](/p5/mouse) | `boolean` | Right button state | [Mouse](/p5/mouse) |\n| [`viji.mouse.middleButton`](/p5/mouse) | `boolean` | Middle button state | [Mouse](/p5/mouse) |\n| [`viji.mouse.deltaX`](/p5/mouse) | `number` | Horizontal movement since last frame | [Mouse](/p5/mouse) |\n| [`viji.mouse.deltaY`](/p5/mouse) | `number` | Vertical movement since last frame | [Mouse](/p5/mouse) |\n| [`viji.mouse.wheelDelta`](/p5/mouse) | `number` | Scroll wheel delta | [Mouse](/p5/mouse) |\n| [`viji.mouse.wheelX`](/p5/mouse) | `number` | Horizontal scroll delta | [Mouse](/p5/mouse) |\n| [`viji.mouse.wheelY`](/p5/mouse) | `number` | Vertical scroll delta | [Mouse](/p5/mouse) |\n| [`viji.mouse.wasPressed`](/p5/mouse) | `boolean` | True for one frame when pressed | [Mouse](/p5/mouse) |\n| [`viji.mouse.wasReleased`](/p5/mouse) | `boolean` | True for one frame when released | [Mouse](/p5/mouse) |\n| [`viji.mouse.wasMoved`](/p5/mouse) | `boolean` | True for one frame when moved | [Mouse](/p5/mouse) |\n\n## Keyboard\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.keyboard.isPressed(key)`](/p5/keyboard) | `(string) => boolean` | Whether a key is currently held | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.wasPressed(key)`](/p5/keyboard) | `(string) => boolean` | True for one frame when pressed | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.wasReleased(key)`](/p5/keyboard) | `(string) => boolean` | True for one frame when released | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.activeKeys`](/p5/keyboard) | `Set<string>` | All currently held keys | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.pressedThisFrame`](/p5/keyboard) | `Set<string>` | Keys pressed this frame | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.releasedThisFrame`](/p5/keyboard) | `Set<string>` | Keys released this frame | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.lastKeyPressed`](/p5/keyboard) | `string` | Most recently pressed key | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.lastKeyReleased`](/p5/keyboard) | `string` | Most recently released key | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.shift`](/p5/keyboard) | `boolean` | Shift key state | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.ctrl`](/p5/keyboard) | `boolean` | Ctrl/Cmd key state | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.alt`](/p5/keyboard) | `boolean` | Alt/Option key state | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.meta`](/p5/keyboard) | `boolean` | Meta/Win key state | [Keyboard](/p5/keyboard) |\n\n## Touch\n\n> **Note:** The property is `viji.touches` (plural), not `viji.touch`.\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.touches.points`](/p5/touch) | `TouchPoint[]` | All active touch points | [Touch](/p5/touch) |\n| [`viji.touches.count`](/p5/touch) | `number` | Number of active touches | [Touch](/p5/touch) |\n| [`viji.touches.started`](/p5/touch) | `TouchPoint[]` | Touch points that started this frame | [Touch](/p5/touch) |\n| [`viji.touches.moved`](/p5/touch) | `TouchPoint[]` | Touch points that moved this frame | [Touch](/p5/touch) |\n| [`viji.touches.ended`](/p5/touch) | `TouchPoint[]` | Touch points that ended this frame | [Touch](/p5/touch) |\n| [`viji.touches.primary`](/p5/touch) | `TouchPoint \\| null` | The first active touch point | [Touch](/p5/touch) |\n\n**`TouchPoint` fields:** `id`, `x`, `y`, `pressure`, `radius`, `radiusX`, `radiusY`, `rotationAngle`, `force` (numbers); `isInCanvas`, `isNew`, `isActive`, `isEnding` (booleans); `deltaX`, `deltaY` (numbers); `velocity` `{ x, y }`.\n\n## Pointer (Unified)\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.pointer.x`](/p5/pointer) | `number` | Primary pointer X position | [Pointer](/p5/pointer) |\n| [`viji.pointer.y`](/p5/pointer) | `number` | Primary pointer Y position | [Pointer](/p5/pointer) |\n| [`viji.pointer.deltaX`](/p5/pointer) | `number` | Horizontal movement since last frame | [Pointer](/p5/pointer) |\n| [`viji.pointer.deltaY`](/p5/pointer) | `number` | Vertical movement since last frame | [Pointer](/p5/pointer) |\n| [`viji.pointer.isDown`](/p5/pointer) | `boolean` | Whether the pointer is active (click or touch) | [Pointer](/p5/pointer) |\n| [`viji.pointer.wasPressed`](/p5/pointer) | `boolean` | True for one frame when pressed | [Pointer](/p5/pointer) |\n| [`viji.pointer.wasReleased`](/p5/pointer) | `boolean` | True for one frame when released | [Pointer](/p5/pointer) |\n| [`viji.pointer.isInCanvas`](/p5/pointer) | `boolean` | Whether pointer is inside the canvas | [Pointer](/p5/pointer) |\n| [`viji.pointer.type`](/p5/pointer) | `'mouse' \\| 'touch' \\| 'none'` | Current input source | [Pointer](/p5/pointer) |\n\n## Device Sensors\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.device.motion`](/p5/sensors) | `DeviceMotionData \\| null` | Accelerometer and gyroscope data | [Device Sensors](/p5/sensors) |\n| [`viji.device.orientation`](/p5/sensors) | `DeviceOrientationData \\| null` | Device orientation (alpha, beta, gamma) | [Device Sensors](/p5/sensors) |\n\n**`DeviceMotionData`:** `acceleration` `{ x, y, z }`, `accelerationIncludingGravity` `{ x, y, z }`, `rotationRate` `{ alpha, beta, gamma }`, `interval`.\n\n**`DeviceOrientationData`:** `alpha`, `beta`, `gamma` (numbers or null), `absolute` (boolean).\n\n## External Devices\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.devices`](/p5/external-devices) | `DeviceState[]` | Connected external devices | [Overview](/p5/external-devices) |\n| [`viji.devices[i].id`](/p5/external-devices) | `string` | Unique device identifier | [Overview](/p5/external-devices) |\n| [`viji.devices[i].name`](/p5/external-devices) | `string` | User-friendly device name | [Overview](/p5/external-devices) |\n| [`viji.devices[i].motion`](/p5/external-devices/sensors) | `DeviceMotionData \\| null` | Device accelerometer/gyroscope | [Device Sensors](/p5/external-devices/sensors) |\n| [`viji.devices[i].orientation`](/p5/external-devices/sensors) | `DeviceOrientationData \\| null` | Device orientation | [Device Sensors](/p5/external-devices/sensors) |\n| [`viji.devices[i].video`](/p5/external-devices/video) | `VideoAPI \\| null` | Device camera video | [Device Video](/p5/external-devices/video) |\n\n## Streams\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `viji.streams` | `VideoAPI[]` | Additional video streams provided by the host |\n\nEach element has the same shape as [`viji.video`](/p5/video). Streams are additional video sources injected by the host application — they are used internally by Viji's compositor for mixing multiple scenes together. The array may be empty if the host does not provide additional streams. Your scene can read them the same way it reads `viji.video`.\n\n## Related\n\n- [Quick Start](/p5/quickstart) — getting started with the P5 renderer\n- [Scene Structure](/p5/scene-structure) — setup/render pattern and instance mode\n- [Drawing with P5](/p5/drawing) — Viji-specific drawing patterns\n- [Best Practices](/getting-started/best-practices) — essential patterns for all renderers\n- [Native API Reference](/native/api-reference) — the same API in the Native renderer\n- [Shader API Reference](/shader/api-reference) — built-in uniforms for shaders\n- [P5.js Reference](https://p5js.org/reference/) — official P5.js documentation"
1976
+ "markdown": "# API Reference\n\nThis page lists every property and method available on the `viji` object passed to your P5 scene functions. The `viji` API is identical to the [Native renderer](/native/api-reference) — the difference is that P5 scenes also receive a `p5` instance as the second argument.\n\nNew to Viji P5? Start with the [Quick Start](/p5/quickstart) instead.\n\n## Entry Points\n\n```javascript\n// @renderer p5\n\nfunction setup(viji, p5) {\n // Called once when the scene starts (optional)\n}\n\nfunction render(viji, p5) {\n // Called every frame\n}\n```\n\nThe `p5` parameter is a full [P5.js](https://p5js.org/reference/) instance (v1.9.4) in instance mode. All P5 drawing methods (`p5.rect()`, `p5.fill()`, `p5.ellipse()`, etc.) are accessed through it. See [Drawing with P5](/p5/drawing) for Viji-specific drawing patterns.\n\n## Canvas & Context\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.canvas`](/p5/canvas-resolution) | `OffscreenCanvas` | The rendering canvas | [Canvas & Resolution](/p5/canvas-resolution) |\n| [`viji.width`](/p5/canvas-resolution) | `number` | Canvas width in pixels | [Canvas & Resolution](/p5/canvas-resolution) |\n| [`viji.height`](/p5/canvas-resolution) | `number` | Canvas height in pixels | [Canvas & Resolution](/p5/canvas-resolution) |\n\n> [!WARNING]\n> `viji.useContext()` is **not available** in P5 scenes. The canvas and its rendering context (2D or WEBGL, depending on the directive) are managed by P5 internally. Calling `useContext()` would conflict with P5's rendering pipeline.\n\n## Timing\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.time`](/p5/timing) | `number` | Seconds elapsed since the scene started | [Timing](/p5/timing) |\n| [`viji.deltaTime`](/p5/timing) | `number` | Seconds since the previous frame | [Timing](/p5/timing) |\n| [`viji.frameCount`](/p5/timing) | `number` | Monotonically increasing frame counter | [Timing](/p5/timing) |\n| [`viji.fps`](/p5/timing) | `number` | Target FPS based on the host's frame rate mode | [Timing](/p5/timing) |\n\n## Parameters\n\nAll parameter methods are called at the top level of your scene file. Read `.value` inside `render()` to get the current value.\n\n| Method | Returns | `.value` Type | Details |\n|--------|---------|---------------|---------|\n| [`viji.slider(default, config)`](/p5/parameters/slider) | `SliderParameter` | `number` | [Slider](/p5/parameters/slider) |\n| [`viji.color(default, config)`](/p5/parameters/color) | `ColorParameter` | `string` (hex) | [Color](/p5/parameters/color) |\n| [`viji.toggle(default, config)`](/p5/parameters/toggle) | `ToggleParameter` | `boolean` | [Toggle](/p5/parameters/toggle) |\n| [`viji.select(default, config)`](/p5/parameters/select) | `SelectParameter` | `string \\| number` | [Select](/p5/parameters/select) |\n| [`viji.number(default, config)`](/p5/parameters/number) | `NumberParameter` | `number` | [Number](/p5/parameters/number) |\n| [`viji.text(default, config)`](/p5/parameters/text) | `TextParameter` | `string` | [Text](/p5/parameters/text) |\n| [`viji.image(null, config)`](/p5/parameters/image) | `ImageParameter` | `ImageBitmap \\| null` | [Image](/p5/parameters/image) |\n| [`viji.button(config)`](/p5/parameters/button) | `ButtonParameter` | `boolean` (true for one frame) | [Button](/p5/parameters/button) |\n\n> [!NOTE]\n> Image parameters have a `.value.p5` property for use with `p5.image()`. See [Drawing with P5 — Image Parameters](/p5/drawing) for the pattern.\n\nSee [Parameters Overview](/p5/parameters) for the declaration pattern, [Grouping](/p5/parameters/grouping) and [Categories](/p5/parameters/categories) for organization.\n\n## Audio\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.audio.isConnected`](/p5/audio) | `boolean` | Whether an audio source is active | [Overview](/p5/audio) |\n| [`viji.audio.volume.current`](/p5/audio/volume) | `number` | Current RMS volume 0–1 | [Volume](/p5/audio/volume) |\n| [`viji.audio.volume.peak`](/p5/audio/volume) | `number` | Peak volume 0–1 | [Volume](/p5/audio/volume) |\n| [`viji.audio.volume.smoothed`](/p5/audio/volume) | `number` | Smoothed volume 0–1 | [Volume](/p5/audio/volume) |\n| [`viji.audio.bands.low`](/p5/audio/bands) | `number` | Low frequency band energy (20–120 Hz) | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.lowMid`](/p5/audio/bands) | `number` | Low-mid band energy (120–500 Hz) | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.mid`](/p5/audio/bands) | `number` | Mid band energy (500–2 kHz) | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.highMid`](/p5/audio/bands) | `number` | High-mid band energy (2–6 kHz) | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.high`](/p5/audio/bands) | `number` | High band energy (6–16 kHz) | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.lowSmoothed`](/p5/audio/bands) | `number` | Smoothed low band | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.lowMidSmoothed`](/p5/audio/bands) | `number` | Smoothed low-mid band | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.midSmoothed`](/p5/audio/bands) | `number` | Smoothed mid band | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.highMidSmoothed`](/p5/audio/bands) | `number` | Smoothed high-mid band | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.bands.highSmoothed`](/p5/audio/bands) | `number` | Smoothed high band | [Frequency Bands](/p5/audio/bands) |\n| [`viji.audio.beat.kick`](/p5/audio/beat) | `number` | Kick beat energy | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.snare`](/p5/audio/beat) | `number` | Snare beat energy | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.hat`](/p5/audio/beat) | `number` | Hi-hat beat energy | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.any`](/p5/audio/beat) | `number` | Combined beat energy | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.kickSmoothed`](/p5/audio/beat) | `number` | Smoothed kick | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.snareSmoothed`](/p5/audio/beat) | `number` | Smoothed snare | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.hatSmoothed`](/p5/audio/beat) | `number` | Smoothed hi-hat | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.anySmoothed`](/p5/audio/beat) | `number` | Smoothed combined | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.triggers.kick`](/p5/audio/beat) | `boolean` | Kick trigger (true for one frame) | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.triggers.snare`](/p5/audio/beat) | `boolean` | Snare trigger (true for one frame) | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.triggers.hat`](/p5/audio/beat) | `boolean` | Hi-hat trigger (true for one frame) | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.triggers.any`](/p5/audio/beat) | `boolean` | Any beat trigger (true for one frame) | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.events`](/p5/audio/beat) | `Array<{ type, time, strength }>` | Beat events this frame | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.bpm`](/p5/audio/beat) | `number` | Tracked BPM | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.confidence`](/p5/audio/beat) | `number` | Beat-tracker confidence 0–1 | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.beat.isLocked`](/p5/audio/beat) | `boolean` | Whether beat tracking is locked | [Beat Detection](/p5/audio/beat) |\n| [`viji.audio.spectral.brightness`](/p5/audio/spectral) | `number` | Spectral brightness 0–1 | [Spectral Analysis](/p5/audio/spectral) |\n| [`viji.audio.spectral.flatness`](/p5/audio/spectral) | `number` | Spectral flatness 0–1 | [Spectral Analysis](/p5/audio/spectral) |\n| [`viji.audio.getFrequencyData()`](/p5/audio/frequency-data) | `() => Uint8Array` | Raw FFT frequency bins (0–255) | [Frequency Data](/p5/audio/frequency-data) |\n| [`viji.audio.getWaveform()`](/p5/audio/waveform) | `() => Float32Array` | Time-domain waveform (-1 to 1) | [Waveform](/p5/audio/waveform) |\n\n## Video & CV\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.video.isConnected`](/p5/video) | `boolean` | Whether a video source is active | [Overview](/p5/video) |\n| [`viji.video.currentFrame`](/p5/video/basics) | `OffscreenCanvas \\| ImageBitmap \\| null` | Current video frame | [Video Basics](/p5/video/basics) |\n| [`viji.video.frameWidth`](/p5/video/basics) | `number` | Frame width in pixels | [Video Basics](/p5/video/basics) |\n| [`viji.video.frameHeight`](/p5/video/basics) | `number` | Frame height in pixels | [Video Basics](/p5/video/basics) |\n| [`viji.video.frameRate`](/p5/video/basics) | `number` | Video frame rate | [Video Basics](/p5/video/basics) |\n| [`viji.video.getFrameData()`](/p5/video/basics) | `() => ImageData \\| null` | Pixel data for the current frame | [Video Basics](/p5/video/basics) |\n| [`viji.video.faces`](/p5/video/face-detection) | `FaceData[]` | Detected faces | [Face Detection](/p5/video/face-detection) |\n| [`viji.video.hands`](/p5/video/hand-tracking) | `HandData[]` | Detected hands | [Hand Tracking](/p5/video/hand-tracking) |\n| [`viji.video.pose`](/p5/video/pose-detection) | `PoseData \\| null` | Detected body pose | [Pose Detection](/p5/video/pose-detection) |\n| [`viji.video.segmentation`](/p5/video/body-segmentation) | `SegmentationData \\| null` | Body segmentation mask | [Body Segmentation](/p5/video/body-segmentation) |\n| [`viji.video.cv.enableFaceDetection(enabled)`](/p5/video/face-detection) | `(boolean) => Promise<void>` | Enable/disable face detection | [Face Detection](/p5/video/face-detection) |\n| [`viji.video.cv.enableFaceMesh(enabled)`](/p5/video/face-mesh) | `(boolean) => Promise<void>` | Enable/disable face mesh | [Face Mesh](/p5/video/face-mesh) |\n| [`viji.video.cv.enableEmotionDetection(enabled)`](/p5/video/emotion-detection) | `(boolean) => Promise<void>` | Enable/disable emotion detection | [Emotion Detection](/p5/video/emotion-detection) |\n| [`viji.video.cv.enableHandTracking(enabled)`](/p5/video/hand-tracking) | `(boolean) => Promise<void>` | Enable/disable hand tracking | [Hand Tracking](/p5/video/hand-tracking) |\n| [`viji.video.cv.enablePoseDetection(enabled)`](/p5/video/pose-detection) | `(boolean) => Promise<void>` | Enable/disable pose detection | [Pose Detection](/p5/video/pose-detection) |\n| [`viji.video.cv.enableBodySegmentation(enabled)`](/p5/video/body-segmentation) | `(boolean) => Promise<void>` | Enable/disable body segmentation | [Body Segmentation](/p5/video/body-segmentation) |\n| [`viji.video.cv.getActiveFeatures()`](/p5/video) | `() => CVFeature[]` | List of active CV features | [Overview](/p5/video) |\n| [`viji.video.cv.isProcessing()`](/p5/video) | `() => boolean` | Whether CV is currently processing | [Overview](/p5/video) |\n\n## Mouse\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.mouse.x`](/p5/mouse) | `number` | Cursor X position in pixels | [Mouse](/p5/mouse) |\n| [`viji.mouse.y`](/p5/mouse) | `number` | Cursor Y position in pixels | [Mouse](/p5/mouse) |\n| [`viji.mouse.isInCanvas`](/p5/mouse) | `boolean` | Whether cursor is inside the canvas | [Mouse](/p5/mouse) |\n| [`viji.mouse.isPressed`](/p5/mouse) | `boolean` | Whether any button is pressed | [Mouse](/p5/mouse) |\n| [`viji.mouse.leftButton`](/p5/mouse) | `boolean` | Left button state | [Mouse](/p5/mouse) |\n| [`viji.mouse.rightButton`](/p5/mouse) | `boolean` | Right button state | [Mouse](/p5/mouse) |\n| [`viji.mouse.middleButton`](/p5/mouse) | `boolean` | Middle button state | [Mouse](/p5/mouse) |\n| [`viji.mouse.deltaX`](/p5/mouse) | `number` | Horizontal movement since last frame | [Mouse](/p5/mouse) |\n| [`viji.mouse.deltaY`](/p5/mouse) | `number` | Vertical movement since last frame | [Mouse](/p5/mouse) |\n| [`viji.mouse.wheelDelta`](/p5/mouse) | `number` | Scroll wheel delta | [Mouse](/p5/mouse) |\n| [`viji.mouse.wheelX`](/p5/mouse) | `number` | Horizontal scroll delta | [Mouse](/p5/mouse) |\n| [`viji.mouse.wheelY`](/p5/mouse) | `number` | Vertical scroll delta | [Mouse](/p5/mouse) |\n| [`viji.mouse.wasPressed`](/p5/mouse) | `boolean` | True for one frame when pressed | [Mouse](/p5/mouse) |\n| [`viji.mouse.wasReleased`](/p5/mouse) | `boolean` | True for one frame when released | [Mouse](/p5/mouse) |\n| [`viji.mouse.wasMoved`](/p5/mouse) | `boolean` | True for one frame when moved | [Mouse](/p5/mouse) |\n\n## Keyboard\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.keyboard.isPressed(key)`](/p5/keyboard) | `(string) => boolean` | Whether a key is currently held | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.wasPressed(key)`](/p5/keyboard) | `(string) => boolean` | True for one frame when pressed | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.wasReleased(key)`](/p5/keyboard) | `(string) => boolean` | True for one frame when released | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.activeKeys`](/p5/keyboard) | `Set<string>` | All currently held keys | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.pressedThisFrame`](/p5/keyboard) | `Set<string>` | Keys pressed this frame | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.releasedThisFrame`](/p5/keyboard) | `Set<string>` | Keys released this frame | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.lastKeyPressed`](/p5/keyboard) | `string` | Most recently pressed key | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.lastKeyReleased`](/p5/keyboard) | `string` | Most recently released key | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.shift`](/p5/keyboard) | `boolean` | Shift key state | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.ctrl`](/p5/keyboard) | `boolean` | Ctrl/Cmd key state | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.alt`](/p5/keyboard) | `boolean` | Alt/Option key state | [Keyboard](/p5/keyboard) |\n| [`viji.keyboard.meta`](/p5/keyboard) | `boolean` | Meta/Win key state | [Keyboard](/p5/keyboard) |\n\n## Touch\n\n> **Note:** The property is `viji.touches` (plural), not `viji.touch`.\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.touches.points`](/p5/touch) | `TouchPoint[]` | All active touch points | [Touch](/p5/touch) |\n| [`viji.touches.count`](/p5/touch) | `number` | Number of active touches | [Touch](/p5/touch) |\n| [`viji.touches.started`](/p5/touch) | `TouchPoint[]` | Touch points that started this frame | [Touch](/p5/touch) |\n| [`viji.touches.moved`](/p5/touch) | `TouchPoint[]` | Touch points that moved this frame | [Touch](/p5/touch) |\n| [`viji.touches.ended`](/p5/touch) | `TouchPoint[]` | Touch points that ended this frame | [Touch](/p5/touch) |\n| [`viji.touches.primary`](/p5/touch) | `TouchPoint \\| null` | The first active touch point | [Touch](/p5/touch) |\n\n**`TouchPoint` fields:** `id`, `x`, `y`, `pressure`, `radius`, `radiusX`, `radiusY`, `rotationAngle`, `force` (numbers); `isInCanvas`, `isNew`, `isActive`, `isEnding` (booleans); `deltaX`, `deltaY` (numbers); `velocity` `{ x, y }`.\n\n## Pointer (Unified)\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.pointer.x`](/p5/pointer) | `number` | Primary pointer X position | [Pointer](/p5/pointer) |\n| [`viji.pointer.y`](/p5/pointer) | `number` | Primary pointer Y position | [Pointer](/p5/pointer) |\n| [`viji.pointer.deltaX`](/p5/pointer) | `number` | Horizontal movement since last frame | [Pointer](/p5/pointer) |\n| [`viji.pointer.deltaY`](/p5/pointer) | `number` | Vertical movement since last frame | [Pointer](/p5/pointer) |\n| [`viji.pointer.isDown`](/p5/pointer) | `boolean` | Whether the pointer is active (click or touch) | [Pointer](/p5/pointer) |\n| [`viji.pointer.wasPressed`](/p5/pointer) | `boolean` | True for one frame when pressed | [Pointer](/p5/pointer) |\n| [`viji.pointer.wasReleased`](/p5/pointer) | `boolean` | True for one frame when released | [Pointer](/p5/pointer) |\n| [`viji.pointer.isInCanvas`](/p5/pointer) | `boolean` | Whether pointer is inside the canvas | [Pointer](/p5/pointer) |\n| [`viji.pointer.type`](/p5/pointer) | `'mouse' \\| 'touch' \\| 'none'` | Current input source | [Pointer](/p5/pointer) |\n\n## Device Sensors\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.device.motion`](/p5/sensors) | `DeviceMotionData \\| null` | Accelerometer and gyroscope data | [Device Sensors](/p5/sensors) |\n| [`viji.device.orientation`](/p5/sensors) | `DeviceOrientationData \\| null` | Device orientation (alpha, beta, gamma) | [Device Sensors](/p5/sensors) |\n\n**`DeviceMotionData`:** `acceleration` `{ x, y, z }`, `accelerationIncludingGravity` `{ x, y, z }`, `rotationRate` `{ alpha, beta, gamma }`, `interval`.\n\n**`DeviceOrientationData`:** `alpha`, `beta`, `gamma` (numbers or null), `absolute` (boolean).\n\n## External Devices\n\n| Member | Type | Description | Details |\n|--------|------|-------------|---------|\n| [`viji.devices`](/p5/external-devices) | `DeviceState[]` | Connected external devices | [Overview](/p5/external-devices) |\n| [`viji.devices[i].id`](/p5/external-devices) | `string` | Unique device identifier | [Overview](/p5/external-devices) |\n| [`viji.devices[i].name`](/p5/external-devices) | `string` | User-friendly device name | [Overview](/p5/external-devices) |\n| [`viji.devices[i].motion`](/p5/external-devices/sensors) | `DeviceMotionData \\| null` | Device accelerometer/gyroscope | [Device Sensors](/p5/external-devices/sensors) |\n| [`viji.devices[i].orientation`](/p5/external-devices/sensors) | `DeviceOrientationData \\| null` | Device orientation | [Device Sensors](/p5/external-devices/sensors) |\n| [`viji.devices[i].video`](/p5/external-devices/video) | `VideoAPI \\| null` | Device camera video | [Device Video](/p5/external-devices/video) |\n| [`viji.devices[i].audio`](/p5/external-devices/audio) | `AudioStreamAPI \\| null` | Device audio stream | [Device Audio](/p5/external-devices/audio) |\n\n## Streams\n\n| Member | Type | Description |\n|--------|------|-------------|\n| `viji.videoStreams` | `VideoAPI[]` | Additional video streams provided by the host |\n| `viji.audioStreams` | `AudioStreamAPI[]` | Additional audio streams provided by the host |\n\nEach `videoStreams` element has the same shape as [`viji.video`](/p5/video). Each `audioStreams` element provides lightweight audio analysis (volume, frequency bands, spectral features) but does **not** include beat detection or BPM tracking — see [`viji.audio`](/p5/audio) for full analysis on the main stream.\n\nBoth arrays may be empty if the host does not provide additional streams. Audio streams from external devices appear on `device.audio`, not in `viji.audioStreams`.\n\n## Related\n\n- [Quick Start](/p5/quickstart) — getting started with the P5 renderer\n- [Scene Structure](/p5/scene-structure) — setup/render pattern and instance mode\n- [Drawing with P5](/p5/drawing) — Viji-specific drawing patterns\n- [Best Practices](/getting-started/best-practices) — essential patterns for all renderers\n- [Native API Reference](/native/api-reference) — the same API in the Native renderer\n- [Shader API Reference](/shader/api-reference) — built-in uniforms for shaders\n- [P5.js Reference](https://p5js.org/reference/) — official P5.js documentation"
1945
1977
  }
1946
1978
  ]
1947
1979
  },
@@ -1952,7 +1984,7 @@ export const docsApi = {
1952
1984
  "content": [
1953
1985
  {
1954
1986
  "type": "text",
1955
- "markdown": "# Scene Structure\n\nA P5 scene in Viji follows a specific lifecycle. This page covers the `@renderer p5` directive, the `setup()` and `render()` functions, instance mode, and how P5 scenes differ from standard sketches.\n\n## The `@renderer` Directive\n\n> [!IMPORTANT]\n> P5 and shader scenes must declare their renderer type as the first comment:\n> ```\n> // @renderer p5\n> ```\n> or\n> ```\n> // @renderer shader\n> ```\n> Without this directive, the scene defaults to the native renderer.\n\n## Scene Lifecycle\n\nA P5 scene has three parts: top-level code, an optional `setup()`, and a required `render()`:\n\n```javascript\n// @renderer p5\n\n// 1. Top level — runs once: parameters, constants, state\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nlet angle = 0;\n\n// 2. setup(viji, p5) — optional, runs once after P5 initializes\nfunction setup(viji, p5) {\n p5.colorMode(p5.HSB);\n}\n\n// 3. render(viji, p5) — called every frame\nfunction render(viji, p5) {\n p5.background(0);\n angle += speed.value * viji.deltaTime;\n p5.circle(viji.width / 2, viji.height / 2, 100);\n}\n```\n\n### Top Level\n\nTop-level code runs once when the scene is first loaded. Use it for:\n\n- **Parameter declarations** — `viji.slider()`, `viji.color()`, `viji.toggle()`, etc.\n- **Constants** — precomputed values, lookup tables\n- **Mutable state** — variables that accumulate across frames\n- **Dynamic imports** — top-level `await` is supported (e.g., `const lib = await import('https://esm.sh/...')`)\n\n### `setup(viji, p5)` — Optional\n\nRuns once after P5 has initialized. Use it for one-time P5 configuration:\n\n```javascript\nfunction setup(viji, p5) {\n p5.colorMode(p5.HSB, 360, 100, 100);\n p5.textFont('monospace');\n p5.noStroke();\n}\n```\n\nIf you don't need any P5 configuration, omit `setup()` entirely. Unlike standard P5, **there is no `createCanvas()` call** — the canvas is already created and sized by Viji.\n\n### `render(viji, p5)` — Required\n\nCalled every frame. This replaces P5's `draw()` function. Both arguments are always provided:\n\n| Argument | Type | Description |\n|----------|------|-------------|\n| `viji` | `VijiAPI` | Full Viji API — timing, audio, video, parameters, input |\n| `p5` | P5 instance | Full P5.js API in instance mode |"
1987
+ "markdown": "# Scene Structure\n\nA P5 scene in Viji follows a specific lifecycle. This page covers the `@renderer p5` directive (2D or WEBGL), the `setup()` and `render()` functions, instance mode, and how P5 scenes differ from standard sketches.\n\n## The `@renderer` Directive\n\n> [!IMPORTANT]\n> P5 and shader scenes must declare their renderer type as the first comment:\n> ```\n> // @renderer p5\n> ```\n> For a **WEBGL** main canvas (3D / P5 shaders on the primary surface), use:\n> ```\n> // @renderer p5 webgl\n> ```\n> Shader scenes use:\n> ```\n> // @renderer shader\n> ```\n> Without a matching directive, the scene defaults to the native renderer.\n\n## Scene Lifecycle\n\nA P5 scene has three parts: top-level code, an optional `setup()`, and a required `render()`:\n\n```javascript\n// @renderer p5\n\n// 1. Top level — runs once: parameters, constants, state\nconst speed = viji.slider(1, { min: 0.1, max: 5, label: 'Speed' });\nlet angle = 0;\n\n// 2. setup(viji, p5) — optional, runs once after P5 initializes\nfunction setup(viji, p5) {\n p5.colorMode(p5.HSB);\n}\n\n// 3. render(viji, p5) — called every frame\nfunction render(viji, p5) {\n p5.background(0);\n angle += speed.value * viji.deltaTime;\n p5.circle(viji.width / 2, viji.height / 2, 100);\n}\n```\n\n### Top Level\n\nTop-level code runs once when the scene is first loaded. Use it for:\n\n- **Parameter declarations** — `viji.slider()`, `viji.color()`, `viji.toggle()`, etc.\n- **Constants** — precomputed values, lookup tables\n- **Mutable state** — variables that accumulate across frames\n- **Dynamic imports** — top-level `await` is supported (e.g., `const lib = await import('https://esm.sh/...')`)\n\n### `setup(viji, p5)` — Optional\n\nRuns once after P5 has initialized. Use it for one-time P5 configuration:\n\n```javascript\nfunction setup(viji, p5) {\n p5.colorMode(p5.HSB, 360, 100, 100);\n p5.textFont('monospace');\n p5.noStroke();\n}\n```\n\nIf you don't need any P5 configuration, omit `setup()` entirely. Unlike standard P5, **there is no `createCanvas()` call** — the canvas is already created and sized by Viji.\n\n### `render(viji, p5)` — Required\n\nCalled every frame. This replaces P5's `draw()` function. Both arguments are always provided:\n\n| Argument | Type | Description |\n|----------|------|-------------|\n| `viji` | `VijiAPI` | Full Viji API — timing, audio, video, parameters, input |\n| `p5` | P5 instance | Full P5.js API in instance mode |"
1956
1988
  },
1957
1989
  {
1958
1990
  "type": "live-example",
@@ -1973,7 +2005,7 @@ export const docsApi = {
1973
2005
  "content": [
1974
2006
  {
1975
2007
  "type": "text",
1976
- "markdown": "# Canvas & Resolution\n\nIn the P5 renderer, the canvas and its rendering context are managed for you. You draw with P5 functions — no need to call `viji.useContext()`. This page covers how resolution works, what `viji.width` and `viji.height` mean, and how to build layouts that adapt to any canvas size.\n\n## Canvas Management\n\nViji creates the canvas and passes it to P5 automatically. Key differences from standard P5.js:\n\n- **No `createCanvas()`.** The canvas already exists. Calling `p5.createCanvas()` is unnecessary and should be avoided.\n- **No `resizeCanvas()`.** When the host resizes the canvas, Viji handles the resize and updates P5 internally. Your `render()` function is always called with the correct dimensions.\n- **P5 owns the rendering context.** You don't call `viji.useContext()` — P5 creates its own 2D context on the provided canvas.\n\n## Resolution Properties\n\n| Property | Type | Description |\n|----------|------|-------------|\n| `viji.width` | `number` | Current canvas width in pixels |\n| `viji.height` | `number` | Current canvas height in pixels |\n| `p5.width` | `number` | Same value — P5's internal width |\n| `p5.height` | `number` | Same value — P5's internal height |\n| `viji.canvas` | `OffscreenCanvas` | The underlying canvas (rarely needed in P5 scenes) |\n\n`viji.width` and `p5.width` are always in sync — they reflect the same canvas. Use whichever feels natural, but `viji.width` is the canonical source across all renderers.\n\n## Resolution-Agnostic Layouts\n\n> [!NOTE]\n> Always use `viji.width` and `viji.height` for positioning and sizing, and `viji.deltaTime` for frame-rate-independent animation. Never hardcode pixel values or assume a specific frame rate.\n\nThe canvas can be any size — from a small preview to a fullscreen 4K display. Position and scale everything relative to `viji.width` and `viji.height`:\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n const cx = viji.width / 2;\n const cy = viji.height / 2;\n const r = Math.min(viji.width, viji.height) * 0.3;\n p5.circle(cx, cy, r * 2);\n}\n```"
2008
+ "markdown": "# Canvas & Resolution\n\nIn the P5 renderer, the canvas and its rendering context are managed for you. You draw with P5 functions — no need to call `viji.useContext()`. This page covers how resolution works, what `viji.width` and `viji.height` mean, and how to build layouts that adapt to any canvas size.\n\n## Canvas Management\n\nViji creates the canvas and passes it to P5 automatically. Key differences from standard P5.js:\n\n- **No `createCanvas()`.** The canvas already exists. Calling `p5.createCanvas()` is unnecessary and should be avoided.\n- **No `resizeCanvas()`.** When the host resizes the canvas, Viji handles the resize and updates P5 internally. Your `render()` function is always called with the correct dimensions.\n- **P5 owns the rendering context.** You don't call `viji.useContext()` — P5 creates its own rendering context (2D or WEBGL, depending on the directive) on the provided canvas.\n\n## Resolution Properties\n\n| Property | Type | Description |\n|----------|------|-------------|\n| `viji.width` | `number` | Current canvas width in pixels |\n| `viji.height` | `number` | Current canvas height in pixels |\n| `p5.width` | `number` | Same value — P5's internal width |\n| `p5.height` | `number` | Same value — P5's internal height |\n| `viji.canvas` | `OffscreenCanvas` | The underlying canvas (rarely needed in P5 scenes) |\n\n`viji.width` and `p5.width` are always in sync — they reflect the same canvas. Use whichever feels natural, but `viji.width` is the canonical source across all renderers.\n\n## Resolution-Agnostic Layouts\n\n> [!NOTE]\n> Always use `viji.width` and `viji.height` for positioning and sizing, and `viji.deltaTime` for frame-rate-independent animation. Never hardcode pixel values or assume a specific frame rate.\n\nThe canvas can be any size — from a small preview to a fullscreen 4K display. Position and scale everything relative to `viji.width` and `viji.height`:\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n const cx = viji.width / 2;\n const cy = viji.height / 2;\n const r = Math.min(viji.width, viji.height) * 0.3;\n p5.circle(cx, cy, r * 2);\n}\n```"
1977
2009
  },
1978
2010
  {
1979
2011
  "type": "live-example",
@@ -1994,7 +2026,7 @@ export const docsApi = {
1994
2026
  "content": [
1995
2027
  {
1996
2028
  "type": "text",
1997
- "markdown": "# Drawing with P5\n\nViji gives you a full P5.js instance in every P5 scene. All standard P5 drawing functions — shapes, colors, transforms, typography, pixel manipulation, math utilities — work as documented in the official reference.\n\n> **P5.js Reference**: Viji loads **P5.js v1.9.4**. For the full drawing API, see the [P5.js Reference](https://p5js.org/reference/).\n\nThis page covers only what is **different or specific to Viji** — how to draw images and video, off-screen buffers, font limitations, and what is not supported.\n\n## Instance Mode\n\n> [!WARNING]\n> Viji uses P5 in **instance mode**. All P5 functions require the `p5.` prefix:\n> ```javascript\n> // Correct\n> p5.background(0);\n> p5.circle(p5.width / 2, p5.height / 2, 100);\n>\n> // Wrong — will throw ReferenceError\n> background(0);\n> circle(width / 2, height / 2, 100);\n> ```\n\nConstants are also namespaced: `p5.PI`, `p5.TWO_PI`, `p5.HSB`, `p5.CENTER`, `p5.BLEND`, etc.\n\n## Drawing Images\n\n### Image Parameters\n\nUse `.value.p5` (not `.value`) when passing image parameters to `p5.image()`:\n\n```javascript\n// @renderer p5\n\nconst tex = viji.image(null, { label: 'Texture' });\n\nfunction render(viji, p5) {\n p5.background(0);\n if (tex.value) {\n p5.image(tex.value.p5, 0, 0, p5.width, p5.height);\n }\n}\n```\n\n> [!WARNING]\n> Passing `tex.value` directly to `p5.image()` will not work. The raw `ImageBitmap` is not P5-compatible. Always use `.value.p5`.\n\nThe `.p5` wrapper is cached — accessing it multiple times per frame has no overhead. See [Image Parameter](../parameters/image/) for full details.\n\n### Video Frames\n\nVideo frames are automatically wrapped for P5 compatibility. Pass `viji.video.currentFrame` directly to `p5.image()`:\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n p5.background(0);\n if (viji.video.isConnected && viji.video.currentFrame) {\n p5.image(viji.video.currentFrame, 0, 0, p5.width, p5.height);\n }\n}\n```\n\nDevice video frames work the same way:\n\n```javascript\nfor (const device of viji.devices) {\n if (device.video?.isConnected && device.video.currentFrame) {\n p5.image(device.video.currentFrame, x, y, w, h);\n }\n}\n```\n\nSee [Video Basics](../video/basics/) for aspect-ratio-correct drawing and `getFrameData()`.\n\n### Tint\n\n`p5.tint()` works as expected for coloring or fading images:\n\n```javascript\n// @renderer p5\n\nconst tex = viji.image(null, { label: 'Image' });\nconst fade = viji.slider(255, { min: 0, max: 255, label: 'Fade' });\n\nfunction render(viji, p5) {\n p5.background(0);\n if (tex.value) {\n p5.tint(255, fade.value);\n p5.image(tex.value.p5, 0, 0, p5.width, p5.height);\n p5.noTint();\n }\n}\n```\n\n## Off-Screen Buffers\n\n`p5.createGraphics(w, h)` works in Viji. Each call creates a real `OffscreenCanvas` buffer you can draw to independently and then composite onto the main canvas:\n\n```javascript\n// @renderer p5\n\nlet buffer;\n\nfunction setup(viji, p5) {\n buffer = p5.createGraphics(p5.width, p5.height);\n}\n\nfunction render(viji, p5) {\n buffer.background(0, 10);\n buffer.noStroke();\n buffer.fill(255);\n buffer.ellipse(\n buffer.width / 2 + p5.sin(viji.time) * 100,\n buffer.height / 2,\n 20, 20\n );\n\n p5.image(buffer, 0, 0);\n}\n```\n\n> [!NOTE]\n> Off-screen buffers created with `createGraphics()` are always 2D. There is no WEBGL support for off-screen buffers.\n\n## Fonts\n\n`p5.loadFont()` is not available in the worker environment. Use system fonts instead:\n\n```javascript\n// @renderer p5\n\nfunction setup(viji, p5) {\n p5.textFont('monospace');\n}\n\nfunction render(viji, p5) {\n p5.background(0);\n p5.fill(255);\n p5.textSize(24);\n p5.textAlign(p5.CENTER, p5.CENTER);\n p5.text('Hello, Viji', p5.width / 2, p5.height / 2);\n}\n```\n\nAvailable system font families: `'monospace'`, `'sans-serif'`, `'serif'`. You can also try specific system fonts like `'Courier New'`, `'Arial'`, `'Georgia'`, but availability depends on the device.\n\n## Blend Modes\n\n`p5.blendMode()` works as expected. All standard P5 blend modes are available:\n\n```javascript\np5.blendMode(p5.ADD);\np5.blendMode(p5.MULTIPLY);\np5.blendMode(p5.SCREEN);\np5.blendMode(p5.BLEND); // default\n```\n\n## Known Limitations\n\n| Feature | Status | Alternative |\n|---------|--------|-------------|\n| WEBGL / 3D mode | Not supported | Use the [Shader renderer](/shader/quickstart) or [Three.js via Native](/native/external-libraries) |\n| `p5.loadFont()` | Not available | Use system fonts: `p5.textFont('monospace')` |\n| `p5.loadImage()` | Not available | Use [`viji.image()`](../parameters/image/) parameters |\n| `p5.createCapture()` | Not available | Use [`viji.video`](../video/) |\n\n> [!NOTE]\n> For the full list of unavailable P5 features (event callbacks, `save()`, `frameRate()`, etc.), see [Converting P5 Sketches](../converting-sketches/).\n\n## Basic Example"
2029
+ "markdown": "# Drawing with P5\n\nViji gives you a full P5.js instance in every P5 scene. All standard P5 drawing functions — shapes, colors, transforms, typography, pixel manipulation, math utilities — work as documented in the official reference.\n\n> **P5.js Reference**: Viji loads **P5.js v1.9.4**. For the full drawing API, see the [P5.js Reference](https://p5js.org/reference/).\n\nThis page covers only what is **different or specific to Viji** — how to draw images and video, off-screen buffers, font limitations, and what is not supported.\n\n## Instance Mode\n\n> [!WARNING]\n> Viji uses P5 in **instance mode**. All P5 functions require the `p5.` prefix:\n> ```javascript\n> // Correct\n> p5.background(0);\n> p5.circle(p5.width / 2, p5.height / 2, 100);\n>\n> // Wrong — will throw ReferenceError\n> background(0);\n> circle(width / 2, height / 2, 100);\n> ```\n\nConstants are also namespaced: `p5.PI`, `p5.TWO_PI`, `p5.HSB`, `p5.CENTER`, `p5.BLEND`, etc.\n\n## Drawing Images\n\n### Image Parameters\n\nUse `.value.p5` (not `.value`) when passing image parameters to `p5.image()`:\n\n```javascript\n// @renderer p5\n\nconst tex = viji.image(null, { label: 'Texture' });\n\nfunction render(viji, p5) {\n p5.background(0);\n if (tex.value) {\n p5.image(tex.value.p5, 0, 0, p5.width, p5.height);\n }\n}\n```\n\n> [!WARNING]\n> Passing `tex.value` directly to `p5.image()` will not work. The raw `ImageBitmap` is not P5-compatible. Always use `.value.p5`.\n\nThe `.p5` wrapper is cached — accessing it multiple times per frame has no overhead. See [Image Parameter](../parameters/image/) for full details.\n\n### Video Frames\n\nVideo frames are automatically wrapped for P5 compatibility. Pass `viji.video.currentFrame` directly to `p5.image()`:\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n p5.background(0);\n if (viji.video.isConnected && viji.video.currentFrame) {\n p5.image(viji.video.currentFrame, 0, 0, p5.width, p5.height);\n }\n}\n```\n\nDevice video frames work the same way:\n\n```javascript\nfor (const device of viji.devices) {\n if (device.video?.isConnected && device.video.currentFrame) {\n p5.image(device.video.currentFrame, x, y, w, h);\n }\n}\n```\n\nSee [Video Basics](../video/basics/) for aspect-ratio-correct drawing and `getFrameData()`.\n\n### Tint\n\n`p5.tint()` works as expected for coloring or fading images:\n\n```javascript\n// @renderer p5\n\nconst tex = viji.image(null, { label: 'Image' });\nconst fade = viji.slider(255, { min: 0, max: 255, label: 'Fade' });\n\nfunction render(viji, p5) {\n p5.background(0);\n if (tex.value) {\n p5.tint(255, fade.value);\n p5.image(tex.value.p5, 0, 0, p5.width, p5.height);\n p5.noTint();\n }\n}\n```\n\n## Off-Screen Buffers\n\n`p5.createGraphics(w, h)` works in Viji. Each call creates a real `OffscreenCanvas` buffer you can draw to independently and then composite onto the main canvas:\n\n```javascript\n// @renderer p5\n\nlet buffer;\n\nfunction setup(viji, p5) {\n buffer = p5.createGraphics(p5.width, p5.height);\n}\n\nfunction render(viji, p5) {\n buffer.background(0, 10);\n buffer.noStroke();\n buffer.fill(255);\n buffer.ellipse(\n buffer.width / 2 + p5.sin(viji.time) * 100,\n buffer.height / 2,\n 20, 20\n );\n\n p5.image(buffer, 0, 0);\n}\n```\n\n> [!NOTE]\n> Off-screen buffers from `createGraphics(w, h)` are **2D only**. `createGraphics(w, h, p5.WEBGL)` is not supported.\n\n## Fonts\n\n`p5.loadFont()` is not available in the worker environment. Use system fonts instead:\n\n```javascript\n// @renderer p5\n\nfunction setup(viji, p5) {\n p5.textFont('monospace');\n}\n\nfunction render(viji, p5) {\n p5.background(0);\n p5.fill(255);\n p5.textSize(24);\n p5.textAlign(p5.CENTER, p5.CENTER);\n p5.text('Hello, Viji', p5.width / 2, p5.height / 2);\n}\n```\n\nAvailable system font families: `'monospace'`, `'sans-serif'`, `'serif'`. You can also try specific system fonts like `'Courier New'`, `'Arial'`, `'Georgia'`, but availability depends on the device.\n\n## Blend Modes\n\n`p5.blendMode()` works as expected. All standard P5 blend modes are available:\n\n```javascript\np5.blendMode(p5.ADD);\np5.blendMode(p5.MULTIPLY);\np5.blendMode(p5.SCREEN);\np5.blendMode(p5.BLEND); // default\n```\n\n## WEBGL main canvas\n\nUse the **first line** of the scene (same place as the P5 renderer directive):\n\n```javascript\n// @renderer p5 webgl\n```\n\nViji creates the main canvas with P5’s WEBGL renderer. You still **must not** call `createCanvas()` — the directive selects 2D vs WEBGL, not your `setup()` code.\n\nIn WEBGL mode, `p5.drawingContext` is a **WebGL** context. Do not use Canvas 2D–only APIs on it. Prefer P5’s 3D drawing APIs, `p5.image()` / textures for images and video (including `viji.video.currentFrame` and image parameters’ `.p5` wrapper), and follow the [P5.js WEBGL reference](https://p5js.org/reference/#/p5/WEBGL).\n\n---\n\n## Known Limitations\n\n| Feature | Status | Alternative |\n|---------|--------|-------------|\n| WEBGL main canvas | Supported via `// @renderer p5 webgl` | Do not call `createCanvas(..., p5.WEBGL)` |\n| WEBGL `createGraphics` | Not supported | Use 2D `createGraphics(w, h)` only, or [Shader](/shader/quickstart) / [Three.js (Native)](/native/external-libraries) for advanced GPU pipelines |\n| `p5.loadFont()` | Not available | Use system fonts: `p5.textFont('monospace')` |\n| `p5.loadImage()` | Not available | Use [`viji.image()`](../parameters/image/) parameters |\n| `p5.createCapture()` | Not available | Use [`viji.video`](../video/) |\n\n> [!NOTE]\n> For the full list of unavailable P5 features (event callbacks, `save()`, `frameRate()`, etc.), see [Converting P5 Sketches](../converting-sketches/).\n\n## Basic Example"
1998
2030
  },
1999
2031
  {
2000
2032
  "type": "live-example",
@@ -2015,7 +2047,7 @@ export const docsApi = {
2015
2047
  "content": [
2016
2048
  {
2017
2049
  "type": "text",
2018
- "markdown": "# Converting P5 Sketches\r\n\r\nThis guide shows how to take any standard P5.js sketch and convert it into a Viji scene. The changes are mechanical — once you learn the pattern, converting takes a few minutes.\r\n\r\n> [!TIP]\r\n> Want an AI to do it for you? See [Convert: P5 Sketches](/ai-prompts/convert-p5) for a ready-to-paste prompt that applies all the rules below automatically.\r\n\r\n## Quick Reference\r\n\r\n| Standard P5.js | Viji-P5 |\r\n|---|---|\r\n| `function setup() { ... }` | `function setup(viji, p5) { ... }` |\r\n| `function draw() { ... }` | `function render(viji, p5) { ... }` |\r\n| `createCanvas(800, 600)` | Remove — canvas is provided |\r\n| `background(0)` | `p5.background(0)` |\r\n| `ellipse(x, y, d)` | `p5.ellipse(x, y, d)` |\r\n| `mouseX`, `mouseY` | [`viji.pointer.x`](/p5/pointer), [`viji.pointer.y`](/p5/pointer) (or [`viji.mouse.x`](/p5/mouse), [`viji.mouse.y`](/p5/mouse)) |\r\n| `keyIsPressed` | [`viji.keyboard.isPressed('a')`](/p5/keyboard) |\r\n| `width`, `height` | `viji.width`, `viji.height` |\r\n| `frameRate(30)` | Remove — host controls frame rate |\r\n| `preload()` | Remove — use `viji.image()` or `fetch()` in `setup()` |\r\n| `save()` / `saveCanvas()` | Remove — host-side `captureFrame()` |\r\n| `loadImage('url')` | `viji.image(null, { label: 'Image' })` |\r\n\r\n## Step by Step\r\n\r\n### 1. Add the renderer directive\r\n\r\nAdd `// @renderer p5` as the very first line:\r\n\r\n```javascript\r\n// @renderer p5\r\n```\r\n\r\n> [!IMPORTANT]\r\n> Without `// @renderer p5`, the scene defaults to the native renderer and the `p5` parameter will be `undefined`.\r\n\r\n### 2. Rename `draw()` to `render(viji, p5)`\r\n\r\nStandard P5:\r\n```javascript\r\nfunction draw() {\r\n background(0);\r\n ellipse(width / 2, height / 2, 100);\r\n}\r\n```\r\n\r\nViji-P5:\r\n```javascript\r\nfunction render(viji, p5) {\r\n p5.background(0);\r\n p5.ellipse(viji.width / 2, viji.height / 2, 100);\r\n}\r\n```\r\n\r\nBoth `viji` and `p5` are required parameters. `viji` gives access to the Viji API; `p5` is the P5.js instance.\r\n\r\n### 3. Add the `p5.` prefix to all P5 functions\r\n\r\n> [!WARNING]\r\n> Viji uses P5 in **instance mode**. Every P5 function and constant needs the `p5.` prefix. This is the most common source of errors during conversion.\r\n\r\n```javascript\r\n// Standard P5.js (global mode)\r\ncolorMode(HSB);\r\nfill(255, 80, 100);\r\nrect(10, 10, 50, 50);\r\nlet v = createVector(1, 0);\r\n\r\n// Viji-P5 (instance mode)\r\np5.colorMode(p5.HSB);\r\np5.fill(255, 80, 100);\r\np5.rect(10, 10, 50, 50);\r\nlet v = p5.createVector(1, 0);\r\n```\r\n\r\nThis applies to constants too: `PI` → `p5.PI`, `TWO_PI` → `p5.TWO_PI`, `HALF_PI` → `p5.HALF_PI`, `HSB` → `p5.HSB`, `WEBGL` → `p5.WEBGL`.\r\n\r\n### 4. Remove `createCanvas()`\r\n\r\nViji creates and manages the canvas for you. Remove any `createCanvas()` call:\r\n\r\n```javascript\r\n// Standard P5.js\r\nfunction setup() {\r\n createCanvas(800, 600);\r\n}\r\n\r\n// Viji-P5 — no createCanvas() needed\r\nfunction setup(viji, p5) {\r\n p5.colorMode(p5.HSB);\r\n}\r\n```\r\n\r\nFor resolution-agnostic sizing, use `viji.width` and `viji.height` instead of hardcoded values.\r\n\r\n### 5. Replace P5 input globals with Viji APIs\r\n\r\nP5's built-in input variables (`mouseX`, `mouseY`, `keyIsPressed`, etc.) are not available in the worker environment. Use the Viji API instead. For most position/click interactions, [`viji.pointer`](/p5/pointer) works across both mouse and touch:\r\n\r\n```javascript\r\n// Standard P5.js\r\nfunction draw() {\r\n if (mouseIsPressed) {\r\n ellipse(mouseX, mouseY, 50);\r\n }\r\n if (keyIsPressed && key === 'r') {\r\n background(255, 0, 0);\r\n }\r\n}\r\n\r\n// Viji-P5\r\nfunction render(viji, p5) {\r\n if (viji.pointer.isDown) {\r\n p5.ellipse(viji.pointer.x, viji.pointer.y, 50);\r\n }\r\n if (viji.keyboard.isPressed('r')) {\r\n p5.background(255, 0, 0);\r\n }\r\n}\r\n```\r\n\r\n### 6. Remove event callbacks\r\n\r\nP5 event callbacks (`mousePressed()`, `mouseDragged()`, `keyPressed()`, etc.) do not work in the worker environment. Check state in `render()` instead:\r\n\r\n```javascript\r\n// Standard P5.js\r\nfunction mousePressed() {\r\n particles.push(new Particle(mouseX, mouseY));\r\n}\r\n\r\n// Viji-P5 — track state manually\r\nlet wasPressed = false;\r\n\r\nfunction render(viji, p5) {\r\n if (viji.mouse.leftButton && !wasPressed) {\r\n particles.push(new Particle(viji.mouse.x, viji.mouse.y));\r\n }\r\n wasPressed = viji.mouse.leftButton;\r\n}\r\n```\r\n\r\n### 7. Replace `preload()` and `loadImage()`\r\n\r\nThere is no `preload()` phase in Viji. For images, use Viji's image parameter or `fetch()` in `setup()`:\r\n\r\n```javascript\r\n// Standard P5.js\r\nlet img;\r\nfunction preload() {\r\n img = loadImage('photo.jpg');\r\n}\r\nfunction draw() {\r\n image(img, 0, 0);\r\n}\r\n\r\n// Viji-P5 — use image parameter\r\nconst photo = viji.image(null, { label: 'Photo' });\r\n\r\nfunction render(viji, p5) {\r\n if (photo.value) {\r\n p5.image(photo.p5, 0, 0, viji.width, viji.height);\r\n }\r\n}\r\n```\r\n\r\n> [!NOTE]\r\n> Use `photo.p5` (not `photo.value`) when passing images to P5 drawing functions like `p5.image()`. The `.p5` property provides a P5-compatible wrapper around the raw image data.\r\n\r\nFor JSON or text data, use `fetch()` in an async `setup()`:\r\n\r\n```javascript\r\nlet data = null;\r\n\r\nasync function setup(viji, p5) {\r\n const response = await fetch('https://cdn.example.com/data.json');\r\n data = await response.json();\r\n}\r\n```\r\n\r\n### 8. Replace `save()` and `frameRate()`\r\n\r\nThese host-level concerns are handled outside the scene:\r\n\r\n- **Saving frames**: The host application uses `core.captureFrame()`.\r\n- **Frame rate**: The host controls it via `core.setFrameRate()`.\r\n\r\nSimply remove these calls from your scene code.\r\n\r\n## Complete Conversion Example\r\n\r\nHere is the same scene implemented both ways, followed by the live Viji version:\r\n\r\n**Standard P5.js:**\r\n\r\n```javascript\r\nfunction setup() {\r\n createCanvas(400, 400);\r\n colorMode(HSB, 360, 100, 100, 100);\r\n}\r\n\r\nfunction draw() {\r\n background(0, 0, 10);\r\n let count = 8;\r\n let radius = 120;\r\n for (let i = 0; i < count; i++) {\r\n let a = frameCount * 0.02 + (i / count) * TWO_PI;\r\n let x = width / 2 + cos(a) * radius;\r\n let y = height / 2 + sin(a) * radius;\r\n noStroke();\r\n fill(255, 150, 0);\r\n circle(x, y, 16);\r\n }\r\n}\r\n```\r\n\r\n**Converted Viji-P5:**"
2050
+ "markdown": "# Converting P5 Sketches\r\n\r\nThis guide shows how to take any standard P5.js sketch and convert it into a Viji scene. The changes are mechanical — once you learn the pattern, converting takes a few minutes.\r\n\r\n> [!TIP]\r\n> Want an AI to do it for you? See [Convert: P5 Sketches](/ai-prompts/convert-p5) for a ready-to-paste prompt that applies all the rules below automatically.\r\n\r\n## Quick Reference\r\n\r\n| Standard P5.js | Viji-P5 |\r\n|---|---|\r\n| `function setup() { ... }` | `function setup(viji, p5) { ... }` |\r\n| `function draw() { ... }` | `function render(viji, p5) { ... }` |\r\n| `createCanvas(800, 600)` | Remove — canvas is provided |\r\n| `background(0)` | `p5.background(0)` |\r\n| `ellipse(x, y, d)` | `p5.ellipse(x, y, d)` |\r\n| `mouseX`, `mouseY` | [`viji.pointer.x`](/p5/pointer), [`viji.pointer.y`](/p5/pointer) (or [`viji.mouse.x`](/p5/mouse), [`viji.mouse.y`](/p5/mouse)) |\r\n| `keyIsPressed` | [`viji.keyboard.isPressed('a')`](/p5/keyboard) |\r\n| `width`, `height` | `viji.width`, `viji.height` |\r\n| `frameRate(30)` | Remove — host controls frame rate |\r\n| `preload()` | Remove — use `viji.image()` or `fetch()` in `setup()` |\r\n| `save()` / `saveCanvas()` | Remove — host-side `captureFrame()` |\r\n| `loadImage('url')` | `viji.image(null, { label: 'Image' })` |\r\n\r\n## Step by Step\r\n\r\n### 1. Add the renderer directive\r\n\r\nAdd `// @renderer p5` as the very first line for a **2D** canvas:\r\n\r\n```javascript\r\n// @renderer p5\r\n```\r\n\r\nIf the sketch used `createCanvas(w, h, WEBGL)` or relies on the WEBGL renderer for the main canvas, use:\r\n\r\n```javascript\r\n// @renderer p5 webgl\r\n```\r\n\r\nDo not keep `createCanvas()` in the converted scene — Viji creates the canvas in the mode chosen by the directive.\r\n\r\n> [!IMPORTANT]\r\n> Without `// @renderer p5` (or `// @renderer p5 webgl`), the scene defaults to the native renderer and the `p5` parameter will be `undefined`.\r\n\r\n### 2. Rename `draw()` to `render(viji, p5)`\r\n\r\nStandard P5:\r\n```javascript\r\nfunction draw() {\r\n background(0);\r\n ellipse(width / 2, height / 2, 100);\r\n}\r\n```\r\n\r\nViji-P5:\r\n```javascript\r\nfunction render(viji, p5) {\r\n p5.background(0);\r\n p5.ellipse(viji.width / 2, viji.height / 2, 100);\r\n}\r\n```\r\n\r\nBoth `viji` and `p5` are required parameters. `viji` gives access to the Viji API; `p5` is the P5.js instance.\r\n\r\n### 3. Add the `p5.` prefix to all P5 functions\r\n\r\n> [!WARNING]\r\n> Viji uses P5 in **instance mode**. Every P5 function and constant needs the `p5.` prefix. This is the most common source of errors during conversion.\r\n\r\n```javascript\r\n// Standard P5.js (global mode)\r\ncolorMode(HSB);\r\nfill(255, 80, 100);\r\nrect(10, 10, 50, 50);\r\nlet v = createVector(1, 0);\r\n\r\n// Viji-P5 (instance mode)\r\np5.colorMode(p5.HSB);\r\np5.fill(255, 80, 100);\r\np5.rect(10, 10, 50, 50);\r\nlet v = p5.createVector(1, 0);\r\n```\r\n\r\nThis applies to constants too: `PI` → `p5.PI`, `TWO_PI` → `p5.TWO_PI`, `HALF_PI` → `p5.HALF_PI`, `HSB` → `p5.HSB`, `WEBGL` → `p5.WEBGL`.\r\n\r\n### 4. Remove `createCanvas()`\r\n\r\nViji creates and manages the canvas for you. Remove any `createCanvas()` call:\r\n\r\n```javascript\r\n// Standard P5.js\r\nfunction setup() {\r\n createCanvas(800, 600);\r\n}\r\n\r\n// Viji-P5 — no createCanvas() needed\r\nfunction setup(viji, p5) {\r\n p5.colorMode(p5.HSB);\r\n}\r\n```\r\n\r\nFor resolution-agnostic sizing, use `viji.width` and `viji.height` instead of hardcoded values.\r\n\r\n### 5. Replace P5 input globals with Viji APIs\r\n\r\nP5's built-in input variables (`mouseX`, `mouseY`, `keyIsPressed`, etc.) are not available in the worker environment. Use the Viji API instead. For most position/click interactions, [`viji.pointer`](/p5/pointer) works across both mouse and touch:\r\n\r\n```javascript\r\n// Standard P5.js\r\nfunction draw() {\r\n if (mouseIsPressed) {\r\n ellipse(mouseX, mouseY, 50);\r\n }\r\n if (keyIsPressed && key === 'r') {\r\n background(255, 0, 0);\r\n }\r\n}\r\n\r\n// Viji-P5\r\nfunction render(viji, p5) {\r\n if (viji.pointer.isDown) {\r\n p5.ellipse(viji.pointer.x, viji.pointer.y, 50);\r\n }\r\n if (viji.keyboard.isPressed('r')) {\r\n p5.background(255, 0, 0);\r\n }\r\n}\r\n```\r\n\r\n### 6. Remove event callbacks\r\n\r\nP5 event callbacks (`mousePressed()`, `mouseDragged()`, `keyPressed()`, etc.) do not work in the worker environment. Check state in `render()` instead:\r\n\r\n```javascript\r\n// Standard P5.js\r\nfunction mousePressed() {\r\n particles.push(new Particle(mouseX, mouseY));\r\n}\r\n\r\n// Viji-P5 — track state manually\r\nlet wasPressed = false;\r\n\r\nfunction render(viji, p5) {\r\n if (viji.mouse.leftButton && !wasPressed) {\r\n particles.push(new Particle(viji.mouse.x, viji.mouse.y));\r\n }\r\n wasPressed = viji.mouse.leftButton;\r\n}\r\n```\r\n\r\n### 7. Replace `preload()` and `loadImage()`\r\n\r\nThere is no `preload()` phase in Viji. For images, use Viji's image parameter or `fetch()` in `setup()`:\r\n\r\n```javascript\r\n// Standard P5.js\r\nlet img;\r\nfunction preload() {\r\n img = loadImage('photo.jpg');\r\n}\r\nfunction draw() {\r\n image(img, 0, 0);\r\n}\r\n\r\n// Viji-P5 — use image parameter\r\nconst photo = viji.image(null, { label: 'Photo' });\r\n\r\nfunction render(viji, p5) {\r\n if (photo.value) {\r\n p5.image(photo.p5, 0, 0, viji.width, viji.height);\r\n }\r\n}\r\n```\r\n\r\n> [!NOTE]\r\n> Use `photo.p5` (not `photo.value`) when passing images to P5 drawing functions like `p5.image()`. The `.p5` property provides a P5-compatible wrapper around the raw image data.\r\n\r\nFor JSON or text data, use `fetch()` in an async `setup()`:\r\n\r\n```javascript\r\nlet data = null;\r\n\r\nasync function setup(viji, p5) {\r\n const response = await fetch('https://cdn.example.com/data.json');\r\n data = await response.json();\r\n}\r\n```\r\n\r\n### 8. Replace `save()` and `frameRate()`\r\n\r\nThese host-level concerns are handled outside the scene:\r\n\r\n- **Saving frames**: The host application uses `core.captureFrame()`.\r\n- **Frame rate**: The host controls it via `core.setFrameRate()`.\r\n\r\nSimply remove these calls from your scene code.\r\n\r\n## Complete Conversion Example\r\n\r\nHere is the same scene implemented both ways, followed by the live Viji version:\r\n\r\n**Standard P5.js:**\r\n\r\n```javascript\r\nfunction setup() {\r\n createCanvas(400, 400);\r\n colorMode(HSB, 360, 100, 100, 100);\r\n}\r\n\r\nfunction draw() {\r\n background(0, 0, 10);\r\n let count = 8;\r\n let radius = 120;\r\n for (let i = 0; i < count; i++) {\r\n let a = frameCount * 0.02 + (i / count) * TWO_PI;\r\n let x = width / 2 + cos(a) * radius;\r\n let y = height / 2 + sin(a) * radius;\r\n noStroke();\r\n fill(255, 150, 0);\r\n circle(x, y, 16);\r\n }\r\n}\r\n```\r\n\r\n**Converted Viji-P5:**"
2019
2051
  },
2020
2052
  {
2021
2053
  "type": "live-example",
@@ -2292,7 +2324,7 @@ export const docsApi = {
2292
2324
  },
2293
2325
  {
2294
2326
  "type": "text",
2295
- "markdown": "## Related\n\n- [Connection & Lifecycle](connection/)\n- [Volume](volume/)\n- [Frequency Bands](bands/)\n- [Beat Detection](beat/)\n- [Spectral Analysis](spectral/)\n- [Frequency Data](frequency-data/)\n- [Waveform](waveform/)\n- [Native Audio](/native/audio)\n- [Shader Audio Uniforms](/shader/audio)"
2327
+ "markdown": "## Additional Audio Streams\n\nBeyond the main `viji.audio` stream, your scene may receive additional audio sources through `viji.audioStreams[]` and `device.audio`:\n\n- **`viji.audioStreams[i]`** — Additional audio streams provided by the host (e.g., for multi-source mixing). Each provides lightweight analysis: volume, frequency bands, and spectral features.\n- **`device.audio`** — Audio from externally connected devices. Same lightweight interface.\n\nThese use the `AudioStreamAPI` interface — a subset of the full `AudioAPI` documented on this page. They include volume, bands, spectral data, frequency data, and waveform, but do **not** include beat detection, BPM, triggers, or events.\n\nSee [Device Audio](/p5/external-devices/audio) for device-specific audio documentation.\n\n## Related\n\n- [Connection & Lifecycle](connection/)\n- [Volume](volume/)\n- [Frequency Bands](bands/)\n- [Beat Detection](beat/)\n- [Spectral Analysis](spectral/)\n- [Frequency Data](frequency-data/)\n- [Waveform](waveform/)\n- [Native Audio](/native/audio)\n- [Shader Audio Uniforms](/shader/audio)"
2296
2328
  }
2297
2329
  ]
2298
2330
  },
@@ -2334,7 +2366,7 @@ export const docsApi = {
2334
2366
  },
2335
2367
  {
2336
2368
  "type": "text",
2337
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Connection & Lifecycle](../connection/)\n- [Frequency Bands](../bands/)\n- [Beat Detection](../beat/)"
2369
+ "markdown": "> [!NOTE]\n> P5 scenes use the same `viji` audio API as native scenes. The same volume data (`current`, `peak`, `smoothed`) is available on additional audio streams via `viji.audioStreams[i].volume` and on device audio via `device.audio.volume`.\n\n## Related\n\n- [Audio Overview](../)\n- [Connection & Lifecycle](../connection/)\n- [Frequency Bands](../bands/)\n- [Beat Detection](../beat/)"
2338
2370
  }
2339
2371
  ]
2340
2372
  },
@@ -2355,7 +2387,7 @@ export const docsApi = {
2355
2387
  },
2356
2388
  {
2357
2389
  "type": "text",
2358
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Volume](../volume/)\n- [Beat Detection](../beat/)\n- [Frequency Data](../frequency-data/)\n- [Spectral Analysis](../spectral/)"
2390
+ "markdown": "> [!NOTE]\n> The same band data is available on additional audio streams via `viji.audioStreams[i].bands` and on device audio via `device.audio.bands`.\n\n## Related\n\n- [Audio Overview](../)\n- [Volume](../volume/)\n- [Beat Detection](../beat/)\n- [Frequency Data](../frequency-data/)\n- [Spectral Analysis](../spectral/)"
2359
2391
  }
2360
2392
  ]
2361
2393
  },
@@ -2376,7 +2408,7 @@ export const docsApi = {
2376
2408
  },
2377
2409
  {
2378
2410
  "type": "text",
2379
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)\n- [Spectral Analysis](../spectral/)"
2411
+ "markdown": "> [!NOTE]\n> Beat detection is only available on the main audio stream (`viji.audio`). Additional audio streams (`viji.audioStreams[]`) and device audio (`device.audio`) do **not** include beat detection, BPM, triggers, or events.\n\n## Related\n\n- [Audio Overview](../)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)\n- [Spectral Analysis](../spectral/)"
2380
2412
  }
2381
2413
  ]
2382
2414
  },
@@ -2397,7 +2429,7 @@ export const docsApi = {
2397
2429
  },
2398
2430
  {
2399
2431
  "type": "text",
2400
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Frequency Bands](../bands/)\n- [Frequency Data](../frequency-data/)\n- [Volume](../volume/)"
2432
+ "markdown": "> [!NOTE]\n> The same spectral data is available on additional audio streams via `viji.audioStreams[i].spectral` and `device.audio.spectral`.\n\n## Related\n\n- [Audio Overview](../)\n- [Frequency Bands](../bands/)\n- [Frequency Data](../frequency-data/)\n- [Volume](../volume/)"
2401
2433
  }
2402
2434
  ]
2403
2435
  },
@@ -2418,7 +2450,7 @@ export const docsApi = {
2418
2450
  },
2419
2451
  {
2420
2452
  "type": "text",
2421
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Frequency Bands](../bands/)\n- [Waveform](../waveform/)\n- [Spectral Analysis](../spectral/)"
2453
+ "markdown": "> [!NOTE]\n> Frequency data is also available on additional audio streams via `viji.audioStreams[i].getFrequencyData()` and `device.audio.getFrequencyData()`.\n\n## Related\n\n- [Audio Overview](../)\n- [Frequency Bands](../bands/)\n- [Waveform](../waveform/)\n- [Spectral Analysis](../spectral/)"
2422
2454
  }
2423
2455
  ]
2424
2456
  },
@@ -2439,7 +2471,7 @@ export const docsApi = {
2439
2471
  },
2440
2472
  {
2441
2473
  "type": "text",
2442
- "markdown": "## Related\n\n- [Audio Overview](../)\n- [Frequency Data](../frequency-data/)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)"
2474
+ "markdown": "> [!NOTE]\n> Waveform data is also available on additional audio streams via `viji.audioStreams[i].getWaveform()` and `device.audio.getWaveform()`.\n\n## Related\n\n- [Audio Overview](../)\n- [Frequency Data](../frequency-data/)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)"
2443
2475
  }
2444
2476
  ]
2445
2477
  },
@@ -2786,7 +2818,7 @@ export const docsApi = {
2786
2818
  "content": [
2787
2819
  {
2788
2820
  "type": "text",
2789
- "markdown": "# External Devices\n\n`viji.devices` provides access to externally connected devices (phones, tablets, or other hardware) linked to your installation through the host platform.\n\n> [!NOTE]\n> External devices are managed entirely by the host application. Artists cannot control device connections — you only read the current state each render cycle. Devices appear and disappear from the array dynamically as they connect and disconnect.\n\n## API Reference\n\n### DeviceState (`viji.devices[i]`)\n\n| Property | Type | Description |\n|----------|------|-------------|\n| `id` | `string` | Unique device identifier (assigned by host) |\n| `name` | `string` | User-friendly device name (assigned by host) |\n| `motion` | `DeviceMotionData \\| null` | Device accelerometer and gyroscope data |\n| `orientation` | `DeviceOrientationData \\| null` | Device spatial orientation |\n| `video` | `VideoAPI \\| null` | Device camera video feed, or `null` if no camera connected |\n\n### Device Limits\n\nUp to **8 external devices** can be connected simultaneously. The `viji.devices` array contains only currently connected devices.\n\n## Default Values\n\n- `viji.devices` → `[]` (empty array) when no devices are connected\n- `device.motion` → `null` when the device has no sensor data\n- `device.orientation` → `null` when the device has no orientation data\n- `device.video` → `null` when the device has no camera stream\n\n## Guard Patterns\n\nAlways check array length and null properties:\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n if (viji.devices.length === 0) return;\n\n for (const device of viji.devices) {\n if (device.video?.isConnected && device.video.currentFrame) {\n p5.image(device.video.currentFrame, 0, 0);\n }\n\n if (device.motion?.acceleration) {\n // Use device acceleration\n }\n }\n}\n```\n\n## Basic Example"
2821
+ "markdown": "# External Devices\n\n`viji.devices` provides access to externally connected devices (phones, tablets, or other hardware) linked to your installation through the host platform.\n\n> [!NOTE]\n> External devices are managed entirely by the host application. Artists cannot control device connections — you only read the current state each render cycle. Devices appear and disappear from the array dynamically as they connect and disconnect.\n\n## API Reference\n\n### DeviceState (`viji.devices[i]`)\n\n| Property | Type | Description |\n|----------|------|-------------|\n| `id` | `string` | Unique device identifier (assigned by host) |\n| `name` | `string` | User-friendly device name (assigned by host) |\n| `motion` | `DeviceMotionData \\| null` | Device accelerometer and gyroscope data |\n| `orientation` | `DeviceOrientationData \\| null` | Device spatial orientation |\n| `video` | `VideoAPI \\| null` | Device camera video feed, or `null` if no camera connected |\n| `audio` | `AudioStreamAPI \\| null` | Device audio stream, or `null` if no audio connected |\n\n### Device Limits\n\nUp to **8 external devices** can be connected simultaneously. The `viji.devices` array contains only currently connected devices.\n\n## Default Values\n\n- `viji.devices` → `[]` (empty array) when no devices are connected\n- `device.motion` → `null` when the device has no sensor data\n- `device.orientation` → `null` when the device has no orientation data\n- `device.video` → `null` when the device has no camera stream\n- `device.audio` → `null` when the device has no audio stream\n\n## Guard Patterns\n\nAlways check array length and null properties:\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n if (viji.devices.length === 0) return;\n\n for (const device of viji.devices) {\n if (device.video?.isConnected && device.video.currentFrame) {\n p5.image(device.video.currentFrame, 0, 0);\n }\n\n // Check for audio\n if (device.audio?.isConnected) {\n // Use device audio data\n }\n\n if (device.motion?.acceleration) {\n // Use device acceleration\n }\n }\n}\n```\n\n## Basic Example"
2790
2822
  },
2791
2823
  {
2792
2824
  "type": "live-example",
@@ -2799,7 +2831,7 @@ export const docsApi = {
2799
2831
  },
2800
2832
  {
2801
2833
  "type": "text",
2802
- "markdown": "## Common Patterns\n\n### Display Device Count\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n const count = viji.devices.length;\n\n p5.background(17);\n p5.fill(255);\n p5.textAlign(p5.CENTER, p5.CENTER);\n p5.textSize(24);\n p5.text(\n `${count} device${count !== 1 ? 's' : ''} connected`,\n p5.width / 2, p5.height / 2\n );\n}\n```\n\n### Find Device by Name\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n const phone = viji.devices.find(d => d.name.includes('Phone'));\n if (phone) {\n // Use phone-specific data\n }\n}\n```\n\n### Iterate All Devices\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n viji.devices.forEach((device, index) => {\n const hasVideo = device.video?.isConnected ?? false;\n const hasSensors = device.motion !== null;\n // Render device status at position based on index\n });\n}\n```\n\n## What's Available on Each Device\n\n| Feature | Access | Notes |\n|---------|--------|-------|\n| **Identity** | `device.id`, `device.name` | Always available |\n| **Sensors** | `device.motion`, `device.orientation` | See [Device Sensors](sensors/) |\n| **Video** | `device.video` | See [Device Video](video/) |\n\n> [!WARNING]\n> Device video does **not** support Computer Vision (CV) features. CV processing (face detection, hand tracking, etc.) is only available on the main video stream (`viji.video`). The `device.video` object provides video frames only.\n\n## Related\n\n- [Device Video](video/) — accessing camera feeds from connected devices\n- [Device Sensors](sensors/) — accelerometer and orientation from connected devices\n- [Device Sensors (Internal)](../sensors/) — sensors from the device running the scene\n- [Native External Devices](/native/external-devices) — same API in the Native renderer\n- [Shader External Device Uniforms](/shader/external-devices) — GLSL uniforms for external devices"
2834
+ "markdown": "## Common Patterns\n\n### Display Device Count\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n const count = viji.devices.length;\n\n p5.background(17);\n p5.fill(255);\n p5.textAlign(p5.CENTER, p5.CENTER);\n p5.textSize(24);\n p5.text(\n `${count} device${count !== 1 ? 's' : ''} connected`,\n p5.width / 2, p5.height / 2\n );\n}\n```\n\n### Find Device by Name\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n const phone = viji.devices.find(d => d.name.includes('Phone'));\n if (phone) {\n // Use phone-specific data\n }\n}\n```\n\n### Iterate All Devices\n\n```javascript\n// @renderer p5\n\nfunction render(viji, p5) {\n viji.devices.forEach((device, index) => {\n const hasVideo = device.video?.isConnected ?? false;\n const hasSensors = device.motion !== null;\n // Render device status at position based on index\n });\n}\n```\n\n## What's Available on Each Device\n\n| Feature | Access | Notes |\n|---------|--------|-------|\n| **Identity** | `device.id`, `device.name` | Always available |\n| **Sensors** | `device.motion`, `device.orientation` | See [Device Sensors](sensors/) |\n| **Video** | `device.video` | See [Device Video](video/) |\n| **Audio** | `device.audio` | See [Device Audio](audio/) |\n\n> [!WARNING]\n> Device video does **not** support Computer Vision (CV) features. CV processing (face detection, hand tracking, etc.) is only available on the main video stream (`viji.video`). The `device.video` object provides video frames only.\n\n> [!NOTE]\n> Device audio provides **lightweight analysis** only — volume, frequency bands, and spectral features. Beat detection, BPM tracking, and onset events are only available on the main audio stream (`viji.audio`).\n\n## Related\n\n- [Device Audio](audio/) — audio analysis from connected devices\n- [Device Video](video/) — accessing camera feeds from connected devices\n- [Device Sensors](sensors/) — accelerometer and orientation from connected devices\n- [Device Sensors (Internal)](../sensors/) — sensors from the device running the scene\n- [Native External Devices](/native/external-devices) — same API in the Native renderer\n- [Shader External Device Uniforms](/shader/external-devices) — GLSL uniforms for external devices"
2803
2835
  }
2804
2836
  ]
2805
2837
  },
@@ -2851,6 +2883,17 @@ export const docsApi = {
2851
2883
  }
2852
2884
  ]
2853
2885
  },
2886
+ "p5-ext-audio": {
2887
+ "id": "p5-ext-audio",
2888
+ "title": "Device Audio",
2889
+ "description": "Lightweight audio analysis from externally connected devices in P5 scenes — volume, bands, spectral features, and raw FFT/waveform via AudioStreamAPI.",
2890
+ "content": [
2891
+ {
2892
+ "type": "text",
2893
+ "markdown": "# Device Audio\n\nEach entry in `viji.devices` may expose **`device.audio`**: an [`AudioStreamAPI`](../../audio/) (or `null` when the host has not attached an audio source for that device).\n\n## Behavior\n\n- Check **`device.audio?.isConnected`** before reading values.\n- **Lightweight only:** `isConnected`, `volume`, `bands` (including smoothed), `spectral`, `getFrequencyData()`, `getWaveform()` — same subset as [`viji.audioStreams`](../../audio/#additional-audio-streams) entries.\n- **Not available:** beat energy, triggers, BPM, or beat events (those exist only on the main [`viji.audio`](../../audio/) stream).\n\n## Related\n\n- [External Devices — Overview](../)\n- [P5 Audio](../../audio/)\n- [Shader: Audio stream uniforms](/shader/api-reference#audio-streams)"
2894
+ }
2895
+ ]
2896
+ },
2854
2897
  "shader-quickstart": {
2855
2898
  "id": "shader-quickstart",
2856
2899
  "title": "shader-quickstart",
@@ -2879,7 +2922,7 @@ export const docsApi = {
2879
2922
  "content": [
2880
2923
  {
2881
2924
  "type": "text",
2882
- "markdown": "# API Reference\n\nViji auto-injects 160+ uniforms into every shader scene. This page is the complete list — use it as a quick lookup. Each entry links to its dedicated documentation page for full details and examples.\n\nAll uniforms listed below are always declared in the shader preamble (except [`backbuffer`](/shader/backbuffer), which is conditional). When data is not available, uniforms hold default values (zeros, false, or empty textures). Your shader compiles once with all declarations present — you do not need to conditionally declare them.\n\nNew to Viji shaders? Start with [Shader Basics](/shader/basics) instead.\n\n## Core / Timing\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_time`](/shader/timing) | `float` | Seconds elapsed since the scene started | [Timing](/shader/timing) |\n| [`u_deltaTime`](/shader/timing) | `float` | Seconds since the previous frame | [Timing](/shader/timing) |\n| [`u_frame`](/shader/timing) | `int` | Frame index (monotonically increasing) | [Timing](/shader/timing) |\n| [`u_fps`](/shader/timing) | `float` | Target FPS based on host's frame rate mode | [Timing](/shader/timing) |\n\n## Resolution\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_resolution`](/shader/resolution) | `vec2` | Canvas width and height in pixels | [Resolution](/shader/resolution) |\n\n## Mouse\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_mouse`](/shader/mouse) | `vec2` | Cursor position in pixels (WebGL Y-flipped) | [Mouse](/shader/mouse) |\n| [`u_mouseInCanvas`](/shader/mouse) | `bool` | Whether cursor is inside the canvas | [Mouse](/shader/mouse) |\n| [`u_mousePressed`](/shader/mouse) | `bool` | Whether any button is pressed | [Mouse](/shader/mouse) |\n| [`u_mouseLeft`](/shader/mouse) | `bool` | Left button state | [Mouse](/shader/mouse) |\n| [`u_mouseRight`](/shader/mouse) | `bool` | Right button state | [Mouse](/shader/mouse) |\n| [`u_mouseMiddle`](/shader/mouse) | `bool` | Middle button state | [Mouse](/shader/mouse) |\n| [`u_mouseDelta`](/shader/mouse) | `vec2` | Pixel movement this frame (Y-flipped) | [Mouse](/shader/mouse) |\n| [`u_mouseWheel`](/shader/mouse) | `float` | Scroll delta this frame | [Mouse](/shader/mouse) |\n| [`u_mouseWasPressed`](/shader/mouse) | `bool` | True for one frame when pressed | [Mouse](/shader/mouse) |\n| [`u_mouseWasReleased`](/shader/mouse) | `bool` | True for one frame when released | [Mouse](/shader/mouse) |\n\n## Keyboard\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_keySpace`](/shader/keyboard) | `bool` | Space bar | [Keyboard](/shader/keyboard) |\n| [`u_keyShift`](/shader/keyboard) | `bool` | Shift key | [Keyboard](/shader/keyboard) |\n| [`u_keyCtrl`](/shader/keyboard) | `bool` | Ctrl/Cmd key | [Keyboard](/shader/keyboard) |\n| [`u_keyAlt`](/shader/keyboard) | `bool` | Alt/Option key | [Keyboard](/shader/keyboard) |\n| [`u_keyW`](/shader/keyboard) | `bool` | W key | [Keyboard](/shader/keyboard) |\n| [`u_keyA`](/shader/keyboard) | `bool` | A key | [Keyboard](/shader/keyboard) |\n| [`u_keyS`](/shader/keyboard) | `bool` | S key | [Keyboard](/shader/keyboard) |\n| [`u_keyD`](/shader/keyboard) | `bool` | D key | [Keyboard](/shader/keyboard) |\n| [`u_keyUp`](/shader/keyboard) | `bool` | Up arrow | [Keyboard](/shader/keyboard) |\n| [`u_keyDown`](/shader/keyboard) | `bool` | Down arrow | [Keyboard](/shader/keyboard) |\n| [`u_keyLeft`](/shader/keyboard) | `bool` | Left arrow | [Keyboard](/shader/keyboard) |\n| [`u_keyRight`](/shader/keyboard) | `bool` | Right arrow | [Keyboard](/shader/keyboard) |\n| [`u_keyboard`](/shader/keyboard) | `sampler2D` | 256×3 LUMINANCE texture (row 0: held, row 1: pressed, row 2: toggle) | [Keyboard](/shader/keyboard) |\n\n## Touch\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_touchCount`](/shader/touch) | `int` | Number of active touches (0–5) | [Touch](/shader/touch) |\n| [`u_touch0`](/shader/touch) | `vec2` | Touch point 0 position (pixels, Y-flipped) | [Touch](/shader/touch) |\n| [`u_touch1`](/shader/touch) | `vec2` | Touch point 1 position | [Touch](/shader/touch) |\n| [`u_touch2`](/shader/touch) | `vec2` | Touch point 2 position | [Touch](/shader/touch) |\n| [`u_touch3`](/shader/touch) | `vec2` | Touch point 3 position | [Touch](/shader/touch) |\n| [`u_touch4`](/shader/touch) | `vec2` | Touch point 4 position | [Touch](/shader/touch) |\n\n## Pointer (Unified)\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_pointer`](/shader/pointer) | `vec2` | Primary pointer position (pixels, Y-flipped) | [Pointer](/shader/pointer) |\n| [`u_pointerDelta`](/shader/pointer) | `vec2` | Movement delta (Y-flipped) | [Pointer](/shader/pointer) |\n| [`u_pointerDown`](/shader/pointer) | `bool` | Whether pointer is active (click or touch) | [Pointer](/shader/pointer) |\n| [`u_pointerWasPressed`](/shader/pointer) | `bool` | True for one frame when pressed | [Pointer](/shader/pointer) |\n| [`u_pointerWasReleased`](/shader/pointer) | `bool` | True for one frame when released | [Pointer](/shader/pointer) |\n| [`u_pointerInCanvas`](/shader/pointer) | `bool` | Whether pointer is inside the canvas | [Pointer](/shader/pointer) |\n\n## Audio — Scalars\n\n### Volume\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_audioVolume`](/shader/audio/volume) | `float` | Current RMS volume 0–1 | [Volume](/shader/audio/volume) |\n| [`u_audioPeak`](/shader/audio/volume) | `float` | Peak volume 0–1 | [Volume](/shader/audio/volume) |\n| [`u_audioVolumeSmoothed`](/shader/audio/volume) | `float` | Smoothed volume (~200ms decay) | [Volume](/shader/audio/volume) |\n\n### Frequency Bands\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_audioLow`](/shader/audio/bands) | `float` | Low band energy (20–120 Hz) | [Bands](/shader/audio/bands) |\n| [`u_audioLowMid`](/shader/audio/bands) | `float` | Low-mid band energy (120–500 Hz) | [Bands](/shader/audio/bands) |\n| [`u_audioMid`](/shader/audio/bands) | `float` | Mid band energy (500–2 kHz) | [Bands](/shader/audio/bands) |\n| [`u_audioHighMid`](/shader/audio/bands) | `float` | High-mid band energy (2–6 kHz) | [Bands](/shader/audio/bands) |\n| [`u_audioHigh`](/shader/audio/bands) | `float` | High band energy (6–16 kHz) | [Bands](/shader/audio/bands) |\n| [`u_audioLowSmoothed`](/shader/audio/bands) | `float` | Smoothed low band | [Bands](/shader/audio/bands) |\n| [`u_audioLowMidSmoothed`](/shader/audio/bands) | `float` | Smoothed low-mid band | [Bands](/shader/audio/bands) |\n| [`u_audioMidSmoothed`](/shader/audio/bands) | `float` | Smoothed mid band | [Bands](/shader/audio/bands) |\n| [`u_audioHighMidSmoothed`](/shader/audio/bands) | `float` | Smoothed high-mid band | [Bands](/shader/audio/bands) |\n| [`u_audioHighSmoothed`](/shader/audio/bands) | `float` | Smoothed high band | [Bands](/shader/audio/bands) |\n\n### Beat Detection\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_audioKick`](/shader/audio/beat) | `float` | Kick beat energy | [Beat](/shader/audio/beat) |\n| [`u_audioSnare`](/shader/audio/beat) | `float` | Snare beat energy | [Beat](/shader/audio/beat) |\n| [`u_audioHat`](/shader/audio/beat) | `float` | Hi-hat beat energy | [Beat](/shader/audio/beat) |\n| [`u_audioAny`](/shader/audio/beat) | `float` | Combined beat energy | [Beat](/shader/audio/beat) |\n| [`u_audioKickSmoothed`](/shader/audio/beat) | `float` | Smoothed kick | [Beat](/shader/audio/beat) |\n| [`u_audioSnareSmoothed`](/shader/audio/beat) | `float` | Smoothed snare | [Beat](/shader/audio/beat) |\n| [`u_audioHatSmoothed`](/shader/audio/beat) | `float` | Smoothed hi-hat | [Beat](/shader/audio/beat) |\n| [`u_audioAnySmoothed`](/shader/audio/beat) | `float` | Smoothed combined | [Beat](/shader/audio/beat) |\n| [`u_audioKickTrigger`](/shader/audio/beat) | `bool` | Kick trigger (true for one frame) | [Beat](/shader/audio/beat) |\n| [`u_audioSnareTrigger`](/shader/audio/beat) | `bool` | Snare trigger (true for one frame) | [Beat](/shader/audio/beat) |\n| [`u_audioHatTrigger`](/shader/audio/beat) | `bool` | Hi-hat trigger (true for one frame) | [Beat](/shader/audio/beat) |\n| [`u_audioAnyTrigger`](/shader/audio/beat) | `bool` | Any beat trigger (true for one frame) | [Beat](/shader/audio/beat) |\n| [`u_audioBPM`](/shader/audio/beat) | `float` | Tracked BPM | [Beat](/shader/audio/beat) |\n| [`u_audioConfidence`](/shader/audio/beat) | `float` | Beat-tracker confidence 0–1 | [Beat](/shader/audio/beat) |\n| [`u_audioIsLocked`](/shader/audio/beat) | `bool` | Whether beat tracking is locked | [Beat](/shader/audio/beat) |\n\n### Spectral Analysis\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_audioBrightness`](/shader/audio/spectral) | `float` | Spectral brightness 0–1 | [Spectral](/shader/audio/spectral) |\n| [`u_audioFlatness`](/shader/audio/spectral) | `float` | Spectral flatness 0–1 | [Spectral](/shader/audio/spectral) |\n\n## Audio — Textures\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_audioFFT`](/shader/audio/fft) | `sampler2D` | FFT as 1D LUMINANCE strip (bin count × 1, values 0–255) | [FFT Texture](/shader/audio/fft) |\n| [`u_audioWaveform`](/shader/audio/waveform) | `sampler2D` | Time-domain waveform as 1D LUMINANCE strip (-1…1 mapped to 0–255) | [Waveform Texture](/shader/audio/waveform) |\n\n## Video\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_video`](/shader/video/basics) | `sampler2D` | Main video frame texture | [Video Basics](/shader/video/basics) |\n| [`u_videoResolution`](/shader/video/basics) | `vec2` | Video frame size in pixels (0,0 if disconnected) | [Video Basics](/shader/video/basics) |\n| [`u_videoFrameRate`](/shader/video/basics) | `float` | Video frame rate | [Video Basics](/shader/video/basics) |\n\n## CV — Face Detection\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_faceCount`](/shader/video/face-detection) | `int` | Number of detected faces | [Face Detection](/shader/video/face-detection) |\n| [`u_face0Bounds`](/shader/video/face-detection) | `vec4` | Bounding box (x, y, w, h) normalized 0–1 | [Face Detection](/shader/video/face-detection) |\n| [`u_face0Center`](/shader/video/face-detection) | `vec2` | Face center normalized 0–1 | [Face Detection](/shader/video/face-detection) |\n| [`u_face0HeadPose`](/shader/video/face-mesh) | `vec3` | Pitch, yaw, roll in degrees | [Face Mesh](/shader/video/face-mesh) |\n| [`u_face0Confidence`](/shader/video/face-detection) | `float` | Detection confidence | [Face Detection](/shader/video/face-detection) |\n\n### Expressions\n\n| Uniform | Type | Details |\n|---------|------|---------|\n| [`u_face0Neutral`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Happy`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Sad`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Angry`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Surprised`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Disgusted`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Fearful`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n\n### Blendshapes (52 uniforms)\n\nAll blendshapes are `float` values 0–1, following the ARKit naming convention. See [Face Mesh](/shader/video/face-mesh) for the full list and usage.\n\n| Uniform | Uniform | Uniform |\n|---------|---------|---------|\n| `u_face0BrowDownLeft` | `u_face0BrowDownRight` | `u_face0BrowInnerUp` |\n| `u_face0BrowOuterUpLeft` | `u_face0BrowOuterUpRight` | `u_face0CheekPuff` |\n| `u_face0CheekSquintLeft` | `u_face0CheekSquintRight` | `u_face0EyeBlinkLeft` |\n| `u_face0EyeBlinkRight` | `u_face0EyeLookDownLeft` | `u_face0EyeLookDownRight` |\n| `u_face0EyeLookInLeft` | `u_face0EyeLookInRight` | `u_face0EyeLookOutLeft` |\n| `u_face0EyeLookOutRight` | `u_face0EyeLookUpLeft` | `u_face0EyeLookUpRight` |\n| `u_face0EyeSquintLeft` | `u_face0EyeSquintRight` | `u_face0EyeWideLeft` |\n| `u_face0EyeWideRight` | `u_face0JawForward` | `u_face0JawLeft` |\n| `u_face0JawOpen` | `u_face0JawRight` | `u_face0MouthClose` |\n| `u_face0MouthDimpleLeft` | `u_face0MouthDimpleRight` | `u_face0MouthFrownLeft` |\n| `u_face0MouthFrownRight` | `u_face0MouthFunnel` | `u_face0MouthLeft` |\n| `u_face0MouthLowerDownLeft` | `u_face0MouthLowerDownRight` | `u_face0MouthPressLeft` |\n| `u_face0MouthPressRight` | `u_face0MouthPucker` | `u_face0MouthRight` |\n| `u_face0MouthRollLower` | `u_face0MouthRollUpper` | `u_face0MouthShrugLower` |\n| `u_face0MouthShrugUpper` | `u_face0MouthSmileLeft` | `u_face0MouthSmileRight` |\n| `u_face0MouthStretchLeft` | `u_face0MouthStretchRight` | `u_face0MouthUpperUpLeft` |\n| `u_face0MouthUpperUpRight` | `u_face0NoseSneerLeft` | `u_face0NoseSneerRight` |\n| `u_face0TongueOut` | | |\n\n## CV — Hand Tracking\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_handCount`](/shader/video/hand-tracking) | `int` | Number of detected hands (0–2) | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandPalm`](/shader/video/hand-tracking) | `vec3` | Left hand palm position | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_rightHandPalm`](/shader/video/hand-tracking) | `vec3` | Right hand palm position | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandConfidence`](/shader/video/hand-tracking) | `float` | Left hand confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_rightHandConfidence`](/shader/video/hand-tracking) | `float` | Right hand confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandBounds`](/shader/video/hand-tracking) | `vec4` | Left hand bounding box (x, y, w, h) | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_rightHandBounds`](/shader/video/hand-tracking) | `vec4` | Right hand bounding box (x, y, w, h) | [Hand Tracking](/shader/video/hand-tracking) |\n\n### Gesture Scores (per hand)\n\nAll gesture uniforms are `float` values 0–1. Replace `left` with `right` for the other hand.\n\n| Uniform | Description | Details |\n|---------|-------------|---------|\n| [`u_leftHandFist`](/shader/video/hand-tracking) | Fist gesture confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandOpenPalm`](/shader/video/hand-tracking) | Open palm confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandPeace`](/shader/video/hand-tracking) | Peace/V-sign confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandThumbsUp`](/shader/video/hand-tracking) | Thumbs up confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandThumbsDown`](/shader/video/hand-tracking) | Thumbs down confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandPointing`](/shader/video/hand-tracking) | Pointing confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandILoveYou`](/shader/video/hand-tracking) | I Love You sign confidence | [Hand Tracking](/shader/video/hand-tracking) |\n\n## CV — Pose Detection\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_poseDetected`](/shader/video/pose-detection) | `bool` | Whether a body pose is detected | [Pose Detection](/shader/video/pose-detection) |\n| [`u_poseConfidence`](/shader/video/pose-detection) | `float` | Pose detection confidence | [Pose Detection](/shader/video/pose-detection) |\n| [`u_nosePosition`](/shader/video/pose-detection) | `vec2` | Nose landmark position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftShoulderPosition`](/shader/video/pose-detection) | `vec2` | Left shoulder position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightShoulderPosition`](/shader/video/pose-detection) | `vec2` | Right shoulder position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftElbowPosition`](/shader/video/pose-detection) | `vec2` | Left elbow position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightElbowPosition`](/shader/video/pose-detection) | `vec2` | Right elbow position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftWristPosition`](/shader/video/pose-detection) | `vec2` | Left wrist position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightWristPosition`](/shader/video/pose-detection) | `vec2` | Right wrist position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftHipPosition`](/shader/video/pose-detection) | `vec2` | Left hip position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightHipPosition`](/shader/video/pose-detection) | `vec2` | Right hip position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftKneePosition`](/shader/video/pose-detection) | `vec2` | Left knee position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightKneePosition`](/shader/video/pose-detection) | `vec2` | Right knee position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftAnklePosition`](/shader/video/pose-detection) | `vec2` | Left ankle position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightAnklePosition`](/shader/video/pose-detection) | `vec2` | Right ankle position | [Pose Detection](/shader/video/pose-detection) |\n\n## CV — Body Segmentation\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_segmentationMask`](/shader/video/body-segmentation) | `sampler2D` | Body mask (LUMINANCE: 0 = background, 1 = person) | [Segmentation](/shader/video/body-segmentation) |\n| [`u_segmentationRes`](/shader/video/body-segmentation) | `vec2` | Mask dimensions in pixels | [Segmentation](/shader/video/body-segmentation) |\n\n## Device Sensors\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_deviceAcceleration`](/shader/sensors) | `vec3` | Acceleration without gravity (m/s²) | [Sensor Uniforms](/shader/sensors) |\n| [`u_deviceAccelerationGravity`](/shader/sensors) | `vec3` | Acceleration with gravity (m/s²) | [Sensor Uniforms](/shader/sensors) |\n| [`u_deviceRotationRate`](/shader/sensors) | `vec3` | Gyroscope: alpha, beta, gamma (deg/s) | [Sensor Uniforms](/shader/sensors) |\n| [`u_deviceOrientation`](/shader/sensors) | `vec3` | Orientation: alpha, beta, gamma (degrees) | [Sensor Uniforms](/shader/sensors) |\n| [`u_deviceOrientationAbsolute`](/shader/sensors) | `bool` | Whether orientation is magnetometer-based | [Sensor Uniforms](/shader/sensors) |\n\n## External Devices — Video\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_deviceCount`](/shader/external-devices) | `int` | Number of devices with active cameras (0–8) | [Overview](/shader/external-devices) |\n| [`u_device0`](/shader/external-devices/video) – `u_device7` | `sampler2D` | Device camera frame texture | [Video Textures](/shader/external-devices/video) |\n| [`u_device0Resolution`](/shader/external-devices/video) – `u_device7Resolution` | `vec2` | Device camera frame size | [Video Textures](/shader/external-devices/video) |\n| [`u_device0Connected`](/shader/external-devices/video) – `u_device7Connected` | `bool` | Whether device camera is active | [Video Textures](/shader/external-devices/video) |\n\n## External Devices — Sensors\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_externalDeviceCount`](/shader/external-devices) | `int` | Number of connected external devices (0–8) | [Overview](/shader/external-devices) |\n| [`u_device0Acceleration`](/shader/external-devices/sensors) – `u_device7Acceleration` | `vec3` | Per-device acceleration without gravity | [Sensor Uniforms](/shader/external-devices/sensors) |\n| [`u_device0AccelerationGravity`](/shader/external-devices/sensors) – `u_device7AccelerationGravity` | `vec3` | Per-device acceleration with gravity | [Sensor Uniforms](/shader/external-devices/sensors) |\n| [`u_device0RotationRate`](/shader/external-devices/sensors) – `u_device7RotationRate` | `vec3` | Per-device rotation rate | [Sensor Uniforms](/shader/external-devices/sensors) |\n| [`u_device0Orientation`](/shader/external-devices/sensors) – `u_device7Orientation` | `vec3` | Per-device orientation angles | [Sensor Uniforms](/shader/external-devices/sensors) |\n\n> [!NOTE]\n> `u_device{i}` (sampler2D) is the **camera texture** for device slot `i`. `u_device{i}Acceleration` and similar are the **IMU sensors** for the same device — different data, same index.\n\n## Streams (Compositor)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_streamCount` | `int` | Number of active streams (0–8) |\n| `u_stream0` – `u_stream7` | `sampler2D` | Stream frame textures |\n| `u_stream0Resolution` – `u_stream7Resolution` | `vec2` | Stream frame sizes in pixels |\n| `u_stream0Connected` – `u_stream7Connected` | `bool` | Whether stream has an active frame |\n\nStreams are additional video sources injected by the host application — they are used internally by Viji's compositor for mixing multiple scenes together. When no streams are provided, `u_streamCount` is `0` and the textures sample as black. Each stream works the same way as [`u_video`](/shader/video/basics).\n\n## Backbuffer\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`backbuffer`](/shader/backbuffer) | `sampler2D` | Previous frame texture (feedback effects) | [Backbuffer & Feedback](/shader/backbuffer) |\n\n> [!WARNING]\n> `backbuffer` is **conditional** — it is only injected if the word `backbuffer` appears anywhere in your shader source (including comments). It has no `u_` prefix. See [Backbuffer & Feedback](/shader/backbuffer) for details.\n\n## Parameter Directives\n\nDeclare parameters using `// @viji-TYPE:uniformName key:value ...` comments. Each directive generates a uniform and a UI control in the host.\n\n| Directive | Uniform Type | UI Control | Details |\n|-----------|-------------|------------|---------|\n| [`@viji-slider`](/shader/parameters/slider) | `float` | Numeric slider | [Slider](/shader/parameters/slider) |\n| [`@viji-number`](/shader/parameters/number) | `float` | Numeric input | [Number](/shader/parameters/number) |\n| [`@viji-color`](/shader/parameters/color) | `vec3` | Color picker (hex → RGB 0–1) | [Color](/shader/parameters/color) |\n| [`@viji-toggle`](/shader/parameters/toggle) | `bool` | On/off switch | [Toggle](/shader/parameters/toggle) |\n| [`@viji-select`](/shader/parameters/select) | `int` | Dropdown (0-based option index) | [Select](/shader/parameters/select) |\n| [`@viji-image`](/shader/parameters/image) | `sampler2D` | Image upload | [Image](/shader/parameters/image) |\n| [`@viji-button`](/shader/parameters/button) | `bool` | Momentary button (true for one frame) | [Button](/shader/parameters/button) |\n| [`@viji-accumulator`](/shader/parameters/accumulator) | `float` | CPU-side: `+= rate × deltaTime` | [Accumulator](/shader/parameters/accumulator) |\n\nSee [Parameters Overview](/shader/parameters) for syntax, [Grouping](/shader/parameters/grouping) and [Categories](/shader/parameters/categories) for organization.\n\n## Related\n\n- [Shader Basics](/shader/basics) — auto-injection, GLSL versions, `@renderer shader`\n- [Shader Quick Start](/shader/quickstart) — getting started with shader scenes\n- [Best Practices](/getting-started/best-practices) — essential patterns for all renderers\n- [Native API Reference](/native/api-reference) — JavaScript API for the Native renderer\n- [P5 API Reference](/p5/api-reference) — JavaScript API for the P5 renderer"
2925
+ "markdown": "# API Reference\n\nViji auto-injects 160+ uniforms into every shader scene. This page is the complete list — use it as a quick lookup. Each entry links to its dedicated documentation page for full details and examples.\n\nAll uniforms listed below are always declared in the shader preamble (except [`backbuffer`](/shader/backbuffer), which is conditional). When data is not available, uniforms hold default values (zeros, false, or empty textures). Your shader compiles once with all declarations present — you do not need to conditionally declare them.\n\nNew to Viji shaders? Start with [Shader Basics](/shader/basics) instead.\n\n## Core / Timing\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_time`](/shader/timing) | `float` | Seconds elapsed since the scene started | [Timing](/shader/timing) |\n| [`u_deltaTime`](/shader/timing) | `float` | Seconds since the previous frame | [Timing](/shader/timing) |\n| [`u_frame`](/shader/timing) | `int` | Frame index (monotonically increasing) | [Timing](/shader/timing) |\n| [`u_fps`](/shader/timing) | `float` | Target FPS based on host's frame rate mode | [Timing](/shader/timing) |\n\n## Resolution\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_resolution`](/shader/resolution) | `vec2` | Canvas width and height in pixels | [Resolution](/shader/resolution) |\n\n## Mouse\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_mouse`](/shader/mouse) | `vec2` | Cursor position in pixels (WebGL Y-flipped) | [Mouse](/shader/mouse) |\n| [`u_mouseInCanvas`](/shader/mouse) | `bool` | Whether cursor is inside the canvas | [Mouse](/shader/mouse) |\n| [`u_mousePressed`](/shader/mouse) | `bool` | Whether any button is pressed | [Mouse](/shader/mouse) |\n| [`u_mouseLeft`](/shader/mouse) | `bool` | Left button state | [Mouse](/shader/mouse) |\n| [`u_mouseRight`](/shader/mouse) | `bool` | Right button state | [Mouse](/shader/mouse) |\n| [`u_mouseMiddle`](/shader/mouse) | `bool` | Middle button state | [Mouse](/shader/mouse) |\n| [`u_mouseDelta`](/shader/mouse) | `vec2` | Pixel movement this frame (Y-flipped) | [Mouse](/shader/mouse) |\n| [`u_mouseWheel`](/shader/mouse) | `float` | Scroll delta this frame | [Mouse](/shader/mouse) |\n| [`u_mouseWasPressed`](/shader/mouse) | `bool` | True for one frame when pressed | [Mouse](/shader/mouse) |\n| [`u_mouseWasReleased`](/shader/mouse) | `bool` | True for one frame when released | [Mouse](/shader/mouse) |\n\n## Keyboard\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_keySpace`](/shader/keyboard) | `bool` | Space bar | [Keyboard](/shader/keyboard) |\n| [`u_keyShift`](/shader/keyboard) | `bool` | Shift key | [Keyboard](/shader/keyboard) |\n| [`u_keyCtrl`](/shader/keyboard) | `bool` | Ctrl/Cmd key | [Keyboard](/shader/keyboard) |\n| [`u_keyAlt`](/shader/keyboard) | `bool` | Alt/Option key | [Keyboard](/shader/keyboard) |\n| [`u_keyW`](/shader/keyboard) | `bool` | W key | [Keyboard](/shader/keyboard) |\n| [`u_keyA`](/shader/keyboard) | `bool` | A key | [Keyboard](/shader/keyboard) |\n| [`u_keyS`](/shader/keyboard) | `bool` | S key | [Keyboard](/shader/keyboard) |\n| [`u_keyD`](/shader/keyboard) | `bool` | D key | [Keyboard](/shader/keyboard) |\n| [`u_keyUp`](/shader/keyboard) | `bool` | Up arrow | [Keyboard](/shader/keyboard) |\n| [`u_keyDown`](/shader/keyboard) | `bool` | Down arrow | [Keyboard](/shader/keyboard) |\n| [`u_keyLeft`](/shader/keyboard) | `bool` | Left arrow | [Keyboard](/shader/keyboard) |\n| [`u_keyRight`](/shader/keyboard) | `bool` | Right arrow | [Keyboard](/shader/keyboard) |\n| [`u_keyboard`](/shader/keyboard) | `sampler2D` | 256×3 LUMINANCE texture (row 0: held, row 1: pressed, row 2: toggle) | [Keyboard](/shader/keyboard) |\n\n## Touch\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_touchCount`](/shader/touch) | `int` | Number of active touches (0–5) | [Touch](/shader/touch) |\n| [`u_touch0`](/shader/touch) | `vec2` | Touch point 0 position (pixels, Y-flipped) | [Touch](/shader/touch) |\n| [`u_touch1`](/shader/touch) | `vec2` | Touch point 1 position | [Touch](/shader/touch) |\n| [`u_touch2`](/shader/touch) | `vec2` | Touch point 2 position | [Touch](/shader/touch) |\n| [`u_touch3`](/shader/touch) | `vec2` | Touch point 3 position | [Touch](/shader/touch) |\n| [`u_touch4`](/shader/touch) | `vec2` | Touch point 4 position | [Touch](/shader/touch) |\n\n## Pointer (Unified)\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_pointer`](/shader/pointer) | `vec2` | Primary pointer position (pixels, Y-flipped) | [Pointer](/shader/pointer) |\n| [`u_pointerDelta`](/shader/pointer) | `vec2` | Movement delta (Y-flipped) | [Pointer](/shader/pointer) |\n| [`u_pointerDown`](/shader/pointer) | `bool` | Whether pointer is active (click or touch) | [Pointer](/shader/pointer) |\n| [`u_pointerWasPressed`](/shader/pointer) | `bool` | True for one frame when pressed | [Pointer](/shader/pointer) |\n| [`u_pointerWasReleased`](/shader/pointer) | `bool` | True for one frame when released | [Pointer](/shader/pointer) |\n| [`u_pointerInCanvas`](/shader/pointer) | `bool` | Whether pointer is inside the canvas | [Pointer](/shader/pointer) |\n\n## Audio — Scalars\n\n### Volume\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_audioVolume`](/shader/audio/volume) | `float` | Current RMS volume 0–1 | [Volume](/shader/audio/volume) |\n| [`u_audioPeak`](/shader/audio/volume) | `float` | Peak volume 0–1 | [Volume](/shader/audio/volume) |\n| [`u_audioVolumeSmoothed`](/shader/audio/volume) | `float` | Smoothed volume (~200ms decay) | [Volume](/shader/audio/volume) |\n\n### Frequency Bands\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_audioLow`](/shader/audio/bands) | `float` | Low band energy (20–120 Hz) | [Bands](/shader/audio/bands) |\n| [`u_audioLowMid`](/shader/audio/bands) | `float` | Low-mid band energy (120–500 Hz) | [Bands](/shader/audio/bands) |\n| [`u_audioMid`](/shader/audio/bands) | `float` | Mid band energy (500–2 kHz) | [Bands](/shader/audio/bands) |\n| [`u_audioHighMid`](/shader/audio/bands) | `float` | High-mid band energy (2–6 kHz) | [Bands](/shader/audio/bands) |\n| [`u_audioHigh`](/shader/audio/bands) | `float` | High band energy (6–16 kHz) | [Bands](/shader/audio/bands) |\n| [`u_audioLowSmoothed`](/shader/audio/bands) | `float` | Smoothed low band | [Bands](/shader/audio/bands) |\n| [`u_audioLowMidSmoothed`](/shader/audio/bands) | `float` | Smoothed low-mid band | [Bands](/shader/audio/bands) |\n| [`u_audioMidSmoothed`](/shader/audio/bands) | `float` | Smoothed mid band | [Bands](/shader/audio/bands) |\n| [`u_audioHighMidSmoothed`](/shader/audio/bands) | `float` | Smoothed high-mid band | [Bands](/shader/audio/bands) |\n| [`u_audioHighSmoothed`](/shader/audio/bands) | `float` | Smoothed high band | [Bands](/shader/audio/bands) |\n\n### Beat Detection\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_audioKick`](/shader/audio/beat) | `float` | Kick beat energy | [Beat](/shader/audio/beat) |\n| [`u_audioSnare`](/shader/audio/beat) | `float` | Snare beat energy | [Beat](/shader/audio/beat) |\n| [`u_audioHat`](/shader/audio/beat) | `float` | Hi-hat beat energy | [Beat](/shader/audio/beat) |\n| [`u_audioAny`](/shader/audio/beat) | `float` | Combined beat energy | [Beat](/shader/audio/beat) |\n| [`u_audioKickSmoothed`](/shader/audio/beat) | `float` | Smoothed kick | [Beat](/shader/audio/beat) |\n| [`u_audioSnareSmoothed`](/shader/audio/beat) | `float` | Smoothed snare | [Beat](/shader/audio/beat) |\n| [`u_audioHatSmoothed`](/shader/audio/beat) | `float` | Smoothed hi-hat | [Beat](/shader/audio/beat) |\n| [`u_audioAnySmoothed`](/shader/audio/beat) | `float` | Smoothed combined | [Beat](/shader/audio/beat) |\n| [`u_audioKickTrigger`](/shader/audio/beat) | `bool` | Kick trigger (true for one frame) | [Beat](/shader/audio/beat) |\n| [`u_audioSnareTrigger`](/shader/audio/beat) | `bool` | Snare trigger (true for one frame) | [Beat](/shader/audio/beat) |\n| [`u_audioHatTrigger`](/shader/audio/beat) | `bool` | Hi-hat trigger (true for one frame) | [Beat](/shader/audio/beat) |\n| [`u_audioAnyTrigger`](/shader/audio/beat) | `bool` | Any beat trigger (true for one frame) | [Beat](/shader/audio/beat) |\n| [`u_audioBPM`](/shader/audio/beat) | `float` | Tracked BPM | [Beat](/shader/audio/beat) |\n| [`u_audioConfidence`](/shader/audio/beat) | `float` | Beat-tracker confidence 0–1 | [Beat](/shader/audio/beat) |\n| [`u_audioIsLocked`](/shader/audio/beat) | `bool` | Whether beat tracking is locked | [Beat](/shader/audio/beat) |\n\n### Spectral Analysis\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_audioBrightness`](/shader/audio/spectral) | `float` | Spectral brightness 0–1 | [Spectral](/shader/audio/spectral) |\n| [`u_audioFlatness`](/shader/audio/spectral) | `float` | Spectral flatness 0–1 | [Spectral](/shader/audio/spectral) |\n\n## Audio — Textures\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_audioFFT`](/shader/audio/fft) | `sampler2D` | FFT as 1D LUMINANCE strip (bin count × 1, values 0–255) | [FFT Texture](/shader/audio/fft) |\n| [`u_audioWaveform`](/shader/audio/waveform) | `sampler2D` | Time-domain waveform as 1D LUMINANCE strip (-1…1 mapped to 0–255) | [Waveform Texture](/shader/audio/waveform) |\n\n## Video\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_video`](/shader/video/basics) | `sampler2D` | Main video frame texture | [Video Basics](/shader/video/basics) |\n| [`u_videoResolution`](/shader/video/basics) | `vec2` | Video frame size in pixels (0,0 if disconnected) | [Video Basics](/shader/video/basics) |\n| [`u_videoFrameRate`](/shader/video/basics) | `float` | Video frame rate | [Video Basics](/shader/video/basics) |\n\n## CV — Face Detection\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_faceCount`](/shader/video/face-detection) | `int` | Number of detected faces | [Face Detection](/shader/video/face-detection) |\n| [`u_face0Bounds`](/shader/video/face-detection) | `vec4` | Bounding box (x, y, w, h) normalized 0–1 | [Face Detection](/shader/video/face-detection) |\n| [`u_face0Center`](/shader/video/face-detection) | `vec2` | Face center normalized 0–1 | [Face Detection](/shader/video/face-detection) |\n| [`u_face0HeadPose`](/shader/video/face-mesh) | `vec3` | Pitch, yaw, roll in degrees | [Face Mesh](/shader/video/face-mesh) |\n| [`u_face0Confidence`](/shader/video/face-detection) | `float` | Detection confidence | [Face Detection](/shader/video/face-detection) |\n\n### Expressions\n\n| Uniform | Type | Details |\n|---------|------|---------|\n| [`u_face0Neutral`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Happy`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Sad`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Angry`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Surprised`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Disgusted`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n| [`u_face0Fearful`](/shader/video/emotion-detection) | `float` | [Emotion Detection](/shader/video/emotion-detection) |\n\n### Blendshapes (52 uniforms)\n\nAll blendshapes are `float` values 0–1, following the ARKit naming convention. See [Face Mesh](/shader/video/face-mesh) for the full list and usage.\n\n| Uniform | Uniform | Uniform |\n|---------|---------|---------|\n| `u_face0BrowDownLeft` | `u_face0BrowDownRight` | `u_face0BrowInnerUp` |\n| `u_face0BrowOuterUpLeft` | `u_face0BrowOuterUpRight` | `u_face0CheekPuff` |\n| `u_face0CheekSquintLeft` | `u_face0CheekSquintRight` | `u_face0EyeBlinkLeft` |\n| `u_face0EyeBlinkRight` | `u_face0EyeLookDownLeft` | `u_face0EyeLookDownRight` |\n| `u_face0EyeLookInLeft` | `u_face0EyeLookInRight` | `u_face0EyeLookOutLeft` |\n| `u_face0EyeLookOutRight` | `u_face0EyeLookUpLeft` | `u_face0EyeLookUpRight` |\n| `u_face0EyeSquintLeft` | `u_face0EyeSquintRight` | `u_face0EyeWideLeft` |\n| `u_face0EyeWideRight` | `u_face0JawForward` | `u_face0JawLeft` |\n| `u_face0JawOpen` | `u_face0JawRight` | `u_face0MouthClose` |\n| `u_face0MouthDimpleLeft` | `u_face0MouthDimpleRight` | `u_face0MouthFrownLeft` |\n| `u_face0MouthFrownRight` | `u_face0MouthFunnel` | `u_face0MouthLeft` |\n| `u_face0MouthLowerDownLeft` | `u_face0MouthLowerDownRight` | `u_face0MouthPressLeft` |\n| `u_face0MouthPressRight` | `u_face0MouthPucker` | `u_face0MouthRight` |\n| `u_face0MouthRollLower` | `u_face0MouthRollUpper` | `u_face0MouthShrugLower` |\n| `u_face0MouthShrugUpper` | `u_face0MouthSmileLeft` | `u_face0MouthSmileRight` |\n| `u_face0MouthStretchLeft` | `u_face0MouthStretchRight` | `u_face0MouthUpperUpLeft` |\n| `u_face0MouthUpperUpRight` | `u_face0NoseSneerLeft` | `u_face0NoseSneerRight` |\n| `u_face0TongueOut` | | |\n\n## CV — Hand Tracking\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_handCount`](/shader/video/hand-tracking) | `int` | Number of detected hands (0–2) | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandPalm`](/shader/video/hand-tracking) | `vec3` | Left hand palm position | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_rightHandPalm`](/shader/video/hand-tracking) | `vec3` | Right hand palm position | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandConfidence`](/shader/video/hand-tracking) | `float` | Left hand confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_rightHandConfidence`](/shader/video/hand-tracking) | `float` | Right hand confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandBounds`](/shader/video/hand-tracking) | `vec4` | Left hand bounding box (x, y, w, h) | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_rightHandBounds`](/shader/video/hand-tracking) | `vec4` | Right hand bounding box (x, y, w, h) | [Hand Tracking](/shader/video/hand-tracking) |\n\n### Gesture Scores (per hand)\n\nAll gesture uniforms are `float` values 0–1. Replace `left` with `right` for the other hand.\n\n| Uniform | Description | Details |\n|---------|-------------|---------|\n| [`u_leftHandFist`](/shader/video/hand-tracking) | Fist gesture confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandOpenPalm`](/shader/video/hand-tracking) | Open palm confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandPeace`](/shader/video/hand-tracking) | Peace/V-sign confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandThumbsUp`](/shader/video/hand-tracking) | Thumbs up confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandThumbsDown`](/shader/video/hand-tracking) | Thumbs down confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandPointing`](/shader/video/hand-tracking) | Pointing confidence | [Hand Tracking](/shader/video/hand-tracking) |\n| [`u_leftHandILoveYou`](/shader/video/hand-tracking) | I Love You sign confidence | [Hand Tracking](/shader/video/hand-tracking) |\n\n## CV — Pose Detection\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_poseDetected`](/shader/video/pose-detection) | `bool` | Whether a body pose is detected | [Pose Detection](/shader/video/pose-detection) |\n| [`u_poseConfidence`](/shader/video/pose-detection) | `float` | Pose detection confidence | [Pose Detection](/shader/video/pose-detection) |\n| [`u_nosePosition`](/shader/video/pose-detection) | `vec2` | Nose landmark position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftShoulderPosition`](/shader/video/pose-detection) | `vec2` | Left shoulder position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightShoulderPosition`](/shader/video/pose-detection) | `vec2` | Right shoulder position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftElbowPosition`](/shader/video/pose-detection) | `vec2` | Left elbow position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightElbowPosition`](/shader/video/pose-detection) | `vec2` | Right elbow position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftWristPosition`](/shader/video/pose-detection) | `vec2` | Left wrist position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightWristPosition`](/shader/video/pose-detection) | `vec2` | Right wrist position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftHipPosition`](/shader/video/pose-detection) | `vec2` | Left hip position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightHipPosition`](/shader/video/pose-detection) | `vec2` | Right hip position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftKneePosition`](/shader/video/pose-detection) | `vec2` | Left knee position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightKneePosition`](/shader/video/pose-detection) | `vec2` | Right knee position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_leftAnklePosition`](/shader/video/pose-detection) | `vec2` | Left ankle position | [Pose Detection](/shader/video/pose-detection) |\n| [`u_rightAnklePosition`](/shader/video/pose-detection) | `vec2` | Right ankle position | [Pose Detection](/shader/video/pose-detection) |\n\n## CV — Body Segmentation\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_segmentationMask`](/shader/video/body-segmentation) | `sampler2D` | Body mask (LUMINANCE: 0 = background, 1 = person) | [Segmentation](/shader/video/body-segmentation) |\n| [`u_segmentationRes`](/shader/video/body-segmentation) | `vec2` | Mask dimensions in pixels | [Segmentation](/shader/video/body-segmentation) |\n\n## Device Sensors\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_deviceAcceleration`](/shader/sensors) | `vec3` | Acceleration without gravity (m/s²) | [Sensor Uniforms](/shader/sensors) |\n| [`u_deviceAccelerationGravity`](/shader/sensors) | `vec3` | Acceleration with gravity (m/s²) | [Sensor Uniforms](/shader/sensors) |\n| [`u_deviceRotationRate`](/shader/sensors) | `vec3` | Gyroscope: alpha, beta, gamma (deg/s) | [Sensor Uniforms](/shader/sensors) |\n| [`u_deviceOrientation`](/shader/sensors) | `vec3` | Orientation: alpha, beta, gamma (degrees) | [Sensor Uniforms](/shader/sensors) |\n| [`u_deviceOrientationAbsolute`](/shader/sensors) | `bool` | Whether orientation is magnetometer-based | [Sensor Uniforms](/shader/sensors) |\n\n## External Devices — Video\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_deviceCount`](/shader/external-devices) | `int` | Number of devices with active cameras (0–8) | [Overview](/shader/external-devices) |\n| [`u_device0`](/shader/external-devices/video) – `u_device7` | `sampler2D` | Device camera frame texture | [Video Textures](/shader/external-devices/video) |\n| [`u_device0Resolution`](/shader/external-devices/video) – `u_device7Resolution` | `vec2` | Device camera frame size | [Video Textures](/shader/external-devices/video) |\n| [`u_device0Connected`](/shader/external-devices/video) – `u_device7Connected` | `bool` | Whether device camera is active | [Video Textures](/shader/external-devices/video) |\n\n> [!NOTE]\n> Audio stream uniforms from devices are accessed via `u_audioStream{i}*` uniforms (same pool as additional audio streams, different index range managed by the host).\n\n## External Devices — Sensors\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`u_externalDeviceCount`](/shader/external-devices) | `int` | Number of connected external devices (0–8) | [Overview](/shader/external-devices) |\n| [`u_device0Acceleration`](/shader/external-devices/sensors) – `u_device7Acceleration` | `vec3` | Per-device acceleration without gravity | [Sensor Uniforms](/shader/external-devices/sensors) |\n| [`u_device0AccelerationGravity`](/shader/external-devices/sensors) – `u_device7AccelerationGravity` | `vec3` | Per-device acceleration with gravity | [Sensor Uniforms](/shader/external-devices/sensors) |\n| [`u_device0RotationRate`](/shader/external-devices/sensors) – `u_device7RotationRate` | `vec3` | Per-device rotation rate | [Sensor Uniforms](/shader/external-devices/sensors) |\n| [`u_device0Orientation`](/shader/external-devices/sensors) – `u_device7Orientation` | `vec3` | Per-device orientation angles | [Sensor Uniforms](/shader/external-devices/sensors) |\n\n> [!NOTE]\n> `u_device{i}` (sampler2D) is the **camera texture** for device slot `i`. `u_device{i}Acceleration` and similar are the **IMU sensors** for the same device — different data, same index.\n\n## Streams (Compositor)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_videoStreamCount` | `int` | Number of active streams (0–8) |\n| `u_videoStream0` – `u_videoStream7` | `sampler2D` | Stream frame textures |\n| `u_videoStream0Resolution` – `u_videoStream7Resolution` | `vec2` | Stream frame sizes in pixels |\n| `u_videoStream0Connected` – `u_videoStream7Connected` | `bool` | Whether stream has an active frame |\n\nStreams are additional video sources injected by the host application — they are used internally by Viji's compositor for mixing multiple scenes together. When no streams are provided, `u_videoStreamCount` is `0` and the textures sample as black. Each stream works the same way as [`u_video`](/shader/video/basics).\n\n## Audio Streams\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_audioStreamCount` | `int` | Number of active audio streams (0–8) |\n| `u_audioStream0Connected` – `u_audioStream7Connected` | `bool` | Whether stream is actively providing audio |\n| `u_audioStream0Volume` – `u_audioStream7Volume` | `float` | Stream volume (0–1) |\n| `u_audioStream0Low` – `u_audioStream7Low` | `float` | Low frequency band energy (0–1) |\n| `u_audioStream0LowMid` – `u_audioStream7LowMid` | `float` | Low-mid band energy (0–1) |\n| `u_audioStream0Mid` – `u_audioStream7Mid` | `float` | Mid band energy (0–1) |\n| `u_audioStream0HighMid` – `u_audioStream7HighMid` | `float` | High-mid band energy (0–1) |\n| `u_audioStream0High` – `u_audioStream7High` | `float` | High band energy (0–1) |\n| `u_audioStream0Brightness` – `u_audioStream7Brightness` | `float` | Spectral brightness (0–1) |\n| `u_audioStream0Flatness` – `u_audioStream7Flatness` | `float` | Spectral flatness (0–1) |\n\nAdditional audio streams provide lightweight scalar uniforms only — **no** FFT or waveform textures. Beat detection, BPM, and onset uniforms are only available on the main audio stream (`u_audioVolume`, `u_audioBPM`, etc.).\n\nWhen no audio streams are provided, `u_audioStreamCount` is `0` and all per-stream uniforms default to `0.0` / `false`.\n\n## Backbuffer\n\n| Uniform | Type | Description | Details |\n|---------|------|-------------|---------|\n| [`backbuffer`](/shader/backbuffer) | `sampler2D` | Previous frame texture (feedback effects) | [Backbuffer & Feedback](/shader/backbuffer) |\n\n> [!WARNING]\n> `backbuffer` is **conditional** — it is only injected if the word `backbuffer` appears anywhere in your shader source (including comments). It has no `u_` prefix. See [Backbuffer & Feedback](/shader/backbuffer) for details.\n\n## Parameter Directives\n\nDeclare parameters using `// @viji-TYPE:uniformName key:value ...` comments. Each directive generates a uniform and a UI control in the host.\n\n| Directive | Uniform Type | UI Control | Details |\n|-----------|-------------|------------|---------|\n| [`@viji-slider`](/shader/parameters/slider) | `float` | Numeric slider | [Slider](/shader/parameters/slider) |\n| [`@viji-number`](/shader/parameters/number) | `float` | Numeric input | [Number](/shader/parameters/number) |\n| [`@viji-color`](/shader/parameters/color) | `vec3` | Color picker (hex → RGB 0–1) | [Color](/shader/parameters/color) |\n| [`@viji-toggle`](/shader/parameters/toggle) | `bool` | On/off switch | [Toggle](/shader/parameters/toggle) |\n| [`@viji-select`](/shader/parameters/select) | `int` | Dropdown (0-based option index) | [Select](/shader/parameters/select) |\n| [`@viji-image`](/shader/parameters/image) | `sampler2D` | Image upload | [Image](/shader/parameters/image) |\n| [`@viji-button`](/shader/parameters/button) | `bool` | Momentary button (true for one frame) | [Button](/shader/parameters/button) |\n| [`@viji-accumulator`](/shader/parameters/accumulator) | `float` | CPU-side: `+= rate × deltaTime` | [Accumulator](/shader/parameters/accumulator) |\n\nSee [Parameters Overview](/shader/parameters) for syntax, [Grouping](/shader/parameters/grouping) and [Categories](/shader/parameters/categories) for organization.\n\n## Related\n\n- [Shader Basics](/shader/basics) — auto-injection, GLSL versions, `@renderer shader`\n- [Shader Quick Start](/shader/quickstart) — getting started with shader scenes\n- [Best Practices](/getting-started/best-practices) — essential patterns for all renderers\n- [Native API Reference](/native/api-reference) — JavaScript API for the Native renderer\n- [P5 API Reference](/p5/api-reference) — JavaScript API for the P5 renderer"
2883
2926
  }
2884
2927
  ]
2885
2928
  },
@@ -3228,7 +3271,7 @@ export const docsApi = {
3228
3271
  },
3229
3272
  {
3230
3273
  "type": "text",
3231
- "markdown": "## Related\n\n- [Volume](volume/)\n- [Frequency Bands](bands/)\n- [Beat Detection](beat/)\n- [Spectral Analysis](spectral/)\n- [FFT Texture](fft/)\n- [Waveform Texture](waveform/)\n- [Native Audio](/native/audio)\n- [P5 Audio](/p5/audio)"
3274
+ "markdown": "## Additional Audio Stream Uniforms\n\nBeyond the main audio uniforms (`u_audioVolume`, `u_audioBPM`, etc.), additional audio streams provide per-stream scalar uniforms:\n\n- `u_audioStreamCount` — number of active additional audio streams\n- `u_audioStream{i}Volume`, `u_audioStream{i}Low`, etc. — per-stream analysis values\n\nThese are **lightweight** — no FFT textures (`u_audioFFT`) or waveform textures (`u_audioWaveform`) per stream. Beat detection and BPM uniforms are only available for the main audio source.\n\nSee [Audio Streams](/shader/api-reference#audio-streams) in the API reference for the complete uniform list.\n\n## Related\n\n- [Volume](volume/)\n- [Frequency Bands](bands/)\n- [Beat Detection](beat/)\n- [Spectral Analysis](spectral/)\n- [FFT Texture](fft/)\n- [Waveform Texture](waveform/)\n- [Native Audio](/native/audio)\n- [P5 Audio](/p5/audio)"
3232
3275
  }
3233
3276
  ]
3234
3277
  },
@@ -3249,7 +3292,7 @@ export const docsApi = {
3249
3292
  },
3250
3293
  {
3251
3294
  "type": "text",
3252
- "markdown": "## Related\n\n- [Audio Uniforms Overview](../)\n- [Frequency Bands](../bands/)\n- [Beat Detection](../beat/)"
3295
+ "markdown": "> [!NOTE]\n> Additional audio streams provide per-stream volume via `u_audioStream{i}Volume` uniforms (i=0–7).\n\n## Related\n\n- [Audio Uniforms Overview](../)\n- [Frequency Bands](../bands/)\n- [Beat Detection](../beat/)"
3253
3296
  }
3254
3297
  ]
3255
3298
  },
@@ -3270,7 +3313,7 @@ export const docsApi = {
3270
3313
  },
3271
3314
  {
3272
3315
  "type": "text",
3273
- "markdown": "## Related\n\n- [Audio Uniforms Overview](../)\n- [Volume](../volume/)\n- [Beat Detection](../beat/)\n- [FFT Texture](../fft/)\n- [Spectral Analysis](../spectral/)"
3316
+ "markdown": "> [!NOTE]\n> Additional audio streams provide per-stream band values via `u_audioStream{i}Low`, `u_audioStream{i}LowMid`, `u_audioStream{i}Mid`, `u_audioStream{i}HighMid`, `u_audioStream{i}High` uniforms.\n\n## Related\n\n- [Audio Uniforms Overview](../)\n- [Volume](../volume/)\n- [Beat Detection](../beat/)\n- [FFT Texture](../fft/)\n- [Spectral Analysis](../spectral/)"
3274
3317
  }
3275
3318
  ]
3276
3319
  },
@@ -3291,7 +3334,7 @@ export const docsApi = {
3291
3334
  },
3292
3335
  {
3293
3336
  "type": "text",
3294
- "markdown": "## Related\n\n- [Audio Uniforms Overview](../)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)\n- [Spectral Analysis](../spectral/)"
3337
+ "markdown": "> [!NOTE]\n> Beat and BPM uniforms (`u_audioKick`, `u_audioBPM`, etc.) are only available for the main audio source. Additional audio stream uniforms (`u_audioStream{i}*`) do not include beat data.\n\n## Related\n\n- [Audio Uniforms Overview](../)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)\n- [Spectral Analysis](../spectral/)"
3295
3338
  }
3296
3339
  ]
3297
3340
  },
@@ -3312,7 +3355,7 @@ export const docsApi = {
3312
3355
  },
3313
3356
  {
3314
3357
  "type": "text",
3315
- "markdown": "## Related\n\n- [Audio Uniforms Overview](../)\n- [Frequency Bands](../bands/)\n- [FFT Texture](../fft/)\n- [Volume](../volume/)"
3358
+ "markdown": "> [!NOTE]\n> Additional audio streams provide per-stream spectral values via `u_audioStream{i}Brightness` and `u_audioStream{i}Flatness` uniforms.\n\n## Related\n\n- [Audio Uniforms Overview](../)\n- [Frequency Bands](../bands/)\n- [FFT Texture](../fft/)\n- [Volume](../volume/)"
3316
3359
  }
3317
3360
  ]
3318
3361
  },
@@ -3333,7 +3376,7 @@ export const docsApi = {
3333
3376
  },
3334
3377
  {
3335
3378
  "type": "text",
3336
- "markdown": "## Related\n\n- [Audio Uniforms Overview](../)\n- [Frequency Bands](../bands/)\n- [Waveform Texture](../waveform/)\n- [Spectral Analysis](../spectral/)"
3379
+ "markdown": "> [!NOTE]\n> The `u_audioFFT` texture is only available for the main audio source. Additional audio streams (`u_audioStream{i}*`) provide scalar band values only, not per-bin FFT data.\n\n## Related\n\n- [Audio Uniforms Overview](../)\n- [Frequency Bands](../bands/)\n- [Waveform Texture](../waveform/)\n- [Spectral Analysis](../spectral/)"
3337
3380
  }
3338
3381
  ]
3339
3382
  },
@@ -3354,7 +3397,7 @@ export const docsApi = {
3354
3397
  },
3355
3398
  {
3356
3399
  "type": "text",
3357
- "markdown": "## Related\n\n- [Audio Uniforms Overview](../)\n- [FFT Texture](../fft/)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)"
3400
+ "markdown": "> [!NOTE]\n> The `u_audioWaveform` texture is only available for the main audio source. Additional audio streams do not have per-stream waveform textures.\n\n## Related\n\n- [Audio Uniforms Overview](../)\n- [FFT Texture](../fft/)\n- [Volume](../volume/)\n- [Frequency Bands](../bands/)"
3358
3401
  }
3359
3402
  ]
3360
3403
  },
@@ -3673,11 +3716,11 @@ export const docsApi = {
3673
3716
  "shader-ext-overview": {
3674
3717
  "id": "shader-ext-overview",
3675
3718
  "title": "External Device Uniforms — Overview",
3676
- "description": "GLSL uniforms for accessing external devices — video textures, sensor data, and connection status from connected hardware.",
3719
+ "description": "GLSL uniforms for accessing external devices — video textures, sensor data, device audio via the shared audio-stream uniform pool, and connection status from connected hardware.",
3677
3720
  "content": [
3678
3721
  {
3679
3722
  "type": "text",
3680
- "markdown": "# External Device Uniforms\n\nThe external device uniforms expose video textures and sensor data from devices connected to your installation (phones, tablets, or other hardware).\n\n> [!NOTE]\n> External devices are managed by the host. Devices appear and disappear dynamically. Use the count uniforms to determine how many devices are available each frame.\n\n## Two Separate Count Uniforms\n\nExternal devices have **two distinct count uniforms** that may differ in value:\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_deviceCount` | `int` | Number of devices with **active camera streams** (0–8) |\n| `u_externalDeviceCount` | `int` | Number of **connected external devices** (0–8) |\n\nA device may be connected (counted in `u_externalDeviceCount`) but have no camera (not counted in `u_deviceCount`). Always use the appropriate count for the data you're accessing.\n\n## Uniform Summary\n\n### Video Uniforms (per device, indices 0–7)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_device{i}` | `sampler2D` | Device camera texture |\n| `u_device{i}Resolution` | `vec2` | Camera resolution in pixels |\n| `u_device{i}Connected` | `bool` | `true` when camera stream is active |\n\nSee [Video Textures](video/) for full usage details.\n\n### Sensor Uniforms (per device, indices 0–7)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_device{i}Acceleration` | `vec3` | Acceleration without gravity (m/s²) |\n| `u_device{i}AccelerationGravity` | `vec3` | Acceleration with gravity (m/s²) |\n| `u_device{i}RotationRate` | `vec3` | Rotation rate (deg/s) |\n| `u_device{i}Orientation` | `vec3` | Orientation (alpha, beta, gamma degrees) |\n\nSee [Sensor Uniforms](sensors/) for full usage details.\n\n## Default Values\n\nWhen no external devices are connected:\n- `u_deviceCount` → `0`\n- `u_externalDeviceCount` → `0`\n- All per-device uniforms → `vec2(0.0)` / `vec3(0.0)` / `false`\n- Disconnected device textures sample as black\n\n## Basic Example"
3723
+ "markdown": "# External Device Uniforms\n\nThe external device uniforms expose video textures, sensor data, and (through the shared `u_audioStream{i}*` pool) lightweight audio analysis from devices connected to your installation (phones, tablets, or other hardware).\n\n> [!NOTE]\n> External devices are managed by the host. Devices appear and disappear dynamically. Use the count uniforms to determine how many devices are available each frame.\n\n## Two Separate Count Uniforms\n\nExternal devices have **two distinct count uniforms** that may differ in value:\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_deviceCount` | `int` | Number of devices with **active camera streams** (0–8) |\n| `u_externalDeviceCount` | `int` | Number of **connected external devices** (0–8) |\n\nA device may be connected (counted in `u_externalDeviceCount`) but have no camera (not counted in `u_deviceCount`). Always use the appropriate count for the data you're accessing.\n\n## Uniform Summary\n\n### Video Uniforms (per device, indices 0–7)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_device{i}` | `sampler2D` | Device camera texture |\n| `u_device{i}Resolution` | `vec2` | Camera resolution in pixels |\n| `u_device{i}Connected` | `bool` | `true` when camera stream is active |\n\nSee [Video Textures](video/) for full usage details.\n\n### Sensor Uniforms (per device, indices 0–7)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_device{i}Acceleration` | `vec3` | Acceleration without gravity (m/s²) |\n| `u_device{i}AccelerationGravity` | `vec3` | Acceleration with gravity (m/s²) |\n| `u_device{i}RotationRate` | `vec3` | Rotation rate (deg/s) |\n| `u_device{i}Orientation` | `vec3` | Orientation (alpha, beta, gamma degrees) |\n\nSee [Sensor Uniforms](sensors/) for full usage details.\n\n## Default Values\n\nWhen no external devices are connected:\n- `u_deviceCount` → `0`\n- `u_externalDeviceCount` → `0`\n- All per-device uniforms → `vec2(0.0)` / `vec3(0.0)` / `false`\n- Disconnected device textures sample as black\n\n## Basic Example"
3681
3724
  },
3682
3725
  {
3683
3726
  "type": "live-example",
@@ -3742,6 +3785,17 @@ export const docsApi = {
3742
3785
  }
3743
3786
  ]
3744
3787
  },
3788
+ "shader-ext-audio": {
3789
+ "id": "shader-ext-audio",
3790
+ "title": "Audio Uniforms",
3791
+ "description": "GLSL uniforms for audio analysis from externally connected devices — volume, frequency bands, and spectral features.",
3792
+ "content": [
3793
+ {
3794
+ "type": "text",
3795
+ "markdown": "# Device Audio Uniforms\n\nDevice audio from externally connected devices is accessed through the same `u_audioStream{i}*` uniform pool as additional audio streams. The host automatically maps each device's audio to a stream index.\n\n> [!NOTE]\n> Device audio provides **lightweight scalar uniforms** only — volume, frequency bands, and spectral features. Beat detection, BPM, and onset uniforms (`u_audioKick`, `u_audioBPM`, etc.) are only available for the main audio source. There are no per-stream FFT or waveform textures.\n\n## Uniform Reference\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_audioStreamCount` | `int` | Total number of active audio streams, including device audio (0–8) |\n\n### Per-Stream Uniforms (indices 0–7)\n\n| Uniform | Type | Description |\n|---------|------|-------------|\n| `u_audioStream{i}Connected` | `bool` | `true` when stream is actively providing audio |\n| `u_audioStream{i}Volume` | `float` | Audio volume (0–1) |\n| `u_audioStream{i}Low` | `float` | Low frequency band energy (0–1) |\n| `u_audioStream{i}LowMid` | `float` | Low-mid band energy (0–1) |\n| `u_audioStream{i}Mid` | `float` | Mid band energy (0–1) |\n| `u_audioStream{i}HighMid` | `float` | High-mid band energy (0–1) |\n| `u_audioStream{i}High` | `float` | High band energy (0–1) |\n| `u_audioStream{i}Brightness` | `float` | Spectral brightness (0–1) |\n| `u_audioStream{i}Flatness` | `float` | Spectral flatness (0–1) |\n\nAll 8 slots are always declared. Use `u_audioStream{i}Connected` to check availability before reading values.\n\n## Default Values\n\n- `u_audioStreamCount` → `0` when no audio streams are connected\n- `u_audioStream{i}Connected` → `false` for unused slots\n- All `float` uniforms → `0.0` for unused or disconnected slots\n\n## Common Patterns\n\n### React to First Device's Audio Volume\n\n```glsl\n// @renderer shader\n\nvoid main() {\n vec2 uv = gl_FragCoord.xy / u_resolution;\n\n float brightness = 0.05;\n if (u_audioStream0Connected) {\n brightness += u_audioStream0Volume * 0.8;\n }\n\n vec3 col = vec3(brightness);\n gl_FragColor = vec4(col, 1.0);\n}\n```\n\n### Color-Mapped Frequency Bands from Device Audio\n\n```glsl\n// @renderer shader\n\nvoid main() {\n vec2 uv = gl_FragCoord.xy / u_resolution;\n vec3 col = vec3(0.04);\n\n if (u_audioStream0Connected) {\n float low = u_audioStream0Low;\n float mid = u_audioStream0Mid;\n float high = u_audioStream0High;\n\n col = vec3(low * 0.8, mid * 0.6, high * 0.9);\n col += 0.05;\n }\n\n gl_FragColor = vec4(col, 1.0);\n}\n```\n\n### Multi-Device Audio Visualizer\n\n```glsl\n// @renderer shader\n\nfloat getStreamVolume(int idx) {\n if (idx == 0 && u_audioStream0Connected) return u_audioStream0Volume;\n if (idx == 1 && u_audioStream1Connected) return u_audioStream1Volume;\n if (idx == 2 && u_audioStream2Connected) return u_audioStream2Volume;\n if (idx == 3 && u_audioStream3Connected) return u_audioStream3Volume;\n return 0.0;\n}\n\nvoid main() {\n vec2 uv = gl_FragCoord.xy / u_resolution;\n vec3 col = vec3(0.04);\n\n int count = u_audioStreamCount;\n if (count > 0) {\n float barWidth = 1.0 / float(count);\n int idx = int(floor(uv.x / barWidth));\n float vol = getStreamVolume(idx);\n\n if (uv.y < vol) {\n float hue = float(idx) / float(count);\n col = vec3(hue, 0.7, 0.9);\n }\n }\n\n gl_FragColor = vec4(col, 1.0);\n}\n```\n\n## Related\n\n- [External Device Uniforms — Overview](../) — count uniforms and naming conventions\n- [External Device Video Textures](../video/) — camera textures from connected devices\n- [External Device Sensor Uniforms](../sensors/) — accelerometer and orientation from connected devices\n- [Audio Uniforms — Overview](../../audio/) — main audio stream uniforms (`u_audioVolume`, `u_audioBPM`, etc.)\n- [Audio Streams](/shader/api-reference#audio-streams) — complete audio stream uniform reference\n- [Native Device Audio](/native/external-devices/audio) — full JavaScript API\n- [P5 Device Audio](/p5/external-devices/audio) — full JavaScript API in P5 renderer"
3796
+ }
3797
+ ]
3798
+ },
3745
3799
  "shader-backbuffer": {
3746
3800
  "id": "shader-backbuffer",
3747
3801
  "title": "Backbuffer & Feedback",