@viji-dev/core 0.3.21 → 0.3.23

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,1440 +1,1440 @@
1
- # Viji Core Package (`@viji-dev/core`)
2
-
3
- **Universal execution engine for Viji Creative scenes**
4
-
5
- A powerful, secure, and feature-rich JavaScript/TypeScript library that provides the foundation for creative scene execution across all Viji platform contexts. The core offers identical IFrame + WebWorker execution with comprehensive parameter management, audio/video analysis, user interaction handling, and performance optimization.
6
-
7
- ## 🚀 Features
8
-
9
- ### ✅ **Core Execution Engine**
10
- - **Secure IFrame + WebWorker Architecture**: Complete isolation with controlled communication
11
- - **Multi-Instance Support**: Concurrent instances for main scenes and previews
12
- - **Automatic Resource Management**: Memory leak prevention and cleanup
13
-
14
- ### ✅ **Parameter System**
15
- - **Declarative Parameter Definition**: Define parameters once with automatic UI generation
16
- - **Proxy-Based Access**: Fast parameter access in render loops
17
- - **Category-Based Organization**: Audio, video, interaction, and general parameters
18
- - **Real-time Validation**: Type safety and range checking
19
- - **Capability-Aware UI**: Parameters shown based on active features
20
-
21
- ### ✅ **Audio Analysis**
22
- - **Real-time Audio Processing**: Volume, frequency analysis, and beat detection
23
- - **Custom Frequency Bands**: Bass, mid, treble, and custom band analysis
24
- - **Multiple Input Sources**: Microphone, audio files, and screen capture
25
- - **Audio-Reactive Scenes**: Make scenes respond to audio input
26
-
27
- ### ✅ **Video Analysis**
28
- - **Real-time Video Processing**: Frame analysis in separate WebWorker
29
- - **Multiple Input Sources**: Camera, video files, and screen capture
30
- - **Video-Reactive Scenes**: Make scenes respond to video motion and brightness
31
- - **Frame Data Access**: Raw video frame data for custom analysis
32
-
33
- ### ✅ **User Interaction**
34
- - **Mouse Tracking**: Position, buttons, movement, and scroll wheel
35
- - **Keyboard Input**: Key states, modifiers, and event handling
36
- - **Touch Support**: Multi-touch with gesture detection
37
- - **Device Sensors**: Motion, orientation, and geolocation (internal + external devices)
38
- - **Canvas-Coordinate Mapping**: Accurate input positioning
39
-
40
- ### ✅ **Performance Optimization**
41
- - **Configurable Frame Rates**: Full (60fps) or half (30fps) modes
42
- - **Resolution Scaling**: Fractional or explicit canvas dimensions
43
- - **Adaptive Performance**: Automatic optimization based on hardware
44
- - **Memory Management**: Efficient resource pooling and cleanup
45
-
46
- ## 📦 Installation
47
-
48
- ```bash
49
- npm install @viji-dev/core
50
- ```
51
-
52
- ## 🎯 Quick Start
53
-
54
- ### Basic Scene Creation
55
-
56
- ```typescript
57
- import { VijiCore } from '@viji-dev/core';
58
-
59
- // Artist scene code
60
- const sceneCode = `
61
- // Define parameters using helper functions
62
- const color = viji.color('#ff6b6b', {
63
- label: 'Shape Color',
64
- description: 'Color of the animated shape',
65
- group: 'appearance'
66
- });
67
-
68
- const size = viji.slider(50, {
69
- min: 10,
70
- max: 150,
71
- step: 5,
72
- label: 'Shape Size',
73
- description: 'Size of the animated shape',
74
- group: 'appearance'
75
- });
76
-
77
- const speed = viji.slider(1.0, {
78
- min: 0.1,
79
- max: 3.0,
80
- step: 0.1,
81
- label: 'Animation Speed',
82
- description: 'Speed of the animation',
83
- group: 'animation'
84
- });
85
-
86
- // Main render function
87
- function render(viji) {
88
- const ctx = viji.useContext('2d');
89
-
90
- // Clear canvas
91
- ctx.fillStyle = '#2c3e50';
92
- ctx.fillRect(0, 0, viji.width, viji.height);
93
-
94
- // Animated shape
95
- const time = viji.time * speed.value;
96
- const x = viji.width / 2 + Math.sin(time) * 100;
97
- const y = viji.height / 2 + Math.cos(time) * 100;
98
-
99
- ctx.fillStyle = color.value;
100
- ctx.beginPath();
101
- ctx.arc(x, y, size.value / 2, 0, Math.PI * 2);
102
- ctx.fill();
103
- }
104
- `;
105
-
106
- // Create core instance
107
- const core = new VijiCore({
108
- hostContainer: document.getElementById('scene-container'),
109
- sceneCode: sceneCode,
110
- frameRateMode: 'full',
111
- allowUserInteraction: true
112
- });
113
-
114
- // Initialize and start rendering
115
- await core.initialize();
116
- console.log('Scene is running!');
117
- ```
118
-
119
- ## 🔧 Integration API
120
-
121
- ### Core Configuration
122
-
123
- The `VijiCoreConfig` interface defines all available configuration options:
124
-
125
- ```typescript
126
- interface VijiCoreConfig {
127
- // Required configuration
128
- hostContainer: HTMLElement; // Container element for the scene
129
- sceneCode: string; // Artist JavaScript code with render function
130
-
131
- // Performance configuration
132
- frameRateMode?: 'full' | 'half'; // 'full' = 60fps, 'half' = 30fps
133
-
134
- // Input streams
135
- audioStream?: MediaStream; // Audio input for analysis
136
- videoStream?: MediaStream; // Video input (main, CV enabled)
137
- videoStreams?: MediaStream[]; // Additional video streams (no CV)
138
-
139
- // Feature toggles
140
- noInputs?: boolean; // Disable all input processing
141
- allowUserInteraction?: boolean; // Enable mouse/keyboard/touch events
142
- allowDeviceInteraction?: boolean; // Enable device sensors (motion/orientation/geolocation)
143
- }
144
- ```
145
-
146
- ### Instance Management
147
-
148
- #### Creation and Initialization
149
-
150
- ```typescript
151
- // Create core instance
152
- const core = new VijiCore({
153
- hostContainer: document.getElementById('scene-container'),
154
- sceneCode: sceneCode,
155
- frameRateMode: 'full',
156
- allowUserInteraction: true
157
- });
158
-
159
- // Initialize the core (required before use)
160
- await core.initialize();
161
-
162
- // Check if core is ready for operations
163
- if (core.ready) {
164
- console.log('Core is ready for use');
165
- }
166
-
167
- // Get current configuration
168
- const config = core.configuration;
169
- console.log('Current frame rate mode:', config.frameRateMode);
170
- ```
171
-
172
- #### Performance Control
173
-
174
- ```typescript
175
- // Frame rate control
176
- await core.setFrameRate('full'); // Set to 60fps mode
177
- await core.setFrameRate('half'); // Set to 30fps mode
178
-
179
- // Resolution control
180
- await core.setResolution(0.75); // Set to 75% of container size
181
- await core.setResolution(0.5); // Set to 50% for performance
182
- await core.updateResolution(); // Auto-detect container size changes
183
-
184
- // Get performance statistics
185
- const stats = core.getStats();
186
- console.log('Current FPS:', stats.frameRate.effectiveRefreshRate);
187
- console.log('Canvas size:', stats.resolution);
188
- console.log('Scale factor:', stats.scale);
189
- console.log('Parameter count:', stats.parameterCount);
190
- ```
191
-
192
- #### Debug and Development
193
-
194
- ```typescript
195
- // Enable debug logging
196
- core.setDebugMode(true);
197
-
198
- // Check debug mode status
199
- const isDebugEnabled = core.getDebugMode();
200
-
201
- // Debug mode provides detailed logging for:
202
- // - Initialization process
203
- // - Communication between components
204
- // - Parameter system operations
205
- // - Audio/video stream processing
206
- // - Performance statistics
207
- ```
208
-
209
- ### Parameter Management
210
-
211
- The parameter system provides a powerful way to create interactive scenes with automatic UI generation.
212
-
213
- #### Parameter Definition and Access
214
-
215
- ```typescript
216
- // Listen for parameter definitions from artist code
217
- core.onParametersDefined((groups) => {
218
- console.log('Parameters available:', groups);
219
-
220
- // Each group contains:
221
- // - groupName: string
222
- // - category: 'audio' | 'video' | 'interaction' | 'general'
223
- // - description: string
224
- // - parameters: Record<string, ParameterDefinition>
225
-
226
- // Generate UI based on parameter groups
227
- generateParameterUI(groups);
228
- });
229
-
230
- // Set individual parameter values
231
- await core.setParameter('color', '#ff0000');
232
- await core.setParameter('size', 75);
233
- await core.setParameter('enabled', true);
234
-
235
- // Set multiple parameters efficiently
236
- await core.setParameters({
237
- 'color': '#00ff00',
238
- 'size': 100,
239
- 'speed': 2.0,
240
- 'enabled': false
241
- });
242
-
243
- // Get current parameter values
244
- const values = core.getParameterValues();
245
- const color = core.getParameter('color');
246
-
247
- // Listen for parameter changes
248
- core.onParameterChange('size', (value) => {
249
- console.log('Size parameter changed to:', value);
250
- });
251
-
252
- // Listen for parameter errors
253
- core.onParameterError((error) => {
254
- console.error('Parameter error:', error.message);
255
- console.error('Error code:', error.code);
256
- });
257
- ```
258
-
259
- #### Capability-Aware Parameters
260
-
261
- ```typescript
262
- // Get all parameter groups (unfiltered, use for saving scene parameters)
263
- const allGroups = core.getAllParameterGroups();
264
-
265
- // Get parameter groups filtered by active capabilities (for UI)
266
- const visibleGroups = core.getVisibleParameterGroups();
267
-
268
- // Check current capabilities
269
- const capabilities = core.getCapabilities();
270
- console.log('Audio available:', capabilities.hasAudio);
271
- console.log('Video available:', capabilities.hasVideo);
272
- console.log('Interaction enabled:', capabilities.hasInteraction);
273
-
274
- // Check if specific parameter category is active
275
- const isAudioActive = core.isCategoryActive('audio');
276
- const isVideoActive = core.isCategoryActive('video');
277
-
278
- // Parameters are automatically categorized:
279
- // - 'audio': Only shown when audio stream is connected
280
- // - 'video': Only shown when video stream is connected
281
- // - 'interaction': Only shown when user interaction is enabled
282
- // - 'general': Always available
283
- ```
284
-
285
- ### Audio and Video Integration
286
-
287
- #### Audio Stream Management
288
-
289
- ```typescript
290
- // Set audio stream for analysis
291
- const audioStream = await navigator.mediaDevices.getUserMedia({
292
- audio: {
293
- echoCancellation: false,
294
- noiseSuppression: false,
295
- autoGainControl: false
296
- }
297
- });
298
- await core.setAudioStream(audioStream);
299
-
300
- // Configure audio analysis with the new namespace API
301
- // Basic control
302
- core.audio.setSensitivity(1.5); // Increase sensitivity (0.5-2.0)
303
-
304
- // Tap tempo for manual BPM control
305
- core.audio.beat.tap(); // Tap to set tempo
306
- core.audio.beat.clearTaps(); // Clear tap history
307
- const tapCount = core.audio.beat.getTapCount();
308
-
309
- // Beat mode control
310
- core.audio.beat.setMode('auto'); // Automatic beat detection (default)
311
- core.audio.beat.setMode('manual'); // Manual BPM control
312
- core.audio.beat.setBPM(120); // Set manual BPM (60-240)
313
- const currentBPM = core.audio.beat.getBPM();
314
-
315
- // Beat phase control for fine-tuning
316
- core.audio.beat.nudge(0.1); // Nudge phase forward
317
- core.audio.beat.resetPhase(); // Reset phase to zero
318
-
319
- // Advanced configuration (optional)
320
- core.audio.advanced.setFFTSize(2048); // FFT resolution (higher = more accurate)
321
- core.audio.advanced.setSmoothing(0.8); // Smoothing factor (0-1)
322
- core.audio.advanced.setAutoGain(true); // Auto-gain normalization
323
- core.audio.advanced.setBeatDetection(true); // Enable beat detection
324
- core.audio.advanced.setOnsetDetection(true); // Enable onset detection
325
-
326
- // Get current audio state
327
- const state = core.audio.getState();
328
- console.log('BPM:', state.currentBPM);
329
- console.log('Confidence:', state.confidence);
330
- console.log('Mode:', state.mode);
331
- console.log('Is Locked:', state.isLocked);
332
-
333
- // Get current audio stream
334
- const currentStream = core.getAudioStream();
335
-
336
- // Disconnect audio
337
- await core.setAudioStream(null);
338
- ```
339
-
340
- #### Video Stream Management
341
-
342
- ```typescript
343
- // Set video stream for analysis
344
- const videoStream = await navigator.mediaDevices.getUserMedia({
345
- video: {
346
- width: { ideal: 640 },
347
- height: { ideal: 480 },
348
- frameRate: { ideal: 30 }
349
- }
350
- });
351
- await core.setVideoStream(videoStream);
352
-
353
- // Video analysis includes:
354
- // - Real-time frame processing
355
- // - Frame data access for custom analysis
356
- // - Brightness and motion detection
357
- // - Custom computer vision processing
358
-
359
- // Disconnect video
360
- await core.setVideoStream(null);
361
- ```
362
-
363
- #### Interaction Management
364
-
365
- ```typescript
366
- // Enable or disable user interactions at runtime
367
- await core.setInteractionEnabled(true); // Enable mouse, keyboard, and touch
368
- await core.setInteractionEnabled(false); // Disable all interactions
369
-
370
- // Get current interaction state
371
- const isInteractionEnabled = core.getInteractionEnabled();
372
-
373
- // Interaction state affects:
374
- // - Mouse, keyboard, and touch event processing
375
- // - Parameter visibility (interaction category parameters)
376
- // - Scene behavior that depends on user input
377
-
378
- // Note: Interaction state is separate from initialization config
379
- // You can toggle interactions regardless of initial allowUserInteraction value
380
- // The interaction system is always available for runtime control
381
- ```
382
-
383
- #### Capability Change Monitoring
384
-
385
- ```typescript
386
- // Listen for capability changes
387
- core.onCapabilitiesChange((capabilities) => {
388
- console.log('Capabilities updated:', capabilities);
389
-
390
- // Update UI based on new capabilities
391
- if (capabilities.hasAudio) {
392
- showAudioControls();
393
- } else {
394
- hideAudioControls();
395
- }
396
-
397
- if (capabilities.hasVideo) {
398
- showVideoControls();
399
- } else {
400
- hideVideoControls();
401
- }
402
-
403
- if (capabilities.hasInteraction) {
404
- showInteractionControls();
405
- } else {
406
- hideInteractionControls();
407
- }
408
- });
409
- ```
410
-
411
- ### Event Handling and Lifecycle
412
-
413
- #### Core Lifecycle Events
414
-
415
- ```typescript
416
- // Core is ready for operations
417
- if (core.ready) {
418
- // All systems initialized and running
419
- console.log('Core is fully operational');
420
- }
421
-
422
- // Check if parameters are initialized
423
- if (core.parametersReady) {
424
- // Parameter system is ready
425
- console.log('Parameters are available');
426
- }
427
- ```
428
-
429
- #### Cleanup and Resource Management
430
-
431
- ```typescript
432
- // Destroy instance and clean up all resources
433
- await core.destroy();
434
-
435
- // This automatically:
436
- // - Stops all rendering loops
437
- // - Disconnects audio/video streams
438
- // - Cleans up WebWorker and IFrame
439
- // - Releases all event listeners
440
- // - Clears parameter system
441
- // - Frees memory resources
442
- ```
443
-
444
- ## 🎨 Artist API
445
-
446
- The artist API provides a comprehensive set of tools for creating interactive, audio-reactive, and video-responsive scenes.
447
-
448
- ### Canvas and Rendering
449
-
450
- ```typescript
451
- function render(viji) {
452
- // Get canvas contexts
453
- const ctx = viji.useContext('2d'); // 2D rendering context
454
- const gl = viji.useContext('webgl'); // WebGL rendering context
455
-
456
- // Canvas properties
457
- viji.canvas; // OffscreenCanvas object
458
- viji.width; // Canvas width in pixels
459
- viji.height; // Canvas height in pixels
460
- viji.pixelRatio; // Device pixel ratio for crisp rendering
461
-
462
- // Example: Draw a responsive circle
463
- const centerX = viji.width / 2;
464
- const centerY = viji.height / 2;
465
- const radius = Math.min(viji.width, viji.height) * 0.1;
466
-
467
- ctx.fillStyle = '#ff6b6b';
468
- ctx.beginPath();
469
- ctx.arc(centerX, centerY, radius, 0, Math.PI * 2);
470
- ctx.fill();
471
- }
472
- ```
473
-
474
- ### Timing Information
475
-
476
- The timing system provides FPS-independent timing data for smooth animations:
477
-
478
- ```typescript
479
- function render(viji) {
480
- // Timing data (FPS independent)
481
- viji.time; // Elapsed time in seconds since scene start
482
- viji.deltaTime; // Time since last frame in seconds
483
- viji.frameCount; // Total number of frames rendered
484
- viji.fps; // Current frames per second
485
-
486
- // Example: Smooth animation regardless of frame rate
487
- const animationSpeed = 2.0; // rotations per second
488
- const rotation = (viji.time * animationSpeed * Math.PI * 2) % (Math.PI * 2);
489
-
490
- ctx.save();
491
- ctx.translate(viji.width / 2, viji.height / 2);
492
- ctx.rotate(rotation);
493
- ctx.fillRect(-25, -25, 50, 50);
494
- ctx.restore();
495
- }
496
- ```
497
-
498
- ### Parameter System
499
-
500
- The parameter system allows artists to define interactive parameters that automatically generate UI controls.
501
-
502
- #### Parameter Definition
503
-
504
- ```typescript
505
- // Define parameters (call once outside render loop)
506
- const color = viji.color('#ff6b6b', {
507
- label: 'Primary Color',
508
- description: 'Main color for shapes',
509
- group: 'appearance',
510
- category: 'general'
511
- });
512
-
513
- const size = viji.slider(50, {
514
- min: 10,
515
- max: 150,
516
- step: 5,
517
- label: 'Shape Size',
518
- description: 'Size of shapes in pixels',
519
- group: 'appearance',
520
- category: 'general'
521
- });
522
-
523
- const speed = viji.slider(1.0, {
524
- min: 0.1,
525
- max: 3.0,
526
- step: 0.1,
527
- label: 'Animation Speed',
528
- description: 'Speed of animation in rotations per second',
529
- group: 'animation',
530
- category: 'general'
531
- });
532
-
533
- const useAudio = viji.toggle(false, {
534
- label: 'Audio Reactive',
535
- description: 'Make shapes react to audio input',
536
- group: 'audio',
537
- category: 'audio'
538
- });
539
-
540
- const shapeType = viji.select('circle', {
541
- options: ['circle', 'square', 'triangle', 'star'],
542
- label: 'Shape Type',
543
- description: 'Type of shape to draw',
544
- group: 'appearance',
545
- category: 'general'
546
- });
547
-
548
- const title = viji.text('My Scene', {
549
- label: 'Scene Title',
550
- description: 'Title displayed in the scene',
551
- group: 'text',
552
- category: 'general',
553
- maxLength: 50
554
- });
555
-
556
- const particleCount = viji.number(5, {
557
- min: 1,
558
- max: 20,
559
- step: 1,
560
- label: 'Particle Count',
561
- description: 'Number of particles to render',
562
- group: 'animation',
563
- category: 'general'
564
- });
565
- ```
566
-
567
- #### Parameter Usage in Render Loop
568
-
569
- ```typescript
570
- function render(viji) {
571
- const ctx = viji.useContext('2d');
572
-
573
- // Fast parameter access (proxy-based)
574
- ctx.fillStyle = color.value; // Get current color value
575
- const radius = size.value / 2; // Get current size value
576
- const animationSpeed = speed.value; // Get current speed value
577
-
578
- // Clear canvas
579
- ctx.fillStyle = '#2c3e50';
580
- ctx.fillRect(0, 0, viji.width, viji.height);
581
-
582
- // Draw title
583
- ctx.fillStyle = 'white';
584
- ctx.font = '20px Arial';
585
- ctx.textAlign = 'center';
586
- ctx.fillText(title.value, viji.width / 2, 30);
587
-
588
- // Draw particles
589
- for (let i = 0; i < particleCount.value; i++) {
590
- const angle = (i / particleCount.value) * Math.PI * 2 + (viji.time * animationSpeed);
591
- const x = viji.width / 2 + Math.cos(angle) * 100;
592
- const y = viji.height / 2 + Math.sin(angle) * 100;
593
-
594
- ctx.fillStyle = color.value;
595
- ctx.beginPath();
596
-
597
- switch (shapeType.value) {
598
- case 'circle':
599
- ctx.arc(x, y, radius, 0, Math.PI * 2);
600
- break;
601
- case 'square':
602
- ctx.rect(x - radius, y - radius, radius * 2, radius * 2);
603
- break;
604
- case 'triangle':
605
- ctx.moveTo(x, y - radius);
606
- ctx.lineTo(x - radius, y + radius);
607
- ctx.lineTo(x + radius, y + radius);
608
- ctx.closePath();
609
- break;
610
- }
611
-
612
- ctx.fill();
613
- }
614
- }
615
- ```
616
-
617
- ### Audio Analysis
618
-
619
- The audio system provides real-time analysis of audio input with comprehensive frequency and volume data.
620
-
621
- #### Audio API Overview
622
-
623
- ```typescript
624
- function render(viji) {
625
- const audio = viji.audio;
626
-
627
- if (audio.isConnected) {
628
- // Volume analysis with smooth values
629
- const volume = audio.volume.current; // 0-1 current volume level (RMS-based)
630
- const peak = audio.volume.peak; // 0-1 peak amplitude (instant)
631
- const smooth = audio.volume.smoothed; // 0-1 smoothed volume (200ms decay)
632
-
633
- // Frequency bands (0-1 values) with instant and smooth versions
634
- const low = audio.bands.low; // 20-150 Hz (bass/kick range, instant)
635
- const lowSmoothed = audio.bands.lowSmoothed; // Smooth low frequency energy
636
- const lowMid = audio.bands.lowMid; // 150-400 Hz
637
- const mid = audio.bands.mid; // 400-2500 Hz (vocals, instruments)
638
- const highMid = audio.bands.highMid; // 2500-8000 Hz (cymbals, hi-hats)
639
- const high = audio.bands.high; // 8000-20000 Hz (air, brilliance)
640
-
641
- // Automatic beat detection with BPM tracking
642
- const beat = audio.beat;
643
-
644
- // Fast energy curves (300ms decay - primary for most visuals)
645
- const kickEnergy = beat.kick; // 0-1 kick drum energy
646
- const snareEnergy = beat.snare; // 0-1 snare energy
647
- const hatEnergy = beat.hat; // 0-1 hi-hat energy
648
- const anyEnergy = beat.any; // 0-1 any beat type energy
649
-
650
- // Smoothed energy curves (500ms decay - for slower animations)
651
- const kickSmoothed = beat.kickSmoothed;
652
- const snareSmoothed = beat.snareSmoothed;
653
- const anySmoothed = beat.anySmoothed;
654
-
655
- // Instant triggers (for advanced use cases)
656
- if (beat.triggers.kick) {
657
- // Kick drum detected this frame
658
- spawnParticle('kick');
659
- }
660
-
661
- // BPM and tempo information
662
- const bpm = beat.bpm; // Detected BPM (60-180)
663
- const phase = beat.phase; // Beat phase 0-1 within current beat
664
- const bar = beat.bar; // Current bar number (0-3 for 4/4)
665
- const confidence = beat.confidence; // Detection confidence 0-1
666
- const isLocked = beat.isLocked; // True when beat is locked
667
-
668
- // Spectral features for advanced audio-reactive effects
669
- const brightness = audio.spectral.brightness; // 0-1 spectral centroid
670
- const flatness = audio.spectral.flatness; // 0-1 spectral flatness (noisiness)
671
- const flux = audio.spectral.flux; // 0-1 spectral flux (change)
672
-
673
- // Raw frequency data (0-255 values)
674
- const frequencyData = audio.getFrequencyData();
675
-
676
- // Example 1: Smooth beat-reactive animation (primary pattern)
677
- const scale = 1 + kickEnergy * 0.5; // Smooth pulsing with kick
678
- const hue = lowSmoothed * 180; // Smooth color based on low frequencies
679
-
680
- ctx.save();
681
- ctx.translate(viji.width / 2, viji.height / 2);
682
- ctx.scale(scale, scale);
683
- ctx.fillStyle = `hsl(${hue}, 70%, 60%)`;
684
- ctx.fillRect(-25, -25, 50, 50);
685
- ctx.restore();
686
-
687
- // Example 2: Phase-synced rotation
688
- const rotation = phase * Math.PI * 2; // Rotate with beat phase
689
- ctx.rotate(rotation);
690
- }
691
- }
692
- ```
693
-
694
- #### Audio-Reactive Scene Example
695
-
696
- ```typescript
697
- // Define audio-reactive parameters
698
- const audioReactive = viji.toggle(true, {
699
- label: 'Audio Reactive',
700
- description: 'Make shapes react to audio',
701
- group: 'audio',
702
- category: 'audio'
703
- });
704
-
705
- const volumeSensitivity = viji.slider(1.0, {
706
- min: 0.1,
707
- max: 5.0,
708
- step: 0.1,
709
- label: 'Volume Sensitivity',
710
- description: 'How sensitive shapes are to volume',
711
- group: 'audio',
712
- category: 'audio'
713
- });
714
-
715
- const bassReactivity = viji.slider(1.0, {
716
- min: 0,
717
- max: 3.0,
718
- step: 0.1,
719
- label: 'Bass Reactivity',
720
- description: 'How much shapes react to bass',
721
- group: 'audio',
722
- category: 'audio'
723
- });
724
-
725
- function render(viji) {
726
- const ctx = viji.useContext('2d');
727
- const audio = viji.audio;
728
-
729
- // Clear canvas
730
- ctx.fillStyle = '#2c3e50';
731
- ctx.fillRect(0, 0, viji.width, viji.height);
732
-
733
- if (audioReactive.value && audio.isConnected) {
734
- // Audio-reactive animation
735
- const volume = audio.volume.current * volumeSensitivity.value;
736
- const bass = audio.bands.low * bassReactivity.value;
737
-
738
- // Scale based on volume
739
- const scale = 1 + volume;
740
-
741
- // Color based on bass
742
- const hue = 200 + (bass * 160); // Blue to purple range
743
-
744
- // Position based on frequency distribution
745
- const x = viji.width * (audio.bands.mid + audio.bands.high) / 2;
746
- const y = viji.height * (1 - audio.bands.low);
747
-
748
- ctx.save();
749
- ctx.translate(x, y);
750
- ctx.scale(scale, scale);
751
- ctx.fillStyle = `hsl(${hue}, 80%, 60%)`;
752
- ctx.beginPath();
753
- ctx.arc(0, 0, 30, 0, Math.PI * 2);
754
- ctx.fill();
755
- ctx.restore();
756
- }
757
- }
758
- ```
759
-
760
- ### Video Analysis
761
-
762
- The video system provides real-time video frame analysis with frame data access for custom processing.
763
-
764
- #### Video API Overview
765
-
766
- ```typescript
767
- function render(viji) {
768
- const video = viji.video;
769
-
770
- if (video.isConnected) {
771
- // Video properties
772
- const frameWidth = video.frameWidth;
773
- const frameHeight = video.frameHeight;
774
- const frameRate = video.frameRate;
775
-
776
- // Current video frame (OffscreenCanvas)
777
- if (video.currentFrame) {
778
- // Draw video frame as background
779
- ctx.globalAlpha = 0.3;
780
- ctx.drawImage(video.currentFrame, 0, 0, viji.width, viji.height);
781
- ctx.globalAlpha = 1.0;
782
- }
783
-
784
- // Frame data for custom analysis
785
- const frameData = video.getFrameData();
786
-
787
- // Example: Custom video analysis
788
- if (frameData) {
789
- // Access raw pixel data for custom processing
790
- const imageData = frameData.data;
791
- const width = frameData.width;
792
- const height = frameData.height;
793
-
794
- // Example: Calculate average brightness
795
- let totalBrightness = 0;
796
- for (let i = 0; i < imageData.length; i += 4) {
797
- const r = imageData[i];
798
- const g = imageData[i + 1];
799
- const b = imageData[i + 2];
800
- totalBrightness += (r + g + b) / 3;
801
- }
802
- const averageBrightness = totalBrightness / (imageData.length / 4);
803
-
804
- // Use brightness for effects
805
- const brightness = averageBrightness / 255; // Normalize to 0-1
806
-
807
- // Create brightness-reactive animation
808
- ctx.fillStyle = `rgba(255, 255, 255, ${brightness * 0.5})`;
809
- ctx.fillRect(0, 0, viji.width, viji.height);
810
- }
811
- }
812
- }
813
- ```
814
-
815
- #### Video-Reactive Scene Example
816
-
817
- ```typescript
818
- // Define video-reactive parameters
819
- const videoReactive = viji.toggle(true, {
820
- label: 'Video Reactive',
821
- description: 'Make shapes react to video',
822
- group: 'video',
823
- category: 'video'
824
- });
825
-
826
- const motionSensitivity = viji.slider(1.0, {
827
- min: 0.1,
828
- max: 3.0,
829
- step: 0.1,
830
- label: 'Motion Sensitivity',
831
- description: 'How sensitive shapes are to video changes',
832
- group: 'video',
833
- category: 'video'
834
- });
835
-
836
- function render(viji) {
837
- const ctx = viji.useContext('2d');
838
- const video = viji.video;
839
-
840
- if (videoReactive.value && video.isConnected) {
841
- // Video-reactive animation using frame data
842
- const frameData = video.getFrameData();
843
-
844
- if (frameData) {
845
- // Simple motion detection (compare with previous frame)
846
- // This is a basic example - you can implement more sophisticated analysis
847
- const imageData = frameData.data;
848
- let motionEnergy = 0;
849
-
850
- // Calculate motion energy (simplified)
851
- for (let i = 0; i < imageData.length; i += 4) {
852
- const brightness = (imageData[i] + imageData[i + 1] + imageData[i + 2]) / 3;
853
- motionEnergy += brightness;
854
- }
855
-
856
- const normalizedMotion = (motionEnergy / (imageData.length / 4)) / 255;
857
- const scale = 1 + (normalizedMotion * motionSensitivity.value);
858
-
859
- // Create motion-reactive shapes
860
- ctx.save();
861
- ctx.translate(viji.width / 2, viji.height / 2);
862
- ctx.scale(scale, scale);
863
- ctx.fillStyle = `hsl(${normalizedMotion * 360}, 70%, 60%)`;
864
- ctx.beginPath();
865
- ctx.arc(0, 0, 30, 0, Math.PI * 2);
866
- ctx.fill();
867
- ctx.restore();
868
- }
869
- }
870
- }
871
- ```
872
-
873
- ### User Interaction
874
-
875
- The interaction system provides comprehensive support for mouse, keyboard, and touch input.
876
-
877
- #### Mouse Interaction
878
-
879
- ```typescript
880
- function render(viji) {
881
- const mouse = viji.mouse;
882
-
883
- // Mouse position (canvas coordinates)
884
- if (mouse.isInCanvas) {
885
- const x = mouse.x; // Current X coordinate
886
- const y = mouse.y; // Current Y coordinate
887
-
888
- // Mouse movement
889
- const deltaX = mouse.deltaX; // X movement since last frame
890
- const deltaY = mouse.deltaY; // Y movement since last frame
891
- const velocity = mouse.velocity; // Smoothed velocity { x, y }
892
-
893
- // Mouse buttons
894
- const isPressed = mouse.isPressed; // Any button currently pressed
895
- const leftButton = mouse.leftButton; // Left button state
896
- const rightButton = mouse.rightButton; // Right button state
897
- const middleButton = mouse.middleButton; // Middle button state
898
-
899
- // Frame-based events
900
- const wasPressed = mouse.wasPressed; // Button was pressed this frame
901
- const wasReleased = mouse.wasReleased; // Button was released this frame
902
- const wasMoved = mouse.wasMoved; // Mouse moved this frame
903
-
904
- // Scroll wheel
905
- const wheelDelta = mouse.wheelDelta; // Combined wheel delta
906
- const wheelX = mouse.wheelX; // Horizontal wheel delta
907
- const wheelY = mouse.wheelY; // Vertical wheel delta
908
-
909
- // Example: Mouse-reactive animation
910
- ctx.fillStyle = leftButton ? 'red' : 'blue';
911
- ctx.beginPath();
912
- ctx.arc(x, y, 20 + Math.abs(velocity.x + velocity.y), 0, Math.PI * 2);
913
- ctx.fill();
914
- }
915
- }
916
- ```
917
-
918
- #### Keyboard Interaction
919
-
920
- ```typescript
921
- function render(viji) {
922
- const keyboard = viji.keyboard;
923
-
924
- // Key state queries
925
- if (keyboard.isPressed('w')) {
926
- // W key is currently pressed
927
- console.log('W key is held down');
928
- }
929
-
930
- if (keyboard.wasPressed('space')) {
931
- // Space was pressed this frame
932
- console.log('Space was pressed!');
933
- }
934
-
935
- if (keyboard.wasReleased('escape')) {
936
- // Escape was released this frame
937
- console.log('Escape was released!');
938
- }
939
-
940
- // Active key tracking
941
- const activeKeys = keyboard.activeKeys; // Set of currently pressed keys
942
- const pressedThisFrame = keyboard.pressedThisFrame; // Set of keys pressed this frame
943
- const releasedThisFrame = keyboard.releasedThisFrame; // Set of keys released this frame
944
-
945
- // Modifier keys
946
- const shift = keyboard.shift; // Shift key is held
947
- const ctrl = keyboard.ctrl; // Ctrl key is held
948
- const alt = keyboard.alt; // Alt key is held
949
- const meta = keyboard.meta; // Meta/Cmd key is held
950
-
951
- // Recent activity
952
- const lastKeyPressed = keyboard.lastKeyPressed; // Last key that was pressed
953
- const lastKeyReleased = keyboard.lastKeyReleased; // Last key that was released
954
-
955
- // Example: Keyboard-controlled movement
956
- let moveX = 0;
957
- let moveY = 0;
958
-
959
- if (keyboard.isPressed('w') || keyboard.isPressed('W')) moveY -= 5;
960
- if (keyboard.isPressed('s') || keyboard.isPressed('S')) moveY += 5;
961
- if (keyboard.isPressed('a') || keyboard.isPressed('A')) moveX -= 5;
962
- if (keyboard.isPressed('d') || keyboard.isPressed('D')) moveX += 5;
963
-
964
- // Apply movement
965
- ctx.save();
966
- ctx.translate(moveX, moveY);
967
- ctx.fillStyle = 'green';
968
- ctx.fillRect(0, 0, 50, 50);
969
- ctx.restore();
970
- }
971
- ```
972
-
973
- #### Touch Interaction
974
-
975
- ```typescript
976
- function render(viji) {
977
- const touches = viji.touches;
978
-
979
- // Touch points
980
- for (const touch of touches.points) {
981
- const x = touch.x; // Touch X coordinate
982
- const y = touch.y; // Touch Y coordinate
983
- const pressure = touch.pressure; // Pressure (0-1)
984
- const radius = touch.radius; // Touch radius
985
- const id = touch.id; // Unique touch ID
986
-
987
- // Movement
988
- const deltaX = touch.deltaX; // X movement since last frame
989
- const deltaY = touch.deltaY; // Y movement since last frame
990
- const velocity = touch.velocity; // Movement velocity { x, y }
991
-
992
- // Lifecycle
993
- const isNew = touch.isNew; // Touch started this frame
994
- const isActive = touch.isActive; // Touch is currently active
995
- const isEnding = touch.isEnding; // Touch ending this frame
996
-
997
- // Draw touch point
998
- ctx.fillStyle = isNew ? 'red' : isEnding ? 'yellow' : 'blue';
999
- ctx.beginPath();
1000
- ctx.arc(x, y, radius * 2, 0, Math.PI * 2);
1001
- ctx.fill();
1002
- }
1003
-
1004
- // Touch events
1005
- const started = touches.started; // Touches that started this frame
1006
- const moved = touches.moved; // Touches that moved this frame
1007
- const ended = touches.ended; // Touches that ended this frame
1008
-
1009
- // Primary touch (first touch point)
1010
- const primary = touches.primary; // Primary touch point or null
1011
-
1012
- // Touch gestures
1013
- const gestures = touches.gestures;
1014
-
1015
- if (gestures.isPinching) {
1016
- const scale = gestures.pinchScale; // Current pinch scale
1017
- const delta = gestures.pinchDelta; // Scale change since last frame
1018
-
1019
- // React to pinch gesture
1020
- ctx.save();
1021
- ctx.scale(scale, scale);
1022
- ctx.fillStyle = 'purple';
1023
- ctx.fillRect(0, 0, 100, 100);
1024
- ctx.restore();
1025
- }
1026
-
1027
- if (gestures.isRotating) {
1028
- const angle = gestures.rotationAngle; // Current rotation angle
1029
- const delta = gestures.rotationDelta; // Rotation change since last frame
1030
-
1031
- // React to rotation gesture
1032
- ctx.save();
1033
- ctx.rotate(angle);
1034
- ctx.fillStyle = 'orange';
1035
- ctx.fillRect(-25, -25, 50, 50);
1036
- ctx.restore();
1037
- }
1038
-
1039
- if (gestures.isPanning) {
1040
- const panDelta = gestures.panDelta; // Pan movement { x, y }
1041
-
1042
- // React to pan gesture
1043
- ctx.save();
1044
- ctx.translate(panDelta.x, panDelta.y);
1045
- ctx.fillStyle = 'cyan';
1046
- ctx.fillRect(0, 0, 50, 50);
1047
- ctx.restore();
1048
- }
1049
-
1050
- if (gestures.isTapping) {
1051
- const tapCount = gestures.tapCount; // Number of taps
1052
- const tapPosition = gestures.tapPosition; // { x, y } tap position
1053
-
1054
- // React to tap gesture
1055
- if (tapPosition) {
1056
- ctx.fillStyle = 'lime';
1057
- ctx.beginPath();
1058
- ctx.arc(tapPosition.x, tapPosition.y, 30, 0, Math.PI * 2);
1059
- ctx.fill();
1060
- }
1061
- }
1062
- }
1063
- ```
1064
-
1065
- #### Device Sensors (Motion, Orientation, Geolocation)
1066
-
1067
- ```typescript
1068
- function render(viji) {
1069
- // Internal device (current device running the scene)
1070
- const device = viji.device;
1071
-
1072
- // Check if device motion is available
1073
- if (device.motion?.acceleration) {
1074
- const accelX = device.motion.acceleration.x; // m/s² (without gravity)
1075
-
1076
- // Example: Shake detection
1077
- const magnitude = Math.sqrt(
1078
- accelX**2 + device.motion.acceleration.y**2 + device.motion.acceleration.z**2
1079
- );
1080
- if (magnitude > 15) {
1081
- console.log('Device shaken!');
1082
- }
1083
- }
1084
-
1085
- // Check if device orientation is available
1086
- if (device.orientation.gamma !== null) {
1087
- const tiltLR = device.orientation.gamma; // -90 to 90 (left/right tilt)
1088
- const tiltFB = device.orientation.beta; // -180 to 180 (front/back tilt)
1089
-
1090
- // Example: Tilt-based control
1091
- const moveX = (tiltLR / 90) * 5;
1092
- ctx.save();
1093
- ctx.translate(moveX, 0);
1094
- ctx.fillStyle = 'red';
1095
- ctx.fillRect(viji.width/2 - 25, viji.height/2 - 25, 50, 50);
1096
- ctx.restore();
1097
- }
1098
-
1099
- // Check if geolocation is available
1100
- if (device.geolocation.latitude !== null) {
1101
- const lat = device.geolocation.latitude;
1102
- const lon = device.geolocation.longitude;
1103
-
1104
- ctx.fillStyle = 'white';
1105
- ctx.fillText(`Location: ${lat.toFixed(4)}, ${lon.toFixed(4)}`, 10, 30);
1106
- }
1107
-
1108
- // External connected devices (WebRTC/Sockets)
1109
- viji.devices.forEach((device, index) => {
1110
- if (device.orientation.gamma !== null) {
1111
- // Multi-device control example
1112
- const x = (device.orientation.gamma / 90 + 1) * viji.width / 2;
1113
- ctx.fillStyle = `hsl(${index * 60}, 70%, 60%)`;
1114
- ctx.beginPath();
1115
- ctx.arc(x, 100 + index * 60, 25, 0, Math.PI * 2);
1116
- ctx.fill();
1117
- }
1118
- });
1119
- }
1120
- ```
1121
-
1122
- **Device Sensor Features:**
1123
- - **Motion Sensors**: Accelerometer and gyroscope data for shake detection and motion tracking
1124
- - **Orientation Sensors**: Device tilt for tilt-based controls and compass heading
1125
- - **Geolocation**: GPS coordinates for location-based content
1126
- - **Multi-Device Support**: Connect multiple external devices via WebRTC for installations
1127
- - **Graceful Degradation**: Automatic handling of missing sensors or permissions
1128
-
1129
- **See `docs/20-device-sensor-system.md` for complete device sensor documentation including WebRTC integration and multi-device installations.**
1130
-
1131
- ## 🎨 P5.js Support
1132
-
1133
- Viji Core supports **P5.js** as an optional rendering library. P5.js provides familiar creative coding APIs while maintaining all Viji features including audio reactivity, video processing, and parameter management.
1134
-
1135
- ### Enabling P5.js Mode
1136
-
1137
- Add a single comment at the top of your scene code:
1138
-
1139
- ```javascript
1140
- // @renderer p5
1141
-
1142
- function setup(viji, p5) {
1143
- p5.colorMode(p5.HSB);
1144
- }
1145
-
1146
- function render(viji, p5) {
1147
- p5.background(220);
1148
- p5.fill(255, 0, 0);
1149
- p5.ellipse(viji.mouse.x, viji.mouse.y, 50, 50);
1150
- }
1151
- ```
1152
-
1153
- ### What Works
1154
-
1155
- - ✅ All P5.js drawing functions (shapes, colors, transforms, typography)
1156
- - ✅ P5.js math utilities (`noise()`, `random()`, `map()`, `lerp()`)
1157
- - ✅ P5.js vectors (`p5.Vector` class)
1158
- - ✅ WebGL mode (`p5.WEBGL`)
1159
- - ✅ Full Viji integration (audio, video, parameters, interaction)
1160
-
1161
- ### What Doesn't Work
1162
-
1163
- - ❌ p5.dom (use Viji parameters instead)
1164
- - ❌ p5.sound (use `viji.audio` instead)
1165
- - ❌ P5.js events (use `viji.mouse`/`keyboard`/`touches` instead)
1166
- - ❌ Direct image loading (use Viji image parameters instead)
1167
-
1168
- ### Audio Reactive P5.js Example
1169
-
1170
- ```javascript
1171
- // @renderer p5
1172
-
1173
- const audioReactive = viji.toggle(true, {
1174
- label: 'Audio Reactive',
1175
- category: 'audio'
1176
- });
1177
-
1178
- const bassReactivity = viji.slider(1.0, {
1179
- min: 0,
1180
- max: 3.0,
1181
- label: 'Bass Reactivity',
1182
- category: 'audio'
1183
- });
1184
-
1185
- function render(viji, p5) {
1186
- p5.background(0);
1187
-
1188
- if (audioReactive.value && viji.audio.isConnected) {
1189
- const bass = viji.audio.bands.low;
1190
- const volume = viji.audio.volume.current;
1191
-
1192
- const hue = bass * 360 * bassReactivity.value;
1193
- const size = 100 + volume * 200;
1194
-
1195
- p5.colorMode(p5.HSB);
1196
- p5.fill(hue, 80, 100);
1197
- p5.ellipse(p5.width / 2, p5.height / 2, size, size);
1198
- }
1199
- }
1200
- ```
1201
-
1202
- ### Image Parameters
1203
-
1204
- ```javascript
1205
- // @renderer p5
1206
-
1207
- const bgImage = viji.image(null, {
1208
- label: 'Background Image',
1209
- group: 'media',
1210
- accept: 'image/*'
1211
- });
1212
-
1213
- function render(viji, p5) {
1214
- p5.background(220);
1215
-
1216
- if (bgImage.value) {
1217
- p5.image(bgImage.value, 0, 0, p5.width, p5.height);
1218
- }
1219
-
1220
- // Draw on top of image
1221
- p5.fill(255, 0, 0, 128);
1222
- p5.ellipse(viji.mouse.x, viji.mouse.y, 50, 50);
1223
- }
1224
- ```
1225
-
1226
- ### Getting Renderer Type
1227
-
1228
- ```typescript
1229
- const core = new VijiCore({
1230
- hostContainer: container,
1231
- sceneCode: sceneCode
1232
- });
1233
-
1234
- await core.initialize();
1235
-
1236
- // Check which renderer is being used (from stats, like FPS/resolution)
1237
- const stats = core.getStats();
1238
- const rendererType = stats.rendererType; // 'native' | 'p5'
1239
- ```
1240
-
1241
- ### Loading Image Parameters
1242
-
1243
- ```typescript
1244
- core.onParametersDefined((groups) => {
1245
- groups.forEach(group => {
1246
- Object.entries(group.parameters).forEach(([name, def]) => {
1247
- if (def.type === 'image') {
1248
- // Create file picker
1249
- const input = createFileInput(name, def);
1250
- input.addEventListener('change', async (e) => {
1251
- const file = e.target.files[0];
1252
- if (file) {
1253
- await core.setParameter(name, file); // Unified API handles images automatically
1254
- }
1255
- });
1256
- }
1257
- });
1258
- });
1259
- });
1260
- ```
1261
-
1262
- **See `docs/16-p5js-integration.md` for comprehensive documentation including migration guides, troubleshooting, and advanced examples.**
1263
-
1264
- ## 🏗️ Architecture
1265
-
1266
- ### Security Model
1267
-
1268
- The core implements a comprehensive security model to ensure safe execution of artist code:
1269
-
1270
- - **IFrame Isolation**: Complete separation from host environment with sandboxed execution
1271
- - **WebWorker Sandboxing**: Artist code runs with controlled API access only
1272
- - **Blob URL Creation**: Secure worker and IFrame creation from blob URLs
1273
- - **Resource Protection**: Memory leaks and errors cannot affect main application
1274
- - **Controlled Communication**: Optimized message passing with validation
1275
-
1276
- ### Performance Features
1277
-
1278
- The core provides extensive performance optimization capabilities:
1279
-
1280
- - **Configurable Frame Rates**: Full (60fps) or half (30fps) modes for performance tuning
1281
- - **Resolution Scaling**: Fractional (0.1-1.0) or explicit canvas dimensions
1282
- - **Adaptive Optimization**: Automatic performance tuning based on hardware capabilities
1283
- - **Efficient Communication**: Optimized message passing between components
1284
- - **Memory Management**: Automatic resource cleanup and memory leak prevention
1285
-
1286
- ### Multi-Instance Support
1287
-
1288
- The core supports multiple concurrent instances for complex applications:
1289
-
1290
- ```typescript
1291
- // Main scene with full features
1292
- const mainCore = new VijiCore({
1293
- hostContainer: document.getElementById('main-scene'),
1294
- resolution: { width: 1920, height: 1080 },
1295
- frameRateMode: 'full',
1296
- allowUserInteraction: true,
1297
- audioStream: sharedAudioStream,
1298
- videoStream: sharedVideoStream
1299
- });
1300
-
1301
- // Preview instance with reduced features
1302
- const previewCore = new VijiCore({
1303
- hostContainer: document.getElementById('preview'),
1304
- resolution: 0.25, // 25% resolution for performance
1305
- frameRateMode: 'half',
1306
- noInputs: true,
1307
- allowUserInteraction: false,
1308
- audioStream: sharedAudioStream // Shared efficiently across instances
1309
- });
1310
-
1311
- // Thumbnail instance for gallery view
1312
- const thumbnailCore = new VijiCore({
1313
- hostContainer: document.getElementById('thumbnail'),
1314
- resolution: 0.1, // 10% resolution
1315
- frameRateMode: 'half',
1316
- noInputs: true,
1317
- allowUserInteraction: false
1318
- });
1319
-
1320
- // To change scenes, create a new instance and destroy the old one
1321
- const newCore = new VijiCore({
1322
- hostContainer: document.getElementById('scene-host'),
1323
- sceneCode: newSceneCode,
1324
- audioStream: sharedAudioStream,
1325
- videoStream: sharedVideoStream
1326
- });
1327
-
1328
- // Automatic comprehensive cleanup when destroyed
1329
- await oldCore.destroy();
1330
- ```
1331
-
1332
- ## 🔍 Error Handling
1333
-
1334
- The core provides comprehensive error handling with detailed error information:
1335
-
1336
- ```typescript
1337
- import { VijiCoreError } from '@viji-dev/core';
1338
-
1339
- try {
1340
- const core = new VijiCore(config);
1341
- await core.initialize();
1342
- } catch (error) {
1343
- if (error instanceof VijiCoreError) {
1344
- console.error(`Core error [${error.code}]:`, error.message);
1345
- console.error('Error context:', error.context);
1346
-
1347
- // Handle specific error types
1348
- switch (error.code) {
1349
- case 'INVALID_CONFIG':
1350
- console.error('Configuration is invalid:', error.context);
1351
- break;
1352
- case 'INITIALIZATION_ERROR':
1353
- console.error('Failed to initialize core:', error.context);
1354
- break;
1355
- case 'CORE_NOT_READY':
1356
- console.error('Core is not ready for operations');
1357
- break;
1358
- case 'INSTANCE_DESTROYED':
1359
- console.error('Core instance has been destroyed');
1360
- break;
1361
- case 'PARAMETERS_NOT_INITIALIZED':
1362
- console.error('Parameter system not yet initialized');
1363
- break;
1364
- case 'UNKNOWN_PARAMETER':
1365
- console.error('Parameter not found:', error.context);
1366
- break;
1367
- }
1368
- } else {
1369
- console.error('Unexpected error:', error);
1370
- }
1371
- }
1372
- ```
1373
-
1374
- **Common Error Codes:**
1375
- - `INVALID_CONFIG` - Configuration validation failed
1376
- - `INITIALIZATION_ERROR` - Failed to initialize core components
1377
- - `CORE_NOT_READY` - Operation attempted before ready
1378
- - `INSTANCE_DESTROYED` - Operation attempted after destroy
1379
- - `PARAMETERS_NOT_INITIALIZED` - Parameters not yet available
1380
- - `UNKNOWN_PARAMETER` - Parameter not found
1381
- - `CONCURRENT_INITIALIZATION` - Multiple initialization attempts
1382
- - `MANAGER_NOT_READY` - Internal component not available
1383
-
1384
- ## 🧪 Development
1385
-
1386
- ```bash
1387
- # Install dependencies
1388
- npm install
1389
-
1390
- # Build the package
1391
- npm run build
1392
-
1393
- # Run tests
1394
- npm test
1395
-
1396
- # Development build (watch mode)
1397
- npm run dev
1398
-
1399
- # Type checking
1400
- npm run type-check
1401
-
1402
- # Linting
1403
- npm run lint
1404
- ```
1405
-
1406
- ## 📚 Examples
1407
-
1408
- The package includes comprehensive examples in the `/example` directory:
1409
-
1410
- - **Basic Scene Creation**: Simple animated shapes with parameters
1411
- - **Audio-Reactive Scenes**: Scenes that respond to audio input
1412
- - **Video-Reactive Scenes**: Scenes that respond to video analysis
1413
- - **Interactive Scenes**: Mouse, keyboard, and touch interaction
1414
- - **Parameter System**: Complete parameter definition and UI generation
1415
- - **Multi-Instance**: Multiple concurrent scene instances
1416
-
1417
- ## 🎯 Use Cases
1418
-
1419
- ### Platform Integration
1420
- The core integrates seamlessly with the Viji platform, providing scene execution while the platform handles UI, user management, and social features.
1421
-
1422
- ### SDK Development
1423
- The core serves as the execution foundation for the Viji SDK, ensuring identical behavior between development and platform environments.
1424
-
1425
- ### Standalone Applications
1426
- Use the core directly in custom applications for creative scene rendering with full feature support.
1427
-
1428
- ## 📄 License
1429
-
1430
- Copyright (c) 2025 Artem Verkhovskiy and Dmitry Manoilenko.
1431
- All rights reserved - see the LICENSE file for details.
1432
-
1433
- ## Contributor License Agreement
1434
-
1435
- By contributing, you agree to the [CLA](./CLA.md).
1436
- Please also confirm your agreement by filling out this short [form](https://forms.gle/A49WTz8nj5b99Yev7).
1437
-
1438
- ---
1439
-
1
+ # Viji Core Package (`@viji-dev/core`)
2
+
3
+ **Universal execution engine for Viji Creative scenes**
4
+
5
+ A powerful, secure, and feature-rich JavaScript/TypeScript library that provides the foundation for creative scene execution across all Viji platform contexts. The core offers identical IFrame + WebWorker execution with comprehensive parameter management, audio/video analysis, user interaction handling, and performance optimization.
6
+
7
+ ## 🚀 Features
8
+
9
+ ### ✅ **Core Execution Engine**
10
+ - **Secure IFrame + WebWorker Architecture**: Complete isolation with controlled communication
11
+ - **Multi-Instance Support**: Concurrent instances for main scenes and previews
12
+ - **Automatic Resource Management**: Memory leak prevention and cleanup
13
+
14
+ ### ✅ **Parameter System**
15
+ - **Declarative Parameter Definition**: Define parameters once with automatic UI generation
16
+ - **Proxy-Based Access**: Fast parameter access in render loops
17
+ - **Category-Based Organization**: Audio, video, interaction, and general parameters
18
+ - **Real-time Validation**: Type safety and range checking
19
+ - **Capability-Aware UI**: Parameters shown based on active features
20
+
21
+ ### ✅ **Audio Analysis**
22
+ - **Real-time Audio Processing**: Volume, frequency analysis, and beat detection
23
+ - **Custom Frequency Bands**: Bass, mid, treble, and custom band analysis
24
+ - **Multiple Input Sources**: Microphone, audio files, and screen capture
25
+ - **Audio-Reactive Scenes**: Make scenes respond to audio input
26
+
27
+ ### ✅ **Video Analysis**
28
+ - **Real-time Video Processing**: Frame analysis in separate WebWorker
29
+ - **Multiple Input Sources**: Camera, video files, and screen capture
30
+ - **Video-Reactive Scenes**: Make scenes respond to video motion and brightness
31
+ - **Frame Data Access**: Raw video frame data for custom analysis
32
+
33
+ ### ✅ **User Interaction**
34
+ - **Mouse Tracking**: Position, buttons, movement, and scroll wheel
35
+ - **Keyboard Input**: Key states, modifiers, and event handling
36
+ - **Touch Support**: Multi-touch with gesture detection
37
+ - **Device Sensors**: Motion, orientation, and geolocation (internal + external devices)
38
+ - **Canvas-Coordinate Mapping**: Accurate input positioning
39
+
40
+ ### ✅ **Performance Optimization**
41
+ - **Configurable Frame Rates**: Full (60fps) or half (30fps) modes
42
+ - **Resolution Scaling**: Fractional or explicit canvas dimensions
43
+ - **Adaptive Performance**: Automatic optimization based on hardware
44
+ - **Memory Management**: Efficient resource pooling and cleanup
45
+
46
+ ## 📦 Installation
47
+
48
+ ```bash
49
+ npm install @viji-dev/core
50
+ ```
51
+
52
+ ## 🎯 Quick Start
53
+
54
+ ### Basic Scene Creation
55
+
56
+ ```typescript
57
+ import { VijiCore } from '@viji-dev/core';
58
+
59
+ // Artist scene code
60
+ const sceneCode = `
61
+ // Define parameters using helper functions
62
+ const color = viji.color('#ff6b6b', {
63
+ label: 'Shape Color',
64
+ description: 'Color of the animated shape',
65
+ group: 'appearance'
66
+ });
67
+
68
+ const size = viji.slider(50, {
69
+ min: 10,
70
+ max: 150,
71
+ step: 5,
72
+ label: 'Shape Size',
73
+ description: 'Size of the animated shape',
74
+ group: 'appearance'
75
+ });
76
+
77
+ const speed = viji.slider(1.0, {
78
+ min: 0.1,
79
+ max: 3.0,
80
+ step: 0.1,
81
+ label: 'Animation Speed',
82
+ description: 'Speed of the animation',
83
+ group: 'animation'
84
+ });
85
+
86
+ // Main render function
87
+ function render(viji) {
88
+ const ctx = viji.useContext('2d');
89
+
90
+ // Clear canvas
91
+ ctx.fillStyle = '#2c3e50';
92
+ ctx.fillRect(0, 0, viji.width, viji.height);
93
+
94
+ // Animated shape
95
+ const time = viji.time * speed.value;
96
+ const x = viji.width / 2 + Math.sin(time) * 100;
97
+ const y = viji.height / 2 + Math.cos(time) * 100;
98
+
99
+ ctx.fillStyle = color.value;
100
+ ctx.beginPath();
101
+ ctx.arc(x, y, size.value / 2, 0, Math.PI * 2);
102
+ ctx.fill();
103
+ }
104
+ `;
105
+
106
+ // Create core instance
107
+ const core = new VijiCore({
108
+ hostContainer: document.getElementById('scene-container'),
109
+ sceneCode: sceneCode,
110
+ frameRateMode: 'full',
111
+ allowUserInteraction: true
112
+ });
113
+
114
+ // Initialize and start rendering
115
+ await core.initialize();
116
+ console.log('Scene is running!');
117
+ ```
118
+
119
+ ## 🔧 Integration API
120
+
121
+ ### Core Configuration
122
+
123
+ The `VijiCoreConfig` interface defines all available configuration options:
124
+
125
+ ```typescript
126
+ interface VijiCoreConfig {
127
+ // Required configuration
128
+ hostContainer: HTMLElement; // Container element for the scene
129
+ sceneCode: string; // Artist JavaScript code with render function
130
+
131
+ // Performance configuration
132
+ frameRateMode?: 'full' | 'half'; // 'full' = 60fps, 'half' = 30fps
133
+
134
+ // Input streams
135
+ audioStream?: MediaStream; // Audio input for analysis
136
+ videoStream?: MediaStream; // Video input (main, CV enabled)
137
+ videoStreams?: MediaStream[]; // Additional video streams (no CV)
138
+
139
+ // Feature toggles
140
+ noInputs?: boolean; // Disable all input processing
141
+ allowUserInteraction?: boolean; // Enable mouse/keyboard/touch events
142
+ allowDeviceInteraction?: boolean; // Enable device sensors (motion/orientation/geolocation)
143
+ }
144
+ ```
145
+
146
+ ### Instance Management
147
+
148
+ #### Creation and Initialization
149
+
150
+ ```typescript
151
+ // Create core instance
152
+ const core = new VijiCore({
153
+ hostContainer: document.getElementById('scene-container'),
154
+ sceneCode: sceneCode,
155
+ frameRateMode: 'full',
156
+ allowUserInteraction: true
157
+ });
158
+
159
+ // Initialize the core (required before use)
160
+ await core.initialize();
161
+
162
+ // Check if core is ready for operations
163
+ if (core.ready) {
164
+ console.log('Core is ready for use');
165
+ }
166
+
167
+ // Get current configuration
168
+ const config = core.configuration;
169
+ console.log('Current frame rate mode:', config.frameRateMode);
170
+ ```
171
+
172
+ #### Performance Control
173
+
174
+ ```typescript
175
+ // Frame rate control
176
+ await core.setFrameRate('full'); // Set to 60fps mode
177
+ await core.setFrameRate('half'); // Set to 30fps mode
178
+
179
+ // Resolution control
180
+ await core.setResolution(0.75); // Set to 75% of container size
181
+ await core.setResolution(0.5); // Set to 50% for performance
182
+ await core.updateResolution(); // Auto-detect container size changes
183
+
184
+ // Get performance statistics
185
+ const stats = core.getStats();
186
+ console.log('Current FPS:', stats.frameRate.effectiveRefreshRate);
187
+ console.log('Canvas size:', stats.resolution);
188
+ console.log('Scale factor:', stats.scale);
189
+ console.log('Parameter count:', stats.parameterCount);
190
+ ```
191
+
192
+ #### Debug and Development
193
+
194
+ ```typescript
195
+ // Enable debug logging
196
+ core.setDebugMode(true);
197
+
198
+ // Check debug mode status
199
+ const isDebugEnabled = core.getDebugMode();
200
+
201
+ // Debug mode provides detailed logging for:
202
+ // - Initialization process
203
+ // - Communication between components
204
+ // - Parameter system operations
205
+ // - Audio/video stream processing
206
+ // - Performance statistics
207
+ ```
208
+
209
+ ### Parameter Management
210
+
211
+ The parameter system provides a powerful way to create interactive scenes with automatic UI generation.
212
+
213
+ #### Parameter Definition and Access
214
+
215
+ ```typescript
216
+ // Listen for parameter definitions from artist code
217
+ core.onParametersDefined((groups) => {
218
+ console.log('Parameters available:', groups);
219
+
220
+ // Each group contains:
221
+ // - groupName: string
222
+ // - category: 'audio' | 'video' | 'interaction' | 'general'
223
+ // - description: string
224
+ // - parameters: Record<string, ParameterDefinition>
225
+
226
+ // Generate UI based on parameter groups
227
+ generateParameterUI(groups);
228
+ });
229
+
230
+ // Set individual parameter values
231
+ await core.setParameter('color', '#ff0000');
232
+ await core.setParameter('size', 75);
233
+ await core.setParameter('enabled', true);
234
+
235
+ // Set multiple parameters efficiently
236
+ await core.setParameters({
237
+ 'color': '#00ff00',
238
+ 'size': 100,
239
+ 'speed': 2.0,
240
+ 'enabled': false
241
+ });
242
+
243
+ // Get current parameter values
244
+ const values = core.getParameterValues();
245
+ const color = core.getParameter('color');
246
+
247
+ // Listen for parameter changes
248
+ core.onParameterChange('size', (value) => {
249
+ console.log('Size parameter changed to:', value);
250
+ });
251
+
252
+ // Listen for parameter errors
253
+ core.onParameterError((error) => {
254
+ console.error('Parameter error:', error.message);
255
+ console.error('Error code:', error.code);
256
+ });
257
+ ```
258
+
259
+ #### Capability-Aware Parameters
260
+
261
+ ```typescript
262
+ // Get all parameter groups (unfiltered, use for saving scene parameters)
263
+ const allGroups = core.getAllParameterGroups();
264
+
265
+ // Get parameter groups filtered by active capabilities (for UI)
266
+ const visibleGroups = core.getVisibleParameterGroups();
267
+
268
+ // Check current capabilities
269
+ const capabilities = core.getCapabilities();
270
+ console.log('Audio available:', capabilities.hasAudio);
271
+ console.log('Video available:', capabilities.hasVideo);
272
+ console.log('Interaction enabled:', capabilities.hasInteraction);
273
+
274
+ // Check if specific parameter category is active
275
+ const isAudioActive = core.isCategoryActive('audio');
276
+ const isVideoActive = core.isCategoryActive('video');
277
+
278
+ // Parameters are automatically categorized:
279
+ // - 'audio': Only shown when audio stream is connected
280
+ // - 'video': Only shown when video stream is connected
281
+ // - 'interaction': Only shown when user interaction is enabled
282
+ // - 'general': Always available
283
+ ```
284
+
285
+ ### Audio and Video Integration
286
+
287
+ #### Audio Stream Management
288
+
289
+ ```typescript
290
+ // Set audio stream for analysis
291
+ const audioStream = await navigator.mediaDevices.getUserMedia({
292
+ audio: {
293
+ echoCancellation: false,
294
+ noiseSuppression: false,
295
+ autoGainControl: false
296
+ }
297
+ });
298
+ await core.setAudioStream(audioStream);
299
+
300
+ // Configure audio analysis with the new namespace API
301
+ // Basic control
302
+ core.audio.setSensitivity(1.5); // Increase sensitivity (0.5-2.0)
303
+
304
+ // Tap tempo for manual BPM control
305
+ core.audio.beat.tap(); // Tap to set tempo
306
+ core.audio.beat.clearTaps(); // Clear tap history
307
+ const tapCount = core.audio.beat.getTapCount();
308
+
309
+ // Beat mode control
310
+ core.audio.beat.setMode('auto'); // Automatic beat detection (default)
311
+ core.audio.beat.setMode('manual'); // Manual BPM control
312
+ core.audio.beat.setBPM(120); // Set manual BPM (60-240)
313
+ const currentBPM = core.audio.beat.getBPM();
314
+
315
+ // Beat phase control for fine-tuning
316
+ core.audio.beat.nudge(0.1); // Nudge phase forward
317
+ core.audio.beat.resetPhase(); // Reset phase to zero
318
+
319
+ // Advanced configuration (optional)
320
+ core.audio.advanced.setFFTSize(2048); // FFT resolution (higher = more accurate)
321
+ core.audio.advanced.setSmoothing(0.8); // Smoothing factor (0-1)
322
+ core.audio.advanced.setAutoGain(true); // Auto-gain normalization
323
+ core.audio.advanced.setBeatDetection(true); // Enable beat detection
324
+ core.audio.advanced.setOnsetDetection(true); // Enable onset detection
325
+
326
+ // Get current audio state
327
+ const state = core.audio.getState();
328
+ console.log('BPM:', state.currentBPM);
329
+ console.log('Confidence:', state.confidence);
330
+ console.log('Mode:', state.mode);
331
+ console.log('Is Locked:', state.isLocked);
332
+
333
+ // Get current audio stream
334
+ const currentStream = core.getAudioStream();
335
+
336
+ // Disconnect audio
337
+ await core.setAudioStream(null);
338
+ ```
339
+
340
+ #### Video Stream Management
341
+
342
+ ```typescript
343
+ // Set video stream for analysis
344
+ const videoStream = await navigator.mediaDevices.getUserMedia({
345
+ video: {
346
+ width: { ideal: 640 },
347
+ height: { ideal: 480 },
348
+ frameRate: { ideal: 30 }
349
+ }
350
+ });
351
+ await core.setVideoStream(videoStream);
352
+
353
+ // Video analysis includes:
354
+ // - Real-time frame processing
355
+ // - Frame data access for custom analysis
356
+ // - Brightness and motion detection
357
+ // - Custom computer vision processing
358
+
359
+ // Disconnect video
360
+ await core.setVideoStream(null);
361
+ ```
362
+
363
+ #### Interaction Management
364
+
365
+ ```typescript
366
+ // Enable or disable user interactions at runtime
367
+ await core.setInteractionEnabled(true); // Enable mouse, keyboard, and touch
368
+ await core.setInteractionEnabled(false); // Disable all interactions
369
+
370
+ // Get current interaction state
371
+ const isInteractionEnabled = core.getInteractionEnabled();
372
+
373
+ // Interaction state affects:
374
+ // - Mouse, keyboard, and touch event processing
375
+ // - Parameter visibility (interaction category parameters)
376
+ // - Scene behavior that depends on user input
377
+
378
+ // Note: Interaction state is separate from initialization config
379
+ // You can toggle interactions regardless of initial allowUserInteraction value
380
+ // The interaction system is always available for runtime control
381
+ ```
382
+
383
+ #### Capability Change Monitoring
384
+
385
+ ```typescript
386
+ // Listen for capability changes
387
+ core.onCapabilitiesChange((capabilities) => {
388
+ console.log('Capabilities updated:', capabilities);
389
+
390
+ // Update UI based on new capabilities
391
+ if (capabilities.hasAudio) {
392
+ showAudioControls();
393
+ } else {
394
+ hideAudioControls();
395
+ }
396
+
397
+ if (capabilities.hasVideo) {
398
+ showVideoControls();
399
+ } else {
400
+ hideVideoControls();
401
+ }
402
+
403
+ if (capabilities.hasInteraction) {
404
+ showInteractionControls();
405
+ } else {
406
+ hideInteractionControls();
407
+ }
408
+ });
409
+ ```
410
+
411
+ ### Event Handling and Lifecycle
412
+
413
+ #### Core Lifecycle Events
414
+
415
+ ```typescript
416
+ // Core is ready for operations
417
+ if (core.ready) {
418
+ // All systems initialized and running
419
+ console.log('Core is fully operational');
420
+ }
421
+
422
+ // Check if parameters are initialized
423
+ if (core.parametersReady) {
424
+ // Parameter system is ready
425
+ console.log('Parameters are available');
426
+ }
427
+ ```
428
+
429
+ #### Cleanup and Resource Management
430
+
431
+ ```typescript
432
+ // Destroy instance and clean up all resources
433
+ await core.destroy();
434
+
435
+ // This automatically:
436
+ // - Stops all rendering loops
437
+ // - Disconnects audio/video streams
438
+ // - Cleans up WebWorker and IFrame
439
+ // - Releases all event listeners
440
+ // - Clears parameter system
441
+ // - Frees memory resources
442
+ ```
443
+
444
+ ## 🎨 Artist API
445
+
446
+ The artist API provides a comprehensive set of tools for creating interactive, audio-reactive, and video-responsive scenes.
447
+
448
+ ### Canvas and Rendering
449
+
450
+ ```typescript
451
+ function render(viji) {
452
+ // Get canvas contexts
453
+ const ctx = viji.useContext('2d'); // 2D rendering context
454
+ const gl = viji.useContext('webgl'); // WebGL rendering context
455
+
456
+ // Canvas properties
457
+ viji.canvas; // OffscreenCanvas object
458
+ viji.width; // Canvas width in pixels
459
+ viji.height; // Canvas height in pixels
460
+ viji.pixelRatio; // Device pixel ratio for crisp rendering
461
+
462
+ // Example: Draw a responsive circle
463
+ const centerX = viji.width / 2;
464
+ const centerY = viji.height / 2;
465
+ const radius = Math.min(viji.width, viji.height) * 0.1;
466
+
467
+ ctx.fillStyle = '#ff6b6b';
468
+ ctx.beginPath();
469
+ ctx.arc(centerX, centerY, radius, 0, Math.PI * 2);
470
+ ctx.fill();
471
+ }
472
+ ```
473
+
474
+ ### Timing Information
475
+
476
+ The timing system provides FPS-independent timing data for smooth animations:
477
+
478
+ ```typescript
479
+ function render(viji) {
480
+ // Timing data (FPS independent)
481
+ viji.time; // Elapsed time in seconds since scene start
482
+ viji.deltaTime; // Time since last frame in seconds
483
+ viji.frameCount; // Total number of frames rendered
484
+ viji.fps; // Current frames per second
485
+
486
+ // Example: Smooth animation regardless of frame rate
487
+ const animationSpeed = 2.0; // rotations per second
488
+ const rotation = (viji.time * animationSpeed * Math.PI * 2) % (Math.PI * 2);
489
+
490
+ ctx.save();
491
+ ctx.translate(viji.width / 2, viji.height / 2);
492
+ ctx.rotate(rotation);
493
+ ctx.fillRect(-25, -25, 50, 50);
494
+ ctx.restore();
495
+ }
496
+ ```
497
+
498
+ ### Parameter System
499
+
500
+ The parameter system allows artists to define interactive parameters that automatically generate UI controls.
501
+
502
+ #### Parameter Definition
503
+
504
+ ```typescript
505
+ // Define parameters (call once outside render loop)
506
+ const color = viji.color('#ff6b6b', {
507
+ label: 'Primary Color',
508
+ description: 'Main color for shapes',
509
+ group: 'appearance',
510
+ category: 'general'
511
+ });
512
+
513
+ const size = viji.slider(50, {
514
+ min: 10,
515
+ max: 150,
516
+ step: 5,
517
+ label: 'Shape Size',
518
+ description: 'Size of shapes in pixels',
519
+ group: 'appearance',
520
+ category: 'general'
521
+ });
522
+
523
+ const speed = viji.slider(1.0, {
524
+ min: 0.1,
525
+ max: 3.0,
526
+ step: 0.1,
527
+ label: 'Animation Speed',
528
+ description: 'Speed of animation in rotations per second',
529
+ group: 'animation',
530
+ category: 'general'
531
+ });
532
+
533
+ const useAudio = viji.toggle(false, {
534
+ label: 'Audio Reactive',
535
+ description: 'Make shapes react to audio input',
536
+ group: 'audio',
537
+ category: 'audio'
538
+ });
539
+
540
+ const shapeType = viji.select('circle', {
541
+ options: ['circle', 'square', 'triangle', 'star'],
542
+ label: 'Shape Type',
543
+ description: 'Type of shape to draw',
544
+ group: 'appearance',
545
+ category: 'general'
546
+ });
547
+
548
+ const title = viji.text('My Scene', {
549
+ label: 'Scene Title',
550
+ description: 'Title displayed in the scene',
551
+ group: 'text',
552
+ category: 'general',
553
+ maxLength: 50
554
+ });
555
+
556
+ const particleCount = viji.number(5, {
557
+ min: 1,
558
+ max: 20,
559
+ step: 1,
560
+ label: 'Particle Count',
561
+ description: 'Number of particles to render',
562
+ group: 'animation',
563
+ category: 'general'
564
+ });
565
+ ```
566
+
567
+ #### Parameter Usage in Render Loop
568
+
569
+ ```typescript
570
+ function render(viji) {
571
+ const ctx = viji.useContext('2d');
572
+
573
+ // Fast parameter access (proxy-based)
574
+ ctx.fillStyle = color.value; // Get current color value
575
+ const radius = size.value / 2; // Get current size value
576
+ const animationSpeed = speed.value; // Get current speed value
577
+
578
+ // Clear canvas
579
+ ctx.fillStyle = '#2c3e50';
580
+ ctx.fillRect(0, 0, viji.width, viji.height);
581
+
582
+ // Draw title
583
+ ctx.fillStyle = 'white';
584
+ ctx.font = '20px Arial';
585
+ ctx.textAlign = 'center';
586
+ ctx.fillText(title.value, viji.width / 2, 30);
587
+
588
+ // Draw particles
589
+ for (let i = 0; i < particleCount.value; i++) {
590
+ const angle = (i / particleCount.value) * Math.PI * 2 + (viji.time * animationSpeed);
591
+ const x = viji.width / 2 + Math.cos(angle) * 100;
592
+ const y = viji.height / 2 + Math.sin(angle) * 100;
593
+
594
+ ctx.fillStyle = color.value;
595
+ ctx.beginPath();
596
+
597
+ switch (shapeType.value) {
598
+ case 'circle':
599
+ ctx.arc(x, y, radius, 0, Math.PI * 2);
600
+ break;
601
+ case 'square':
602
+ ctx.rect(x - radius, y - radius, radius * 2, radius * 2);
603
+ break;
604
+ case 'triangle':
605
+ ctx.moveTo(x, y - radius);
606
+ ctx.lineTo(x - radius, y + radius);
607
+ ctx.lineTo(x + radius, y + radius);
608
+ ctx.closePath();
609
+ break;
610
+ }
611
+
612
+ ctx.fill();
613
+ }
614
+ }
615
+ ```
616
+
617
+ ### Audio Analysis
618
+
619
+ The audio system provides real-time analysis of audio input with comprehensive frequency and volume data.
620
+
621
+ #### Audio API Overview
622
+
623
+ ```typescript
624
+ function render(viji) {
625
+ const audio = viji.audio;
626
+
627
+ if (audio.isConnected) {
628
+ // Volume analysis with smooth values
629
+ const volume = audio.volume.current; // 0-1 current volume level (RMS-based)
630
+ const peak = audio.volume.peak; // 0-1 peak amplitude (instant)
631
+ const smooth = audio.volume.smoothed; // 0-1 smoothed volume (200ms decay)
632
+
633
+ // Frequency bands (0-1 values) with instant and smooth versions
634
+ const low = audio.bands.low; // 20-150 Hz (bass/kick range, instant)
635
+ const lowSmoothed = audio.bands.lowSmoothed; // Smooth low frequency energy
636
+ const lowMid = audio.bands.lowMid; // 150-400 Hz
637
+ const mid = audio.bands.mid; // 400-2500 Hz (vocals, instruments)
638
+ const highMid = audio.bands.highMid; // 2500-8000 Hz (cymbals, hi-hats)
639
+ const high = audio.bands.high; // 8000-20000 Hz (air, brilliance)
640
+
641
+ // Automatic beat detection with BPM tracking
642
+ const beat = audio.beat;
643
+
644
+ // Fast energy curves (300ms decay - primary for most visuals)
645
+ const kickEnergy = beat.kick; // 0-1 kick drum energy
646
+ const snareEnergy = beat.snare; // 0-1 snare energy
647
+ const hatEnergy = beat.hat; // 0-1 hi-hat energy
648
+ const anyEnergy = beat.any; // 0-1 any beat type energy
649
+
650
+ // Smoothed energy curves (500ms decay - for slower animations)
651
+ const kickSmoothed = beat.kickSmoothed;
652
+ const snareSmoothed = beat.snareSmoothed;
653
+ const anySmoothed = beat.anySmoothed;
654
+
655
+ // Instant triggers (for advanced use cases)
656
+ if (beat.triggers.kick) {
657
+ // Kick drum detected this frame
658
+ spawnParticle('kick');
659
+ }
660
+
661
+ // BPM and tempo information
662
+ const bpm = beat.bpm; // Detected BPM (60-180)
663
+ const phase = beat.phase; // Beat phase 0-1 within current beat
664
+ const bar = beat.bar; // Current bar number (0-3 for 4/4)
665
+ const confidence = beat.confidence; // Detection confidence 0-1
666
+ const isLocked = beat.isLocked; // True when beat is locked
667
+
668
+ // Spectral features for advanced audio-reactive effects
669
+ const brightness = audio.spectral.brightness; // 0-1 spectral centroid
670
+ const flatness = audio.spectral.flatness; // 0-1 spectral flatness (noisiness)
671
+ const flux = audio.spectral.flux; // 0-1 spectral flux (change)
672
+
673
+ // Raw frequency data (0-255 values)
674
+ const frequencyData = audio.getFrequencyData();
675
+
676
+ // Example 1: Smooth beat-reactive animation (primary pattern)
677
+ const scale = 1 + kickEnergy * 0.5; // Smooth pulsing with kick
678
+ const hue = lowSmoothed * 180; // Smooth color based on low frequencies
679
+
680
+ ctx.save();
681
+ ctx.translate(viji.width / 2, viji.height / 2);
682
+ ctx.scale(scale, scale);
683
+ ctx.fillStyle = `hsl(${hue}, 70%, 60%)`;
684
+ ctx.fillRect(-25, -25, 50, 50);
685
+ ctx.restore();
686
+
687
+ // Example 2: Phase-synced rotation
688
+ const rotation = phase * Math.PI * 2; // Rotate with beat phase
689
+ ctx.rotate(rotation);
690
+ }
691
+ }
692
+ ```
693
+
694
+ #### Audio-Reactive Scene Example
695
+
696
+ ```typescript
697
+ // Define audio-reactive parameters
698
+ const audioReactive = viji.toggle(true, {
699
+ label: 'Audio Reactive',
700
+ description: 'Make shapes react to audio',
701
+ group: 'audio',
702
+ category: 'audio'
703
+ });
704
+
705
+ const volumeSensitivity = viji.slider(1.0, {
706
+ min: 0.1,
707
+ max: 5.0,
708
+ step: 0.1,
709
+ label: 'Volume Sensitivity',
710
+ description: 'How sensitive shapes are to volume',
711
+ group: 'audio',
712
+ category: 'audio'
713
+ });
714
+
715
+ const bassReactivity = viji.slider(1.0, {
716
+ min: 0,
717
+ max: 3.0,
718
+ step: 0.1,
719
+ label: 'Bass Reactivity',
720
+ description: 'How much shapes react to bass',
721
+ group: 'audio',
722
+ category: 'audio'
723
+ });
724
+
725
+ function render(viji) {
726
+ const ctx = viji.useContext('2d');
727
+ const audio = viji.audio;
728
+
729
+ // Clear canvas
730
+ ctx.fillStyle = '#2c3e50';
731
+ ctx.fillRect(0, 0, viji.width, viji.height);
732
+
733
+ if (audioReactive.value && audio.isConnected) {
734
+ // Audio-reactive animation
735
+ const volume = audio.volume.current * volumeSensitivity.value;
736
+ const bass = audio.bands.low * bassReactivity.value;
737
+
738
+ // Scale based on volume
739
+ const scale = 1 + volume;
740
+
741
+ // Color based on bass
742
+ const hue = 200 + (bass * 160); // Blue to purple range
743
+
744
+ // Position based on frequency distribution
745
+ const x = viji.width * (audio.bands.mid + audio.bands.high) / 2;
746
+ const y = viji.height * (1 - audio.bands.low);
747
+
748
+ ctx.save();
749
+ ctx.translate(x, y);
750
+ ctx.scale(scale, scale);
751
+ ctx.fillStyle = `hsl(${hue}, 80%, 60%)`;
752
+ ctx.beginPath();
753
+ ctx.arc(0, 0, 30, 0, Math.PI * 2);
754
+ ctx.fill();
755
+ ctx.restore();
756
+ }
757
+ }
758
+ ```
759
+
760
+ ### Video Analysis
761
+
762
+ The video system provides real-time video frame analysis with frame data access for custom processing.
763
+
764
+ #### Video API Overview
765
+
766
+ ```typescript
767
+ function render(viji) {
768
+ const video = viji.video;
769
+
770
+ if (video.isConnected) {
771
+ // Video properties
772
+ const frameWidth = video.frameWidth;
773
+ const frameHeight = video.frameHeight;
774
+ const frameRate = video.frameRate;
775
+
776
+ // Current video frame (OffscreenCanvas)
777
+ if (video.currentFrame) {
778
+ // Draw video frame as background
779
+ ctx.globalAlpha = 0.3;
780
+ ctx.drawImage(video.currentFrame, 0, 0, viji.width, viji.height);
781
+ ctx.globalAlpha = 1.0;
782
+ }
783
+
784
+ // Frame data for custom analysis
785
+ const frameData = video.getFrameData();
786
+
787
+ // Example: Custom video analysis
788
+ if (frameData) {
789
+ // Access raw pixel data for custom processing
790
+ const imageData = frameData.data;
791
+ const width = frameData.width;
792
+ const height = frameData.height;
793
+
794
+ // Example: Calculate average brightness
795
+ let totalBrightness = 0;
796
+ for (let i = 0; i < imageData.length; i += 4) {
797
+ const r = imageData[i];
798
+ const g = imageData[i + 1];
799
+ const b = imageData[i + 2];
800
+ totalBrightness += (r + g + b) / 3;
801
+ }
802
+ const averageBrightness = totalBrightness / (imageData.length / 4);
803
+
804
+ // Use brightness for effects
805
+ const brightness = averageBrightness / 255; // Normalize to 0-1
806
+
807
+ // Create brightness-reactive animation
808
+ ctx.fillStyle = `rgba(255, 255, 255, ${brightness * 0.5})`;
809
+ ctx.fillRect(0, 0, viji.width, viji.height);
810
+ }
811
+ }
812
+ }
813
+ ```
814
+
815
+ #### Video-Reactive Scene Example
816
+
817
+ ```typescript
818
+ // Define video-reactive parameters
819
+ const videoReactive = viji.toggle(true, {
820
+ label: 'Video Reactive',
821
+ description: 'Make shapes react to video',
822
+ group: 'video',
823
+ category: 'video'
824
+ });
825
+
826
+ const motionSensitivity = viji.slider(1.0, {
827
+ min: 0.1,
828
+ max: 3.0,
829
+ step: 0.1,
830
+ label: 'Motion Sensitivity',
831
+ description: 'How sensitive shapes are to video changes',
832
+ group: 'video',
833
+ category: 'video'
834
+ });
835
+
836
+ function render(viji) {
837
+ const ctx = viji.useContext('2d');
838
+ const video = viji.video;
839
+
840
+ if (videoReactive.value && video.isConnected) {
841
+ // Video-reactive animation using frame data
842
+ const frameData = video.getFrameData();
843
+
844
+ if (frameData) {
845
+ // Simple motion detection (compare with previous frame)
846
+ // This is a basic example - you can implement more sophisticated analysis
847
+ const imageData = frameData.data;
848
+ let motionEnergy = 0;
849
+
850
+ // Calculate motion energy (simplified)
851
+ for (let i = 0; i < imageData.length; i += 4) {
852
+ const brightness = (imageData[i] + imageData[i + 1] + imageData[i + 2]) / 3;
853
+ motionEnergy += brightness;
854
+ }
855
+
856
+ const normalizedMotion = (motionEnergy / (imageData.length / 4)) / 255;
857
+ const scale = 1 + (normalizedMotion * motionSensitivity.value);
858
+
859
+ // Create motion-reactive shapes
860
+ ctx.save();
861
+ ctx.translate(viji.width / 2, viji.height / 2);
862
+ ctx.scale(scale, scale);
863
+ ctx.fillStyle = `hsl(${normalizedMotion * 360}, 70%, 60%)`;
864
+ ctx.beginPath();
865
+ ctx.arc(0, 0, 30, 0, Math.PI * 2);
866
+ ctx.fill();
867
+ ctx.restore();
868
+ }
869
+ }
870
+ }
871
+ ```
872
+
873
+ ### User Interaction
874
+
875
+ The interaction system provides comprehensive support for mouse, keyboard, and touch input.
876
+
877
+ #### Mouse Interaction
878
+
879
+ ```typescript
880
+ function render(viji) {
881
+ const mouse = viji.mouse;
882
+
883
+ // Mouse position (canvas coordinates)
884
+ if (mouse.isInCanvas) {
885
+ const x = mouse.x; // Current X coordinate
886
+ const y = mouse.y; // Current Y coordinate
887
+
888
+ // Mouse movement
889
+ const deltaX = mouse.deltaX; // X movement since last frame
890
+ const deltaY = mouse.deltaY; // Y movement since last frame
891
+ const velocity = mouse.velocity; // Smoothed velocity { x, y }
892
+
893
+ // Mouse buttons
894
+ const isPressed = mouse.isPressed; // Any button currently pressed
895
+ const leftButton = mouse.leftButton; // Left button state
896
+ const rightButton = mouse.rightButton; // Right button state
897
+ const middleButton = mouse.middleButton; // Middle button state
898
+
899
+ // Frame-based events
900
+ const wasPressed = mouse.wasPressed; // Button was pressed this frame
901
+ const wasReleased = mouse.wasReleased; // Button was released this frame
902
+ const wasMoved = mouse.wasMoved; // Mouse moved this frame
903
+
904
+ // Scroll wheel
905
+ const wheelDelta = mouse.wheelDelta; // Combined wheel delta
906
+ const wheelX = mouse.wheelX; // Horizontal wheel delta
907
+ const wheelY = mouse.wheelY; // Vertical wheel delta
908
+
909
+ // Example: Mouse-reactive animation
910
+ ctx.fillStyle = leftButton ? 'red' : 'blue';
911
+ ctx.beginPath();
912
+ ctx.arc(x, y, 20 + Math.abs(velocity.x + velocity.y), 0, Math.PI * 2);
913
+ ctx.fill();
914
+ }
915
+ }
916
+ ```
917
+
918
+ #### Keyboard Interaction
919
+
920
+ ```typescript
921
+ function render(viji) {
922
+ const keyboard = viji.keyboard;
923
+
924
+ // Key state queries
925
+ if (keyboard.isPressed('w')) {
926
+ // W key is currently pressed
927
+ console.log('W key is held down');
928
+ }
929
+
930
+ if (keyboard.wasPressed('space')) {
931
+ // Space was pressed this frame
932
+ console.log('Space was pressed!');
933
+ }
934
+
935
+ if (keyboard.wasReleased('escape')) {
936
+ // Escape was released this frame
937
+ console.log('Escape was released!');
938
+ }
939
+
940
+ // Active key tracking
941
+ const activeKeys = keyboard.activeKeys; // Set of currently pressed keys
942
+ const pressedThisFrame = keyboard.pressedThisFrame; // Set of keys pressed this frame
943
+ const releasedThisFrame = keyboard.releasedThisFrame; // Set of keys released this frame
944
+
945
+ // Modifier keys
946
+ const shift = keyboard.shift; // Shift key is held
947
+ const ctrl = keyboard.ctrl; // Ctrl key is held
948
+ const alt = keyboard.alt; // Alt key is held
949
+ const meta = keyboard.meta; // Meta/Cmd key is held
950
+
951
+ // Recent activity
952
+ const lastKeyPressed = keyboard.lastKeyPressed; // Last key that was pressed
953
+ const lastKeyReleased = keyboard.lastKeyReleased; // Last key that was released
954
+
955
+ // Example: Keyboard-controlled movement
956
+ let moveX = 0;
957
+ let moveY = 0;
958
+
959
+ if (keyboard.isPressed('w') || keyboard.isPressed('W')) moveY -= 5;
960
+ if (keyboard.isPressed('s') || keyboard.isPressed('S')) moveY += 5;
961
+ if (keyboard.isPressed('a') || keyboard.isPressed('A')) moveX -= 5;
962
+ if (keyboard.isPressed('d') || keyboard.isPressed('D')) moveX += 5;
963
+
964
+ // Apply movement
965
+ ctx.save();
966
+ ctx.translate(moveX, moveY);
967
+ ctx.fillStyle = 'green';
968
+ ctx.fillRect(0, 0, 50, 50);
969
+ ctx.restore();
970
+ }
971
+ ```
972
+
973
+ #### Touch Interaction
974
+
975
+ ```typescript
976
+ function render(viji) {
977
+ const touches = viji.touches;
978
+
979
+ // Touch points
980
+ for (const touch of touches.points) {
981
+ const x = touch.x; // Touch X coordinate
982
+ const y = touch.y; // Touch Y coordinate
983
+ const pressure = touch.pressure; // Pressure (0-1)
984
+ const radius = touch.radius; // Touch radius
985
+ const id = touch.id; // Unique touch ID
986
+
987
+ // Movement
988
+ const deltaX = touch.deltaX; // X movement since last frame
989
+ const deltaY = touch.deltaY; // Y movement since last frame
990
+ const velocity = touch.velocity; // Movement velocity { x, y }
991
+
992
+ // Lifecycle
993
+ const isNew = touch.isNew; // Touch started this frame
994
+ const isActive = touch.isActive; // Touch is currently active
995
+ const isEnding = touch.isEnding; // Touch ending this frame
996
+
997
+ // Draw touch point
998
+ ctx.fillStyle = isNew ? 'red' : isEnding ? 'yellow' : 'blue';
999
+ ctx.beginPath();
1000
+ ctx.arc(x, y, radius * 2, 0, Math.PI * 2);
1001
+ ctx.fill();
1002
+ }
1003
+
1004
+ // Touch events
1005
+ const started = touches.started; // Touches that started this frame
1006
+ const moved = touches.moved; // Touches that moved this frame
1007
+ const ended = touches.ended; // Touches that ended this frame
1008
+
1009
+ // Primary touch (first touch point)
1010
+ const primary = touches.primary; // Primary touch point or null
1011
+
1012
+ // Touch gestures
1013
+ const gestures = touches.gestures;
1014
+
1015
+ if (gestures.isPinching) {
1016
+ const scale = gestures.pinchScale; // Current pinch scale
1017
+ const delta = gestures.pinchDelta; // Scale change since last frame
1018
+
1019
+ // React to pinch gesture
1020
+ ctx.save();
1021
+ ctx.scale(scale, scale);
1022
+ ctx.fillStyle = 'purple';
1023
+ ctx.fillRect(0, 0, 100, 100);
1024
+ ctx.restore();
1025
+ }
1026
+
1027
+ if (gestures.isRotating) {
1028
+ const angle = gestures.rotationAngle; // Current rotation angle
1029
+ const delta = gestures.rotationDelta; // Rotation change since last frame
1030
+
1031
+ // React to rotation gesture
1032
+ ctx.save();
1033
+ ctx.rotate(angle);
1034
+ ctx.fillStyle = 'orange';
1035
+ ctx.fillRect(-25, -25, 50, 50);
1036
+ ctx.restore();
1037
+ }
1038
+
1039
+ if (gestures.isPanning) {
1040
+ const panDelta = gestures.panDelta; // Pan movement { x, y }
1041
+
1042
+ // React to pan gesture
1043
+ ctx.save();
1044
+ ctx.translate(panDelta.x, panDelta.y);
1045
+ ctx.fillStyle = 'cyan';
1046
+ ctx.fillRect(0, 0, 50, 50);
1047
+ ctx.restore();
1048
+ }
1049
+
1050
+ if (gestures.isTapping) {
1051
+ const tapCount = gestures.tapCount; // Number of taps
1052
+ const tapPosition = gestures.tapPosition; // { x, y } tap position
1053
+
1054
+ // React to tap gesture
1055
+ if (tapPosition) {
1056
+ ctx.fillStyle = 'lime';
1057
+ ctx.beginPath();
1058
+ ctx.arc(tapPosition.x, tapPosition.y, 30, 0, Math.PI * 2);
1059
+ ctx.fill();
1060
+ }
1061
+ }
1062
+ }
1063
+ ```
1064
+
1065
+ #### Device Sensors (Motion, Orientation, Geolocation)
1066
+
1067
+ ```typescript
1068
+ function render(viji) {
1069
+ // Internal device (current device running the scene)
1070
+ const device = viji.device;
1071
+
1072
+ // Check if device motion is available
1073
+ if (device.motion?.acceleration) {
1074
+ const accelX = device.motion.acceleration.x; // m/s² (without gravity)
1075
+
1076
+ // Example: Shake detection
1077
+ const magnitude = Math.sqrt(
1078
+ accelX**2 + device.motion.acceleration.y**2 + device.motion.acceleration.z**2
1079
+ );
1080
+ if (magnitude > 15) {
1081
+ console.log('Device shaken!');
1082
+ }
1083
+ }
1084
+
1085
+ // Check if device orientation is available
1086
+ if (device.orientation.gamma !== null) {
1087
+ const tiltLR = device.orientation.gamma; // -90 to 90 (left/right tilt)
1088
+ const tiltFB = device.orientation.beta; // -180 to 180 (front/back tilt)
1089
+
1090
+ // Example: Tilt-based control
1091
+ const moveX = (tiltLR / 90) * 5;
1092
+ ctx.save();
1093
+ ctx.translate(moveX, 0);
1094
+ ctx.fillStyle = 'red';
1095
+ ctx.fillRect(viji.width/2 - 25, viji.height/2 - 25, 50, 50);
1096
+ ctx.restore();
1097
+ }
1098
+
1099
+ // Check if geolocation is available
1100
+ if (device.geolocation.latitude !== null) {
1101
+ const lat = device.geolocation.latitude;
1102
+ const lon = device.geolocation.longitude;
1103
+
1104
+ ctx.fillStyle = 'white';
1105
+ ctx.fillText(`Location: ${lat.toFixed(4)}, ${lon.toFixed(4)}`, 10, 30);
1106
+ }
1107
+
1108
+ // External connected devices (WebRTC/Sockets)
1109
+ viji.devices.forEach((device, index) => {
1110
+ if (device.orientation.gamma !== null) {
1111
+ // Multi-device control example
1112
+ const x = (device.orientation.gamma / 90 + 1) * viji.width / 2;
1113
+ ctx.fillStyle = `hsl(${index * 60}, 70%, 60%)`;
1114
+ ctx.beginPath();
1115
+ ctx.arc(x, 100 + index * 60, 25, 0, Math.PI * 2);
1116
+ ctx.fill();
1117
+ }
1118
+ });
1119
+ }
1120
+ ```
1121
+
1122
+ **Device Sensor Features:**
1123
+ - **Motion Sensors**: Accelerometer and gyroscope data for shake detection and motion tracking
1124
+ - **Orientation Sensors**: Device tilt for tilt-based controls and compass heading
1125
+ - **Geolocation**: GPS coordinates for location-based content
1126
+ - **Multi-Device Support**: Connect multiple external devices via WebRTC for installations
1127
+ - **Graceful Degradation**: Automatic handling of missing sensors or permissions
1128
+
1129
+ **See `docs/20-device-sensor-system.md` for complete device sensor documentation including WebRTC integration and multi-device installations.**
1130
+
1131
+ ## 🎨 P5.js Support
1132
+
1133
+ Viji Core supports **P5.js** as an optional rendering library. P5.js provides familiar creative coding APIs while maintaining all Viji features including audio reactivity, video processing, and parameter management.
1134
+
1135
+ ### Enabling P5.js Mode
1136
+
1137
+ Add a single comment at the top of your scene code:
1138
+
1139
+ ```javascript
1140
+ // @renderer p5
1141
+
1142
+ function setup(viji, p5) {
1143
+ p5.colorMode(p5.HSB);
1144
+ }
1145
+
1146
+ function render(viji, p5) {
1147
+ p5.background(220);
1148
+ p5.fill(255, 0, 0);
1149
+ p5.ellipse(viji.mouse.x, viji.mouse.y, 50, 50);
1150
+ }
1151
+ ```
1152
+
1153
+ ### What Works
1154
+
1155
+ - ✅ All P5.js drawing functions (shapes, colors, transforms, typography)
1156
+ - ✅ P5.js math utilities (`noise()`, `random()`, `map()`, `lerp()`)
1157
+ - ✅ P5.js vectors (`p5.Vector` class)
1158
+ - ✅ WebGL mode (`p5.WEBGL`)
1159
+ - ✅ Full Viji integration (audio, video, parameters, interaction)
1160
+
1161
+ ### What Doesn't Work
1162
+
1163
+ - ❌ p5.dom (use Viji parameters instead)
1164
+ - ❌ p5.sound (use `viji.audio` instead)
1165
+ - ❌ P5.js events (use `viji.mouse`/`keyboard`/`touches` instead)
1166
+ - ❌ Direct image loading (use Viji image parameters instead)
1167
+
1168
+ ### Audio Reactive P5.js Example
1169
+
1170
+ ```javascript
1171
+ // @renderer p5
1172
+
1173
+ const audioReactive = viji.toggle(true, {
1174
+ label: 'Audio Reactive',
1175
+ category: 'audio'
1176
+ });
1177
+
1178
+ const bassReactivity = viji.slider(1.0, {
1179
+ min: 0,
1180
+ max: 3.0,
1181
+ label: 'Bass Reactivity',
1182
+ category: 'audio'
1183
+ });
1184
+
1185
+ function render(viji, p5) {
1186
+ p5.background(0);
1187
+
1188
+ if (audioReactive.value && viji.audio.isConnected) {
1189
+ const bass = viji.audio.bands.low;
1190
+ const volume = viji.audio.volume.current;
1191
+
1192
+ const hue = bass * 360 * bassReactivity.value;
1193
+ const size = 100 + volume * 200;
1194
+
1195
+ p5.colorMode(p5.HSB);
1196
+ p5.fill(hue, 80, 100);
1197
+ p5.ellipse(p5.width / 2, p5.height / 2, size, size);
1198
+ }
1199
+ }
1200
+ ```
1201
+
1202
+ ### Image Parameters
1203
+
1204
+ ```javascript
1205
+ // @renderer p5
1206
+
1207
+ const bgImage = viji.image(null, {
1208
+ label: 'Background Image',
1209
+ group: 'media',
1210
+ accept: 'image/*'
1211
+ });
1212
+
1213
+ function render(viji, p5) {
1214
+ p5.background(220);
1215
+
1216
+ if (bgImage.value) {
1217
+ p5.image(bgImage.value, 0, 0, p5.width, p5.height);
1218
+ }
1219
+
1220
+ // Draw on top of image
1221
+ p5.fill(255, 0, 0, 128);
1222
+ p5.ellipse(viji.mouse.x, viji.mouse.y, 50, 50);
1223
+ }
1224
+ ```
1225
+
1226
+ ### Getting Renderer Type
1227
+
1228
+ ```typescript
1229
+ const core = new VijiCore({
1230
+ hostContainer: container,
1231
+ sceneCode: sceneCode
1232
+ });
1233
+
1234
+ await core.initialize();
1235
+
1236
+ // Check which renderer is being used (from stats, like FPS/resolution)
1237
+ const stats = core.getStats();
1238
+ const rendererType = stats.rendererType; // 'native' | 'p5'
1239
+ ```
1240
+
1241
+ ### Loading Image Parameters
1242
+
1243
+ ```typescript
1244
+ core.onParametersDefined((groups) => {
1245
+ groups.forEach(group => {
1246
+ Object.entries(group.parameters).forEach(([name, def]) => {
1247
+ if (def.type === 'image') {
1248
+ // Create file picker
1249
+ const input = createFileInput(name, def);
1250
+ input.addEventListener('change', async (e) => {
1251
+ const file = e.target.files[0];
1252
+ if (file) {
1253
+ await core.setParameter(name, file); // Unified API handles images automatically
1254
+ }
1255
+ });
1256
+ }
1257
+ });
1258
+ });
1259
+ });
1260
+ ```
1261
+
1262
+ **See `docs/16-p5js-integration.md` for comprehensive documentation including migration guides, troubleshooting, and advanced examples.**
1263
+
1264
+ ## 🏗️ Architecture
1265
+
1266
+ ### Security Model
1267
+
1268
+ The core implements a comprehensive security model to ensure safe execution of artist code:
1269
+
1270
+ - **IFrame Isolation**: Complete separation from host environment with sandboxed execution
1271
+ - **WebWorker Sandboxing**: Artist code runs with controlled API access only
1272
+ - **Blob URL Creation**: Secure worker and IFrame creation from blob URLs
1273
+ - **Resource Protection**: Memory leaks and errors cannot affect main application
1274
+ - **Controlled Communication**: Optimized message passing with validation
1275
+
1276
+ ### Performance Features
1277
+
1278
+ The core provides extensive performance optimization capabilities:
1279
+
1280
+ - **Configurable Frame Rates**: Full (60fps) or half (30fps) modes for performance tuning
1281
+ - **Resolution Scaling**: Fractional (0.1-1.0) or explicit canvas dimensions
1282
+ - **Adaptive Optimization**: Automatic performance tuning based on hardware capabilities
1283
+ - **Efficient Communication**: Optimized message passing between components
1284
+ - **Memory Management**: Automatic resource cleanup and memory leak prevention
1285
+
1286
+ ### Multi-Instance Support
1287
+
1288
+ The core supports multiple concurrent instances for complex applications:
1289
+
1290
+ ```typescript
1291
+ // Main scene with full features
1292
+ const mainCore = new VijiCore({
1293
+ hostContainer: document.getElementById('main-scene'),
1294
+ resolution: { width: 1920, height: 1080 },
1295
+ frameRateMode: 'full',
1296
+ allowUserInteraction: true,
1297
+ audioStream: sharedAudioStream,
1298
+ videoStream: sharedVideoStream
1299
+ });
1300
+
1301
+ // Preview instance with reduced features
1302
+ const previewCore = new VijiCore({
1303
+ hostContainer: document.getElementById('preview'),
1304
+ resolution: 0.25, // 25% resolution for performance
1305
+ frameRateMode: 'half',
1306
+ noInputs: true,
1307
+ allowUserInteraction: false,
1308
+ audioStream: sharedAudioStream // Shared efficiently across instances
1309
+ });
1310
+
1311
+ // Thumbnail instance for gallery view
1312
+ const thumbnailCore = new VijiCore({
1313
+ hostContainer: document.getElementById('thumbnail'),
1314
+ resolution: 0.1, // 10% resolution
1315
+ frameRateMode: 'half',
1316
+ noInputs: true,
1317
+ allowUserInteraction: false
1318
+ });
1319
+
1320
+ // To change scenes, create a new instance and destroy the old one
1321
+ const newCore = new VijiCore({
1322
+ hostContainer: document.getElementById('scene-host'),
1323
+ sceneCode: newSceneCode,
1324
+ audioStream: sharedAudioStream,
1325
+ videoStream: sharedVideoStream
1326
+ });
1327
+
1328
+ // Automatic comprehensive cleanup when destroyed
1329
+ await oldCore.destroy();
1330
+ ```
1331
+
1332
+ ## 🔍 Error Handling
1333
+
1334
+ The core provides comprehensive error handling with detailed error information:
1335
+
1336
+ ```typescript
1337
+ import { VijiCoreError } from '@viji-dev/core';
1338
+
1339
+ try {
1340
+ const core = new VijiCore(config);
1341
+ await core.initialize();
1342
+ } catch (error) {
1343
+ if (error instanceof VijiCoreError) {
1344
+ console.error(`Core error [${error.code}]:`, error.message);
1345
+ console.error('Error context:', error.context);
1346
+
1347
+ // Handle specific error types
1348
+ switch (error.code) {
1349
+ case 'INVALID_CONFIG':
1350
+ console.error('Configuration is invalid:', error.context);
1351
+ break;
1352
+ case 'INITIALIZATION_ERROR':
1353
+ console.error('Failed to initialize core:', error.context);
1354
+ break;
1355
+ case 'CORE_NOT_READY':
1356
+ console.error('Core is not ready for operations');
1357
+ break;
1358
+ case 'INSTANCE_DESTROYED':
1359
+ console.error('Core instance has been destroyed');
1360
+ break;
1361
+ case 'PARAMETERS_NOT_INITIALIZED':
1362
+ console.error('Parameter system not yet initialized');
1363
+ break;
1364
+ case 'UNKNOWN_PARAMETER':
1365
+ console.error('Parameter not found:', error.context);
1366
+ break;
1367
+ }
1368
+ } else {
1369
+ console.error('Unexpected error:', error);
1370
+ }
1371
+ }
1372
+ ```
1373
+
1374
+ **Common Error Codes:**
1375
+ - `INVALID_CONFIG` - Configuration validation failed
1376
+ - `INITIALIZATION_ERROR` - Failed to initialize core components
1377
+ - `CORE_NOT_READY` - Operation attempted before ready
1378
+ - `INSTANCE_DESTROYED` - Operation attempted after destroy
1379
+ - `PARAMETERS_NOT_INITIALIZED` - Parameters not yet available
1380
+ - `UNKNOWN_PARAMETER` - Parameter not found
1381
+ - `CONCURRENT_INITIALIZATION` - Multiple initialization attempts
1382
+ - `MANAGER_NOT_READY` - Internal component not available
1383
+
1384
+ ## 🧪 Development
1385
+
1386
+ ```bash
1387
+ # Install dependencies
1388
+ npm install
1389
+
1390
+ # Build the package
1391
+ npm run build
1392
+
1393
+ # Run tests
1394
+ npm test
1395
+
1396
+ # Development build (watch mode)
1397
+ npm run dev
1398
+
1399
+ # Type checking
1400
+ npm run type-check
1401
+
1402
+ # Linting
1403
+ npm run lint
1404
+ ```
1405
+
1406
+ ## 📚 Examples
1407
+
1408
+ The package includes comprehensive examples in the `/example` directory:
1409
+
1410
+ - **Basic Scene Creation**: Simple animated shapes with parameters
1411
+ - **Audio-Reactive Scenes**: Scenes that respond to audio input
1412
+ - **Video-Reactive Scenes**: Scenes that respond to video analysis
1413
+ - **Interactive Scenes**: Mouse, keyboard, and touch interaction
1414
+ - **Parameter System**: Complete parameter definition and UI generation
1415
+ - **Multi-Instance**: Multiple concurrent scene instances
1416
+
1417
+ ## 🎯 Use Cases
1418
+
1419
+ ### Platform Integration
1420
+ The core integrates seamlessly with the Viji platform, providing scene execution while the platform handles UI, user management, and social features.
1421
+
1422
+ ### SDK Development
1423
+ The core serves as the execution foundation for the Viji SDK, ensuring identical behavior between development and platform environments.
1424
+
1425
+ ### Standalone Applications
1426
+ Use the core directly in custom applications for creative scene rendering with full feature support.
1427
+
1428
+ ## 📄 License
1429
+
1430
+ Copyright (c) 2025 Artem Verkhovskiy and Dmitry Manoilenko.
1431
+ All rights reserved - see the LICENSE file for details.
1432
+
1433
+ ## Contributor License Agreement
1434
+
1435
+ By contributing, you agree to the [CLA](./CLA.md).
1436
+ Please also confirm your agreement by filling out this short [form](https://forms.gle/A49WTz8nj5b99Yev7).
1437
+
1438
+ ---
1439
+
1440
1440
  **Viji Core** - Universal execution engine for creative scenes across all contexts.