@stinkycomputing/web-live-player 0.1.0 → 0.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,323 +1,402 @@
1
- # Web Live Player
2
-
3
- A framework-agnostic video streaming library with support for MoQ (Media over QUIC) and WebSocket backends.
4
-
5
- ## Features
6
-
7
- - **WebCodecs-based decoding** - Hardware-accelerated video decoding
8
- - **MoQ support** - Native Media over QUIC protocol support via `stinky-moq-js`
9
- - **Pluggable stream sources** - Use dependency injection to provide video data from any transport
10
- - **Frame scheduling** - Automatic buffering and drift correction for smooth playback
11
- - **No framework dependencies** - Works with vanilla JS, React, Three.js, or any other framework
12
-
13
- ## Installation
14
-
15
- ```bash
16
- npm install @stinkycomputing/web-live-player
17
- ```
18
-
19
- For MoQ support, also install:
20
-
21
- ```bash
22
- npm install stinky-moq-js
23
- ```
24
-
25
- ## Quick Start
26
-
27
- ### Using with MoQ (Standalone)
28
-
29
- ```typescript
30
- import { createPlayer, createStandaloneMoQSource } from '@stinkycomputing/web-live-player';
31
-
32
- // Create player
33
- const player = createPlayer({
34
- preferredDecoder: 'webcodecs-hw',
35
- bufferSizeFrames: 3,
36
- });
37
-
38
- // Create MoQ source
39
- const moqSource = createStandaloneMoQSource({
40
- relayUrl: 'https://moq-relay.example.com',
41
- namespace: 'live/stream',
42
- subscriptions: [
43
- { trackName: 'video', streamType: 'video' },
44
- ],
45
- });
46
-
47
- // Connect and play
48
- await moqSource.connect();
49
- player.setStreamSource(moqSource);
50
- player.setTrackFilter('video');
51
- player.play();
52
-
53
- // Render loop
54
- function render(timestamp) {
55
- const frame = player.getVideoFrame(timestamp);
56
- if (frame) {
57
- ctx.drawImage(frame, 0, 0);
58
- }
59
- requestAnimationFrame(render);
60
- }
61
- requestAnimationFrame(render);
62
- ```
63
-
64
- ### Using with Elmo's MoQSession
65
-
66
- ```typescript
67
- import { createPlayer } from '@stinkycomputing/web-live-player';
68
- import { MoQDiscoveryUtils } from '@elmo/core';
69
-
70
- // Find session from Elmo's node tree
71
- // MoQSessionNode implements IStreamSource directly
72
- const session = MoQDiscoveryUtils.findMoQSession(currentNode, 'my-session');
73
-
74
- // Create and configure player - session can be used directly as stream source
75
- const player = createPlayer();
76
- player.setStreamSource(session);
77
- player.setTrackFilter('video-track');
78
- player.play();
79
- ```
80
-
81
- ### Custom Stream Source
82
-
83
- ```typescript
84
- import { createPlayer, IStreamSource, BaseStreamSource } from '@stinkycomputing/web-live-player';
85
-
86
- class MyCustomSource extends BaseStreamSource {
87
- async connect() {
88
- // Your connection logic
89
- this._connected = true;
90
- this.emit('connected');
91
- }
92
-
93
- // Call this when you receive video data
94
- handleVideoData(trackName: string, data: ParsedData) {
95
- this.emit('data', {
96
- trackName,
97
- streamType: 'video',
98
- data,
99
- });
100
- }
101
- }
102
-
103
- const source = new MyCustomSource();
104
- await source.connect();
105
-
106
- const player = createPlayer();
107
- player.setStreamSource(source);
108
- player.play();
109
- ```
110
-
111
- ## API Reference
112
-
113
- ### `createPlayer(config?)`
114
-
115
- Creates a new player instance.
116
-
117
- **Config options:**
118
- - `preferredDecoder`: `'webcodecs-hw'` | `'webcodecs-sw'` | `'wasm'` - Decoder preference
119
- - `bufferSizeFrames`: `number` - Target buffer size (default: 3)
120
- - `debugLogging`: `boolean` - Enable debug logging
121
-
122
- ### `LiveVideoPlayer`
123
-
124
- Main player class.
125
-
126
- **Methods:**
127
- - `setStreamSource(source: IStreamSource)` - Set the stream data source
128
- - `setTrackFilter(trackName: string)` - Filter for specific track
129
- - `play()` - Start playback
130
- - `pause()` - Pause playback
131
- - `getVideoFrame(timestampMs: number)` - Get frame for current render timestamp
132
- - `getStats()` - Get playback statistics
133
- - `dispose()` - Clean up resources
134
-
135
- **Events:**
136
- - `frame` - Emitted when a frame is decoded
137
- - `metadata` - Emitted when stream metadata is received
138
- - `statechange` - Emitted when player state changes
139
- - `error` - Emitted on errors
140
-
141
- ### `IStreamSource`
142
-
143
- Interface for stream data sources.
144
-
145
- **Events to emit:**
146
- - `data` - Stream data event with `{ trackName, streamType, data }`
147
- - `connected` - When connected
148
- - `disconnected` - When disconnected
149
- - `error` - On errors
150
-
151
- ## Rendering Frames to Canvas
152
-
153
- The player returns `VideoFrame` objects that can be rendered in multiple ways:
154
-
155
- ### Basic Canvas Rendering
156
-
157
- ```typescript
158
- const canvas = document.getElementById('video-canvas') as HTMLCanvasElement;
159
- const ctx = canvas.getContext('2d')!;
160
-
161
- function render(timestamp: number) {
162
- const frame = player.getVideoFrame(timestamp);
163
- if (frame) {
164
- // Resize canvas to match video dimensions
165
- if (canvas.width !== frame.displayWidth || canvas.height !== frame.displayHeight) {
166
- canvas.width = frame.displayWidth;
167
- canvas.height = frame.displayHeight;
168
- }
169
-
170
- // Draw the frame
171
- ctx.drawImage(frame, 0, 0);
172
-
173
- // IMPORTANT: Close the frame when done to release memory
174
- frame.close();
175
- }
176
- requestAnimationFrame(render);
177
- }
178
- requestAnimationFrame(render);
179
- ```
180
-
181
- ### WebGL / Three.js Rendering
182
-
183
- For GPU-accelerated rendering (e.g., in Three.js):
184
-
185
- ```typescript
186
- // Create a texture
187
- const texture = new THREE.Texture();
188
- texture.minFilter = THREE.LinearFilter;
189
- texture.magFilter = THREE.LinearFilter;
190
- texture.colorSpace = THREE.SRGBColorSpace;
191
-
192
- // In your render loop
193
- function render(timestamp: number) {
194
- const frame = player.getVideoFrame(timestamp);
195
- if (frame) {
196
- // Update texture with the VideoFrame
197
- texture.image = frame;
198
- texture.needsUpdate = true;
199
-
200
- // Close previous frame if stored
201
- if (lastFrame) lastFrame.close();
202
- lastFrame = frame;
203
- }
204
-
205
- renderer.render(scene, camera);
206
- requestAnimationFrame(render);
207
- }
208
- ```
209
-
210
- ### Handling YUV Frames (WASM Decoder)
211
-
212
- When using the WASM decoder, the library automatically converts YUV frames to `VideoFrame` objects using the browser's native I420 support. The GPU handles YUV→RGB conversion, so you can use the same rendering code regardless of decoder:
213
-
214
- ```typescript
215
- // The player always returns VideoFrame, even with WASM decoder
216
- const frame = player.getVideoFrame(timestamp);
217
- if (frame) {
218
- ctx.drawImage(frame, 0, 0);
219
- frame.close();
220
- }
221
- ```
222
-
223
- If you need raw YUV data for custom processing, you can access the `WasmDecoder` directly:
224
-
225
- ```typescript
226
- import { WasmDecoder } from '@stinkycomputing/web-live-player';
227
-
228
- const decoder = new WasmDecoder({
229
- onFrameDecoded: (yuvFrame) => {
230
- // yuvFrame has: { y, u, v, width, height, stride, chromaStride, chromaHeight, timestamp }
231
- // Process raw YUV data here
232
- },
233
- });
234
- ```
235
-
236
- ### Best Practices
237
-
238
- 1. **Always close VideoFrames** - Call `frame.close()` when done to prevent memory leaks
239
- 2. **Check for null frames** - `getVideoFrame()` returns null when no frame is ready
240
- 3. **Use performance.now()** - Pass accurate timestamps for proper frame scheduling
241
- 4. **Handle resize** - Update canvas dimensions when video dimensions change
242
-
243
-
244
- ## Bundler Configuration
245
-
246
- ### WASM Decoder (tinyh264)
247
-
248
- The WASM decoder uses `tinyh264` which requires special bundler configuration for its Web Worker and WASM assets.
249
-
250
- #### Vite
251
-
252
- Add the following to your `vite.config.ts`:
253
-
254
- ```typescript
255
- import { defineConfig } from 'vite';
256
-
257
- export default defineConfig({
258
- // Handle tinyh264's .asset files as URLs
259
- assetsInclude: ['**/*.asset'],
260
-
261
- // Ensure worker files are bundled correctly
262
- worker: {
263
- format: 'es',
264
- },
265
- });
266
- ```
267
-
268
- #### Webpack
269
-
270
- For Webpack, you may need to configure asset handling:
271
-
272
- ```javascript
273
- module.exports = {
274
- module: {
275
- rules: [
276
- {
277
- test: /\.asset$/,
278
- type: 'asset/resource',
279
- },
280
- ],
281
- },
282
- };
283
- ```
284
-
285
- ### WebCodecs Decoder (Recommended)
286
-
287
- If you only need WebCodecs-based decoding (hardware or software), no special bundler configuration is required. Simply use:
288
-
289
- ```typescript
290
- const player = createPlayer({
291
- preferredDecoder: 'webcodecs-hw', // or 'webcodecs-sw'
292
- });
293
- ```
294
-
295
- ## Demo
296
-
297
- Run the demo application:
298
-
299
- ```bash
300
- cd video-player
301
- npm install
302
- npm run dev
303
- ```
304
-
305
- Open http://localhost:3001 to see the demo.
306
-
307
- ## Building
308
-
309
- Build the library:
310
-
311
- ```bash
312
- npm run build
313
- ```
314
-
315
- Build the demo:
316
-
317
- ```bash
318
- npm run build:demo
319
- ```
320
-
321
- ## License
322
-
323
- MIT
1
+ # Web Live Player
2
+
3
+ A framework-agnostic video streaming library for playing back **Sesame** video streams. Sesame is a video engine that delivers low-latency video over MoQ (Media over QUIC) and WebSocket transports.
4
+
5
+ ## Features
6
+
7
+ - **Sesame stream playback** - Native support for Sesame video engine streams
8
+ - **WebCodecs-based decoding** - Hardware-accelerated video decoding
9
+ - **MoQ support** - Native Media over QUIC protocol support via `stinky-moq-js`
10
+ - **Pluggable stream sources** - Use dependency injection to provide video data from any transport
11
+ - **Frame scheduling** - Automatic buffering and drift correction for smooth playback
12
+ - **Optimized file loading** - Range-based chunked loading for fast playback of large MP4 files
13
+ - **No framework dependencies** - Works with vanilla JS, React, Three.js, or any other framework
14
+
15
+ ## Installation
16
+
17
+ ```bash
18
+ npm install @stinkycomputing/web-live-player
19
+ ```
20
+
21
+ ## Quick Start
22
+
23
+ ### Using with MoQ (Standalone)
24
+
25
+ ```typescript
26
+ import { createPlayer, createStandaloneMoQSource } from '@stinkycomputing/web-live-player';
27
+
28
+ // Create player
29
+ const player = createPlayer({
30
+ preferredDecoder: 'webcodecs-hw',
31
+ bufferDelayMs: 100,
32
+ });
33
+
34
+ // Create MoQ source
35
+ const moqSource = createStandaloneMoQSource({
36
+ relayUrl: 'https://moq-relay.example.com',
37
+ namespace: 'live/stream',
38
+ subscriptions: [
39
+ { trackName: 'video', streamType: 'video' },
40
+ { trackName: 'audio', streamType: 'audio' },
41
+ ],
42
+ });
43
+
44
+ // Connect and play
45
+ await moqSource.connect();
46
+ player.setStreamSource(moqSource);
47
+ player.setTrackFilter('video');
48
+ player.play();
49
+
50
+ // Render loop
51
+ function render(timestamp) {
52
+ const frame = player.getVideoFrame(timestamp);
53
+ if (frame) {
54
+ ctx.drawImage(frame, 0, 0);
55
+ }
56
+ requestAnimationFrame(render);
57
+ }
58
+ requestAnimationFrame(render);
59
+ ```
60
+
61
+ ### Custom Stream Source
62
+
63
+ ```typescript
64
+ import { createPlayer, IStreamSource, BaseStreamSource } from '@stinkycomputing/web-live-player';
65
+
66
+ class MyCustomSource extends BaseStreamSource {
67
+ async connect() {
68
+ // Your connection logic
69
+ this._connected = true;
70
+ this.emit('connected');
71
+ }
72
+
73
+ // Call this when you receive video data
74
+ handleVideoData(trackName: string, data: ParsedData) {
75
+ this.emit('data', {
76
+ trackName,
77
+ streamType: 'video',
78
+ data,
79
+ });
80
+ }
81
+ }
82
+
83
+ const source = new MyCustomSource();
84
+ await source.connect();
85
+
86
+ const player = createPlayer();
87
+ player.setStreamSource(source);
88
+ player.play();
89
+ ```
90
+
91
+ ### File Playback
92
+
93
+ For playing MP4 files from URLs or local files:
94
+
95
+ ```typescript
96
+ import { createFilePlayer } from '@stinkycomputing/web-live-player';
97
+
98
+ const filePlayer = createFilePlayer({
99
+ preferredDecoder: 'webcodecs-hw',
100
+ enableAudio: true,
101
+ debugLogging: false,
102
+ playMode: 'once', // or 'loop' for continuous playback
103
+ });
104
+
105
+ // Load from URL (with optimized chunked loading)
106
+ await filePlayer.loadFromUrl('https://example.com/video.mp4');
107
+
108
+ // Or load from File object (e.g., from file input)
109
+ const file = fileInput.files[0];
110
+ await filePlayer.loadFromFile(file);
111
+
112
+ // Play the file
113
+ filePlayer.play();
114
+
115
+ // Render loop
116
+ function render() {
117
+ const frame = filePlayer.getVideoFrame();
118
+ if (frame) {
119
+ ctx.drawImage(frame, 0, 0);
120
+ frame.close();
121
+ }
122
+ requestAnimationFrame(render);
123
+ }
124
+ requestAnimationFrame(render);
125
+
126
+ // Seek to position (in seconds)
127
+ await filePlayer.seek(30);
128
+
129
+ // Listen to events
130
+ filePlayer.on('ready', (info) => {
131
+ console.log(`Video loaded: ${info.width}x${info.height}, ${info.duration}s`);
132
+ });
133
+
134
+ filePlayer.on('progress', (loaded, total) => {
135
+ console.log(`Loading: ${(loaded / total * 100).toFixed(1)}%`);
136
+ });
137
+ ```
138
+
139
+ **Optimized Loading**: The file player uses HTTP Range requests to load large files in chunks (1MB each). This means:
140
+ - Playback starts as soon as metadata is available (~1-2MB typically)
141
+ - Remaining file loads in the background during playback
142
+ - 10-30x faster time-to-first-frame for large files
143
+ - Automatic fallback to full download if server doesn't support ranges
144
+
145
+ ## API Reference
146
+
147
+ ### `createPlayer(config?)`
148
+
149
+ Creates a new player instance.
150
+
151
+ **Config options:**
152
+ - `preferredDecoder`: `'webcodecs-hw'` | `'webcodecs-sw'` | `'wasm'` - Decoder preference (default: `'webcodecs-sw'`). Note: WASM decoder only supports H.264 Baseline profile.
153
+ - `bufferDelayMs`: `number` - Buffer delay in milliseconds (default: 100)
154
+ - `enableAudio`: `boolean` - Enable audio playback (default: true)
155
+ - `videoTrackName`: `string | null` - Video track name for MoQ streams (default: `'video'`)
156
+ - `audioTrackName`: `string | null` - Audio track name for MoQ streams (default: `'audio'`)
157
+ - `debugLogging`: `boolean` - Enable debug logging
158
+
159
+ ### `createFilePlayer(config?)`
160
+
161
+ Creates a file player instance for MP4 playback.
162
+
163
+ **Config options:**
164
+ - `preferredDecoder`: `'webcodecs-hw'` | `'webcodecs-sw'` | `'wasm'` - Decoder preference (default: `'webcodecs-sw'`)
165
+ - `enableAudio`: `boolean` - Enable audio playback (default: true)
166
+ - `audioContext`: `AudioContext` - Optional audio context (creates one if not provided)
167
+ - `playMode`: `'once'` | `'loop'` - Play mode (default: `'once'`)
168
+ - `debugLogging`: `boolean` - Enable debug logging
169
+
170
+ ### `FileVideoPlayer`
171
+
172
+ File player class.
173
+
174
+ **Methods:**
175
+ - `loadFromUrl(url: string)` - Load MP4 from URL (uses range-based chunked loading)
176
+ - `loadFromFile(file: File)` - Load MP4 from File object
177
+ - `play()` - Start playback
178
+ - `pause()` - Pause playback
179
+ - `seek(timeSeconds: number)` - Seek to position
180
+ - `getVideoFrame()` - Get current video frame for rendering
181
+ - `getPosition()` - Get current position in seconds
182
+ - `getDuration()` - Get duration in seconds
183
+ - `getStats()` - Get playback statistics
184
+ - `setVolume(volume: number)` - Set audio volume (0-1)
185
+ - `setPlayMode(mode: 'once' | 'loop')` - Set play mode
186
+ - `dispose()` - Clean up resources
187
+
188
+ **Events:**
189
+ - `ready` - Emitted when file is loaded and ready to play
190
+ - `progress` - Emitted during file loading with (loaded, total) bytes
191
+ - `statechange` - Emitted when player state changes
192
+ - `ended` - Emitted when playback ends (in 'once' mode)
193
+ - `loop` - Emitted when video loops (in 'loop' mode)
194
+ - `seeked` - Emitted after seeking completes
195
+ - `error` - Emitted on errors
196
+
197
+ ### `LiveVideoPlayer`
198
+
199
+ Main player class.
200
+
201
+ **Methods:**
202
+ - `setStreamSource(source: IStreamSource)` - Set the stream data source
203
+ - `setTrackFilter(trackName: string)` - Filter for specific track
204
+ - `connectToMoQRelay(relayUrl, namespace, options?)` - Connect directly to a MoQ relay
205
+ - `play()` - Start playback
206
+ - `pause()` - Pause playback
207
+ - `getVideoFrame(timestampMs: number)` - Get frame for current render timestamp
208
+ - `getStats()` - Get playback statistics
209
+ - `setVolume(volume: number)` - Set audio volume (0-1)
210
+ - `setDebugLogging(enabled: boolean)` - Enable/disable debug logging at runtime
211
+ - `dispose()` - Clean up resources
212
+
213
+ **Events:**
214
+ - `frame` - Emitted when a frame is decoded
215
+ - `metadata` - Emitted when stream metadata is received
216
+ - `statechange` - Emitted when player state changes
217
+ - `error` - Emitted on errors
218
+
219
+ ### `IStreamSource`
220
+
221
+ Interface for stream data sources.
222
+
223
+ **Events to emit:**
224
+ - `data` - Stream data event with `{ trackName, streamType, data }`
225
+ - `connected` - When connected
226
+ - `disconnected` - When disconnected
227
+ - `error` - On errors
228
+
229
+ ## Rendering Frames to Canvas
230
+
231
+ The player returns `VideoFrame` objects that can be rendered in multiple ways:
232
+
233
+ ### Basic Canvas Rendering
234
+
235
+ ```typescript
236
+ const canvas = document.getElementById('video-canvas') as HTMLCanvasElement;
237
+ const ctx = canvas.getContext('2d')!;
238
+
239
+ function render(timestamp: number) {
240
+ const frame = player.getVideoFrame(timestamp);
241
+ if (frame) {
242
+ // Resize canvas to match video dimensions
243
+ if (canvas.width !== frame.displayWidth || canvas.height !== frame.displayHeight) {
244
+ canvas.width = frame.displayWidth;
245
+ canvas.height = frame.displayHeight;
246
+ }
247
+
248
+ // Draw the frame
249
+ ctx.drawImage(frame, 0, 0);
250
+
251
+ // IMPORTANT: Close the frame when done to release memory
252
+ frame.close();
253
+ }
254
+ requestAnimationFrame(render);
255
+ }
256
+ requestAnimationFrame(render);
257
+ ```
258
+
259
+ ### WebGL / Three.js Rendering
260
+
261
+ For GPU-accelerated rendering (e.g., in Three.js):
262
+
263
+ ```typescript
264
+ // Create a texture
265
+ const texture = new THREE.Texture();
266
+ texture.minFilter = THREE.LinearFilter;
267
+ texture.magFilter = THREE.LinearFilter;
268
+ texture.colorSpace = THREE.SRGBColorSpace;
269
+
270
+ // In your render loop
271
+ function render(timestamp: number) {
272
+ const frame = player.getVideoFrame(timestamp);
273
+ if (frame) {
274
+ // Update texture with the VideoFrame
275
+ texture.image = frame;
276
+ texture.needsUpdate = true;
277
+
278
+ // Close previous frame if stored
279
+ if (lastFrame) lastFrame.close();
280
+ lastFrame = frame;
281
+ }
282
+
283
+ renderer.render(scene, camera);
284
+ requestAnimationFrame(render);
285
+ }
286
+ ```
287
+
288
+ ### Handling YUV Frames (WASM Decoder)
289
+
290
+ > **Note:** The WASM decoder only supports **H.264 Baseline profile**. For Main or High profile streams, use `'webcodecs-hw'` or `'webcodecs-sw'` instead.
291
+
292
+ When using the WASM decoder, the library automatically converts YUV frames to `VideoFrame` objects using the browser's native I420 support. The GPU handles YUV→RGB conversion, so you can use the same rendering code regardless of decoder:
293
+
294
+ ```typescript
295
+ // The player always returns VideoFrame, even with WASM decoder
296
+ const frame = player.getVideoFrame(timestamp);
297
+ if (frame) {
298
+ ctx.drawImage(frame, 0, 0);
299
+ frame.close();
300
+ }
301
+ ```
302
+
303
+ If you need raw YUV data for custom processing, you can access the `WasmDecoder` directly:
304
+
305
+ ```typescript
306
+ import { WasmDecoder } from '@stinkycomputing/web-live-player';
307
+
308
+ const decoder = new WasmDecoder({
309
+ onFrameDecoded: (yuvFrame) => {
310
+ // yuvFrame has: { y, u, v, width, height, stride, chromaStride, chromaHeight, timestamp }
311
+ // Process raw YUV data here
312
+ },
313
+ });
314
+ ```
315
+
316
+ ### Best Practices
317
+
318
+ 1. **Always close VideoFrames** - Call `frame.close()` when done to prevent memory leaks
319
+ 2. **Check for null frames** - `getVideoFrame()` returns null when no frame is ready
320
+ 3. **Use performance.now()** - Pass accurate timestamps for proper frame scheduling
321
+ 4. **Handle resize** - Update canvas dimensions when video dimensions change
322
+
323
+
324
+ ## Bundler Configuration
325
+
326
+ ### WASM Decoder (tinyh264)
327
+
328
+ The WASM decoder uses `tinyh264` which requires special bundler configuration for its Web Worker and WASM assets.
329
+
330
+ #### Vite
331
+
332
+ Add the following to your `vite.config.ts`:
333
+
334
+ ```typescript
335
+ import { defineConfig } from 'vite';
336
+
337
+ export default defineConfig({
338
+ // Handle tinyh264's .asset files as URLs
339
+ assetsInclude: ['**/*.asset'],
340
+
341
+ // Ensure worker files are bundled correctly
342
+ worker: {
343
+ format: 'es',
344
+ },
345
+ });
346
+ ```
347
+
348
+ #### Webpack
349
+
350
+ For Webpack, you may need to configure asset handling:
351
+
352
+ ```javascript
353
+ module.exports = {
354
+ module: {
355
+ rules: [
356
+ {
357
+ test: /\.asset$/,
358
+ type: 'asset/resource',
359
+ },
360
+ ],
361
+ },
362
+ };
363
+ ```
364
+
365
+ ### WebCodecs Decoder (Recommended)
366
+
367
+ If you only need WebCodecs-based decoding (hardware or software), no special bundler configuration is required. Simply use:
368
+
369
+ ```typescript
370
+ const player = createPlayer({
371
+ preferredDecoder: 'webcodecs-hw', // or 'webcodecs-sw'
372
+ });
373
+ ```
374
+
375
+ ## Demo
376
+
377
+ Run the demo application:
378
+
379
+ ```bash
380
+ npm install
381
+ npm run dev
382
+ ```
383
+
384
+ Open http://localhost:3001 to see the demo.
385
+
386
+ ## Building
387
+
388
+ Build the library:
389
+
390
+ ```bash
391
+ npm run build
392
+ ```
393
+
394
+ Build the demo:
395
+
396
+ ```bash
397
+ npm run build:demo
398
+ ```
399
+
400
+ ## License
401
+
402
+ MIT