@stinkycomputing/web-live-player 0.1.0 → 0.1.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,323 +1,324 @@
1
- # Web Live Player
2
-
3
- A framework-agnostic video streaming library with support for MoQ (Media over QUIC) and WebSocket backends.
4
-
5
- ## Features
6
-
7
- - **WebCodecs-based decoding** - Hardware-accelerated video decoding
8
- - **MoQ support** - Native Media over QUIC protocol support via `stinky-moq-js`
9
- - **Pluggable stream sources** - Use dependency injection to provide video data from any transport
10
- - **Frame scheduling** - Automatic buffering and drift correction for smooth playback
11
- - **No framework dependencies** - Works with vanilla JS, React, Three.js, or any other framework
12
-
13
- ## Installation
14
-
15
- ```bash
16
- npm install @stinkycomputing/web-live-player
17
- ```
18
-
19
- For MoQ support, also install:
20
-
21
- ```bash
22
- npm install stinky-moq-js
23
- ```
24
-
25
- ## Quick Start
26
-
27
- ### Using with MoQ (Standalone)
28
-
29
- ```typescript
30
- import { createPlayer, createStandaloneMoQSource } from '@stinkycomputing/web-live-player';
31
-
32
- // Create player
33
- const player = createPlayer({
34
- preferredDecoder: 'webcodecs-hw',
35
- bufferSizeFrames: 3,
36
- });
37
-
38
- // Create MoQ source
39
- const moqSource = createStandaloneMoQSource({
40
- relayUrl: 'https://moq-relay.example.com',
41
- namespace: 'live/stream',
42
- subscriptions: [
43
- { trackName: 'video', streamType: 'video' },
44
- ],
45
- });
46
-
47
- // Connect and play
48
- await moqSource.connect();
49
- player.setStreamSource(moqSource);
50
- player.setTrackFilter('video');
51
- player.play();
52
-
53
- // Render loop
54
- function render(timestamp) {
55
- const frame = player.getVideoFrame(timestamp);
56
- if (frame) {
57
- ctx.drawImage(frame, 0, 0);
58
- }
59
- requestAnimationFrame(render);
60
- }
61
- requestAnimationFrame(render);
62
- ```
63
-
64
- ### Using with Elmo's MoQSession
65
-
66
- ```typescript
67
- import { createPlayer } from '@stinkycomputing/web-live-player';
68
- import { MoQDiscoveryUtils } from '@elmo/core';
69
-
70
- // Find session from Elmo's node tree
71
- // MoQSessionNode implements IStreamSource directly
72
- const session = MoQDiscoveryUtils.findMoQSession(currentNode, 'my-session');
73
-
74
- // Create and configure player - session can be used directly as stream source
75
- const player = createPlayer();
76
- player.setStreamSource(session);
77
- player.setTrackFilter('video-track');
78
- player.play();
79
- ```
80
-
81
- ### Custom Stream Source
82
-
83
- ```typescript
84
- import { createPlayer, IStreamSource, BaseStreamSource } from '@stinkycomputing/web-live-player';
85
-
86
- class MyCustomSource extends BaseStreamSource {
87
- async connect() {
88
- // Your connection logic
89
- this._connected = true;
90
- this.emit('connected');
91
- }
92
-
93
- // Call this when you receive video data
94
- handleVideoData(trackName: string, data: ParsedData) {
95
- this.emit('data', {
96
- trackName,
97
- streamType: 'video',
98
- data,
99
- });
100
- }
101
- }
102
-
103
- const source = new MyCustomSource();
104
- await source.connect();
105
-
106
- const player = createPlayer();
107
- player.setStreamSource(source);
108
- player.play();
109
- ```
110
-
111
- ## API Reference
112
-
113
- ### `createPlayer(config?)`
114
-
115
- Creates a new player instance.
116
-
117
- **Config options:**
118
- - `preferredDecoder`: `'webcodecs-hw'` | `'webcodecs-sw'` | `'wasm'` - Decoder preference
119
- - `bufferSizeFrames`: `number` - Target buffer size (default: 3)
120
- - `debugLogging`: `boolean` - Enable debug logging
121
-
122
- ### `LiveVideoPlayer`
123
-
124
- Main player class.
125
-
126
- **Methods:**
127
- - `setStreamSource(source: IStreamSource)` - Set the stream data source
128
- - `setTrackFilter(trackName: string)` - Filter for specific track
129
- - `play()` - Start playback
130
- - `pause()` - Pause playback
131
- - `getVideoFrame(timestampMs: number)` - Get frame for current render timestamp
132
- - `getStats()` - Get playback statistics
133
- - `dispose()` - Clean up resources
134
-
135
- **Events:**
136
- - `frame` - Emitted when a frame is decoded
137
- - `metadata` - Emitted when stream metadata is received
138
- - `statechange` - Emitted when player state changes
139
- - `error` - Emitted on errors
140
-
141
- ### `IStreamSource`
142
-
143
- Interface for stream data sources.
144
-
145
- **Events to emit:**
146
- - `data` - Stream data event with `{ trackName, streamType, data }`
147
- - `connected` - When connected
148
- - `disconnected` - When disconnected
149
- - `error` - On errors
150
-
151
- ## Rendering Frames to Canvas
152
-
153
- The player returns `VideoFrame` objects that can be rendered in multiple ways:
154
-
155
- ### Basic Canvas Rendering
156
-
157
- ```typescript
158
- const canvas = document.getElementById('video-canvas') as HTMLCanvasElement;
159
- const ctx = canvas.getContext('2d')!;
160
-
161
- function render(timestamp: number) {
162
- const frame = player.getVideoFrame(timestamp);
163
- if (frame) {
164
- // Resize canvas to match video dimensions
165
- if (canvas.width !== frame.displayWidth || canvas.height !== frame.displayHeight) {
166
- canvas.width = frame.displayWidth;
167
- canvas.height = frame.displayHeight;
168
- }
169
-
170
- // Draw the frame
171
- ctx.drawImage(frame, 0, 0);
172
-
173
- // IMPORTANT: Close the frame when done to release memory
174
- frame.close();
175
- }
176
- requestAnimationFrame(render);
177
- }
178
- requestAnimationFrame(render);
179
- ```
180
-
181
- ### WebGL / Three.js Rendering
182
-
183
- For GPU-accelerated rendering (e.g., in Three.js):
184
-
185
- ```typescript
186
- // Create a texture
187
- const texture = new THREE.Texture();
188
- texture.minFilter = THREE.LinearFilter;
189
- texture.magFilter = THREE.LinearFilter;
190
- texture.colorSpace = THREE.SRGBColorSpace;
191
-
192
- // In your render loop
193
- function render(timestamp: number) {
194
- const frame = player.getVideoFrame(timestamp);
195
- if (frame) {
196
- // Update texture with the VideoFrame
197
- texture.image = frame;
198
- texture.needsUpdate = true;
199
-
200
- // Close previous frame if stored
201
- if (lastFrame) lastFrame.close();
202
- lastFrame = frame;
203
- }
204
-
205
- renderer.render(scene, camera);
206
- requestAnimationFrame(render);
207
- }
208
- ```
209
-
210
- ### Handling YUV Frames (WASM Decoder)
211
-
212
- When using the WASM decoder, the library automatically converts YUV frames to `VideoFrame` objects using the browser's native I420 support. The GPU handles YUV→RGB conversion, so you can use the same rendering code regardless of decoder:
213
-
214
- ```typescript
215
- // The player always returns VideoFrame, even with WASM decoder
216
- const frame = player.getVideoFrame(timestamp);
217
- if (frame) {
218
- ctx.drawImage(frame, 0, 0);
219
- frame.close();
220
- }
221
- ```
222
-
223
- If you need raw YUV data for custom processing, you can access the `WasmDecoder` directly:
224
-
225
- ```typescript
226
- import { WasmDecoder } from '@stinkycomputing/web-live-player';
227
-
228
- const decoder = new WasmDecoder({
229
- onFrameDecoded: (yuvFrame) => {
230
- // yuvFrame has: { y, u, v, width, height, stride, chromaStride, chromaHeight, timestamp }
231
- // Process raw YUV data here
232
- },
233
- });
234
- ```
235
-
236
- ### Best Practices
237
-
238
- 1. **Always close VideoFrames** - Call `frame.close()` when done to prevent memory leaks
239
- 2. **Check for null frames** - `getVideoFrame()` returns null when no frame is ready
240
- 3. **Use performance.now()** - Pass accurate timestamps for proper frame scheduling
241
- 4. **Handle resize** - Update canvas dimensions when video dimensions change
242
-
243
-
244
- ## Bundler Configuration
245
-
246
- ### WASM Decoder (tinyh264)
247
-
248
- The WASM decoder uses `tinyh264` which requires special bundler configuration for its Web Worker and WASM assets.
249
-
250
- #### Vite
251
-
252
- Add the following to your `vite.config.ts`:
253
-
254
- ```typescript
255
- import { defineConfig } from 'vite';
256
-
257
- export default defineConfig({
258
- // Handle tinyh264's .asset files as URLs
259
- assetsInclude: ['**/*.asset'],
260
-
261
- // Ensure worker files are bundled correctly
262
- worker: {
263
- format: 'es',
264
- },
265
- });
266
- ```
267
-
268
- #### Webpack
269
-
270
- For Webpack, you may need to configure asset handling:
271
-
272
- ```javascript
273
- module.exports = {
274
- module: {
275
- rules: [
276
- {
277
- test: /\.asset$/,
278
- type: 'asset/resource',
279
- },
280
- ],
281
- },
282
- };
283
- ```
284
-
285
- ### WebCodecs Decoder (Recommended)
286
-
287
- If you only need WebCodecs-based decoding (hardware or software), no special bundler configuration is required. Simply use:
288
-
289
- ```typescript
290
- const player = createPlayer({
291
- preferredDecoder: 'webcodecs-hw', // or 'webcodecs-sw'
292
- });
293
- ```
294
-
295
- ## Demo
296
-
297
- Run the demo application:
298
-
299
- ```bash
300
- cd video-player
301
- npm install
302
- npm run dev
303
- ```
304
-
305
- Open http://localhost:3001 to see the demo.
306
-
307
- ## Building
308
-
309
- Build the library:
310
-
311
- ```bash
312
- npm run build
313
- ```
314
-
315
- Build the demo:
316
-
317
- ```bash
318
- npm run build:demo
319
- ```
320
-
321
- ## License
322
-
323
- MIT
1
+ # Web Live Player
2
+
3
+ A framework-agnostic video streaming library for playing back **Sesame** video streams. Sesame is a video engine that delivers low-latency video over MoQ (Media over QUIC) and WebSocket transports.
4
+
5
+ ## Features
6
+
7
+ - **Sesame stream playback** - Native support for Sesame video engine streams
8
+ - **WebCodecs-based decoding** - Hardware-accelerated video decoding
9
+ - **MoQ support** - Native Media over QUIC protocol support via `stinky-moq-js`
10
+ - **Pluggable stream sources** - Use dependency injection to provide video data from any transport
11
+ - **Frame scheduling** - Automatic buffering and drift correction for smooth playback
12
+ - **No framework dependencies** - Works with vanilla JS, React, Three.js, or any other framework
13
+
14
+ ## Installation
15
+
16
+ ```bash
17
+ npm install @stinkycomputing/web-live-player
18
+ ```
19
+
20
+ For MoQ support, also install:
21
+
22
+ ```bash
23
+ npm install stinky-moq-js
24
+ ```
25
+
26
+ ## Quick Start
27
+
28
+ ### Using with MoQ (Standalone)
29
+
30
+ ```typescript
31
+ import { createPlayer, createStandaloneMoQSource } from '@stinkycomputing/web-live-player';
32
+
33
+ // Create player
34
+ const player = createPlayer({
35
+ preferredDecoder: 'webcodecs-hw',
36
+ bufferSizeFrames: 3,
37
+ });
38
+
39
+ // Create MoQ source
40
+ const moqSource = createStandaloneMoQSource({
41
+ relayUrl: 'https://moq-relay.example.com',
42
+ namespace: 'live/stream',
43
+ subscriptions: [
44
+ { trackName: 'video', streamType: 'video' },
45
+ ],
46
+ });
47
+
48
+ // Connect and play
49
+ await moqSource.connect();
50
+ player.setStreamSource(moqSource);
51
+ player.setTrackFilter('video');
52
+ player.play();
53
+
54
+ // Render loop
55
+ function render(timestamp) {
56
+ const frame = player.getVideoFrame(timestamp);
57
+ if (frame) {
58
+ ctx.drawImage(frame, 0, 0);
59
+ }
60
+ requestAnimationFrame(render);
61
+ }
62
+ requestAnimationFrame(render);
63
+ ```
64
+
65
+ ### Using with Elmo's MoQSession
66
+
67
+ ```typescript
68
+ import { createPlayer } from '@stinkycomputing/web-live-player';
69
+ import { MoQDiscoveryUtils } from '@elmo/core';
70
+
71
+ // Find session from Elmo's node tree
72
+ // MoQSessionNode implements IStreamSource directly
73
+ const session = MoQDiscoveryUtils.findMoQSession(currentNode, 'my-session');
74
+
75
+ // Create and configure player - session can be used directly as stream source
76
+ const player = createPlayer();
77
+ player.setStreamSource(session);
78
+ player.setTrackFilter('video-track');
79
+ player.play();
80
+ ```
81
+
82
+ ### Custom Stream Source
83
+
84
+ ```typescript
85
+ import { createPlayer, IStreamSource, BaseStreamSource } from '@stinkycomputing/web-live-player';
86
+
87
+ class MyCustomSource extends BaseStreamSource {
88
+ async connect() {
89
+ // Your connection logic
90
+ this._connected = true;
91
+ this.emit('connected');
92
+ }
93
+
94
+ // Call this when you receive video data
95
+ handleVideoData(trackName: string, data: ParsedData) {
96
+ this.emit('data', {
97
+ trackName,
98
+ streamType: 'video',
99
+ data,
100
+ });
101
+ }
102
+ }
103
+
104
+ const source = new MyCustomSource();
105
+ await source.connect();
106
+
107
+ const player = createPlayer();
108
+ player.setStreamSource(source);
109
+ player.play();
110
+ ```
111
+
112
+ ## API Reference
113
+
114
+ ### `createPlayer(config?)`
115
+
116
+ Creates a new player instance.
117
+
118
+ **Config options:**
119
+ - `preferredDecoder`: `'webcodecs-hw'` | `'webcodecs-sw'` | `'wasm'` - Decoder preference
120
+ - `bufferSizeFrames`: `number` - Target buffer size (default: 3)
121
+ - `debugLogging`: `boolean` - Enable debug logging
122
+
123
+ ### `LiveVideoPlayer`
124
+
125
+ Main player class.
126
+
127
+ **Methods:**
128
+ - `setStreamSource(source: IStreamSource)` - Set the stream data source
129
+ - `setTrackFilter(trackName: string)` - Filter for specific track
130
+ - `play()` - Start playback
131
+ - `pause()` - Pause playback
132
+ - `getVideoFrame(timestampMs: number)` - Get frame for current render timestamp
133
+ - `getStats()` - Get playback statistics
134
+ - `dispose()` - Clean up resources
135
+
136
+ **Events:**
137
+ - `frame` - Emitted when a frame is decoded
138
+ - `metadata` - Emitted when stream metadata is received
139
+ - `statechange` - Emitted when player state changes
140
+ - `error` - Emitted on errors
141
+
142
+ ### `IStreamSource`
143
+
144
+ Interface for stream data sources.
145
+
146
+ **Events to emit:**
147
+ - `data` - Stream data event with `{ trackName, streamType, data }`
148
+ - `connected` - When connected
149
+ - `disconnected` - When disconnected
150
+ - `error` - On errors
151
+
152
+ ## Rendering Frames to Canvas
153
+
154
+ The player returns `VideoFrame` objects that can be rendered in multiple ways:
155
+
156
+ ### Basic Canvas Rendering
157
+
158
+ ```typescript
159
+ const canvas = document.getElementById('video-canvas') as HTMLCanvasElement;
160
+ const ctx = canvas.getContext('2d')!;
161
+
162
+ function render(timestamp: number) {
163
+ const frame = player.getVideoFrame(timestamp);
164
+ if (frame) {
165
+ // Resize canvas to match video dimensions
166
+ if (canvas.width !== frame.displayWidth || canvas.height !== frame.displayHeight) {
167
+ canvas.width = frame.displayWidth;
168
+ canvas.height = frame.displayHeight;
169
+ }
170
+
171
+ // Draw the frame
172
+ ctx.drawImage(frame, 0, 0);
173
+
174
+ // IMPORTANT: Close the frame when done to release memory
175
+ frame.close();
176
+ }
177
+ requestAnimationFrame(render);
178
+ }
179
+ requestAnimationFrame(render);
180
+ ```
181
+
182
+ ### WebGL / Three.js Rendering
183
+
184
+ For GPU-accelerated rendering (e.g., in Three.js):
185
+
186
+ ```typescript
187
+ // Create a texture
188
+ const texture = new THREE.Texture();
189
+ texture.minFilter = THREE.LinearFilter;
190
+ texture.magFilter = THREE.LinearFilter;
191
+ texture.colorSpace = THREE.SRGBColorSpace;
192
+
193
+ // In your render loop
194
+ function render(timestamp: number) {
195
+ const frame = player.getVideoFrame(timestamp);
196
+ if (frame) {
197
+ // Update texture with the VideoFrame
198
+ texture.image = frame;
199
+ texture.needsUpdate = true;
200
+
201
+ // Close previous frame if stored
202
+ if (lastFrame) lastFrame.close();
203
+ lastFrame = frame;
204
+ }
205
+
206
+ renderer.render(scene, camera);
207
+ requestAnimationFrame(render);
208
+ }
209
+ ```
210
+
211
+ ### Handling YUV Frames (WASM Decoder)
212
+
213
+ When using the WASM decoder, the library automatically converts YUV frames to `VideoFrame` objects using the browser's native I420 support. The GPU handles YUV→RGB conversion, so you can use the same rendering code regardless of decoder:
214
+
215
+ ```typescript
216
+ // The player always returns VideoFrame, even with WASM decoder
217
+ const frame = player.getVideoFrame(timestamp);
218
+ if (frame) {
219
+ ctx.drawImage(frame, 0, 0);
220
+ frame.close();
221
+ }
222
+ ```
223
+
224
+ If you need raw YUV data for custom processing, you can access the `WasmDecoder` directly:
225
+
226
+ ```typescript
227
+ import { WasmDecoder } from '@stinkycomputing/web-live-player';
228
+
229
+ const decoder = new WasmDecoder({
230
+ onFrameDecoded: (yuvFrame) => {
231
+ // yuvFrame has: { y, u, v, width, height, stride, chromaStride, chromaHeight, timestamp }
232
+ // Process raw YUV data here
233
+ },
234
+ });
235
+ ```
236
+
237
+ ### Best Practices
238
+
239
+ 1. **Always close VideoFrames** - Call `frame.close()` when done to prevent memory leaks
240
+ 2. **Check for null frames** - `getVideoFrame()` returns null when no frame is ready
241
+ 3. **Use performance.now()** - Pass accurate timestamps for proper frame scheduling
242
+ 4. **Handle resize** - Update canvas dimensions when video dimensions change
243
+
244
+
245
+ ## Bundler Configuration
246
+
247
+ ### WASM Decoder (tinyh264)
248
+
249
+ The WASM decoder uses `tinyh264` which requires special bundler configuration for its Web Worker and WASM assets.
250
+
251
+ #### Vite
252
+
253
+ Add the following to your `vite.config.ts`:
254
+
255
+ ```typescript
256
+ import { defineConfig } from 'vite';
257
+
258
+ export default defineConfig({
259
+ // Handle tinyh264's .asset files as URLs
260
+ assetsInclude: ['**/*.asset'],
261
+
262
+ // Ensure worker files are bundled correctly
263
+ worker: {
264
+ format: 'es',
265
+ },
266
+ });
267
+ ```
268
+
269
+ #### Webpack
270
+
271
+ For Webpack, you may need to configure asset handling:
272
+
273
+ ```javascript
274
+ module.exports = {
275
+ module: {
276
+ rules: [
277
+ {
278
+ test: /\.asset$/,
279
+ type: 'asset/resource',
280
+ },
281
+ ],
282
+ },
283
+ };
284
+ ```
285
+
286
+ ### WebCodecs Decoder (Recommended)
287
+
288
+ If you only need WebCodecs-based decoding (hardware or software), no special bundler configuration is required. Simply use:
289
+
290
+ ```typescript
291
+ const player = createPlayer({
292
+ preferredDecoder: 'webcodecs-hw', // or 'webcodecs-sw'
293
+ });
294
+ ```
295
+
296
+ ## Demo
297
+
298
+ Run the demo application:
299
+
300
+ ```bash
301
+ cd video-player
302
+ npm install
303
+ npm run dev
304
+ ```
305
+
306
+ Open http://localhost:3001 to see the demo.
307
+
308
+ ## Building
309
+
310
+ Build the library:
311
+
312
+ ```bash
313
+ npm run build
314
+ ```
315
+
316
+ Build the demo:
317
+
318
+ ```bash
319
+ npm run build:demo
320
+ ```
321
+
322
+ ## License
323
+
324
+ MIT
@@ -14,7 +14,6 @@ export declare class WasmDecoder implements IVideoDecoder {
14
14
  private _queueSize;
15
15
  private pendingTimestamps;
16
16
  private pendingFrames;
17
- private maxQueueSize;
18
17
  private onFrameDecoded?;
19
18
  private onError?;
20
19
  private onQueueOverflow?;