node-web-audio-api 0.21.0 → 0.21.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,6 +1,20 @@
1
+ # CHANGELOG
2
+
3
+ ## v0.21.2 (20/09/2024)
4
+
5
+ - - Update upstream crate to [v1.0.1](https://github.com/orottier/web-audio-api-rs/blob/main/CHANGELOG.md#version-101-2024-09-18)
6
+ - Fix: Make sure `AudioBuffer` returned by `OfflineContext` is valid
7
+ - Fix: Allow contexts to be properly garbage collected
8
+
9
+ ## v0.21.1 (10/06/2024)
10
+
11
+ - Feat: Buffer pool for AudioWorketProcessor
12
+ - Fix: Propagate `addModule` errors to main thread
13
+ - Fix: Memory leak due to `onended` events
14
+
1
15
  ## v0.21.0 (17/05/2024)
2
16
 
3
- - Feat: implement AudioWorkletNode
17
+ - Feat: Implement AudioWorkletNode
4
18
 
5
19
  ## v0.20.0 (29/04/2024)
6
20
 
@@ -86,7 +100,7 @@
86
100
  ## v0.5.0 (19/12/2022)
87
101
 
88
102
  - Implement AudioParam#setValueCurveAtTime
89
- - Offline context constructor
103
+ - Offline context constructor
90
104
 
91
105
  ## v0.4.0 (07/11/2022)
92
106
 
package/README.md CHANGED
@@ -65,7 +65,7 @@ node examples/granular-scrub.mjs
65
65
 
66
66
  ## Caveats
67
67
 
68
- - `Streams`: only a minimial audio input stream and the `MediaStreamSourceNode` are provided. All other `MediaStream` features are left on the side for now as they principaly concern a different API specification, which is not a trivial problem.
68
+ - `Streams`: only a minimal audio input stream and the `MediaStreamSourceNode` are provided. All other `MediaStream` features are left on the side for now as they principally concern a different API specification, which is not a trivial problem.
69
69
 
70
70
  ## Supported Platforms
71
71
 
@@ -79,24 +79,7 @@ node examples/granular-scrub.mjs
79
79
  | Linux arm gnueabihf (RPi) | ✓ | ✓ |
80
80
  | Linux arm64 gnu (RPi) | ✓ | ✓ |
81
81
 
82
-
83
- ## Notes for Linux users
84
-
85
- Using the library on Linux with the ALSA backend might lead to unexpected cranky sound with the default render size (i.e. 128 frames). In such cases, a simple workaround is to pass the `playback` latency hint when creating the audio context, which will increase the render size to 1024 frames:
86
-
87
- ```js
88
- const audioContext = new AudioContext({ latencyHint: 'playback' });
89
- ```
90
-
91
- For real-time and interactive applications where low latency is crucial, you should instead rely on the JACK backend provided by `cpal`. By default the audio context will use that backend if a running JACK server is found.
92
-
93
- If you don't have JACK installed, you can still pass the `WEB_AUDIO_LATENCY=playback` env variable to all examples to create the audio context with the playback latency hint, e.g.:
94
-
95
- ```sh
96
- WEB_AUDIO_LATENCY=playback node examples/amplitude-modulation.mjs
97
- ```
98
-
99
- ### Manual Build
82
+ ## Manual Build
100
83
 
101
84
  If prebuilt binaries are not shippped for your platform, you will need to:
102
85
 
@@ -119,15 +102,51 @@ The package will be built on your machine, which might take some time.
119
102
 
120
103
  Be aware that the package won't be listed on your `package.json` file, and that it won't be re-installed if running `npm install` again. A possible workaround would be to include the above in a postinstall script.
121
104
 
105
+ ## Notes for Linux users
106
+
107
+ ### Build
108
+
109
+ To build the library, you will need to manually install the `libasound2-dev` package:
110
+
111
+ ```sh
112
+ sudo apt install libasound2-dev
113
+ ```
114
+
115
+ Optionally, if you use the Jack Audio Backend, the `libjack-jackd2-dev` package:
116
+
117
+ ```sh
118
+ sudo apt install libjack-jackd2-dev
119
+ ```
120
+
121
+ In such case, you can use the `npm run build:jack` script to enable the Jack feature.
122
+
123
+ ### Audio backend and latency
124
+
125
+ Using the library on Linux with the ALSA backend might lead to unexpected cranky sound with the default render size (i.e. 128 frames). In such cases, a simple workaround is to pass the `playback` latency hint when creating the audio context, which will increase the render size to 1024 frames:
126
+
127
+ ```js
128
+ const audioContext = new AudioContext({ latencyHint: 'playback' });
129
+ ```
130
+
131
+ For real-time and interactive applications where low latency is crucial, you should instead rely on the JACK backend provided by `cpal`. By default the audio context will use that backend if a running JACK server is found.
132
+
133
+ If you don't have JACK installed, you can still pass the `WEB_AUDIO_LATENCY=playback` environment variable to all examples to create the audio context with the playback latency hint, e.g.:
134
+
135
+ ```sh
136
+ WEB_AUDIO_LATENCY=playback node examples/amplitude-modulation.mjs
137
+ ```
138
+
122
139
  ## Development notes
123
140
 
141
+ ### Synchronize versioning
142
+
124
143
  The npm `postversion` script rely on [`cargo-bump`](https://crates.io/crates/cargo-bump) to maintain versions synced between the `package.json` and the `Cargo.toml` files. Therefore, you will need to install `cargo-bump` on your machine
125
144
 
126
145
  ```
127
146
  cargo install cargo-bump
128
147
  ```
129
148
 
130
- ## Running the web-platform-test suite
149
+ ### Running the web-platform-test suite
131
150
 
132
151
  Follow the steps for 'Manual Build' first. Then checkout the web-platform-tests submodule with:
133
152
 
package/all-checks.sh ADDED
@@ -0,0 +1,18 @@
1
+ #!/bin/bash
2
+
3
+
4
+ echo "-----------------------------------------------"
5
+ echo "> cargo fmt -- --check --color always"
6
+ echo "-----------------------------------------------"
7
+ cargo fmt -- --check --color always
8
+
9
+ echo "-----------------------------------------------"
10
+ echo "> cargo clippy --all-targets --features cpal -- -D warnings"
11
+ echo "-----------------------------------------------"
12
+ cargo clippy --all-targets --features cpal -- -D warnings
13
+
14
+ echo "-----------------------------------------------"
15
+ echo "> Run js tests"
16
+ echo "-----------------------------------------------"
17
+ npm run test
18
+
@@ -89,30 +89,22 @@ module.exports = function(jsExport, nativeBinding) {
89
89
  });
90
90
 
91
91
  // Add function to Napi object to bridge from Rust events to JS EventTarget
92
- this[kNapiObj][kOnStateChange] = (err, rawEvent) => {
93
- if (typeof rawEvent !== 'object' && !('type' in rawEvent)) {
94
- throw new TypeError('Invalid [kOnStateChange] Invocation: rawEvent should have a type property');
95
- }
96
-
92
+ this[kNapiObj][kOnStateChange] = (function(err, rawEvent) {
97
93
  const event = new Event(rawEvent.type);
98
94
  propagateEvent(this, event);
99
- };
100
-
101
- this[kNapiObj][kOnSinkChange] = (err, rawEvent) => {
102
- if (typeof rawEvent !== 'object' && !('type' in rawEvent)) {
103
- throw new TypeError('Invalid [kOnSinkChange] Invocation: rawEvent should have a type property');
104
- }
95
+ }).bind(this);
105
96
 
97
+ this[kNapiObj][kOnSinkChange] = (function(err, rawEvent) {
106
98
  const event = new Event(rawEvent.type);
107
99
  propagateEvent(this, event);
108
- };
100
+ }).bind(this);
109
101
 
110
102
  // Workaround to bind the `sinkchange` and `statechange` events to EventTarget.
111
103
  // This must be called from JS facade ctor as the JS handler are added to the Napi
112
104
  // object after its instantiation, and that we don't have any initial `resume` call.
113
105
  this[kNapiObj].listen_to_events();
114
106
 
115
- // @todo - check if this is still required
107
+ // @todo - This is probably not requested anymore as the event listeners
116
108
  // prevent garbage collection and process exit
117
109
  const id = contextId++;
118
110
  // store in process to prevent garbage collection
@@ -125,7 +117,6 @@ module.exports = function(jsExport, nativeBinding) {
125
117
  });
126
118
  // keep process awake until context is closed
127
119
  const keepAwakeId = setInterval(() => {}, 10 * 1000);
128
-
129
120
  // clear on close
130
121
  this.addEventListener('statechange', () => {
131
122
  if (this.state === 'closed') {
@@ -134,6 +125,11 @@ module.exports = function(jsExport, nativeBinding) {
134
125
  clearTimeout(keepAwakeId);
135
126
  }
136
127
  });
128
+
129
+ // for wpt tests, see ./.scripts/wpt_harness.mjs for informations
130
+ if (process.WPT_TEST_RUNNER) {
131
+ process.WPT_TEST_RUNNER.once('cleanup', () => this.close());
132
+ }
137
133
  }
138
134
 
139
135
  get baseLatency() {
@@ -31,10 +31,10 @@ class AudioRenderCapacity extends EventTarget {
31
31
 
32
32
  this[kNapiObj] = options[kNapiObj];
33
33
 
34
- this[kNapiObj][kOnUpdate] = (err, rawEvent) => {
34
+ this[kNapiObj][kOnUpdate] = (function(err, rawEvent) {
35
35
  const event = new AudioRenderCapacityEvent('update', rawEvent);
36
36
  propagateEvent(this, event);
37
- };
37
+ }).bind(this);
38
38
 
39
39
  this[kNapiObj].listen_to_events();
40
40
  }
@@ -33,14 +33,15 @@ class AudioScheduledSourceNode extends AudioNode {
33
33
 
34
34
  // Add function to Napi object to bridge from Rust events to JS EventTarget
35
35
  // It will be effectively registered on rust side when `start` is called
36
- this[kNapiObj][kOnEnded] = (err, rawEvent) => {
37
- if (typeof rawEvent !== 'object' && !('type' in rawEvent)) {
38
- throw new TypeError('Invalid [kOnEnded] Invocation: rawEvent should have a type property');
39
- }
40
-
36
+ //
37
+ // Note 2024-06-05 - We use bind instead of arrow function because arrow function
38
+ // prevent the node to be collected by Scavenge step of GC, which can lead to
39
+ // oversized graphs and performance issues.
40
+ // cf. https://github.com/ircam-ismm/node-web-audio-api/tree/fix/118
41
+ this[kNapiObj][kOnEnded] = (function(_err, rawEvent) {
41
42
  const event = new Event(rawEvent.type);
42
43
  propagateEvent(this, event);
43
- };
44
+ }).bind(this);
44
45
  }
45
46
 
46
47
  get onended() {
@@ -64,7 +64,7 @@ const resolveModule = async (moduleUrl) => {
64
64
  // get caller site from error stack trace
65
65
  const callerSite = caller(2);
66
66
 
67
- if (callerSite.startsWith('http')) {
67
+ if (callerSite.startsWith('http')) { // this branch exists for wpt where caller site is an url
68
68
  let url;
69
69
  // handle origin relative and caller path relative URLs
70
70
  if (moduleUrl.startsWith('/')) {
@@ -134,6 +134,14 @@ class AudioWorklet {
134
134
  resolve();
135
135
  break;
136
136
  }
137
+ case 'node-web-audio-api:worklet:add-module-failed': {
138
+ const { promiseId, ctor, name, message } = event;
139
+ const { reject } = this.#idPromiseMap.get(promiseId);
140
+ this.#idPromiseMap.delete(promiseId);
141
+ const err = new globalThis[ctor](message, name);
142
+ reject(err);
143
+ break;
144
+ }
137
145
  case 'node-web-audio-api:worlet:processor-registered': {
138
146
  const { name, parameterDescriptors } = event;
139
147
  this.#workletParamDescriptorsMap.set(name, parameterDescriptors);
@@ -188,7 +196,6 @@ class AudioWorklet {
188
196
  // For OfflineAudioContext only, check that all processors have been properly
189
197
  // created before actual `startRendering`
190
198
  async [kCheckProcessorsCreated]() {
191
- // console.log(this.#pendingCreateProcessors);
192
199
  return new Promise(async resolve => {
193
200
  while (this.#pendingCreateProcessors.size !== 0) {
194
201
  // we need a microtask to ensure message can be received
@@ -1,10 +1,12 @@
1
1
  const {
2
2
  parentPort,
3
3
  workerData,
4
+ markAsUntransferable,
4
5
  } = require('node:worker_threads');
5
6
 
6
7
  const conversions = require('webidl-conversions');
7
8
 
9
+ // these are defined in rust side
8
10
  const {
9
11
  exit_audio_worklet_global_scope,
10
12
  run_audio_worklet_global_scope,
@@ -21,14 +23,72 @@ const kWorkletInputs = Symbol.for('node-web-audio-api:worklet-inputs');
21
23
  const kWorkletOutputs = Symbol.for('node-web-audio-api:worklet-outputs');
22
24
  const kWorkletParams = Symbol.for('node-web-audio-api:worklet-params');
23
25
  const kWorkletParamsCache = Symbol.for('node-web-audio-api:worklet-params-cache');
26
+ const kWorkletGetBuffer = Symbol.for('node-web-audio-api:worklet-get-buffer');
27
+ const kWorkletRecycleBuffer = Symbol.for('node-web-audio-api:worklet-recycle-buffer');
28
+ const kWorkletRecycleBuffer1 = Symbol.for('node-web-audio-api:worklet-recycle-buffer-1');
29
+ const kWorkletMarkAsUntransferable = Symbol.for('node-web-audio-api:worklet-mark-as-untransferable');
24
30
  // const kWorkletOrderedParamNames = Symbol.for('node-web-audio-api:worklet-ordered-param-names');
25
31
 
32
+
26
33
  const nameProcessorCtorMap = new Map();
27
34
  const processors = {};
28
35
  let pendingProcessorConstructionData = null;
29
36
  let loopStarted = false;
30
37
  let runLoopImmediateId = null;
31
38
 
39
+ class BufferPool {
40
+ #bufferSize;
41
+ #pool;
42
+
43
+ constructor(bufferSize, initialPoolSize) {
44
+ this.#bufferSize = bufferSize;
45
+ this.#pool = new Array(initialPoolSize);
46
+
47
+ for (let i = 0; i < this.#pool.length; i++) {
48
+ this.#pool[i] = this.#allocate();
49
+ }
50
+ }
51
+
52
+ #allocate() {
53
+ const float32 = new Float32Array(this.#bufferSize);
54
+ markAsUntransferable(float32);
55
+ // Mark underlying buffer as untransfrable too, this will fail one of
56
+ // the task in `audioworkletprocessor-process-frozen-array.https.html`
57
+ // but prevent segmentation fault
58
+ markAsUntransferable(float32.buffer);
59
+
60
+ return float32;
61
+ }
62
+
63
+ get() {
64
+ if (this.#pool.length === 0) {
65
+ return this.#allocate();
66
+ }
67
+
68
+ return this.#pool.pop();
69
+ }
70
+
71
+ recycle(buffer) {
72
+ // make sure we cannot polute our pool
73
+ if (buffer.length === this.#bufferSize) {
74
+ this.#pool.push(buffer);
75
+ }
76
+ }
77
+ }
78
+
79
+ const renderQuantumSize = 128;
80
+
81
+ const pool128 = new BufferPool(renderQuantumSize, 256);
82
+ const pool1 = new BufferPool(1, 64);
83
+ // allow rust to access some methods required when io layout change
84
+ globalThis[kWorkletGetBuffer] = () => pool128.get();
85
+ globalThis[kWorkletRecycleBuffer] = buffer => pool128.recycle(buffer);
86
+ globalThis[kWorkletRecycleBuffer1] = buffer => pool1.recycle(buffer);
87
+ globalThis[kWorkletMarkAsUntransferable] = obj => {
88
+ markAsUntransferable(obj);
89
+ return obj;
90
+ }
91
+
32
92
  function isIterable(obj) {
33
93
  // checks for null and undefined
34
94
  if (obj === null || obj === undefined) {
@@ -54,12 +114,11 @@ function runLoop() {
54
114
  runLoopImmediateId = setImmediate(runLoop);
55
115
  }
56
116
 
57
- // s
58
117
  globalThis.currentTime = 0
59
118
  globalThis.currentFrame = 0;
60
119
  globalThis.sampleRate = sampleRate;
61
120
  // @todo - implement in upstream crate
62
- // globalThis.renderQuantumSize = 128;
121
+ globalThis.renderQuantumSize = renderQuantumSize;
63
122
 
64
123
  globalThis.AudioWorkletProcessor = class AudioWorkletProcessor {
65
124
  static get parameterDescriptors() {
@@ -76,13 +135,14 @@ globalThis.AudioWorkletProcessor = class AudioWorkletProcessor {
76
135
  parameterDescriptors,
77
136
  } = pendingProcessorConstructionData;
78
137
 
79
- // @todo - Mark [[callable process]] as true, set to false in render quantum
138
+ // Mark [[callable process]] as true, set to false in render quantum
80
139
  // either "process" doese not exists, either it throws an error
81
140
  this[kWorkletCallableProcess] = true;
82
- // @todo - reuse Float32Arrays between calls + freeze arrays
141
+
142
+ // Populate with dummy values which will be replaced in first render call
83
143
  this[kWorkletInputs] = new Array(numberOfInputs).fill([]);
84
- // @todo - use `outputChannelCount`
85
144
  this[kWorkletOutputs] = new Array(numberOfOutputs).fill([]);
145
+
86
146
  // Object to be reused as `process` parameters argument
87
147
  this[kWorkletParams] = {};
88
148
  // Cache of 2 Float32Array (of length 128 and 1) for each param, to be reused on
@@ -91,8 +151,8 @@ globalThis.AudioWorkletProcessor = class AudioWorkletProcessor {
91
151
 
92
152
  parameterDescriptors.forEach(desc => {
93
153
  this[kWorkletParamsCache][desc.name] = [
94
- new Float32Array(128), // should be globalThis.renderQuantumSize
95
- new Float32Array(1),
154
+ pool128.get(), // should be globalThis.renderQuantumSize
155
+ pool1.get(),
96
156
  ]
97
157
  });
98
158
 
@@ -234,13 +294,11 @@ globalThis.registerProcessor = function registerProcessor(name, processorCtor) {
234
294
  // NOTE: Authors that register an event listener on the "message" event of this
235
295
  // port should call close on either end of the MessageChannel (either in the
236
296
  // AudioWorklet or the AudioWorkletGlobalScope side) to allow for resources to be collected.
237
- parentPort.on('exit', () => {
238
- process.stdout.write('closing worklet');
239
- });
297
+ // parentPort.on('exit', () => {
298
+ // process.stdout.write('closing worklet');
299
+ // });
240
300
 
241
301
  parentPort.on('message', event => {
242
- console.log(event.cmd + '\n');
243
-
244
302
  switch (event.cmd) {
245
303
  case 'node-web-audio-api:worklet:init': {
246
304
  const { workletId, processors, promiseId } = event;
@@ -257,13 +315,23 @@ parentPort.on('message', event => {
257
315
  case 'node-web-audio-api:worklet:add-module': {
258
316
  const { code, promiseId } = event;
259
317
  const func = new Function('AudioWorkletProcessor', 'registerProcessor', code);
260
- func(AudioWorkletProcessor, registerProcessor);
261
318
 
262
- // send registered param descriptors on main thread and resolve Promise
263
- parentPort.postMessage({
264
- cmd: 'node-web-audio-api:worklet:module-added',
265
- promiseId,
266
- });
319
+ try {
320
+ func(AudioWorkletProcessor, registerProcessor);
321
+ // send registered param descriptors on main thread and resolve Promise
322
+ parentPort.postMessage({
323
+ cmd: 'node-web-audio-api:worklet:module-added',
324
+ promiseId,
325
+ });
326
+ } catch (err) {
327
+ parentPort.postMessage({
328
+ cmd: 'node-web-audio-api:worklet:add-module-failed',
329
+ promiseId,
330
+ ctor: err.constructor.name,
331
+ name: err.name,
332
+ message: err.message,
333
+ });
334
+ }
267
335
  break;
268
336
  }
269
337
  case 'node-web-audio-api:worklet:create-processor': {
@@ -98,14 +98,16 @@ module.exports = (jsExport, nativeBinding) => {
98
98
  throw new DOMException(`Failed to construct 'AudioWorkletNode': Invalid 'outputChannelCount' property from AudioWorkletNodeOptions: 'outputChannelCount' length (${parsedOptions.outputChannelCount.length}) does not equal 'numberOfOutputs' (${parsedOptions.numberOfOutputs})`, 'IndexSizeError');
99
99
  }
100
100
  } else {
101
- // If outputChannelCount does not exists,
102
101
  // - If both numberOfInputs and numberOfOutputs are 1, set the initial channel count of the node output to 1 and return.
103
102
  // NOTE: For this case, the output chanel count will change to computedNumberOfChannels dynamically based on the input and the channelCountMode at runtime.
104
- // - Otherwise set the channel count of each output of the node to 1 and return.
105
-
106
- // @note - not sure what this means, let's go simple
107
- parsedOptions.outputChannelCount = new Uint32Array(parsedOptions.numberOfOutputs);
108
- parsedOptions.outputChannelCount.fill(1);
103
+ if (parsedOptions.numberOfInputs === 1 && parsedOptions.numberOfOutputs === 1) {
104
+ // rust waits for an empty Vec as the special case value
105
+ parsedOptions.outputChannelCount = new Uint32Array(0);
106
+ } else {
107
+ // - Otherwise set the channel count of each output of the node to 1 and return.
108
+ parsedOptions.outputChannelCount = new Uint32Array(parsedOptions.numberOfOutputs);
109
+ parsedOptions.outputChannelCount.fill(1);
110
+ }
109
111
  }
110
112
 
111
113
  // @todo
@@ -82,33 +82,26 @@ module.exports = function patchOfflineAudioContext(jsExport, nativeBinding) {
82
82
 
83
83
  // Add function to Napi object to bridge from Rust events to JS EventTarget
84
84
  // They will be effectively registered on rust side when `startRendering` is called
85
- this[kNapiObj][kOnStateChange] = (err, rawEvent) => {
86
- if (typeof rawEvent !== 'object' && !('type' in rawEvent)) {
87
- throw new TypeError('Invalid [kOnStateChange] Invocation: rawEvent should have a type property');
88
- }
89
-
85
+ this[kNapiObj][kOnStateChange] = (function(_err, rawEvent) {
90
86
  const event = new Event(rawEvent.type);
91
87
  propagateEvent(this, event);
92
- };
88
+ }).bind(this);
93
89
 
94
90
  // This event is, per spec, the last trigerred one
95
- this[kNapiObj][kOnComplete] = (err, rawEvent) => {
96
- if (typeof rawEvent !== 'object' && !('type' in rawEvent)) {
97
- throw new TypeError('Invalid [kOnComplete] Invocation: rawEvent should have a type property');
98
- }
99
-
100
- // @fixme: workaround the fact that this event seems to be triggered before
91
+ this[kNapiObj][kOnComplete] = (function(err, rawEvent) {
92
+ // workaround the fact that the oncomplete event is triggered before
101
93
  // startRendering fulfills and that we want to return the exact same instance
102
- if (this.#renderedBuffer === null) {
103
- this.#renderedBuffer = new jsExport.AudioBuffer({ [kNapiObj]: rawEvent.renderedBuffer });
104
- }
94
+ this.#renderedBuffer = new jsExport.AudioBuffer({ [kNapiObj]: rawEvent.renderedBuffer });
105
95
 
106
96
  const event = new jsExport.OfflineAudioCompletionEvent(rawEvent.type, {
107
97
  renderedBuffer: this.#renderedBuffer,
108
98
  });
109
99
 
110
- propagateEvent(this, event);
111
- };
100
+ // delay event propagation to next tick that it is executed after startRendering fulfills
101
+ setImmediate(() => {
102
+ propagateEvent(this, event);
103
+ }, 0);
104
+ }).bind(this);
112
105
  }
113
106
 
114
107
  get length() {
@@ -153,15 +146,9 @@ module.exports = function patchOfflineAudioContext(jsExport, nativeBinding) {
153
146
  throwSanitizedError(err);
154
147
  }
155
148
 
156
- // release audio worklet, if any
149
+ // release audio worklets
157
150
  await this.audioWorklet[kWorkletRelease]();
158
151
 
159
- // workaround the fact that this event seems to be triggered before
160
- // startRendering fulfills and that we want to return the exact same instance
161
- if (this.#renderedBuffer === null) {
162
- this.#renderedBuffer = new jsExport.AudioBuffer({ [kNapiObj]: nativeAudioBuffer });
163
- }
164
-
165
152
  return this.#renderedBuffer;
166
153
  }
167
154
 
@@ -107,11 +107,7 @@ module.exports = (jsExport, nativeBinding) => {
107
107
  [kNapiObj]: napiObj,
108
108
  });
109
109
 
110
- this[kNapiObj][kOnAudioProcess] = (err, rawEvent) => {
111
- if (typeof rawEvent !== 'object' && !('type' in rawEvent)) {
112
- throw new TypeError('Invalid [kOnStateChange] Invocation: rawEvent should have a type property');
113
- }
114
-
110
+ this[kNapiObj][kOnAudioProcess] = (function(err, rawEvent) {
115
111
  const audioProcessingEventInit = {
116
112
  playbackTime: rawEvent.playbackTime,
117
113
  inputBuffer: new jsExport.AudioBuffer({ [kNapiObj]: rawEvent.inputBuffer }),
@@ -120,7 +116,7 @@ module.exports = (jsExport, nativeBinding) => {
120
116
 
121
117
  const event = new jsExport.AudioProcessingEvent('audioprocess', audioProcessingEventInit);
122
118
  propagateEvent(this, event);
123
- };
119
+ }).bind(this);
124
120
 
125
121
  this[kNapiObj].listen_to_events();
126
122
  }
Binary file
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "node-web-audio-api",
3
- "version": "0.21.0",
3
+ "version": "0.21.2",
4
4
  "author": "Benjamin Matuszewski",
5
5
  "description": "Node.js bindings for web-audio-api-rs using napi-rs",
6
6
  "exports": {
@@ -44,13 +44,15 @@
44
44
  "lint": "npx eslint index.cjs index.mjs && npx eslint js/*.js && npx eslint examples/*.mjs",
45
45
  "preversion": "yarn install && npm run generate",
46
46
  "postversion": "cargo bump $npm_package_version && git commit -am \"v$npm_package_version\" && node .scripts/check-changelog.mjs",
47
- "test": "mocha tests",
47
+ "test": "mocha tests/*.spec.mjs",
48
+ "test:only": "mocha",
48
49
  "wpt": "npm run build && node ./.scripts/wpt-harness.mjs",
49
50
  "wpt:only": "node ./.scripts/wpt-harness.mjs"
50
51
  },
51
52
  "devDependencies": {
52
53
  "@ircam/eslint-config": "^1.3.0",
53
54
  "@ircam/sc-gettime": "^1.0.0",
55
+ "@ircam/sc-scheduling": "^0.1.7",
54
56
  "@ircam/sc-utils": "^1.3.3",
55
57
  "@sindresorhus/slugify": "^2.1.1",
56
58
  "camelcase": "^7.0.1",
@@ -66,7 +68,6 @@
66
68
  "octokit": "^2.0.11",
67
69
  "ping": "^0.4.2",
68
70
  "template-literal": "^1.0.4",
69
- "waves-masters": "^2.3.1",
70
71
  "webidl2": "^24.2.0",
71
72
  "wpt-runner": "^5.0.0"
72
73
  },