node-web-audio-api 0.20.0 → 0.21.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,3 +1,13 @@
1
+ ## v0.21.1 (10/06/2024)
2
+
3
+ - Feat: Buffer pool for AudioWorketProcessor
4
+ - Fix: Propagate `addModule` errors to main thread
5
+ - Fix: Memory leak due to `onended` events
6
+
7
+ ## v0.21.0 (17/05/2024)
8
+
9
+ - Feat: Implement AudioWorkletNode
10
+
1
11
  ## v0.20.0 (29/04/2024)
2
12
 
3
13
  - Update upstream crate to [v0.44.0](https://github.com/orottier/web-audio-api-rs/blob/main/CHANGELOG.md#version-0440-2024-04-22)
package/README.md CHANGED
@@ -65,9 +65,7 @@ node examples/granular-scrub.mjs
65
65
 
66
66
  ## Caveats
67
67
 
68
- - Missing nodes: `ScriptProcessorNode`, `AudioWorkletNode`.
69
- - Streams: only a minimial audio input stream and the `MediaStreamSourceNode` are provided. All other `MediaStream` features are left on the side for now as they principaly concern a different API specification, which is not a trivial problem.
70
- - Some async methods (e.g. `decodeAudioData`, `setSinkId`) are not trully async yet. This will evolve with the implemetation of the async version in the upstream crate.
68
+ - `Streams`: only a minimal audio input stream and the `MediaStreamSourceNode` are provided. All other `MediaStream` features are left on the side for now as they principally concern a different API specification, which is not a trivial problem.
71
69
 
72
70
  ## Supported Platforms
73
71
 
@@ -81,24 +79,7 @@ node examples/granular-scrub.mjs
81
79
  | Linux arm gnueabihf (RPi) | ✓ | ✓ |
82
80
  | Linux arm64 gnu (RPi) | ✓ | ✓ |
83
81
 
84
-
85
- ## Notes for Linux users
86
-
87
- Using the library on Linux with the ALSA backend might lead to unexpected cranky sound with the default render size (i.e. 128 frames). In such cases, a simple workaround is to pass the `playback` latency hint when creating the audio context, which will increase the render size to 1024 frames:
88
-
89
- ```js
90
- const audioContext = new AudioContext({ latencyHint: 'playback' });
91
- ```
92
-
93
- For real-time and interactive applications where low latency is crucial, you should instead rely on the JACK backend provided by `cpal`. By default the audio context will use that backend if a running JACK server is found.
94
-
95
- If you don't have JACK installed, you can still pass the `WEB_AUDIO_LATENCY=playback` env variable to all examples to create the audio context with the playback latency hint, e.g.:
96
-
97
- ```sh
98
- WEB_AUDIO_LATENCY=playback node examples/amplitude-modulation.mjs
99
- ```
100
-
101
- ### Manual Build
82
+ ## Manual Build
102
83
 
103
84
  If prebuilt binaries are not shippped for your platform, you will need to:
104
85
 
@@ -121,15 +102,51 @@ The package will be built on your machine, which might take some time.
121
102
 
122
103
  Be aware that the package won't be listed on your `package.json` file, and that it won't be re-installed if running `npm install` again. A possible workaround would be to include the above in a postinstall script.
123
104
 
105
+ ## Notes for Linux users
106
+
107
+ ### Build
108
+
109
+ To build the library, you will need to manually install the `libasound2-dev` package:
110
+
111
+ ```sh
112
+ sudo apt install libasound2-dev
113
+ ```
114
+
115
+ Optionally, if you use the Jack Audio Backend, the `libjack-jackd2-dev` package:
116
+
117
+ ```sh
118
+ sudo apt install libjack-jackd2-dev
119
+ ```
120
+
121
+ In such case, you can use the `npm run build:jack` script to enable the Jack feature.
122
+
123
+ ### Audio backend and latency
124
+
125
+ Using the library on Linux with the ALSA backend might lead to unexpected cranky sound with the default render size (i.e. 128 frames). In such cases, a simple workaround is to pass the `playback` latency hint when creating the audio context, which will increase the render size to 1024 frames:
126
+
127
+ ```js
128
+ const audioContext = new AudioContext({ latencyHint: 'playback' });
129
+ ```
130
+
131
+ For real-time and interactive applications where low latency is crucial, you should instead rely on the JACK backend provided by `cpal`. By default the audio context will use that backend if a running JACK server is found.
132
+
133
+ If you don't have JACK installed, you can still pass the `WEB_AUDIO_LATENCY=playback` environment variable to all examples to create the audio context with the playback latency hint, e.g.:
134
+
135
+ ```sh
136
+ WEB_AUDIO_LATENCY=playback node examples/amplitude-modulation.mjs
137
+ ```
138
+
124
139
  ## Development notes
125
140
 
141
+ ### Synchronize versioning
142
+
126
143
  The npm `postversion` script rely on [`cargo-bump`](https://crates.io/crates/cargo-bump) to maintain versions synced between the `package.json` and the `Cargo.toml` files. Therefore, you will need to install `cargo-bump` on your machine
127
144
 
128
145
  ```
129
146
  cargo install cargo-bump
130
147
  ```
131
148
 
132
- ## Running the web-platform-test suite
149
+ ### Running the web-platform-test suite
133
150
 
134
151
  Follow the steps for 'Manual Build' first. Then checkout the web-platform-tests submodule with:
135
152
 
package/index.cjs CHANGED
@@ -1,90 +1,88 @@
1
- const { platform, arch } = process;
1
+ // -------------------------------------------------------------------------- //
2
+ // -------------------------------------------------------------------------- //
3
+ // //
4
+ // //
5
+ // //
6
+ // ██╗ ██╗ █████╗ ██████╗ ███╗ ██╗██╗███╗ ██╗ ██████╗ //
7
+ // ██║ ██║██╔══██╗██╔══██╗████╗ ██║██║████╗ ██║██╔════╝ //
8
+ // ██║ █╗ ██║███████║██████╔╝██╔██╗ ██║██║██╔██╗ ██║██║ ███╗ //
9
+ // ██║███╗██║██╔══██║██╔══██╗██║╚██╗██║██║██║╚██╗██║██║ ██║ //
10
+ // ╚███╔███╔╝██║ ██║██║ ██║██║ ╚████║██║██║ ╚████║╚██████╔╝ //
11
+ // ╚══╝╚══╝ ╚═╝ ╚═╝╚═╝ ╚═╝╚═╝ ╚═══╝╚═╝╚═╝ ╚═══╝ ╚═════╝ //
12
+ // //
13
+ // //
14
+ // - This file has been generated --------------------------- //
15
+ // //
16
+ // //
17
+ // -------------------------------------------------------------------------- //
18
+ // -------------------------------------------------------------------------- //
2
19
 
3
- let nativeBinding = null;
4
- let loadError = null;
20
+ const nativeBinding = require('./load-native.cjs');
21
+ const jsExport = {};
5
22
 
6
- switch (platform) {
7
- case 'win32':
8
- switch (arch) {
9
- case 'x64':
10
- try {
11
- nativeBinding = require('./node-web-audio-api.win32-x64-msvc.node');
12
- } catch (e) {
13
- loadError = e;
14
- }
15
- break;
16
- case 'arm64':
17
- try {
18
- nativeBinding = require('./node-web-audio-api.win32-arm64-msvc.node');
19
- } catch (e) {
20
- loadError = e;
21
- }
22
- break;
23
- default:
24
- throw new Error(`Unsupported architecture on Windows: ${arch}`);
25
- }
26
- break;
27
- case 'darwin':
28
- switch (arch) {
29
- case 'x64':
30
- try {
31
- nativeBinding = require('./node-web-audio-api.darwin-x64.node');
32
- } catch (e) {
33
- loadError = e;
34
- }
35
- break;
36
- case 'arm64':
37
- try {
38
- nativeBinding = require('./node-web-audio-api.darwin-arm64.node');
39
- } catch (e) {
40
- loadError = e;
41
- }
42
- break;
43
- default:
44
- throw new Error(`Unsupported architecture on macOS: ${arch}`);
45
- }
46
- break;
47
- case 'linux':
48
- switch (arch) {
49
- case 'x64':
50
- try {
51
- nativeBinding = require('./node-web-audio-api.linux-x64-gnu.node');
52
- } catch (e) {
53
- loadError = e;
54
- }
55
- break;
56
- case 'arm64':
57
- try {
58
- nativeBinding = require('./node-web-audio-api.linux-arm64-gnu.node');
59
- } catch (e) {
60
- loadError = e;
61
- }
62
- break;
63
- case 'arm':
64
- try {
65
- nativeBinding = require('./node-web-audio-api.linux-arm-gnueabihf.node');
66
- } catch (e) {
67
- loadError = e;
68
- }
69
- break;
70
- default:
71
- throw new Error(`Unsupported architecture on Linux: ${arch}`);
72
- }
73
- break;
74
- default:
75
- throw new Error(`Unsupported OS: ${platform}, architecture: ${arch}`);
76
- }
23
+ // --------------------------------------------------------------------------
24
+ // Events
25
+ // --------------------------------------------------------------------------
26
+ jsExport.OfflineAudioCompletionEvent = require('./js/Events').OfflineAudioCompletionEvent;
27
+ jsExport.AudioProcessingEvent = require('./js/Events').AudioProcessingEvent;
28
+ jsExport.AudioRenderCapacityEvent = require('./js/Events').AudioRenderCapacityEvent;
29
+ // --------------------------------------------------------------------------
30
+ // Create Web Audio API facade
31
+ // --------------------------------------------------------------------------
32
+ jsExport.BaseAudioContext = require('./js/BaseAudioContext.js')(jsExport, nativeBinding);
33
+ jsExport.AudioContext = require('./js/AudioContext.js')(jsExport, nativeBinding);
34
+ jsExport.OfflineAudioContext = require('./js/OfflineAudioContext.js')(jsExport, nativeBinding);
77
35
 
78
- if (!nativeBinding) {
79
- if (loadError) {
80
- throw loadError;
81
- }
36
+ jsExport.ScriptProcessorNode = require('./js/ScriptProcessorNode.js')(jsExport, nativeBinding);
37
+ jsExport.AudioWorkletNode = require('./js/AudioWorkletNode.js')(jsExport, nativeBinding);
38
+ jsExport.AnalyserNode = require('./js/AnalyserNode.js')(jsExport, nativeBinding);
39
+ jsExport.AudioBufferSourceNode = require('./js/AudioBufferSourceNode.js')(jsExport, nativeBinding);
40
+ jsExport.BiquadFilterNode = require('./js/BiquadFilterNode.js')(jsExport, nativeBinding);
41
+ jsExport.ChannelMergerNode = require('./js/ChannelMergerNode.js')(jsExport, nativeBinding);
42
+ jsExport.ChannelSplitterNode = require('./js/ChannelSplitterNode.js')(jsExport, nativeBinding);
43
+ jsExport.ConstantSourceNode = require('./js/ConstantSourceNode.js')(jsExport, nativeBinding);
44
+ jsExport.ConvolverNode = require('./js/ConvolverNode.js')(jsExport, nativeBinding);
45
+ jsExport.DelayNode = require('./js/DelayNode.js')(jsExport, nativeBinding);
46
+ jsExport.DynamicsCompressorNode = require('./js/DynamicsCompressorNode.js')(jsExport, nativeBinding);
47
+ jsExport.GainNode = require('./js/GainNode.js')(jsExport, nativeBinding);
48
+ jsExport.IIRFilterNode = require('./js/IIRFilterNode.js')(jsExport, nativeBinding);
49
+ jsExport.MediaStreamAudioSourceNode = require('./js/MediaStreamAudioSourceNode.js')(jsExport, nativeBinding);
50
+ jsExport.OscillatorNode = require('./js/OscillatorNode.js')(jsExport, nativeBinding);
51
+ jsExport.PannerNode = require('./js/PannerNode.js')(jsExport, nativeBinding);
52
+ jsExport.StereoPannerNode = require('./js/StereoPannerNode.js')(jsExport, nativeBinding);
53
+ jsExport.WaveShaperNode = require('./js/WaveShaperNode.js')(jsExport, nativeBinding);
54
+
55
+ jsExport.AudioNode = require('./js/AudioNode.js');
56
+ jsExport.AudioScheduledSourceNode = require('./js/AudioScheduledSourceNode.js');
57
+ jsExport.AudioParam = require('./js/AudioParam.js');
58
+ jsExport.AudioDestinationNode = require('./js/AudioDestinationNode.js');
59
+ jsExport.AudioListener = require('./js/AudioListener.js');
60
+ jsExport.AudioWorklet = require('./js/AudioWorklet.js');
61
+ jsExport.AudioParamMap = require('./js/AudioParamMap.js');
62
+ jsExport.AudioRenderCapacity = require('./js/AudioRenderCapacity.js');
63
+
64
+ jsExport.PeriodicWave = require('./js/PeriodicWave.js')(jsExport, nativeBinding);
65
+ jsExport.AudioBuffer = require('./js/AudioBuffer.js')(jsExport, nativeBinding);
82
66
 
83
- throw new Error(`Failed to load native binding for OS: ${platform}, architecture: ${arch}`);
84
- }
67
+ // --------------------------------------------------------------------------
68
+ // Promisify MediaDevices API
69
+ // --------------------------------------------------------------------------
70
+ jsExport.mediaDevices = {};
85
71
 
86
- const monkeyPatch = require('./js/monkey-patch.js');
87
- nativeBinding = monkeyPatch(nativeBinding);
72
+ const enumerateDevicesSync = nativeBinding.mediaDevices.enumerateDevices;
73
+ jsExport.mediaDevices.enumerateDevices = async function enumerateDevices() {
74
+ const list = enumerateDevicesSync();
75
+ return Promise.resolve(list);
76
+ };
77
+
78
+ const getUserMediaSync = nativeBinding.mediaDevices.getUserMedia;
79
+ jsExport.mediaDevices.getUserMedia = async function getUserMedia(options) {
80
+ if (options === undefined) {
81
+ throw new TypeError('Failed to execute "getUserMedia" on "MediaDevices": audio must be requested');
82
+ }
88
83
 
89
- module.exports = nativeBinding;
84
+ const stream = getUserMediaSync(options);
85
+ return Promise.resolve(stream);
86
+ };
90
87
 
88
+ module.exports = jsExport;
package/index.mjs CHANGED
@@ -30,6 +30,7 @@ export const {
30
30
  // events
31
31
  OfflineAudioCompletionEvent,
32
32
  AudioProcessingEvent,
33
+ AudioRenderCapacityEvent,
33
34
 
34
35
  // manually written nodes
35
36
  BaseAudioContext,
@@ -41,11 +42,15 @@ export const {
41
42
  AudioParam,
42
43
  AudioDestinationNode,
43
44
  AudioListener,
45
+ AudioWorklet,
46
+ AudioParamMap,
47
+ AudioRenderCapacity,
44
48
 
45
49
  PeriodicWave,
46
50
  AudioBuffer,
47
51
  // generated nodes
48
52
  ScriptProcessorNode,
53
+ AudioWorkletNode,
49
54
  AnalyserNode,
50
55
  AudioBufferSourceNode,
51
56
  BiquadFilterNode,
@@ -11,6 +11,7 @@ const {
11
11
  kNapiObj,
12
12
  kOnStateChange,
13
13
  kOnSinkChange,
14
+ kWorkletRelease,
14
15
  } = require('./lib/symbols.js');
15
16
  const {
16
17
  propagateEvent,
@@ -22,6 +23,8 @@ module.exports = function(jsExport, nativeBinding) {
22
23
 
23
24
  class AudioContext extends jsExport.BaseAudioContext {
24
25
  #sinkId = '';
26
+ #renderCapacity = null;
27
+ #onsinkchange = null;
25
28
 
26
29
  constructor(options = {}) {
27
30
  if (typeof options !== 'object') {
@@ -51,18 +54,16 @@ module.exports = function(jsExport, nativeBinding) {
51
54
  }
52
55
 
53
56
  if (options.sinkId !== undefined) {
54
- const sinkId = options.sinkId;
55
-
56
57
  if (typeof options.sinkId === 'object') {
57
58
  // https://webaudio.github.io/web-audio-api/#enumdef-audiosinktype
58
59
  if (!('type' in options.sinkId) || options.sinkId.type !== 'none') {
59
- throw TypeError(`Failed to construct 'AudioContext': Failed to read the 'sinkId' property from AudioNodeOptions: Failed to read the 'type' property from 'AudioSinkOptions': The provided value (${sinkId.type}) is not a valid enum value of type AudioSinkType.`);
60
+ throw TypeError(`Failed to construct 'AudioContext': Failed to read the 'sinkId' property from AudioNodeOptions: Failed to read the 'type' property from 'AudioSinkOptions': The provided value (${options.sinkId.type}) is not a valid enum value of type AudioSinkType.`);
60
61
  }
61
62
 
62
63
  targetOptions.sinkId = 'none';
63
64
  } else {
64
- targetOptions.sinkId = conversions['DOMString'](sinkId, {
65
- context: `Failed to construct 'AudioContext': Failed to read the 'sinkId' property from AudioNodeOptions: Failed to read the 'type' property from 'AudioSinkOptions': The provided value (${sinkId})`,
65
+ targetOptions.sinkId = conversions['DOMString'](options.sinkId, {
66
+ context: `Failed to construct 'AudioContext': Failed to read the 'sinkId' property from AudioNodeOptions: Failed to read the 'type' property from 'AudioSinkOptions': The provided value (${options.sinkId})`,
66
67
  });
67
68
  }
68
69
  } else {
@@ -83,24 +84,20 @@ module.exports = function(jsExport, nativeBinding) {
83
84
  this.#sinkId = options.sinkId;
84
85
  }
85
86
 
86
- // Add function to Napi object to bridge from Rust events to JS EventTarget
87
- this[kNapiObj][kOnStateChange] = (err, rawEvent) => {
88
- if (typeof rawEvent !== 'object' && !('type' in rawEvent)) {
89
- throw new TypeError('Invalid [kOnStateChange] Invocation: rawEvent should have a type property');
90
- }
87
+ this.#renderCapacity = new jsExport.AudioRenderCapacity({
88
+ [kNapiObj]: this[kNapiObj].renderCapacity,
89
+ });
91
90
 
91
+ // Add function to Napi object to bridge from Rust events to JS EventTarget
92
+ this[kNapiObj][kOnStateChange] = (function(err, rawEvent) {
92
93
  const event = new Event(rawEvent.type);
93
94
  propagateEvent(this, event);
94
- };
95
-
96
- this[kNapiObj][kOnSinkChange] = (err, rawEvent) => {
97
- if (typeof rawEvent !== 'object' && !('type' in rawEvent)) {
98
- throw new TypeError('Invalid [kOnSinkChange] Invocation: rawEvent should have a type property');
99
- }
95
+ }).bind(this);
100
96
 
97
+ this[kNapiObj][kOnSinkChange] = (function(err, rawEvent) {
101
98
  const event = new Event(rawEvent.type);
102
99
  propagateEvent(this, event);
103
- };
100
+ }).bind(this);
104
101
 
105
102
  // Workaround to bind the `sinkchange` and `statechange` events to EventTarget.
106
103
  // This must be called from JS facade ctor as the JS handler are added to the Napi
@@ -120,7 +117,6 @@ module.exports = function(jsExport, nativeBinding) {
120
117
  });
121
118
  // keep process awake until context is closed
122
119
  const keepAwakeId = setInterval(() => {}, 10 * 1000);
123
-
124
120
  // clear on close
125
121
  this.addEventListener('statechange', () => {
126
122
  if (this.state === 'closed') {
@@ -129,6 +125,11 @@ module.exports = function(jsExport, nativeBinding) {
129
125
  clearTimeout(keepAwakeId);
130
126
  }
131
127
  });
128
+
129
+ // for wpt tests, see ./.scripts/wpt_harness.mjs for informations
130
+ if (process.WPT_TEST_RUNNER) {
131
+ process.WPT_TEST_RUNNER.once('cleanup', () => this.close());
132
+ }
132
133
  }
133
134
 
134
135
  get baseLatency() {
@@ -160,7 +161,7 @@ module.exports = function(jsExport, nativeBinding) {
160
161
  throw new TypeError('Invalid Invocation: Value of \'this\' must be of type \'AudioContext\'');
161
162
  }
162
163
 
163
- throw new Error(`AudioContext::renderCapacity is not yet implemented`);
164
+ return this.#renderCapacity;
164
165
  }
165
166
 
166
167
  get onsinkchange() {
@@ -168,7 +169,7 @@ module.exports = function(jsExport, nativeBinding) {
168
169
  throw new TypeError('Invalid Invocation: Value of \'this\' must be of type \'AudioContext\'');
169
170
  }
170
171
 
171
- return this._sinkchange || null;
172
+ return this.#onsinkchange;
172
173
  }
173
174
 
174
175
  set onsinkchange(value) {
@@ -177,7 +178,7 @@ module.exports = function(jsExport, nativeBinding) {
177
178
  }
178
179
 
179
180
  if (isFunction(value) || value === null) {
180
- this._sinkchange = value;
181
+ this.#onsinkchange = value;
181
182
  }
182
183
  }
183
184
 
@@ -210,6 +211,9 @@ module.exports = function(jsExport, nativeBinding) {
210
211
  throw new TypeError('Invalid Invocation: Value of \'this\' must be of type \'AudioContext\'');
211
212
  }
212
213
 
214
+ // Close audioWorklet first so that `run_audio_worklet_global_scope` exit first
215
+ // The other way around works too because of `recv_timeout` but cleaner this way
216
+ await this.audioWorklet[kWorkletRelease]();
213
217
  await this[kNapiObj].close();
214
218
  }
215
219
 
@@ -0,0 +1,88 @@
1
+ const {
2
+ kPrivateConstructor,
3
+ } = require('./lib/symbols.js');
4
+ const {
5
+ kEnumerableProperty,
6
+ } = require('./lib/utils.js');
7
+
8
+ class AudioParamMap {
9
+ #parameters = null;
10
+
11
+ constructor(options) {
12
+ if (
13
+ (typeof options !== 'object') ||
14
+ options[kPrivateConstructor] !== true
15
+ ) {
16
+ throw new TypeError('Illegal constructor');
17
+ }
18
+
19
+ this.#parameters = options.parameters;
20
+ }
21
+
22
+ get size() {
23
+ return this.#parameters.size;
24
+ }
25
+
26
+ entries() {
27
+ return this.#parameters.entries();
28
+ }
29
+
30
+ keys() {
31
+ return this.#parameters.keys();
32
+ }
33
+
34
+ values() {
35
+ return this.#parameters.values();
36
+ }
37
+
38
+ forEach(func) {
39
+ return this.#parameters.forEach(func);
40
+ }
41
+
42
+ get(name) {
43
+ return this.#parameters.get(name);
44
+ }
45
+
46
+ has(name) {
47
+ return this.#parameters.has(name);
48
+ }
49
+ }
50
+
51
+ Object.defineProperties(AudioParamMap, {
52
+ length: {
53
+ __proto__: null,
54
+ writable: false,
55
+ enumerable: false,
56
+ configurable: true,
57
+ value: 0,
58
+ },
59
+ });
60
+
61
+ Object.defineProperties(AudioParamMap.prototype, {
62
+ [Symbol.toStringTag]: {
63
+ __proto__: null,
64
+ writable: false,
65
+ enumerable: false,
66
+ configurable: true,
67
+ value: 'AudioParamMap',
68
+ },
69
+ [Symbol.iterator]: {
70
+ value: AudioParamMap.prototype.entries,
71
+ enumerable: false,
72
+ configurable: true,
73
+ writable: true,
74
+ },
75
+ size: {
76
+ __proto__: null,
77
+ enumerable: true,
78
+ configurable: true,
79
+ },
80
+ entries: kEnumerableProperty,
81
+ keys: kEnumerableProperty,
82
+ values: kEnumerableProperty,
83
+ forEach: kEnumerableProperty,
84
+ get: kEnumerableProperty,
85
+ has: kEnumerableProperty,
86
+ });
87
+
88
+ module.exports = AudioParamMap;
@@ -0,0 +1,117 @@
1
+ const conversions = require('webidl-conversions');
2
+
3
+ const {
4
+ kNapiObj,
5
+ kOnUpdate,
6
+ } = require('./lib/symbols.js');
7
+ const {
8
+ kEnumerableProperty,
9
+ } = require('./lib/utils.js');
10
+ const {
11
+ propagateEvent,
12
+ } = require('./lib/events.js');
13
+ const {
14
+ AudioRenderCapacityEvent,
15
+ } = require('./Events.js');
16
+
17
+ class AudioRenderCapacity extends EventTarget {
18
+ #onupdate = null;
19
+
20
+ constructor(options) {
21
+ // Make constructor "private"
22
+ if (
23
+ (typeof options !== 'object')
24
+ || !(kNapiObj in options)
25
+ || options[kNapiObj]['Symbol.toStringTag'] !== 'AudioRenderCapacity'
26
+ ) {
27
+ throw new TypeError('Illegal constructor');
28
+ }
29
+
30
+ super();
31
+
32
+ this[kNapiObj] = options[kNapiObj];
33
+
34
+ this[kNapiObj][kOnUpdate] = (function(err, rawEvent) {
35
+ const event = new AudioRenderCapacityEvent('update', rawEvent);
36
+ propagateEvent(this, event);
37
+ }).bind(this);
38
+
39
+ this[kNapiObj].listen_to_events();
40
+ }
41
+
42
+ get onupdate() {
43
+ if (!(this instanceof AudioRenderCapacity)) {
44
+ throw new TypeError('Invalid Invocation: Value of \'this\' must be of type \'AudioRenderCapacity\'');
45
+ }
46
+
47
+ return this.#onupdate;
48
+ }
49
+
50
+ set onupdate(value) {
51
+ if (!(this instanceof AudioRenderCapacity)) {
52
+ throw new TypeError('Invalid Invocation: Value of \'this\' must be of type \'AudioRenderCapacity\'');
53
+ }
54
+
55
+ if (isFunction(value) || value === null) {
56
+ this.#onupdate = value;
57
+ }
58
+ }
59
+
60
+ start(options = null) {
61
+ if (!(this instanceof AudioRenderCapacity)) {
62
+ throw new TypeError(`Invalid Invocation: Value of 'this' must be of type 'AudioRenderCapacity'`);
63
+ }
64
+
65
+ let targetOptions = {};
66
+
67
+ if (typeof options === 'object' && options !== null) {
68
+ if (!('updateInterval' in options)) {
69
+ throw new TypeError(`Failed to execute 'start' on 'AudioRenderCapacity': Failed to read the 'updateInterval' property on 'AudioRenderCapacityOptions'`);
70
+ }
71
+
72
+ targetOptions.updateInterval = conversions['double'](options.updateInterval, {
73
+ context: `Failed to execute 'start' on 'AudioRenderCapacity': Failed to read the 'updateInterval' property on 'AudioRenderCapacityOptions': The provided value ()`
74
+ });
75
+ } else {
76
+ targetOptions.updateInterval = 1;
77
+ }
78
+
79
+ return this[kNapiObj].start(targetOptions);
80
+ }
81
+
82
+ stop() {
83
+ if (!(this instanceof AudioRenderCapacity)) {
84
+ throw new TypeError(`Invalid Invocation: Value of 'this' must be of type 'AudioRenderCapacity'`);
85
+ }
86
+
87
+ return this[kNapiObj].start();
88
+ }
89
+ }
90
+
91
+ Object.defineProperties(AudioRenderCapacity, {
92
+ length: {
93
+ __proto__: null,
94
+ writable: false,
95
+ enumerable: false,
96
+ configurable: true,
97
+ value: 0,
98
+ },
99
+ });
100
+
101
+ Object.defineProperties(AudioRenderCapacity.prototype, {
102
+ [Symbol.toStringTag]: {
103
+ __proto__: null,
104
+ writable: false,
105
+ enumerable: false,
106
+ configurable: true,
107
+ value: 'AudioRenderCapacity',
108
+ },
109
+
110
+ onupdate: kEnumerableProperty,
111
+ stop: kEnumerableProperty,
112
+ stop: kEnumerableProperty,
113
+ });
114
+
115
+ module.exports = AudioRenderCapacity;
116
+
117
+
@@ -33,14 +33,15 @@ class AudioScheduledSourceNode extends AudioNode {
33
33
 
34
34
  // Add function to Napi object to bridge from Rust events to JS EventTarget
35
35
  // It will be effectively registered on rust side when `start` is called
36
- this[kNapiObj][kOnEnded] = (err, rawEvent) => {
37
- if (typeof rawEvent !== 'object' && !('type' in rawEvent)) {
38
- throw new TypeError('Invalid [kOnEnded] Invocation: rawEvent should have a type property');
39
- }
40
-
36
+ //
37
+ // Note 2024-06-05 - We use bind instead of arrow function because arrow function
38
+ // prevent the node to be collected by Scavenge step of GC, which can lead to
39
+ // oversized graphs and performance issues.
40
+ // cf. https://github.com/ircam-ismm/node-web-audio-api/tree/fix/118
41
+ this[kNapiObj][kOnEnded] = (function(_err, rawEvent) {
41
42
  const event = new Event(rawEvent.type);
42
43
  propagateEvent(this, event);
43
- };
44
+ }).bind(this);
44
45
  }
45
46
 
46
47
  get onended() {