agora-electron-sdk 4.2.2-dev.6 → 4.2.2-dev.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,19 +1,19 @@
1
1
 
2
2
 
3
- ## [4.2.2-dev.6](https://github.com/AgoraIO-Extensions/Electron-SDK/compare/v4.2.1...v4.2.2-dev.6) (2023-07-31)
3
+ ## [4.2.2-dev.7](https://github.com/AgoraIO-Extensions/Electron-SDK/compare/v4.2.2...v4.2.2-dev.7) (2023-08-04)
4
+
5
+ ## [4.2.2](https://github.com/AgoraIO-Extensions/Electron-SDK/compare/v4.2.1...v4.2.2) (2023-08-01)
4
6
 
5
7
 
6
8
  ### Bug Fixes
7
9
 
8
- * CSD-57577 ex callback issue ([#1032](https://github.com/AgoraIO-Extensions/Electron-SDK/issues/1032)) ([82b53ac](https://github.com/AgoraIO-Extensions/Electron-SDK/commit/82b53ac89ee8d4c18ce2fe3608d176d1b4140103))
9
10
  * CSD-57615 render mode not working on software renderer ([1097ebc](https://github.com/AgoraIO-Extensions/Electron-SDK/commit/1097ebcdf21cb6f5c66821174e8f46ec470976db))
10
11
  * CSD-57699 removeEventListener `webglcontextlost` while unbind ([2a280c8](https://github.com/AgoraIO-Extensions/Electron-SDK/commit/2a280c87242fe3abecb61bc23ffb0996029d6fec))
11
- * CSD-58183 getScreenCaptureSources memory leak ([0bc9ee6](https://github.com/AgoraIO-Extensions/Electron-SDK/commit/0bc9ee6700a52979fa2c1d72cb3b738d5786f8a8))
12
- * getApiTypeFromPreload ([d79444d](https://github.com/AgoraIO-Extensions/Electron-SDK/commit/d79444d169abaa08b5361cad382fa8e03e7fdf5b))
13
- * getApiTypeFromPreloadChannelWithUserAccount ([426f1d8](https://github.com/AgoraIO-Extensions/Electron-SDK/commit/426f1d8073647321b7290b5495c7cacc65ae77a7))
14
- * jira NMS-13855 ([04baa1e](https://github.com/AgoraIO-Extensions/Electron-SDK/commit/04baa1edf4acc900a7c2e0e9798b613c367ed9e5))
15
- * NMS-14097 app css issue ([8e1a2cc](https://github.com/AgoraIO-Extensions/Electron-SDK/commit/8e1a2cccf93ff5a908f437dfc8e8c582051968d7))
16
- * some error ([4e74203](https://github.com/AgoraIO-Extensions/Electron-SDK/commit/4e7420350c516e7b940c9e7a9ca3b9571c587239))
12
+
13
+
14
+ ### Features
15
+
16
+ * support native 4.2.2 ([#1033](https://github.com/AgoraIO-Extensions/Electron-SDK/issues/1033)) ([f151e69](https://github.com/AgoraIO-Extensions/Electron-SDK/commit/f151e69abf22286eb5fbec3bcee8f703188c7515))
17
17
 
18
18
  ## [4.2.1](https://github.com/AgoraIO-Extensions/Electron-SDK/compare/v4.2.0...v4.2.1) (2023-06-30)
19
19
 
@@ -1737,6 +1737,18 @@ var LocalVideoStreamError;
1737
1737
  * @ignore
1738
1738
  */
1739
1739
  LocalVideoStreamError[LocalVideoStreamError["LocalVideoStreamErrorScreenCaptureNoPermission"] = 22] = "LocalVideoStreamErrorScreenCaptureNoPermission";
1740
+ /**
1741
+ * @ignore
1742
+ */
1743
+ LocalVideoStreamError[LocalVideoStreamError["LocalVideoStreamErrorScreenCapturePaused"] = 23] = "LocalVideoStreamErrorScreenCapturePaused";
1744
+ /**
1745
+ * @ignore
1746
+ */
1747
+ LocalVideoStreamError[LocalVideoStreamError["LocalVideoStreamErrorScreenCaptureResumed"] = 24] = "LocalVideoStreamErrorScreenCaptureResumed";
1748
+ /**
1749
+ * @ignore
1750
+ */
1751
+ LocalVideoStreamError[LocalVideoStreamError["LocalVideoStreamErrorScreenCaptureWindowRecoverFromMinimized"] = 25] = "LocalVideoStreamErrorScreenCaptureWindowRecoverFromMinimized";
1740
1752
  })(LocalVideoStreamError = exports.LocalVideoStreamError || (exports.LocalVideoStreamError = {}));
1741
1753
  /**
1742
1754
  * Remote audio states.
@@ -2447,6 +2459,10 @@ var ConnectionChangedReasonType;
2447
2459
  * @ignore
2448
2460
  */
2449
2461
  ConnectionChangedReasonType[ConnectionChangedReasonType["ConnectionChangedLicenseValidationFailure"] = 21] = "ConnectionChangedLicenseValidationFailure";
2462
+ /**
2463
+ * @ignore
2464
+ */
2465
+ ConnectionChangedReasonType[ConnectionChangedReasonType["ConnectionChangedCertificationVeryfyFailure"] = 22] = "ConnectionChangedCertificationVeryfyFailure";
2450
2466
  })(ConnectionChangedReasonType = exports.ConnectionChangedReasonType || (exports.ConnectionChangedReasonType = {}));
2451
2467
  /**
2452
2468
  * The reason for a user role switch failure.
@@ -2776,11 +2792,11 @@ var AudioTrackType;
2776
2792
  */
2777
2793
  AudioTrackType[AudioTrackType["AudioTrackInvalid"] = -1] = "AudioTrackInvalid";
2778
2794
  /**
2779
- * 0: Mixable audio tracks. You can publish multiple mixable audio tracks in one channel, and SDK will automatically mix these tracks into one. The latency of mixable audio tracks is higher than that of direct audio tracks.
2795
+ * 0: Mixable audio tracks. This type of audio track supports mixing with other audio streams (such as audio streams captured by microphone) and playing locally or publishing to channels after mixing. The latency of mixable audio tracks is higher than that of direct audio tracks.
2780
2796
  */
2781
2797
  AudioTrackType[AudioTrackType["AudioTrackMixable"] = 0] = "AudioTrackMixable";
2782
2798
  /**
2783
- * 1: Direct audio tracks. When creating multiple audio tracks of this type, each direct audio track can only be published in one channel and cannot be mixed with others. The latency of direct audio tracks is lower than that of mixable audio tracks.
2799
+ * 1: Direct audio tracks. This type of audio track will replace the audio streams captured by the microphone and does not support mixing with other audio streams. The latency of direct audio tracks is lower than that of mixable audio tracks. If AudioTrackDirect is specified for this parameter, you must set publishMicrophoneTrack to false in ChannelMediaOptions when calling joinChannel to join the channel; otherwise, joining the channel fails and returns the error code -2.
2784
2800
  */
2785
2801
  AudioTrackType[AudioTrackType["AudioTrackDirect"] = 1] = "AudioTrackDirect";
2786
2802
  })(AudioTrackType = exports.AudioTrackType || (exports.AudioTrackType = {}));
@@ -381,6 +381,10 @@ var VideoPixelFormat;
381
381
  * 16: The format is I422.
382
382
  */
383
383
  VideoPixelFormat[VideoPixelFormat["VideoPixelI422"] = 16] = "VideoPixelI422";
384
+ /**
385
+ * @ignore
386
+ */
387
+ VideoPixelFormat[VideoPixelFormat["VideoTextureId3d11texture2d"] = 17] = "VideoTextureId3d11texture2d";
384
388
  })(VideoPixelFormat = exports.VideoPixelFormat || (exports.VideoPixelFormat = {}));
385
389
  /**
386
390
  * Video display modes.
@@ -451,7 +451,7 @@ exports.ImageTrackOptions = ImageTrackOptions;
451
451
  /**
452
452
  * The channel media options.
453
453
  *
454
- * Agora supports publishing multiple audio streams and one video stream at the same time and in the same RtcConnection. For example, publishMicrophoneTrack, publishAudioTrack, publishCustomAudioTrack, and publishMediaPlayerAudioTrack can be set as true at the same time, but only one of publishCameraTrack, publishScreenTrack, publishCustomVideoTrack, or publishEncodedVideoTrack can be set as true. Agora recommends that you set member parameter values yourself according to your business scenario, otherwise the SDK will automatically assign values to member parameters.
454
+ * Agora supports publishing multiple audio streams and one video stream at the same time and in the same RtcConnection. For example, publishMicrophoneTrack, publishCustomAudioTrack, and publishMediaPlayerAudioTrack can be set as true at the same time, but only one of publishCameraTrack, publishScreenTrack, publishCustomVideoTrack, or publishEncodedVideoTrack can be set as true. Agora recommends that you set member parameter values yourself according to your business scenario, otherwise the SDK will automatically assign values to member parameters.
455
455
  */
456
456
  var ChannelMediaOptions = /** @class */ (function () {
457
457
  function ChannelMediaOptions() {
@@ -235,6 +235,15 @@ exports.EVENT_PROCESSORS = {
235
235
  handlers: function () { return MusicContentCenterInternal_1.MusicContentCenterInternal._event_handlers; },
236
236
  },
237
237
  };
238
+ // some events are not needed, so ignore them
239
+ function isIgnoredEvent(event, data) {
240
+ if (event === 'onLocalVideoStats' && 'connection' in data) {
241
+ return true;
242
+ }
243
+ else {
244
+ return false;
245
+ }
246
+ }
238
247
  function handleEvent() {
239
248
  var _a, _b;
240
249
  var _c = [];
@@ -268,6 +277,9 @@ function handleEvent() {
268
277
  if (_event.endsWith('Ex')) {
269
278
  _event = _event.replace(/Ex$/g, '');
270
279
  }
280
+ if (isIgnoredEvent(_event, _data)) {
281
+ return false;
282
+ }
271
283
  var _buffers = buffers;
272
284
  if (processor.preprocess) {
273
285
  processor.preprocess(_event, _data, _buffers);
@@ -282,6 +294,7 @@ function handleEvent() {
282
294
  });
283
295
  }
284
296
  emitEvent(_event, processor, _data);
297
+ return true;
285
298
  }
286
299
  /**
287
300
  * @internal
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "agora-electron-sdk",
3
- "version": "4.2.2-dev.6",
3
+ "version": "4.2.2-dev.7",
4
4
  "description": "agora-electron-sdk",
5
5
  "main": "js/AgoraSdk",
6
6
  "types": "types/AgoraSdk.d.ts",
@@ -134,7 +134,7 @@
134
134
  "yuv-canvas": "1.2.6"
135
135
  },
136
136
  "agora_electron": {
137
- "iris_sdk_win": "https://download.agora.io/sdk/release/iris_4.2.2-dev.6_DCG_Windows_Video_20230704_0337.zip",
138
- "iris_sdk_mac": "https://download.agora.io/sdk/release/iris_4.2.2-dev.6_DCG_Mac_Video_20230704_0337.zip"
137
+ "iris_sdk_win": "https://download.agora.io/sdk/release/iris_4.2.2-custom.3_DCG_Windows_Video_20230804_0633.zip",
138
+ "iris_sdk_mac": "https://download.agora.io/sdk/release/iris_4.2.2-build.1_DCG_Mac_Video_20230727_1159.zip"
139
139
  }
140
140
  }
@@ -2081,6 +2081,18 @@ export enum LocalVideoStreamError {
2081
2081
  * @ignore
2082
2082
  */
2083
2083
  LocalVideoStreamErrorScreenCaptureNoPermission = 22,
2084
+ /**
2085
+ * @ignore
2086
+ */
2087
+ LocalVideoStreamErrorScreenCapturePaused = 23,
2088
+ /**
2089
+ * @ignore
2090
+ */
2091
+ LocalVideoStreamErrorScreenCaptureResumed = 24,
2092
+ /**
2093
+ * @ignore
2094
+ */
2095
+ LocalVideoStreamErrorScreenCaptureWindowRecoverFromMinimized = 25,
2084
2096
  }
2085
2097
 
2086
2098
  /**
@@ -3126,6 +3138,10 @@ export enum ConnectionChangedReasonType {
3126
3138
  * @ignore
3127
3139
  */
3128
3140
  ConnectionChangedLicenseValidationFailure = 21,
3141
+ /**
3142
+ * @ignore
3143
+ */
3144
+ ConnectionChangedCertificationVeryfyFailure = 22,
3129
3145
  }
3130
3146
 
3131
3147
  /**
@@ -3558,11 +3574,11 @@ export enum AudioTrackType {
3558
3574
  */
3559
3575
  AudioTrackInvalid = -1,
3560
3576
  /**
3561
- * 0: Mixable audio tracks. You can publish multiple mixable audio tracks in one channel, and SDK will automatically mix these tracks into one. The latency of mixable audio tracks is higher than that of direct audio tracks.
3577
+ * 0: Mixable audio tracks. This type of audio track supports mixing with other audio streams (such as audio streams captured by microphone) and playing locally or publishing to channels after mixing. The latency of mixable audio tracks is higher than that of direct audio tracks.
3562
3578
  */
3563
3579
  AudioTrackMixable = 0,
3564
3580
  /**
3565
- * 1: Direct audio tracks. When creating multiple audio tracks of this type, each direct audio track can only be published in one channel and cannot be mixed with others. The latency of direct audio tracks is lower than that of mixable audio tracks.
3581
+ * 1: Direct audio tracks. This type of audio track will replace the audio streams captured by the microphone and does not support mixing with other audio streams. The latency of direct audio tracks is lower than that of mixable audio tracks. If AudioTrackDirect is specified for this parameter, you must set publishMicrophoneTrack to false in ChannelMediaOptions when calling joinChannel to join the channel; otherwise, joining the channel fails and returns the error code -2.
3566
3582
  */
3567
3583
  AudioTrackDirect = 1,
3568
3584
  }
@@ -3824,9 +3840,9 @@ export enum HeadphoneEqualizerPreset {
3824
3840
  */
3825
3841
  export class ScreenCaptureParameters {
3826
3842
  /**
3827
- * The video encoding resolution of the shared screen stream. See VideoDimensions. The default value is 1920 × 1080, that is, 2,073,600 pixels. Agora uses the value of this parameter to calculate the charges. If the screen dimensions are different from the value of this parameter, Agora applies the following strategies for encoding. Suppose
3843
+ * The video encoding resolution of the shared screen stream. See VideoDimensions. The default value is 1920 × 1080, that is, 2,073,600 pixels. Agora uses the value of this parameter to calculate the charges. If the screen dimensions are different from the value of this parameter, Agora applies the following strategies for encoding. Suppose is set to 1920 × 1080:
3828
3844
  * If the value of the screen dimensions is lower than that of dimensions, for example, 1000 × 1000 pixels, the SDK uses the screen dimensions, that is, 1000 × 1000 pixels, for encoding.
3829
- * If the value of the screen dimensions is higher than that of dimensions, for example, 2000 × 1500, the SDK uses the maximum value under
3845
+ * If the value of the screen dimensions is higher than that of dimensions, for example, 2000 × 1500, the SDK uses the maximum value under with the aspect ratio of the screen dimension (4:3) for encoding, that is, 1440 × 1080.
3830
3846
  */
3831
3847
  dimensions?: VideoDimensions;
3832
3848
  /**
@@ -433,6 +433,10 @@ export enum VideoPixelFormat {
433
433
  * 16: The format is I422.
434
434
  */
435
435
  VideoPixelI422 = 16,
436
+ /**
437
+ * @ignore
438
+ */
439
+ VideoTextureId3d11texture2d = 17,
436
440
  }
437
441
 
438
442
  /**
@@ -575,6 +579,14 @@ export class ExternalVideoFrame {
575
579
  * @ignore
576
580
  */
577
581
  alphaBuffer?: Uint8Array;
582
+ /**
583
+ * @ignore
584
+ */
585
+ d3d11_texture_2d?: any;
586
+ /**
587
+ * @ignore
588
+ */
589
+ texture_slice_index?: number;
578
590
  }
579
591
 
580
592
  /**
@@ -643,6 +655,10 @@ export class VideoFrame {
643
655
  * This parameter only applies to video data in Texture format. Texture ID.
644
656
  */
645
657
  textureId?: number;
658
+ /**
659
+ * @ignore
660
+ */
661
+ d3d11Texture2d?: any;
646
662
  /**
647
663
  * This parameter only applies to video data in Texture format. Incoming 4 × 4 transformational matrix. The typical value is a unit matrix.
648
664
  */
@@ -151,12 +151,11 @@ export abstract class IMediaEngine {
151
151
  /**
152
152
  * Sets the external audio source parameters.
153
153
  *
154
- * Call this method before joining a channel.
154
+ * Deprecated: This method is deprecated, use createCustomAudioTrack instead. Call this method before joining a channel.
155
155
  *
156
156
  * @param enabled Whether to enable the external audio source: true : Enable the external audio source. false : (Default) Disable the external audio source.
157
157
  * @param sampleRate The sample rate (Hz) of the external audio source which can be set as 8000, 16000, 32000, 44100, or 48000.
158
158
  * @param channels The number of channels of the external audio source, which can be set as 1 (Mono) or 2 (Stereo).
159
- * @param sourceNumber The number of external audio sources. The value of this parameter should be larger than 0. The SDK creates a corresponding number of custom audio tracks based on this parameter value and names the audio tracks starting from 0. In ChannelMediaOptions, you can set publishCustomAudioSourceId to the audio track ID you want to publish.
160
159
  * @param localPlayback Whether to play the external audio source: true : Play the external audio source. false : (Default) Do not play the external source.
161
160
  * @param publish Whether to publish audio to the remote users: true : (Default) Publish audio to the remote users. false : Do not publish audio to the remote users.
162
161
  *
@@ -173,14 +172,14 @@ export abstract class IMediaEngine {
173
172
  ): number;
174
173
 
175
174
  /**
176
- * Creates a customized audio track.
175
+ * Creates a custom audio track.
177
176
  *
178
- * When you need to publish multiple custom captured audios in the channel, you can refer to the following steps:
177
+ * To publish a custom audio source to multiple channels, see the following steps:
179
178
  * Call this method to create a custom audio track and get the audio track ID.
180
179
  * In ChannelMediaOptions of each channel, set publishCustomAduioTrackId to the audio track ID that you want to publish, and set publishCustomAudioTrack to true.
181
- * If you call pushAudioFrame trackId as the audio track ID set in step 2, you can publish the corresponding custom audio source in multiple channels.
180
+ * If you call pushAudioFrame, and specify trackId as the audio track ID set in step 2, you can publish the corresponding custom audio source in multiple channels.
182
181
  *
183
- * @param trackType The type of the custom audio track. See AudioTrackType.
182
+ * @param trackType The type of the custom audio track. See AudioTrackType. If AudioTrackDirect is specified for this parameter, you must set publishMicrophoneTrack to false in ChannelMediaOptions when calling joinChannel to join the channel; otherwise, joining the channel fails and returns the error code -2.
184
183
  * @param config The configuration of the custom audio track. See AudioTrackConfig.
185
184
  *
186
185
  * @returns
@@ -888,7 +888,7 @@ export class ScreenCaptureConfiguration {
888
888
  */
889
889
  params?: ScreenCaptureParameters;
890
890
  /**
891
- * Rectangle. If you do not set this parameter, the SDK shares the whole screen. If the region you set exceeds the boundary of the screen, only the region within in the screen is shared. If you set width or height in Rectangle as 0, the whole screen is shared.
891
+ * The relative position of the shared region to the whole screen. See Rectangle. If you do not set this parameter, the SDK shares the whole screen. If the region you set exceeds the boundary of the screen, only the region within in the screen is shared. If you set width or height in Rectangle as 0, the whole screen is shared.
892
892
  */
893
893
  regionRect?: Rectangle;
894
894
  }
@@ -970,11 +970,11 @@ export class ScreenCaptureSourceInfo {
970
970
  */
971
971
  sourceName?: string;
972
972
  /**
973
- * The image content of the thumbnail. See ThumbImageBuffer
973
+ * The image content of the thumbnail. See ThumbImageBuffer.
974
974
  */
975
975
  thumbImage?: ThumbImageBuffer;
976
976
  /**
977
- * The image content of the icon. See ThumbImageBuffer
977
+ * The image content of the icon. See ThumbImageBuffer.
978
978
  */
979
979
  iconImage?: ThumbImageBuffer;
980
980
  /**
@@ -1038,7 +1038,7 @@ export class ImageTrackOptions {
1038
1038
  /**
1039
1039
  * The channel media options.
1040
1040
  *
1041
- * Agora supports publishing multiple audio streams and one video stream at the same time and in the same RtcConnection. For example, publishMicrophoneTrack, publishAudioTrack, publishCustomAudioTrack, and publishMediaPlayerAudioTrack can be set as true at the same time, but only one of publishCameraTrack, publishScreenTrack, publishCustomVideoTrack, or publishEncodedVideoTrack can be set as true. Agora recommends that you set member parameter values yourself according to your business scenario, otherwise the SDK will automatically assign values to member parameters.
1041
+ * Agora supports publishing multiple audio streams and one video stream at the same time and in the same RtcConnection. For example, publishMicrophoneTrack, publishCustomAudioTrack, and publishMediaPlayerAudioTrack can be set as true at the same time, but only one of publishCameraTrack, publishScreenTrack, publishCustomVideoTrack, or publishEncodedVideoTrack can be set as true. Agora recommends that you set member parameter values yourself according to your business scenario, otherwise the SDK will automatically assign values to member parameters.
1042
1042
  */
1043
1043
  export class ChannelMediaOptions {
1044
1044
  /**
@@ -1154,7 +1154,7 @@ export class ChannelMediaOptions {
1154
1154
  */
1155
1155
  mediaPlayerAudioDelayMs?: number;
1156
1156
  /**
1157
- * (Optional) The token generated on your server for authentication. See
1157
+ * (Optional) The token generated on your server for authentication.
1158
1158
  * This parameter takes effect only when calling updateChannelMediaOptions or updateChannelMediaOptionsEx.
1159
1159
  * Ensure that the App ID, channel name, and user name used for creating the token are the same as those used by the initialize method for initializing the RTC engine, and those used by the joinChannel and joinChannelEx methods for joining the channel.
1160
1160
  */
@@ -1592,7 +1592,7 @@ export interface IRtcEngineEventHandler {
1592
1592
  ): void;
1593
1593
 
1594
1594
  /**
1595
- * Occurs when a remote user (in the communication profile)/ host (in the live streaming profile) leaves the channel.
1595
+ * Occurs when a remote user (in the communication profile)/ host (in the live streaming profile) joins the channel.
1596
1596
  *
1597
1597
  * In a communication channel, this callback indicates that a remote user joins the channel. The SDK also triggers this callback to report the existing users in the channel when a user joins the channel.
1598
1598
  * In a live-broadcast channel, this callback indicates that a host joins the channel. The SDK also triggers this callback to report the existing hosts in the channel when a host joins the channel. Agora recommends limiting the number of hosts to 17. The SDK triggers this callback under one of the following circumstances:
@@ -1878,7 +1878,7 @@ export interface IRtcEngineEventHandler {
1878
1878
  *
1879
1879
  * When the token expires during a call, the SDK triggers this callback to remind the app to renew the token. When receiving this callback, you need to generate a new token on your token server and you can renew your token through one of the following ways:
1880
1880
  * Call renewToken to pass in the new token.
1881
- * Call to leave the current channel and then pass in the new token when you call joinChannel to join a channel.
1881
+ * Call leaveChannel to leave the current channel and then pass in the new token when you call joinChannel to join a channel.
1882
1882
  *
1883
1883
  * @param connection The connection information. See RtcConnection.
1884
1884
  */
@@ -2051,7 +2051,7 @@ export interface IRtcEngineEventHandler {
2051
2051
  /**
2052
2052
  * Occurs when the user role switching fails in the interactive live streaming.
2053
2053
  *
2054
- * In the live broadcasting channel profile, when the local user calls to switch the user role after joining the channel but the switch fails, the SDK triggers this callback to report the reason for the failure and the current user role.
2054
+ * In the live broadcasting channel profile, when the local user calls setClientRole to switch the user role after joining the channel but the switch fails, the SDK triggers this callback to report the reason for the failure and the current user role.
2055
2055
  *
2056
2056
  * @param connection The connection information. See RtcConnection.
2057
2057
  * @param reason The reason for a user role switch failure. See ClientRoleChangeFailedReason.
@@ -2888,7 +2888,7 @@ export abstract class IRtcEngine {
2888
2888
  abstract queryCodecCapability(): { codecInfo: CodecCapInfo[]; size: number };
2889
2889
 
2890
2890
  /**
2891
- * Preloads a channel with token, channelId uid
2891
+ * Preloads a channel with token, channelId, and uid.
2892
2892
  *
2893
2893
  * When audience members need to switch between different channels frequently, calling the method can help shortening the time of joining a channel, thus reducing the time it takes for audience members to hear and see the host. As it may take a while for the SDK to preload a channel, Agora recommends that you call this method as soon as possible after obtaining the channel name and user ID to join a channel.
2894
2894
  * When calling this method, ensure you set the user role as audience and do not set the audio scenario as AudioScenarioChorus, otherwise, this method does not take effect.
@@ -2922,7 +2922,7 @@ export abstract class IRtcEngine {
2922
2922
  ): number;
2923
2923
 
2924
2924
  /**
2925
- * Preloads a channel with token, channelId userAccount.
2925
+ * Preloads a channel with token, channelId, and userAccount.
2926
2926
  *
2927
2927
  * When audience members need to switch between different channels frequently, calling the method can help shortening the time of joining a channel, thus reducing the time it takes for audience members to hear and see the host. As it may take a while for the SDK to preload a channel, Agora recommends that you call this method as soon as possible after obtaining the channel name and user ID to join a channel. If you join a preloaded channel, leave it and want to rejoin the same channel, you do not need to call this method unless the token for preloading the channel expires.
2928
2928
  * Failing to preload a channel does not mean that you can't join a channel, nor will it increase the time of joining a channel.
@@ -3130,13 +3130,13 @@ export abstract class IRtcEngine {
3130
3130
  *
3131
3131
  * In scenarios where there are existing cameras to capture video, Agora recommends that you use the following steps to capture and publish video with multiple cameras:
3132
3132
  * Call this method to enable multi-channel camera capture.
3133
- * Call to start the local video preview.
3133
+ * Call startPreview to start the local video preview.
3134
3134
  * Call startCameraCapture, and set sourceType to start video capture with the second camera.
3135
3135
  * Call joinChannelEx, and set publishSecondaryCameraTrack to true to publish the video stream captured by the second camera in the channel. If you want to disable multi-channel camera capture, use the following steps:
3136
3136
  * Call stopCameraCapture.
3137
- * Call this method with enabled set to false. You can call this method before and after to enable multi-camera capture:
3138
- * If it is enabled before, the local video preview shows the image captured by the two cameras at the same time.
3139
- * If it is enabled after, the SDK stops the current camera capture first, and then enables the primary camera and the second camera. The local video preview appears black for a short time, and then automatically returns to normal. When using this function, ensure that the system version is 13.0 or later. The minimum iOS device types that support multi-camera capture are as follows:
3137
+ * Call this method with enabled set to false. You can call this method before and after startPreview to enable multi-camera capture:
3138
+ * If it is enabled before startPreview, the local video preview shows the image captured by the two cameras at the same time.
3139
+ * If it is enabled after startPreview, the SDK stops the current camera capture first, and then enables the primary camera and the second camera. The local video preview appears black for a short time, and then automatically returns to normal. When using this function, ensure that the system version is 13.0 or later. The minimum iOS device types that support multi-camera capture are as follows:
3140
3140
  * iPhone XR
3141
3141
  * iPhone XS
3142
3142
  * iPhone XS Max
@@ -3257,12 +3257,14 @@ export abstract class IRtcEngine {
3257
3257
  * Sets the image enhancement options.
3258
3258
  *
3259
3259
  * Enables or disables image enhancement, and sets the options.
3260
- * Call this method before calling enableVideo or.
3260
+ * Call this method before calling enableVideo or startPreview.
3261
3261
  * This method relies on the video enhancement dynamic library libagora_clear_vision_extension.dll. If the dynamic library is deleted, the function cannot be enabled normally.
3262
3262
  *
3263
3263
  * @param enabled Whether to enable the image enhancement function: true : Enable the image enhancement function. false : (Default) Disable the image enhancement function.
3264
3264
  * @param options The image enhancement options. See BeautyOptions.
3265
- * @param type The type of the video source, see MediaSourceType.
3265
+ * @param type Type of media source. See MediaSourceType. In this method, this parameter supports only the following two settings:
3266
+ * The default value is UnknownMediaSource.
3267
+ * If you want to use the second camera to capture video, set this parameter to SecondaryCameraSource.
3266
3268
  *
3267
3269
  * @returns
3268
3270
  * 0: Success.
@@ -3352,7 +3354,7 @@ export abstract class IRtcEngine {
3352
3354
  /**
3353
3355
  * Enables/Disables the virtual background.
3354
3356
  *
3355
- * The virtual background feature enables the local user to replace their original background with a static image, dynamic video, blurred background, or portrait-background segmentation to achieve picture-in-picture effect. Once the virtual background feature is enabled, all users in the channel can see the custom background. Call this method before calling enableVideo or.
3357
+ * The virtual background feature enables the local user to replace their original background with a static image, dynamic video, blurred background, or portrait-background segmentation to achieve picture-in-picture effect. Once the virtual background feature is enabled, all users in the channel can see the custom background. Call this method before calling enableVideo or startPreview.
3356
3358
  * This feature requires high performance devices. Agora recommends that you implement it on devices equipped with the following chips:
3357
3359
  * Devices with an i5 CPU and better
3358
3360
  * Agora recommends that you use this feature in scenarios that meet the following conditions:
@@ -3621,7 +3623,7 @@ export abstract class IRtcEngine {
3621
3623
  /**
3622
3624
  * Sets the stream type of the remote video.
3623
3625
  *
3624
- * Under limited network conditions, if the publisher has not disabled the dual-stream mode using enableDualStreamMode (false), the receiver can choose to receive either the high-quality video stream or the low-quality video stream. The high-quality video stream has a higher resolution and bitrate, and the low-quality video stream has a lower resolution and bitrate. By default, users receive the high-quality video stream. Call this method if you want to switch to the low-quality video stream. This method allows the app to adjust the corresponding video stream type based on the size of the video window to reduce the bandwidth and resources. The aspect ratio of the low-quality video stream is the same as the high-quality video stream. Once the resolution of the high-quality video stream is set, the system automatically sets the resolution, frame rate, and bitrate of the low-quality video stream. The SDK enables the low-quality video stream auto mode on the sender by default (not actively sending low-quality video streams). The host at the receiving end can call this method to initiate a low-quality video stream stream request on the receiving end, and the sender automatically switches to the low-quality video stream mode after receiving the request. You can call this method either before or after joining a channel. If you call both setRemoteVideoStreamType and setRemoteDefaultVideoStreamType, the setting of setRemoteVideoStreamType takes effect.
3626
+ * Under limited network conditions, if the publisher has not disabled the dual-stream mode using enableDualStreamMode (false), the receiver can choose to receive either the high-quality video stream or the low-quality video stream. The high-quality video stream has a higher resolution and bitrate, and the low-quality video stream has a lower resolution and bitrate. By default, users receive the high-quality video stream. Call this method if you want to switch to the low-quality video stream. This method allows the app to adjust the corresponding video stream type based on the size of the video window to reduce the bandwidth and resources. The aspect ratio of the low-quality video stream is the same as the high-quality video stream. Once the resolution of the high-quality video stream is set, the system automatically sets the resolution, frame rate, and bitrate of the low-quality video stream. By default, the SDK enables the low-quality video stream auto mode on the sending end (it does not actively send the low-quality video stream). The host identity receiver can initiate a low-quality video stream application at the receiving end by calling this method (the call to this method by the audience receiver does not take effect). After receiving the application, the sending end automatically switches to the low-quality video stream mode. You can call this method either before or after joining a channel. If you call both setRemoteVideoStreamType and setRemoteDefaultVideoStreamType, the setting of setRemoteVideoStreamType takes effect.
3625
3627
  *
3626
3628
  * @param uid The user ID.
3627
3629
  * @param streamType The video stream type: VideoStreamType.
@@ -3662,7 +3664,7 @@ export abstract class IRtcEngine {
3662
3664
  /**
3663
3665
  * Sets the default stream type of subscrption for remote video streams.
3664
3666
  *
3665
- * The SDK enables the low-quality video stream auto mode on the sender by default (not actively sending low-quality video streams). The host at the receiving end can call this method to initiate a low-quality video stream stream request on the receiving end, and the sender automatically switches to the low-quality video stream mode after receiving the request. Under limited network conditions, if the publisher has not disabled the dual-stream mode using enableDualStreamMode (false), the receiver can choose to receive either the high-quality video stream or the low-quality video stream. The high-quality video stream has a higher resolution and bitrate, and the low-quality video stream has a lower resolution and bitrate. By default, users receive the high-quality video stream. Call this method if you want to switch to the low-quality video stream. This method allows the app to adjust the corresponding video stream type based on the size of the video window to reduce the bandwidth and resources. The aspect ratio of the low-quality video stream is the same as the high-quality video stream. Once the resolution of the high-quality video stream is set, the system automatically sets the resolution, frame rate, and bitrate of the low-quality video stream.
3667
+ * By default, the SDK enables the low-quality video stream auto mode on the sending end (it does not actively send the low-quality video stream). The host identity receiver can initiate a low-quality video stream application at the receiving end by calling this method (the call to this method by the audience receiver does not take effect). After receiving the application, the sending end automatically switches to the low-quality video stream mode. Under limited network conditions, if the publisher has not disabled the dual-stream mode using enableDualStreamMode (false), the receiver can choose to receive either the high-quality video stream or the low-quality video stream. The high-quality video stream has a higher resolution and bitrate, and the low-quality video stream has a lower resolution and bitrate. By default, users receive the high-quality video stream. Call this method if you want to switch to the low-quality video stream. This method allows the app to adjust the corresponding video stream type based on the size of the video window to reduce the bandwidth and resources. The aspect ratio of the low-quality video stream is the same as the high-quality video stream. Once the resolution of the high-quality video stream is set, the system automatically sets the resolution, frame rate, and bitrate of the low-quality video stream.
3666
3668
  * Call this method before joining a channel. The SDK does not support changing the default subscribed video stream type after joining a channel.
3667
3669
  * If you call both this method and setRemoteVideoStreamType, the SDK applies the settings in the setRemoteVideoStreamType method.
3668
3670
  *
@@ -4740,10 +4742,12 @@ export abstract class IRtcEngine {
4740
4742
  /**
4741
4743
  * Sets dual-stream mode configuration on the sender, and sets the low-quality video stream.
4742
4744
  *
4743
- * The difference and connection between this method and is as follows:
4744
- * When calling this method and setting mode to DisableSimulcastStream, it has the same effect as (false).
4745
- * When calling this method and setting mode to EnableSimulcastStream, it has the same effect as (true).
4746
- * Both methods can be called before and after joining a channel. If both methods are used, the settings in the method called later takes precedence. The SDK enables the low-quality video stream auto mode on the sender by default, which is equivalent to calling this method and setting the mode to AutoSimulcastStream. If you want to modify this behavior, you can call this method and modify the mode to DisableSimulcastStream (never send low-quality video streams) or EnableSimulcastStream (always send low-quality video streams).
4745
+ * The SDK enables the low-quality video stream auto mode on the sender side by default (it does not actively sending low-quality video streams). The host identity receiver can initiate a low-quality video stream application at the receiving end by calling setRemoteVideoStreamType. After receiving the application, the sending end automatically switches to the low-quality video stream mode.
4746
+ * If you want to modify this behavior, you can call this method and modify the mode to DisableSimulcastStream (never send low-quality video streams) or EnableSimulcastStream (always send low-quality video streams).
4747
+ * If you want to restore the default behavior after making changes, you can call this method again with mode set to AutoSimulcastStream. The difference and connection between this method and is as follows:
4748
+ * When calling this method and setting mode to DisableSimulcastStream, it has the same effect as calling and setting enabled to false.
4749
+ * When calling this method and setting mode to EnableSimulcastStream, it has the same effect as calling and setting enabled to true.
4750
+ * Both methods can be called before and after joining a channel. If both methods are used, the settings in the method called later takes precedence.
4747
4751
  *
4748
4752
  * @param mode The mode in which the video stream is sent. See SimulcastStreamMode.
4749
4753
  * @param streamConfig The configuration of the low-quality video stream. See SimulcastStreamConfig. When setting mode to DisableSimulcastStream, setting streamConfig will not take effect.
@@ -5168,7 +5172,9 @@ export abstract class IRtcEngine {
5168
5172
  * @param extension The name of the extension.
5169
5173
  * @param key The key of the extension.
5170
5174
  * @param value The value of the extension key.
5171
- * @param type The type of the video source, see MediaSourceType.
5175
+ * @param type Type of media source. See MediaSourceType. In this method, this parameter supports only the following two settings:
5176
+ * The default value is UnknownMediaSource.
5177
+ * If you want to use the second camera to capture video, set this parameter to SecondaryCameraSource.
5172
5178
  *
5173
5179
  * @returns
5174
5180
  * 0: Success.
@@ -5419,7 +5425,7 @@ export abstract class IRtcEngine {
5419
5425
  * Call this method after joining a channel, and then call updateChannelMediaOptions and set publishScreenTrack or publishSecondaryScreenTrack to true to start screen sharing. Deprecated: This method is deprecated. Use startScreenCaptureByDisplayId instead. Agora strongly recommends using startScreenCaptureByDisplayId if you need to start screen sharing on a device connected to another display. This method shares a screen or part of the screen. You need to specify the area of the screen to be shared. This method applies to Windows only.
5420
5426
  *
5421
5427
  * @param screenRect Sets the relative location of the screen to the virtual screen.
5422
- * @param regionRect Rectangle. If the specified region overruns the screen, the SDK shares only the region within it; if you set width or height as 0, the SDK shares the whole screen.
5428
+ * @param regionRect Sets the relative location of the region to the screen. If you do not set this parameter, the SDK shares the whole screen. See Rectangle. If the specified region overruns the screen, the SDK shares only the region within it; if you set width or height as 0, the SDK shares the whole screen.
5423
5429
  * @param captureParams The screen sharing encoding parameters. The default video resolution is 1920 × 1080, that is, 2,073,600 pixels. Agora uses the value of this parameter to calculate the charges. See ScreenCaptureParameters.
5424
5430
  *
5425
5431
  * @returns
@@ -5506,7 +5512,7 @@ export abstract class IRtcEngine {
5506
5512
  *
5507
5513
  * Call this method after starting screen sharing or window sharing.
5508
5514
  *
5509
- * @param captureParams The screen sharing encoding parameters. The default video resolution is 1920 × 1080, that is, 2,073,600 pixels. Agora uses the value of this parameter to calculate the charges. See ScreenCaptureParameters
5515
+ * @param captureParams The screen sharing encoding parameters. The default video resolution is 1920 × 1080, that is, 2,073,600 pixels. Agora uses the value of this parameter to calculate the charges. See ScreenCaptureParameters.
5510
5516
  *
5511
5517
  * @returns
5512
5518
  * 0: Success.
@@ -5710,7 +5716,7 @@ export abstract class IRtcEngine {
5710
5716
  * If you need to mix locally captured video streams, the SDK supports the following capture combinations:
5711
5717
  * On the Windows platform, it supports up to 4 video streams captured by cameras + 4 screen sharing streams.
5712
5718
  * On the macOS platform, it supports up to 4 video streams captured by cameras + 1 screen sharing stream.
5713
- * If you need to mix the locally collected video streams, you need to call this method after startCameraCapture or startScreenCaptureBySourceType
5719
+ * If you need to mix the locally collected video streams, you need to call this method after startCameraCapture or startScreenCaptureBySourceType.
5714
5720
  * If you want to publish the mixed video stream to the channel, you need to set publishTranscodedVideoTrack in ChannelMediaOptions to true when calling joinChannel or updateChannelMediaOptions.
5715
5721
  *
5716
5722
  * @param config Configuration of the local video mixing, see LocalTranscoderConfiguration.
@@ -5728,7 +5734,7 @@ export abstract class IRtcEngine {
5728
5734
  /**
5729
5735
  * Updates the local video mixing configuration.
5730
5736
  *
5731
- * After calling startLocalVideoTranscoder, call this method if you want to update the local video mixing configuration. If you want to update the video source type used for local video mixing, such as adding a second camera or screen to capture video, you need to call this method after startCameraCapture or startScreenCaptureBySourceType
5737
+ * After calling startLocalVideoTranscoder, call this method if you want to update the local video mixing configuration. If you want to update the video source type used for local video mixing, such as adding a second camera or screen to capture video, you need to call this method after startCameraCapture or startScreenCaptureBySourceType.
5732
5738
  *
5733
5739
  * @param config Configuration of the local video mixing, see LocalTranscoderConfiguration.
5734
5740
  *
@@ -5778,6 +5784,7 @@ export abstract class IRtcEngine {
5778
5784
  * Sets the rotation angle of the captured video.
5779
5785
  *
5780
5786
  * This method applies to Windows only.
5787
+ * You must call this method after enableVideo. The setting result will take effect after the camera is successfully turned on, that is, after the SDK triggers the onLocalVideoStateChanged callback and returns the local video state as LocalVideoStreamStateCapturing (1).
5781
5788
  * When the video capture device does not have the gravity sensing function, you can call this method to manually adjust the rotation angle of the captured video.
5782
5789
  *
5783
5790
  * @param type The video source type. See VideoSourceType.
@@ -6067,7 +6074,7 @@ export abstract class IRtcEngine {
6067
6074
  * You can call this method to enable AI noise suppression function. Once enabled, the SDK automatically detects and reduces stationary and non-stationary noise from your audio on the premise of ensuring the quality of human voice. Stationary noise refers to noise signal with constant average statistical properties and negligibly small fluctuations of level within the period of observation. Common sources of stationary noises are:
6068
6075
  * Television;
6069
6076
  * Air conditioner;
6070
- * Machinery, etc. Non-stationary noise refers to noise signal with huge fluctuations of level within the period of observation. Common sources of non-stationary noises are:
6077
+ * Machinery, etc. Non-stationary noise refers to noise signal with huge fluctuations of level within the period of observation; common sources of non-stationary noises are:
6071
6078
  * Thunder;
6072
6079
  * Explosion;
6073
6080
  * Cracking, etc.
@@ -6389,7 +6396,7 @@ export abstract class IRtcEngine {
6389
6396
  *
6390
6397
  * This method takes a snapshot of a video stream from the specified user, generates a JPG image, and saves it to the specified path. The method is asynchronous, and the SDK has not taken the snapshot when the method call returns. After a successful method call, the SDK triggers the onSnapshotTaken callback to report whether the snapshot is successfully taken, as well as the details for that snapshot.
6391
6398
  * Call this method after joining a channel.
6392
- * This method takes a snapshot of the published video stream specified in ChannelMediaOptions.
6399
+ * When used for local video snapshots, this method takes a snapshot for the video streams specified in ChannelMediaOptions.
6393
6400
  * If the user's video has been preprocessed, for example, watermarked or beautified, the resulting snapshot includes the pre-processing effect.
6394
6401
  *
6395
6402
  * @param uid The user ID. Set uid as 0 if you want to take a snapshot of the local user's video.
@@ -6409,7 +6416,7 @@ export abstract class IRtcEngine {
6409
6416
  * When video screenshot and upload function is enabled, the SDK takes screenshots and upload videos sent by local users based on the type and frequency of the module you set in ContentInspectConfig. After video screenshot and upload, the Agora server sends the callback notification to your app server in HTTPS requests and sends all screenshots to the third-party cloud storage service. Before calling this method, ensure that the video screenshot upload service has been activated. Before calling this method, ensure that Video content moderation service has been activated.
6410
6417
  * This method relies on the video screenshot and upload dynamic library libagora_content_inspect_extension.dll. If the dynamic library is deleted, the function cannot be enabled normally.
6411
6418
  *
6412
- * @param enabled Whether to enable video screenshot and upload true : Enables video screenshot and upload. false : Disables video screenshot and upload.
6419
+ * @param enabled Whether to enable video screenshot and upload : true : Enables video screenshot and upload. false : Disables video screenshot and upload.
6413
6420
  * @param config Configuration of video screenshot and upload. See ContentInspectConfig.
6414
6421
  *
6415
6422
  * @returns
@@ -6422,9 +6429,9 @@ export abstract class IRtcEngine {
6422
6429
  ): number;
6423
6430
 
6424
6431
  /**
6425
- * Adjusts the volume of the custom external audio source when it is published in the channel.
6432
+ * Adjusts the volume of the custom audio track played remotely.
6426
6433
  *
6427
- * Ensure you have called the createCustomAudioTrack method to create an external audio track before calling this method. If you want to change the volume of the audio to be published, you need to call this method again.
6434
+ * Ensure you have called the createCustomAudioTrack method to create a custom audio track before calling this method. If you want to change the volume of the audio to be published, you need to call this method again.
6428
6435
  *
6429
6436
  * @param trackId The audio track ID. Set this parameter to the custom audio track ID returned in createCustomAudioTrack.
6430
6437
  * @param volume The volume of the audio source. The value can range from 0 to 100. 0 means mute; 100 means the original volume.
@@ -78,7 +78,7 @@ export abstract class IRtcEngineEx extends IRtcEngine {
78
78
  *
79
79
  * This method lets the user leave the channel, for example, by hanging up or exiting the call. After calling joinChannelEx to join the channel, this method must be called to end the call before starting the next call. This method can be called whether or not a call is currently in progress. This method releases all resources related to the session. This method call is asynchronous. When this method returns, it does not necessarily mean that the user has left the channel. After you leave the channel, the SDK triggers the onLeaveChannel callback. After actually leaving the channel, the local user triggers the onLeaveChannel callback; after the user in the communication scenario and the host in the live streaming scenario leave the channel, the remote user triggers the onUserOffline callback.
80
80
  * If you call release immediately after calling this method, the SDK does not trigger the onLeaveChannel callback.
81
- * Calling leaveChannel will leave the channels when calling joinChannel and joinChannelEx at the same time.
81
+ * Calling leaveChannel will leave the channels joined by calling joinChannel and joinChannelEx.
82
82
  *
83
83
  * @param connection The connection information. See RtcConnection.
84
84
  * @param options The options for leaving the channel. See LeaveChannelOptions. This parameter only supports the stopMicrophoneRecording member in the LeaveChannelOptions settings; setting other members does not take effect.
@@ -522,7 +522,7 @@ export abstract class IRtcEngineEx extends IRtcEngine {
522
522
  /**
523
523
  * Creates a data stream.
524
524
  *
525
- * Creates a data stream. Each user can create up to five data streams in a single channel. Compared with createDataStreamEx, this method does not support data reliability. If a data packet is not received five seconds after it was sent, the SDK directly discards the data.
525
+ * Creates a data stream. Each user can create up to five data streams in a single channel.
526
526
  *
527
527
  * @param config The configurations for the data stream. See DataStreamConfig.
528
528
  * @param connection The connection information. See RtcConnection.
@@ -899,7 +899,7 @@ export abstract class IRtcEngineEx extends IRtcEngine {
899
899
  *
900
900
  * The method is asynchronous, and the SDK has not taken the snapshot when the method call returns. After a successful method call, the SDK triggers the onSnapshotTaken callback to report whether the snapshot is successfully taken, as well as the details for that snapshot. This method takes a snapshot of a video stream from the specified user, generates a JPG image, and saves it to the specified path.
901
901
  * Call this method after the joinChannelEx method.
902
- * This method takes a snapshot of the published video stream specified in ChannelMediaOptions.
902
+ * When used for local video snapshots, this method takes a snapshot for the video streams specified in ChannelMediaOptions.
903
903
  * If the user's video has been preprocessed, for example, watermarked or beautified, the resulting snapshot includes the pre-processing effect.
904
904
  *
905
905
  * @param connection The connection information. See RtcConnection.
@@ -207,7 +207,7 @@ export abstract class IBaseSpatialAudioEngine {
207
207
  * If the user or media player is in the same sound insulation area, it is not affected by SpatialAudioZone, and the sound attenuation effect is determined by the attenuation parameter in setPlayerAttenuation or setRemoteAudioAttenuation. If you do not call setPlayerAttenuation or setRemoteAudioAttenuation, the default sound attenuation coefficient of the SDK is 0.5, which simulates the attenuation of the sound in the real environment.
208
208
  * If the sound source and the receiver belong to two sound insulation areas, the receiver cannot hear the sound source. If this method is called multiple times, the last sound insulation area set takes effect.
209
209
  *
210
- * @param zones Sound insulation area settings. See SpatialAudioZone.
210
+ * @param zones Sound insulation area settings. See SpatialAudioZone. On the Windows platform, it is necessary to ensure that the number of members in the zones array is equal to the value of zoneCount; otherwise, it may cause a crash.
211
211
  * @param zoneCount The number of sound insulation areas.
212
212
  *
213
213
  * @returns