@webex/web-client-media-engine 3.5.1 → 3.6.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +29 -14
- package/dist/cjs/index.js +164 -1
- package/dist/cjs/index.js.map +1 -1
- package/dist/esm/index.js +164 -1
- package/dist/esm/index.js.map +1 -1
- package/dist/types/index.d.ts +7 -1
- package/package.json +2 -2
package/README.md
CHANGED
|
@@ -3,6 +3,7 @@
|
|
|
3
3
|
Web Client Media Engine is common web code for interacting with the multistream media server.
|
|
4
4
|
|
|
5
5
|

|
|
6
|
+
_UML diagram generated with [tplant](https://github.com/bafolts/tplant)._
|
|
6
7
|
|
|
7
8
|
## Setup
|
|
8
9
|
|
|
@@ -25,11 +26,13 @@ const offer = await multistreamConnection.createOffer();
|
|
|
25
26
|
// after sending offer to server and receiving answer
|
|
26
27
|
await multistreamConnection.setAnswer(answer);
|
|
27
28
|
|
|
28
|
-
const
|
|
29
|
-
|
|
29
|
+
const audioSendSlot = multistreamConnection.createSendSlot(MediaType.AudioMain);
|
|
30
|
+
const localAudioStream = createMicrophoneStream(LocalMicrophoneStream);
|
|
31
|
+
audioSendSlot.publishStream(localAudioStream);
|
|
30
32
|
|
|
31
|
-
const
|
|
32
|
-
|
|
33
|
+
const videoSendSlot = multistreamConnection.createSendSlot(MediaType.VideoMain);
|
|
34
|
+
const localVideoStream = createCameraStream(LocalCameraStream);
|
|
35
|
+
videoSendSlot.publishStream(localVideoStream);
|
|
33
36
|
```
|
|
34
37
|
|
|
35
38
|
## SDP Management
|
|
@@ -102,21 +105,29 @@ In the above SDP, only mlines 0, 1, 2, 3 and 10 will be sent to Homer. The rest
|
|
|
102
105
|
|
|
103
106
|
Mlines 4, 5, 6, 7, 8, and 9 serve only as ways for generating tracks associated with a MID on the client. To request media on those tracks, the client sends a `MediaRequest` (described below) with the appropriate MID.
|
|
104
107
|
|
|
105
|
-
`MultistreamConnection` handles all the SDP manipulation required to accomplish the above for both the offer and the answer. It also performs additional preprocessing on the offer such as injecting content types and JMP attributes.
|
|
108
|
+
`MultistreamConnection` handles all the SDP manipulation required to accomplish the above for both the offer and the answer. It also performs additional preprocessing on the offer such as injecting content types and JMP attributes. All of this is handled by the appropriate SDP "mungers" (`IngressSdpMunger` and `EgressSdpMunger`), which are called before setting the local offer, sending the offer to the remote server, and setting the remote answer.
|
|
106
109
|
|
|
107
|
-
## Transceivers
|
|
110
|
+
## Transceivers and Slots
|
|
108
111
|
|
|
109
112
|
In WebRTC, RTCRtpTransceivers represent a pairing of send and receive SRTP streams between the client and server. WCME defines two classes of transceivers: `SendOnlyTransceiver` and `RecvOnlyTransceiver`. Each `MediaType` (`VideoMain`, `AudioMain`, `VideoSlides`, or `AudioSlides`) can only have one `SendOnlyTransceiver` but may have multiple `RecvOnlyTransceiver`s. `MultistreamConnection` maintains a list of all transceivers in a connection per `MediaType`.
|
|
110
113
|
|
|
111
|
-
Although `SendOnlyTransceiver`s are only used for sending, its underlying RTCRtpTransceiver direction is set to "sendrecv" in order to make it compatible with Homer. Each `SendOnlyTransceiver` maintains the state of the sending
|
|
114
|
+
Although `SendOnlyTransceiver`s are only used for sending, its underlying RTCRtpTransceiver direction is set to "sendrecv" in order to make it compatible with Homer. Each `SendOnlyTransceiver` maintains the state of the sending stream -- whether or not it is published, whether or not it has been requested by a remote peer -- as well as handles replacing the existing stream with a new one (e.g. when switching camera devices).
|
|
112
115
|
|
|
113
|
-
`RecvOnlyTransceiver`s maintain the state of receiving
|
|
116
|
+
Likewise, `RecvOnlyTransceiver`s maintain the state of receiving streams.
|
|
114
117
|
|
|
115
|
-
|
|
118
|
+
Clients should never have to interact with transceivers directly. Instead, all interactions with transceivers are handled via "slots". Each `SendOnlyTransceiver` is associated with a single `SendSlot`, while each `RecvOnlyTransceiver` is associated with a single `ReceiveSlot`.
|
|
116
119
|
|
|
117
|
-
|
|
120
|
+
### Receive Slots
|
|
118
121
|
|
|
119
|
-
|
|
122
|
+
WebRTC clients need to be able to receive remote media, but creating a stream for every remote participant's media would result in a huge SDP, and would require that the SDP was updated every time a client joined or left the session. So instead of allocating an RTCRtpReceiver for every remote participant, the receivers are treated as "slots" on which a remote participant (CSI) can be requested. `ReceiveSlot`s have an ID (which is the MID of the corresponding mline), and media is requested via JMP using these IDs. For example, if a client wants to receive the 3 most active speakers' audio, then it needs to create only 3 main audio receive slots, even if there are 25 participants in the meeting.
|
|
123
|
+
|
|
124
|
+
The media received on a `ReceiveSlot` can change over time, depending on the policy requested on that slot and/or future media requests on that slot. `sourceAdvertisement` messages are used by the server to notify the client which CSI is being received on a `ReceiveSlot` at a given time.
|
|
125
|
+
|
|
126
|
+
### Send Slots
|
|
127
|
+
|
|
128
|
+
Similarly, send slots are used to handle sending local media.
|
|
129
|
+
|
|
130
|
+
By default, all `SendOnlyTransceiver`s are inactive (that is, the direction in the SDP offer is set to "inactive") until a `SendSlot` is created, during which the transceiver can be activated (the direction set to "sendrecv"). Only one `SendSlot` per media type should be created at a time, and `SendSlot`s can be activated or deactivated on the fly.
|
|
120
131
|
|
|
121
132
|
## Requesting Media
|
|
122
133
|
|
|
@@ -126,13 +137,13 @@ The `requestMedia` API is used to request media from the media server. It takes
|
|
|
126
137
|
requestMedia(mediaType: MediaType, mediaRequests: MediaRequest[]): void
|
|
127
138
|
```
|
|
128
139
|
|
|
129
|
-
The `MediaRequest` object is an abstraction on the JMP
|
|
140
|
+
The `MediaRequest` object is an abstraction on the JMP media request object, and allows crafting different combinations of policies and `ReceiveSlot`s to achieve different behaviors. A `MediaRequest` consists of:
|
|
130
141
|
|
|
131
142
|
- Policy
|
|
132
143
|
- PolicySpecificInfo
|
|
133
144
|
- ReceiveSlot[]
|
|
134
145
|
|
|
135
|
-
Details for these fields are the same as they are for the JMP
|
|
146
|
+
Details for these fields are the same as they are for the JMP media request. Information can be found [here](https://confluence-eng-gpk2.cisco.com/conf/pages/viewpage.action?spaceKey=WMT&title=JMP+-+Json+Multistream+Protocol).
|
|
136
147
|
|
|
137
148
|
### Examples
|
|
138
149
|
|
|
@@ -162,13 +173,15 @@ requestMedia(MediaType.VideoMain, [
|
|
|
162
173
|
]);
|
|
163
174
|
```
|
|
164
175
|
|
|
165
|
-
## Utilities
|
|
176
|
+
## Utilities and Other Exports
|
|
166
177
|
|
|
167
178
|
WCME also defines several utility files with exported functions that may be useful in both WCME and in other code:
|
|
168
179
|
|
|
169
180
|
- `sdp-utils`: Contains functions that munge and manipulate SDP offer/answer descriptions.
|
|
170
181
|
- `ua-utils`: Contains functions to help identify the current user agent (i.e. browser name and version).
|
|
171
182
|
|
|
183
|
+
In addition to WCME exports, all necessary imports from `webrtc-core` and `json-multistream` are also re-exported for your convenience, so they can be used in other code without the need to import from those repositories separately.
|
|
184
|
+
|
|
172
185
|
## Logging
|
|
173
186
|
|
|
174
187
|
Logging is done through the `js-logger` library and the `Logger` class is exported to be used with other repositories. This can be done by importing and setting a handler for `Logger`.
|
|
@@ -181,3 +194,5 @@ Logger.setHandler((msgs, context) => {
|
|
|
181
194
|
console.log(context.name, msgs);
|
|
182
195
|
});
|
|
183
196
|
```
|
|
197
|
+
|
|
198
|
+
Loggers from `webrtc-core` and `json-multistream` are also re-exported if you want to handle logs from those repositories separately.
|
package/dist/cjs/index.js
CHANGED
|
@@ -6942,6 +6942,17 @@ var map2obj = function (report) {
|
|
|
6942
6942
|
});
|
|
6943
6943
|
return o;
|
|
6944
6944
|
};
|
|
6945
|
+
var dumpStream = function (stream) { return ({
|
|
6946
|
+
id: stream.id,
|
|
6947
|
+
tracks: stream.getTracks().map(function (track) { return ({
|
|
6948
|
+
id: track.id,
|
|
6949
|
+
kind: track.kind,
|
|
6950
|
+
label: track.label,
|
|
6951
|
+
enabled: track.enabled,
|
|
6952
|
+
muted: track.muted,
|
|
6953
|
+
readyState: track.readyState
|
|
6954
|
+
}); })
|
|
6955
|
+
}); };
|
|
6945
6956
|
var persistedKeys = ['type', 'id', 'timestamp'];
|
|
6946
6957
|
/**
|
|
6947
6958
|
* Check to see if the report consists of more than just the persisted metadata.
|
|
@@ -7036,6 +7047,7 @@ var rtcStats = function (pc, logger, intervalTime, statsPreProcessor) {
|
|
|
7036
7047
|
var trace = function (name, payload, timestamp) {
|
|
7037
7048
|
logger({ timestamp: timestamp ? Math.round(timestamp) : Date.now(), name: name, payload: payload });
|
|
7038
7049
|
};
|
|
7050
|
+
var origPeerConnection = window.RTCPeerConnection;
|
|
7039
7051
|
pc.addEventListener('icecandidate', function (e) {
|
|
7040
7052
|
if (e.candidate) {
|
|
7041
7053
|
trace('onicecandidate', makeEvent(JSON.stringify(e.candidate)));
|
|
@@ -7068,6 +7080,139 @@ var rtcStats = function (pc, logger, intervalTime, statsPreProcessor) {
|
|
|
7068
7080
|
pc.addEventListener('datachannel', function (event) {
|
|
7069
7081
|
trace('ondatachannel', makeEvent("".concat(event.channel.id, ": ").concat(event.channel.label)));
|
|
7070
7082
|
});
|
|
7083
|
+
['createDataChannel', 'close'].forEach(function (method) {
|
|
7084
|
+
var nativeMethod = origPeerConnection.prototype[method];
|
|
7085
|
+
if (nativeMethod) {
|
|
7086
|
+
origPeerConnection.prototype[method] = function () {
|
|
7087
|
+
trace(method, makeEvent(method));
|
|
7088
|
+
return nativeMethod.apply(this, arguments);
|
|
7089
|
+
};
|
|
7090
|
+
}
|
|
7091
|
+
});
|
|
7092
|
+
['addStream', 'removeStream'].forEach(function (method) {
|
|
7093
|
+
var nativeMethod = origPeerConnection.prototype[method];
|
|
7094
|
+
if (nativeMethod) {
|
|
7095
|
+
origPeerConnection.prototype[method] = function () {
|
|
7096
|
+
var stream = arguments[0];
|
|
7097
|
+
var streamInfo = stream
|
|
7098
|
+
.getTracks()
|
|
7099
|
+
.map(function (t) { return "".concat(t.kind, ":").concat(t.id); })
|
|
7100
|
+
.join(',');
|
|
7101
|
+
trace(method, makeEvent("".concat(stream.id, " ").concat(streamInfo)));
|
|
7102
|
+
return nativeMethod.apply(this, arguments);
|
|
7103
|
+
};
|
|
7104
|
+
}
|
|
7105
|
+
});
|
|
7106
|
+
['addTrack'].forEach(function (method) {
|
|
7107
|
+
var nativeMethod = origPeerConnection.prototype[method];
|
|
7108
|
+
if (nativeMethod) {
|
|
7109
|
+
origPeerConnection.prototype[method] = function () {
|
|
7110
|
+
var track = arguments[0];
|
|
7111
|
+
var streams = [].slice.call(arguments, 1);
|
|
7112
|
+
trace(method, makeEvent("".concat(track.kind, ":").concat(track.id, " ").concat(streams.map(function (s) { return "stream:".concat(s.id); }).join(';') || '-')));
|
|
7113
|
+
return nativeMethod.apply(this, arguments);
|
|
7114
|
+
};
|
|
7115
|
+
}
|
|
7116
|
+
});
|
|
7117
|
+
['removeTrack'].forEach(function (method) {
|
|
7118
|
+
var nativeMethod = origPeerConnection.prototype[method];
|
|
7119
|
+
if (nativeMethod) {
|
|
7120
|
+
origPeerConnection.prototype[method] = function () {
|
|
7121
|
+
var track = arguments[0].track;
|
|
7122
|
+
trace(method, makeEvent(track ? "".concat(track.kind, ":").concat(track.id) : 'null'));
|
|
7123
|
+
return nativeMethod.apply(this, arguments);
|
|
7124
|
+
};
|
|
7125
|
+
}
|
|
7126
|
+
});
|
|
7127
|
+
['createOffer', 'createAnswer'].forEach(function (method) {
|
|
7128
|
+
var nativeMethod = origPeerConnection.prototype[method];
|
|
7129
|
+
if (nativeMethod) {
|
|
7130
|
+
origPeerConnection.prototype[method] = function () {
|
|
7131
|
+
var opts;
|
|
7132
|
+
var args = arguments;
|
|
7133
|
+
if (arguments.length === 1 && typeof arguments[0] === 'object') {
|
|
7134
|
+
// eslint-disable-next-line prefer-destructuring
|
|
7135
|
+
opts = arguments[0];
|
|
7136
|
+
}
|
|
7137
|
+
else if (arguments.length === 3 && typeof arguments[2] === 'object') {
|
|
7138
|
+
// eslint-disable-next-line prefer-destructuring
|
|
7139
|
+
opts = arguments[2];
|
|
7140
|
+
}
|
|
7141
|
+
trace(method, makeEvent(opts || ''));
|
|
7142
|
+
return nativeMethod.apply(this, opts ? [opts] : undefined).then(function (description) {
|
|
7143
|
+
trace("".concat(method, "OnSuccess"), makeEvent(description.sdp));
|
|
7144
|
+
if (args.length > 0 && typeof args[0] === 'function') {
|
|
7145
|
+
args[0].apply(null, [description]);
|
|
7146
|
+
return undefined;
|
|
7147
|
+
}
|
|
7148
|
+
return description;
|
|
7149
|
+
}, function (err) {
|
|
7150
|
+
trace("".concat(method, "OnFailure"), makeEvent(err.toString()));
|
|
7151
|
+
if (args.length > 1 && typeof args[1] === 'function') {
|
|
7152
|
+
args[1].apply(null, [err]);
|
|
7153
|
+
return;
|
|
7154
|
+
}
|
|
7155
|
+
throw err;
|
|
7156
|
+
});
|
|
7157
|
+
};
|
|
7158
|
+
}
|
|
7159
|
+
});
|
|
7160
|
+
['setLocalDescription', 'setRemoteDescription', 'addIceCandidate'].forEach(function (method) {
|
|
7161
|
+
var nativeMethod = origPeerConnection.prototype[method];
|
|
7162
|
+
if (nativeMethod) {
|
|
7163
|
+
origPeerConnection.prototype[method] = function () {
|
|
7164
|
+
var args = arguments;
|
|
7165
|
+
trace(method, makeEvent(method === 'addIceCandidate' ? arguments[0] : arguments[0].sdp));
|
|
7166
|
+
return nativeMethod.apply(this, [arguments[0]]).then(function () {
|
|
7167
|
+
trace("".concat(method, "OnSuccess"), makeEvent('success'));
|
|
7168
|
+
if (args.length >= 2 && typeof args[1] === 'function') {
|
|
7169
|
+
args[1].apply(null, []);
|
|
7170
|
+
return undefined;
|
|
7171
|
+
}
|
|
7172
|
+
return undefined;
|
|
7173
|
+
}, function (err) {
|
|
7174
|
+
trace("".concat(method, "OnFailure"), makeEvent(err.toString()));
|
|
7175
|
+
if (args.length >= 3 && typeof args[2] === 'function') {
|
|
7176
|
+
args[2].apply(null, [err]);
|
|
7177
|
+
return undefined;
|
|
7178
|
+
}
|
|
7179
|
+
throw err;
|
|
7180
|
+
});
|
|
7181
|
+
};
|
|
7182
|
+
}
|
|
7183
|
+
});
|
|
7184
|
+
if (navigator.mediaDevices && navigator.mediaDevices.getUserMedia) {
|
|
7185
|
+
var origGetUserMedia_1 = navigator.mediaDevices.getUserMedia.bind(navigator.mediaDevices);
|
|
7186
|
+
var gum = function () {
|
|
7187
|
+
trace('navigator.mediaDevices.getUserMedia', makeEvent(JSON.stringify(arguments[0])));
|
|
7188
|
+
return origGetUserMedia_1
|
|
7189
|
+
.apply(navigator.mediaDevices, arguments)
|
|
7190
|
+
.then(function (stream) {
|
|
7191
|
+
trace('navigator.mediaDevices.getUserMediaOnSuccess', makeEvent(JSON.stringify(dumpStream(stream))));
|
|
7192
|
+
return stream;
|
|
7193
|
+
}, function (err) {
|
|
7194
|
+
trace('navigator.mediaDevices.getUserMediaOnFailure', makeEvent(err.name));
|
|
7195
|
+
return Promise.reject(err);
|
|
7196
|
+
});
|
|
7197
|
+
};
|
|
7198
|
+
navigator.mediaDevices.getUserMedia = gum.bind(navigator.mediaDevices);
|
|
7199
|
+
}
|
|
7200
|
+
if (navigator.mediaDevices && navigator.mediaDevices.getDisplayMedia) {
|
|
7201
|
+
var origGetDisplayMedia_1 = navigator.mediaDevices.getDisplayMedia.bind(navigator.mediaDevices);
|
|
7202
|
+
var gdm = function () {
|
|
7203
|
+
trace('navigator.mediaDevices.getDisplayMedia', makeEvent(JSON.stringify(arguments[0])));
|
|
7204
|
+
return origGetDisplayMedia_1
|
|
7205
|
+
.apply(navigator.mediaDevices, arguments)
|
|
7206
|
+
.then(function (stream) {
|
|
7207
|
+
trace('navigator.mediaDevices.getDisplayMediaOnSuccess', makeEvent(JSON.stringify(dumpStream(stream))));
|
|
7208
|
+
return stream;
|
|
7209
|
+
}, function (err) {
|
|
7210
|
+
trace('navigator.mediaDevices.getDisplayMediaOnFailure', makeEvent(err.name));
|
|
7211
|
+
return Promise.reject(err);
|
|
7212
|
+
});
|
|
7213
|
+
};
|
|
7214
|
+
navigator.mediaDevices.getDisplayMedia = gdm.bind(navigator.mediaDevices);
|
|
7215
|
+
}
|
|
7071
7216
|
var interval = window.setInterval(function () {
|
|
7072
7217
|
if (pc.signalingState === 'closed') {
|
|
7073
7218
|
window.clearInterval(interval);
|
|
@@ -10277,6 +10422,7 @@ class SendOnlyTransceiver extends Transceiver {
|
|
|
10277
10422
|
this.streamMuteStateChange = new TypedEvent();
|
|
10278
10423
|
this.streamPublishStateChange = new TypedEvent();
|
|
10279
10424
|
this.negotiationNeeded = new TypedEvent();
|
|
10425
|
+
this.namedMediaGroupsChange = new TypedEvent();
|
|
10280
10426
|
this.requested = false;
|
|
10281
10427
|
this.requestedIdEncodingParamsMap = new Map();
|
|
10282
10428
|
this.csi = csi;
|
|
@@ -10351,6 +10497,13 @@ class SendOnlyTransceiver extends Transceiver {
|
|
|
10351
10497
|
}
|
|
10352
10498
|
});
|
|
10353
10499
|
}
|
|
10500
|
+
setNamedMediaGroups(namedMediaGroups) {
|
|
10501
|
+
if (this.mediaType !== exports.MediaType.AudioMain) {
|
|
10502
|
+
throw new Error(`This interface does not allow media types other than AudioMain: ${this.mediaType}`);
|
|
10503
|
+
}
|
|
10504
|
+
this.namedMediaGroups = namedMediaGroups;
|
|
10505
|
+
this.namedMediaGroupsChange.emit();
|
|
10506
|
+
}
|
|
10354
10507
|
publishStream(stream) {
|
|
10355
10508
|
return this.replacePublishedStream(stream);
|
|
10356
10509
|
}
|
|
@@ -10454,6 +10607,9 @@ class SendSlot {
|
|
|
10454
10607
|
return this.sendTransceiver.unpublishStream();
|
|
10455
10608
|
});
|
|
10456
10609
|
}
|
|
10610
|
+
setNamedMediaGroups(namedMediaGroups) {
|
|
10611
|
+
this.sendTransceiver.setNamedMediaGroups(namedMediaGroups);
|
|
10612
|
+
}
|
|
10457
10613
|
get active() {
|
|
10458
10614
|
return this.sendTransceiver.active;
|
|
10459
10615
|
}
|
|
@@ -13893,6 +14049,9 @@ class MultistreamConnection extends EventEmitter$2 {
|
|
|
13893
14049
|
this.queueLocalOfferAnswer();
|
|
13894
14050
|
}
|
|
13895
14051
|
});
|
|
14052
|
+
transceiver.namedMediaGroupsChange.on(() => {
|
|
14053
|
+
this.sendSourceAdvertisement(mediaType);
|
|
14054
|
+
});
|
|
13896
14055
|
this.sendTransceivers.set(mediaType, transceiver);
|
|
13897
14056
|
this.createJmpSession(mediaType);
|
|
13898
14057
|
}
|
|
@@ -14082,7 +14241,11 @@ class MultistreamConnection extends EventEmitter$2 {
|
|
|
14082
14241
|
};
|
|
14083
14242
|
}
|
|
14084
14243
|
else {
|
|
14085
|
-
task = () => {
|
|
14244
|
+
task = () => {
|
|
14245
|
+
var _a;
|
|
14246
|
+
return (_a = this.jmpSessions
|
|
14247
|
+
.get(mediaType)) === null || _a === void 0 ? void 0 : _a.sendSourceAdvertisement(1, numLiveSources, mediaType === exports.MediaType.AudioMain ? transceiver.namedMediaGroups : []);
|
|
14248
|
+
};
|
|
14086
14249
|
}
|
|
14087
14250
|
if (((_b = this.dataChannel) === null || _b === void 0 ? void 0 : _b.readyState) === 'open') {
|
|
14088
14251
|
task();
|