@webex/web-client-media-engine 3.5.0 → 3.5.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -3,6 +3,7 @@
3
3
  Web Client Media Engine is common web code for interacting with the multistream media server.
4
4
 
5
5
  ![WCME class diagram](uml.svg)
6
+ _UML diagram generated with [tplant](https://github.com/bafolts/tplant)._
6
7
 
7
8
  ## Setup
8
9
 
@@ -25,11 +26,13 @@ const offer = await multistreamConnection.createOffer();
25
26
  // after sending offer to server and receiving answer
26
27
  await multistreamConnection.setAnswer(answer);
27
28
 
28
- const localAudioTrack = createMicrophoneTrack();
29
- multistreamConnection.publishTrack(localAudioTrack);
29
+ const audioSendSlot = multistreamConnection.createSendSlot(MediaType.AudioMain);
30
+ const localAudioStream = createMicrophoneStream(LocalMicrophoneStream);
31
+ audioSendSlot.publishStream(localAudioStream);
30
32
 
31
- const localVideoTrack = createCameraTrack();
32
- multistreamConnection.publishTrack(localVideoTrack);
33
+ const videoSendSlot = multistreamConnection.createSendSlot(MediaType.VideoMain);
34
+ const localVideoStream = createCameraStream(LocalCameraStream);
35
+ videoSendSlot.publishStream(localVideoStream);
33
36
  ```
34
37
 
35
38
  ## SDP Management
@@ -102,21 +105,29 @@ In the above SDP, only mlines 0, 1, 2, 3 and 10 will be sent to Homer. The rest
102
105
 
103
106
  Mlines 4, 5, 6, 7, 8, and 9 serve only as ways for generating tracks associated with a MID on the client. To request media on those tracks, the client sends a `MediaRequest` (described below) with the appropriate MID.
104
107
 
105
- `MultistreamConnection` handles all the SDP manipulation required to accomplish the above for both the offer and the answer. It also performs additional preprocessing on the offer such as injecting content types and JMP attributes.
108
+ `MultistreamConnection` handles all the SDP manipulation required to accomplish the above for both the offer and the answer. It also performs additional preprocessing on the offer such as injecting content types and JMP attributes. All of this is handled by the appropriate SDP "mungers" (`IngressSdpMunger` and `EgressSdpMunger`), which are called before setting the local offer, sending the offer to the remote server, and setting the remote answer.
106
109
 
107
- ## Transceivers
110
+ ## Transceivers and Slots
108
111
 
109
112
  In WebRTC, RTCRtpTransceivers represent a pairing of send and receive SRTP streams between the client and server. WCME defines two classes of transceivers: `SendOnlyTransceiver` and `RecvOnlyTransceiver`. Each `MediaType` (`VideoMain`, `AudioMain`, `VideoSlides`, or `AudioSlides`) can only have one `SendOnlyTransceiver` but may have multiple `RecvOnlyTransceiver`s. `MultistreamConnection` maintains a list of all transceivers in a connection per `MediaType`.
110
113
 
111
- Although `SendOnlyTransceiver`s are only used for sending, its underlying RTCRtpTransceiver direction is set to "sendrecv" in order to make it compatible with Homer. Each `SendOnlyTransceiver` maintains the state of the sending track -- whether or not it is published, whether or not it has been requested by a remote peer -- as well as handles replacing the existing track with a new one (e.g. when switching camera devices).
114
+ Although `SendOnlyTransceiver`s are only used for sending, its underlying RTCRtpTransceiver direction is set to "sendrecv" in order to make it compatible with Homer. Each `SendOnlyTransceiver` maintains the state of the sending stream -- whether or not it is published, whether or not it has been requested by a remote peer -- as well as handles replacing the existing stream with a new one (e.g. when switching camera devices).
112
115
 
113
- `RecvOnlyTransceiver`s maintain the state of receiving tracks. Each `RecvOnlyTransceiver` is associated with a single `ReceiveSlot` (described below).
116
+ Likewise, `RecvOnlyTransceiver`s maintain the state of receiving streams.
114
117
 
115
- ## ReceiveSlot
118
+ Clients should never have to interact with transceivers directly. Instead, all interactions with transceivers are handled via "slots". Each `SendOnlyTransceiver` is associated with a single `SendSlot`, while each `RecvOnlyTransceiver` is associated with a single `ReceiveSlot`.
116
119
 
117
- WebRTC clients need tracks to be able to receive remote media, but creating a track for every remote participant's media would result in a huge SDP, and would require that the SDP was updated every time a client joined or left the session. So instead of allocating an RTCRtpReceiver for every remote participant, the receivers are treated as "slots" on which a remote participant (CSI) can be requested. `ReceiveSlot`s have an ID (which is the MID of the corresponding mline), and media is requested via JMP using these IDs. For example, if a client wants to receive the 3 most active speakers' audio, then it needs to create only 3 main audio receive slots, even if there are 25 participants in the meeting.
120
+ ### Receive Slots
118
121
 
119
- The media received on a `ReceiveSlot` can change over time, depending on the policy requested on that slot and/or future media requests on that slot. `Source Indication` messages are used by the server to notify the client which CSI is being received on a `ReceiveSlot` at a given time.
122
+ WebRTC clients need to be able to receive remote media, but creating a stream for every remote participant's media would result in a huge SDP, and would require that the SDP was updated every time a client joined or left the session. So instead of allocating an RTCRtpReceiver for every remote participant, the receivers are treated as "slots" on which a remote participant (CSI) can be requested. `ReceiveSlot`s have an ID (which is the MID of the corresponding mline), and media is requested via JMP using these IDs. For example, if a client wants to receive the 3 most active speakers' audio, then it needs to create only 3 main audio receive slots, even if there are 25 participants in the meeting.
123
+
124
+ The media received on a `ReceiveSlot` can change over time, depending on the policy requested on that slot and/or future media requests on that slot. `sourceAdvertisement` messages are used by the server to notify the client which CSI is being received on a `ReceiveSlot` at a given time.
125
+
126
+ ### Send Slots
127
+
128
+ Similarly, send slots are used to handle sending local media.
129
+
130
+ By default, all `SendOnlyTransceiver`s are inactive (that is, the direction in the SDP offer is set to "inactive") until a `SendSlot` is created, during which the transceiver can be activated (the direction set to "sendrecv"). Only one `SendSlot` per media type should be created at a time, and `SendSlot`s can be activated or deactivated on the fly.
120
131
 
121
132
  ## Requesting Media
122
133
 
@@ -126,13 +137,13 @@ The `requestMedia` API is used to request media from the media server. It takes
126
137
  requestMedia(mediaType: MediaType, mediaRequests: MediaRequest[]): void
127
138
  ```
128
139
 
129
- The `MediaRequest` object is an abstraction on the JMP SCR object, and allows crafting different combinations of policies and `ReceiveSlot`s to achieve different behaviors. A `MediaRequest` consists of:
140
+ The `MediaRequest` object is an abstraction on the JMP media request object, and allows crafting different combinations of policies and `ReceiveSlot`s to achieve different behaviors. A `MediaRequest` consists of:
130
141
 
131
142
  - Policy
132
143
  - PolicySpecificInfo
133
144
  - ReceiveSlot[]
134
145
 
135
- Details for these fields are the same as they are for the JMP SCR. Information can be found [here](https://confluence-eng-gpk2.cisco.com/conf/pages/viewpage.action?spaceKey=WMT&title=JMP+-+Json+Multistream+Protocol).
146
+ Details for these fields are the same as they are for the JMP media request. Information can be found [here](https://confluence-eng-gpk2.cisco.com/conf/pages/viewpage.action?spaceKey=WMT&title=JMP+-+Json+Multistream+Protocol).
136
147
 
137
148
  ### Examples
138
149
 
@@ -162,13 +173,15 @@ requestMedia(MediaType.VideoMain, [
162
173
  ]);
163
174
  ```
164
175
 
165
- ## Utilities
176
+ ## Utilities and Other Exports
166
177
 
167
178
  WCME also defines several utility files with exported functions that may be useful in both WCME and in other code:
168
179
 
169
180
  - `sdp-utils`: Contains functions that munge and manipulate SDP offer/answer descriptions.
170
181
  - `ua-utils`: Contains functions to help identify the current user agent (i.e. browser name and version).
171
182
 
183
+ In addition to WCME exports, all necessary imports from `webrtc-core` and `json-multistream` are also re-exported for your convenience, so they can be used in other code without the need to import from those repositories separately.
184
+
172
185
  ## Logging
173
186
 
174
187
  Logging is done through the `js-logger` library and the `Logger` class is exported to be used with other repositories. This can be done by importing and setting a handler for `Logger`.
@@ -181,3 +194,5 @@ Logger.setHandler((msgs, context) => {
181
194
  console.log(context.name, msgs);
182
195
  });
183
196
  ```
197
+
198
+ Loggers from `webrtc-core` and `json-multistream` are also re-exported if you want to handle logs from those repositories separately.
package/dist/cjs/index.js CHANGED
@@ -6556,15 +6556,15 @@ class JmpSession extends events$3.EventEmitter {
6556
6556
  }
6557
6557
  receive(jmpMsg) {
6558
6558
  if (jmpMsg.mediaContent !== this.mediaContent || jmpMsg.mediaFamily !== this.mediaFamily) {
6559
- this.logger.error(`JmpMsg ${jmpMsg} sent to incorrect JmpSession`);
6559
+ this.logger.error(`JmpMsg ${JSON.stringify(jmpMsg)} sent to incorrect JmpSession`);
6560
6560
  return;
6561
6561
  }
6562
- this.logger.debug(`Received JmpMsg`, jmpMsg);
6562
+ this.logger.debug(`Received JmpMsg`, JSON.stringify(jmpMsg));
6563
6563
  const { payload } = jmpMsg;
6564
6564
  if (payload.msgType === JmpMsgType.MediaRequest) {
6565
6565
  const mediaRequestMsg = payload.payload;
6566
6566
  if (!isValidMediaRequestMsg(mediaRequestMsg)) {
6567
- this.logger.error(`Received invalid MediaRequest:`, mediaRequestMsg);
6567
+ this.logger.error(`Received invalid MediaRequest:`, JSON.stringify(mediaRequestMsg));
6568
6568
  return;
6569
6569
  }
6570
6570
  this.handleIncomingMediaRequest(mediaRequestMsg);
@@ -6572,7 +6572,7 @@ class JmpSession extends events$3.EventEmitter {
6572
6572
  else if (payload.msgType === JmpMsgType.MediaRequestAck) {
6573
6573
  const mediaRequestAckMsg = payload.payload;
6574
6574
  if (!isValidMediaRequestAckMsg(mediaRequestAckMsg)) {
6575
- this.logger.error(`Received invalid MediaRequest ACK:`, mediaRequestAckMsg);
6575
+ this.logger.error(`Received invalid MediaRequest ACK:`, JSON.stringify(mediaRequestAckMsg));
6576
6576
  return;
6577
6577
  }
6578
6578
  this.handleIncomingMediaRequestAck(mediaRequestAckMsg);
@@ -6580,7 +6580,7 @@ class JmpSession extends events$3.EventEmitter {
6580
6580
  else if (payload.msgType === JmpMsgType.ActiveSpeakerNotification) {
6581
6581
  const activeSpeakerNotification = payload.payload;
6582
6582
  if (!isValidActiveSpeakerNotificationMsg(activeSpeakerNotification)) {
6583
- this.logger.info(`Received invalid Active Speaker Notification:`, activeSpeakerNotification);
6583
+ this.logger.info(`Received invalid Active Speaker Notification:`, JSON.stringify(activeSpeakerNotification));
6584
6584
  return;
6585
6585
  }
6586
6586
  this.handleIncomingActiveSpeakerNotification(activeSpeakerNotification);
@@ -6588,7 +6588,7 @@ class JmpSession extends events$3.EventEmitter {
6588
6588
  else if (payload.msgType === JmpMsgType.SourceAdvertisement) {
6589
6589
  const sourceAdvertisement = payload.payload;
6590
6590
  if (!isValidSourceAdvertisementMsg(sourceAdvertisement)) {
6591
- this.logger.error(`Received invalid SourceAdvertisementMsg: `, sourceAdvertisement);
6591
+ this.logger.error(`Received invalid SourceAdvertisementMsg:`, JSON.stringify(sourceAdvertisement));
6592
6592
  return;
6593
6593
  }
6594
6594
  this.handleIncomingSourceAdvertisement(sourceAdvertisement);
@@ -6596,7 +6596,7 @@ class JmpSession extends events$3.EventEmitter {
6596
6596
  else if (payload.msgType === JmpMsgType.SourceAdvertisementAck) {
6597
6597
  const sourceAdvertisementAck = payload.payload;
6598
6598
  if (!isValidSourceAdvertisementAckMsg(sourceAdvertisementAck)) {
6599
- this.logger.error(`Received invalid SourceAdvertisementAckMsg: `, sourceAdvertisementAck);
6599
+ this.logger.error(`Received invalid SourceAdvertisementAckMsg:`, JSON.stringify(sourceAdvertisementAck));
6600
6600
  return;
6601
6601
  }
6602
6602
  this.handleIncomingSourceAdvertisementAck(sourceAdvertisementAck);
@@ -6604,7 +6604,7 @@ class JmpSession extends events$3.EventEmitter {
6604
6604
  else if (payload.msgType === JmpMsgType.MediaRequestStatus) {
6605
6605
  const mediaRequestStatus = payload.payload;
6606
6606
  if (!isValidMediaRequestStatusMsg(mediaRequestStatus)) {
6607
- this.logger.error(`Received invalid MediaRequestStatusMsg: `, mediaRequestStatus);
6607
+ this.logger.error(`Received invalid MediaRequestStatusMsg:`, JSON.stringify(mediaRequestStatus));
6608
6608
  return;
6609
6609
  }
6610
6610
  this.handleIncomingMediaRequestStatus(mediaRequestStatus);
@@ -6612,7 +6612,7 @@ class JmpSession extends events$3.EventEmitter {
6612
6612
  else if (payload.msgType === JmpMsgType.MediaRequestStatusAck) {
6613
6613
  const mediaRequestStatusAck = payload.payload;
6614
6614
  if (!isValidMediaRequestStatusAckMsg(mediaRequestStatusAck)) {
6615
- this.logger.error(`Received invalid MediaRequestStatusAckMsg: `, mediaRequestStatusAck);
6615
+ this.logger.error(`Received invalid MediaRequestStatusAckMsg:`, JSON.stringify(mediaRequestStatusAck));
6616
6616
  return;
6617
6617
  }
6618
6618
  this.handleIncomingMediaRequestStatusAck(mediaRequestStatusAck);
@@ -6942,6 +6942,17 @@ var map2obj = function (report) {
6942
6942
  });
6943
6943
  return o;
6944
6944
  };
6945
+ var dumpStream = function (stream) { return ({
6946
+ id: stream.id,
6947
+ tracks: stream.getTracks().map(function (track) { return ({
6948
+ id: track.id,
6949
+ kind: track.kind,
6950
+ label: track.label,
6951
+ enabled: track.enabled,
6952
+ muted: track.muted,
6953
+ readyState: track.readyState
6954
+ }); })
6955
+ }); };
6945
6956
  var persistedKeys = ['type', 'id', 'timestamp'];
6946
6957
  /**
6947
6958
  * Check to see if the report consists of more than just the persisted metadata.
@@ -7036,6 +7047,7 @@ var rtcStats = function (pc, logger, intervalTime, statsPreProcessor) {
7036
7047
  var trace = function (name, payload, timestamp) {
7037
7048
  logger({ timestamp: timestamp ? Math.round(timestamp) : Date.now(), name: name, payload: payload });
7038
7049
  };
7050
+ var origPeerConnection = window.RTCPeerConnection;
7039
7051
  pc.addEventListener('icecandidate', function (e) {
7040
7052
  if (e.candidate) {
7041
7053
  trace('onicecandidate', makeEvent(JSON.stringify(e.candidate)));
@@ -7068,6 +7080,139 @@ var rtcStats = function (pc, logger, intervalTime, statsPreProcessor) {
7068
7080
  pc.addEventListener('datachannel', function (event) {
7069
7081
  trace('ondatachannel', makeEvent("".concat(event.channel.id, ": ").concat(event.channel.label)));
7070
7082
  });
7083
+ ['createDataChannel', 'close'].forEach(function (method) {
7084
+ var nativeMethod = origPeerConnection.prototype[method];
7085
+ if (nativeMethod) {
7086
+ origPeerConnection.prototype[method] = function () {
7087
+ trace(method, makeEvent(method));
7088
+ return nativeMethod.apply(this, arguments);
7089
+ };
7090
+ }
7091
+ });
7092
+ ['addStream', 'removeStream'].forEach(function (method) {
7093
+ var nativeMethod = origPeerConnection.prototype[method];
7094
+ if (nativeMethod) {
7095
+ origPeerConnection.prototype[method] = function () {
7096
+ var stream = arguments[0];
7097
+ var streamInfo = stream
7098
+ .getTracks()
7099
+ .map(function (t) { return "".concat(t.kind, ":").concat(t.id); })
7100
+ .join(',');
7101
+ trace(method, makeEvent("".concat(stream.id, " ").concat(streamInfo)));
7102
+ return nativeMethod.apply(this, arguments);
7103
+ };
7104
+ }
7105
+ });
7106
+ ['addTrack'].forEach(function (method) {
7107
+ var nativeMethod = origPeerConnection.prototype[method];
7108
+ if (nativeMethod) {
7109
+ origPeerConnection.prototype[method] = function () {
7110
+ var track = arguments[0];
7111
+ var streams = [].slice.call(arguments, 1);
7112
+ trace(method, makeEvent("".concat(track.kind, ":").concat(track.id, " ").concat(streams.map(function (s) { return "stream:".concat(s.id); }).join(';') || '-')));
7113
+ return nativeMethod.apply(this, arguments);
7114
+ };
7115
+ }
7116
+ });
7117
+ ['removeTrack'].forEach(function (method) {
7118
+ var nativeMethod = origPeerConnection.prototype[method];
7119
+ if (nativeMethod) {
7120
+ origPeerConnection.prototype[method] = function () {
7121
+ var track = arguments[0].track;
7122
+ trace(method, makeEvent(track ? "".concat(track.kind, ":").concat(track.id) : 'null'));
7123
+ return nativeMethod.apply(this, arguments);
7124
+ };
7125
+ }
7126
+ });
7127
+ ['createOffer', 'createAnswer'].forEach(function (method) {
7128
+ var nativeMethod = origPeerConnection.prototype[method];
7129
+ if (nativeMethod) {
7130
+ origPeerConnection.prototype[method] = function () {
7131
+ var opts;
7132
+ var args = arguments;
7133
+ if (arguments.length === 1 && typeof arguments[0] === 'object') {
7134
+ // eslint-disable-next-line prefer-destructuring
7135
+ opts = arguments[0];
7136
+ }
7137
+ else if (arguments.length === 3 && typeof arguments[2] === 'object') {
7138
+ // eslint-disable-next-line prefer-destructuring
7139
+ opts = arguments[2];
7140
+ }
7141
+ trace(method, makeEvent(opts || ''));
7142
+ return nativeMethod.apply(this, opts ? [opts] : undefined).then(function (description) {
7143
+ trace("".concat(method, "OnSuccess"), makeEvent(description.sdp));
7144
+ if (args.length > 0 && typeof args[0] === 'function') {
7145
+ args[0].apply(null, [description]);
7146
+ return undefined;
7147
+ }
7148
+ return description;
7149
+ }, function (err) {
7150
+ trace("".concat(method, "OnFailure"), makeEvent(err.toString()));
7151
+ if (args.length > 1 && typeof args[1] === 'function') {
7152
+ args[1].apply(null, [err]);
7153
+ return;
7154
+ }
7155
+ throw err;
7156
+ });
7157
+ };
7158
+ }
7159
+ });
7160
+ ['setLocalDescription', 'setRemoteDescription', 'addIceCandidate'].forEach(function (method) {
7161
+ var nativeMethod = origPeerConnection.prototype[method];
7162
+ if (nativeMethod) {
7163
+ origPeerConnection.prototype[method] = function () {
7164
+ var args = arguments;
7165
+ trace(method, makeEvent(method === 'addIceCandidate' ? arguments[0] : arguments[0].sdp));
7166
+ return nativeMethod.apply(this, [arguments[0]]).then(function () {
7167
+ trace("".concat(method, "OnSuccess"), makeEvent('success'));
7168
+ if (args.length >= 2 && typeof args[1] === 'function') {
7169
+ args[1].apply(null, []);
7170
+ return undefined;
7171
+ }
7172
+ return undefined;
7173
+ }, function (err) {
7174
+ trace("".concat(method, "OnFailure"), makeEvent(err.toString()));
7175
+ if (args.length >= 3 && typeof args[2] === 'function') {
7176
+ args[2].apply(null, [err]);
7177
+ return undefined;
7178
+ }
7179
+ throw err;
7180
+ });
7181
+ };
7182
+ }
7183
+ });
7184
+ if (navigator.mediaDevices && navigator.mediaDevices.getUserMedia) {
7185
+ var origGetUserMedia_1 = navigator.mediaDevices.getUserMedia.bind(navigator.mediaDevices);
7186
+ var gum = function () {
7187
+ trace('navigator.mediaDevices.getUserMedia', makeEvent(JSON.stringify(arguments[0])));
7188
+ return origGetUserMedia_1
7189
+ .apply(navigator.mediaDevices, arguments)
7190
+ .then(function (stream) {
7191
+ trace('navigator.mediaDevices.getUserMediaOnSuccess', makeEvent(JSON.stringify(dumpStream(stream))));
7192
+ return stream;
7193
+ }, function (err) {
7194
+ trace('navigator.mediaDevices.getUserMediaOnFailure', makeEvent(err.name));
7195
+ return Promise.reject(err);
7196
+ });
7197
+ };
7198
+ navigator.mediaDevices.getUserMedia = gum.bind(navigator.mediaDevices);
7199
+ }
7200
+ if (navigator.mediaDevices && navigator.mediaDevices.getDisplayMedia) {
7201
+ var origGetDisplayMedia_1 = navigator.mediaDevices.getDisplayMedia.bind(navigator.mediaDevices);
7202
+ var gdm = function () {
7203
+ trace('navigator.mediaDevices.getDisplayMedia', makeEvent(JSON.stringify(arguments[0])));
7204
+ return origGetDisplayMedia_1
7205
+ .apply(navigator.mediaDevices, arguments)
7206
+ .then(function (stream) {
7207
+ trace('navigator.mediaDevices.getDisplayMediaOnSuccess', makeEvent(JSON.stringify(dumpStream(stream))));
7208
+ return stream;
7209
+ }, function (err) {
7210
+ trace('navigator.mediaDevices.getDisplayMediaOnFailure', makeEvent(err.name));
7211
+ return Promise.reject(err);
7212
+ });
7213
+ };
7214
+ navigator.mediaDevices.getDisplayMedia = gdm.bind(navigator.mediaDevices);
7215
+ }
7071
7216
  var interval = window.setInterval(function () {
7072
7217
  if (pc.signalingState === 'closed') {
7073
7218
  window.clearInterval(interval);
@@ -14010,7 +14155,7 @@ class MultistreamConnection extends EventEmitter$2 {
14010
14155
  logger.error(`Error parsing datachannel JSON: ${err}`);
14011
14156
  return;
14012
14157
  }
14013
- logger.debug('DataChannel got msg: ', parsed);
14158
+ logger.debug('DataChannel got msg:', e.data);
14014
14159
  const homerMsg = HomerMsg.fromJson(parsed);
14015
14160
  if (!homerMsg) {
14016
14161
  logger.error(`Received invalid datachannel message: ${e}`);
@@ -14018,7 +14163,7 @@ class MultistreamConnection extends EventEmitter$2 {
14018
14163
  }
14019
14164
  const jmpMsg = homerMsg.payload;
14020
14165
  if (!isValidJmpMsg(jmpMsg)) {
14021
- logger.error(`Received invalid JMP msg: ${jmpMsg}`);
14166
+ logger.error(`Received invalid JMP msg: ${JSON.stringify(jmpMsg)}`);
14022
14167
  return;
14023
14168
  }
14024
14169
  const mediaType = getMediaType(jmpMsg.mediaFamily, jmpMsg.mediaContent);