com.adrenak.univoice 4.7.0 → 4.8.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (24) hide show
  1. package/README.md +37 -25
  2. package/Runtime/Adrenak.UniVoice.Runtime.asmdef +2 -1
  3. package/Runtime/Impl/Networks/FishNet/FishNetBroadcast.cs +22 -0
  4. package/Runtime/Impl/Networks/FishNet/FishNetBroadcast.cs.meta +2 -0
  5. package/Runtime/Impl/Networks/FishNet/FishNetBroadcastTags.cs +15 -0
  6. package/Runtime/Impl/Networks/FishNet/FishNetBroadcastTags.cs.meta +2 -0
  7. package/Runtime/Impl/Networks/FishNet/FishNetClient.cs +197 -0
  8. package/Runtime/Impl/Networks/FishNet/FishNetClient.cs.meta +2 -0
  9. package/Runtime/Impl/Networks/FishNet/FishNetServer.cs +236 -0
  10. package/Runtime/Impl/Networks/FishNet/FishNetServer.cs.meta +2 -0
  11. package/Runtime/Impl/Networks/FishNet.meta +3 -0
  12. package/Runtime/Impl/Networks/Mirror/MirrorClient.cs +1 -1
  13. package/Runtime/Impl/Networks/Mirror/MirrorMessage.cs +1 -1
  14. package/Runtime/Impl/Networks/Mirror/MirrorMessageTags.cs +1 -1
  15. package/Runtime/Impl/Networks/Mirror/MirrorModeObserver.cs +1 -1
  16. package/Runtime/Impl/Networks/Mirror/MirrorServer.cs +1 -1
  17. package/Samples~/Basic Setup Scripts/FishNet-SinglePrefabObjects.asset +15 -0
  18. package/Samples~/Basic Setup Scripts/FishNet-SinglePrefabObjects.asset.meta +8 -0
  19. package/Samples~/Basic Setup Scripts/UniVoiceFishNetSetupSample.cs +199 -0
  20. package/Samples~/Basic Setup Scripts/UniVoiceFishNetSetupSample.cs.meta +11 -0
  21. package/Samples~/Basic Setup Scripts/UniVoiceFishNetSetupSample.unity +335 -0
  22. package/Samples~/Basic Setup Scripts/UniVoiceFishNetSetupSample.unity.meta +7 -0
  23. package/Samples~/Basic Setup Scripts/UniVoiceMirrorSetupSample.cs +2 -2
  24. package/package.json +1 -1
package/README.md CHANGED
@@ -1,25 +1,13 @@
1
1
  # UniVoice
2
2
  UniVoice is a voice chat/VoIP solution for Unity.
3
3
 
4
- Some features of UniVoice:
5
- - 👥 Group voice chat. Multiple peers can join a chatroom and exchange audio.
6
-
7
- - ⚙ Fine control over audio data flow.
8
- * Don't want to listen to a peer? Mute them. Don't want someone listening to you? Deafen them.
9
- * Group players using tags and control audio flow between them. For example:
10
- - "red", "blue" and "spectator" tags for two teams playing against each other.
11
- - Red and Blue teams can only hear each other
12
- - Spectators can hear everyone
13
- - clients with "contestant", "judge" and "audience" tags for a virtual talent show.
14
- - Contestant can be heard by everyone, but don't hear anyone else (for focus)
15
- - Judges can talk to and hear each other for discussions. They can hear the contestant. But not the audience (for less noise)
16
- - Audience can hear and talk to each other. They can hear the performer. But they cannot hear the judges.
17
-
4
+ Some features of UniVoice:
18
5
  - 🎨 Customize your audio input, output and networking layers.
19
6
  * 🌐 __Configurable Network__:
20
7
  - UniVoice is networking agnostic. Implement the `IAudioClient` and `IAudioServer` interfaces using the networking plugin of your choice to have it send audio data over any networking solution.
21
8
  - Built-in support for:
22
- - Mirror networking
9
+ - [Mirror networking](https://mirror-networking.com/)
10
+ - [Fish Networking](https://fish-networking.gitbook.io/docs)
23
11
 
24
12
  * 🎤 __Configurable Audio Input__:
25
13
  - UniVoice is audio input agnostic. You can change the source of outgoing audio by implementing the `IAudioInput` interface.
@@ -37,6 +25,22 @@ Some features of UniVoice:
37
25
  - Opus (Concentus) encoding & decoding.
38
26
  - RNNoise based noise removal.
39
27
  - Gaussian blurring for minor denoising.
28
+
29
+ - 👥 Easy integration with your existing networking solution
30
+ - Whether you're using Mirror or FishNet, UniVoice runs in the background in sync with your networking lifecycle
31
+ - A basic integration involves just initializing it on start.
32
+ - For advanced usage like teams, chatrooms, lobbies, you can use the UniVoice API to create runtime behaviour.
33
+
34
+ - ⚙ Fine control over audio data flow.
35
+ * Don't want to listen to a peer? Mute them. Don't want someone listening to you? Deafen them.
36
+ * Group players using tags and control audio flow between them. For example:
37
+ - "red", "blue" and "spectator" tags for two teams playing against each other.
38
+ - Red and Blue teams can only hear each other
39
+ - Spectators can hear everyone
40
+ - clients with "contestant", "judge" and "audience" tags for a virtual talent show.
41
+ - Contestant can be heard by everyone, but don't hear anyone else (for focus)
42
+ - Judges can talk to and hear each other for discussions. They can hear the contestant. But not the audience (for less noise)
43
+ - Audience can hear and talk to each other. They can hear the performer. But they cannot hear the judges.
40
44
 
41
45
  ## Installation
42
46
  ⚠️ [OpenUPM](https://openupm.com/packages/com.adrenak.univoice/?subPage=versions) may not have up to date releases. Install using NPM registry instead 👇
@@ -62,19 +66,18 @@ Then add `com.adrenak.univoice:x.y.z` to the `dependencies` in your `manifest.js
62
66
  ## Useful links
63
67
  * API reference is available here: http://www.vatsalambastha.com/univoice
64
68
  * UniVoice blog: https://blog.vatsalambastha.com/search/label/univoice
65
- * Discord server: https://discord.gg/Un6Y2sQqqe
69
+ * Discord server: https://discord.gg/NGvkEVbdjQ
66
70
 
67
71
  ## Integration
68
72
  UniVoice isn't currently very drag-and-drop/low-code. The best way to integrate is to have some code perform a one time setup when your app starts and provides access to relevant objects that you can use throughout the rest of the apps runtime.
69
73
 
70
- An example of this is the `UniVoiceMirrorSetupSample.cs` file that gives you access to an AudioServer that you can use in your server code and a ClientSession that you can use in your client code. For more see the "Samples" section below.
71
-
72
74
  ## Samples
73
75
  This repository contains two samples:
74
- * `UniVoiceMirrorSetupSample.cs` is a drag and drop component, a simple integration sample script. You can add it to your Mirror NetworkManager to get voice chat to work. No code required, it's as simple as that! It'll work as long as you have setup your project properly. For more instructions see the top of the `UniVoiceMirrorSetupSample.cs` file.
75
- * A sample scene that shows the other clients in a UI as well as allows you to mute yourself/them. This sample is also Mirror based.
76
+ * `UniVoiceMirrorSetupSample.cs` is a drag and drop component, a simple integration sample script. You can add it to your Mirror NetworkManager to get voice chat to work. No code required, it's as simple as that! It'll work as long as you have setup your project properly. For more instructions see the top of the `UniVoiceMirrorSetupSample.cs` file.
77
+ * `UniVoiceFishNetSetypSample.cs` is also very similar. Just drag and drop and it should work!
78
+ * A sample scene that shows the other clients in a UI as well as allows you to mute yourself/them. This sample is Mirror based.
76
79
 
77
- > UniVoice currently only supports Mirror out of the box. All the samples rely on Mirror networking to work. Follow the instructions in the "Activating non-packaged dependencies" section below for enabling Mirror in your project before trying it out.
80
+ > UniVoice currently only supports Mirror and FishNetworking out of the box. Follow the instructions in the "Activating non-packaged dependencies" section below before trying it out the samples.
78
81
 
79
82
  ## Dependencies
80
83
  [com.adrenak.brw](https://www.github.com/adrenak/brw) for reading and writing messages for communication. See `MirrorServer.cs` and `MirrorClient.cs` where they're used.
@@ -90,19 +93,28 @@ UniVoice includes and installs the dependencies mentioned above along with itsel
90
93
  * Mic audio capture input (via UniMic)
91
94
  * AudioSource based playback output (via UniMic)
92
95
 
93
- But the following implementations are based on dependencies that you have to install and enable via compilation symbols as they are _not_ UniVoice dependencies and _don't_ get installed along with UniVoice. This is because they are either third party modules or based on native libraries (not plain C#) that can pose build issues.
94
- * Mirror network:
95
- * To enable, ensure the Mirror package is in your project and add `UNIVOICE_NETWORK_MIRROR` to activate it
96
+ UniVoice has code that uses dependencies that you have to install and sometimes enable via compilation symbols as they are _not_ UniVoice dependencies and _don't_ get installed along with UniVoice. This is because they are either third party modules or based on native libraries (not plain C#) that can pose build issues.
96
97
  * RNNoise Noise removal filter:
97
98
  * To enable, ensure the [RNNoise4Unity](https://github.com/adrenak/RNNoise4Unity) package is in your project and add `UNIVOICE_FILTER_RNNOISE4UNITY` to activate it
99
+ * Mirror network:
100
+ * Just add the Mirror package to your project. UniVoice will detect it.
101
+ * Fish Networking:
102
+ * Just install FishNet package in your project. UniVoice will detect it.
98
103
 
99
104
  ## License and Support
100
105
  This project is under the [MIT license](https://github.com/adrenak/univoice/blob/master/LICENSE).
101
106
 
102
107
  Community contributions are welcome.
108
+
109
+ Commercial engagements with the author can be arranged, subject to schedule and availability.
110
+
111
+ ## Acknowledgements and contributors
112
+ * [@metater](https://github.com/Metater/) for helping make improvements to audio streaming quality. [A related blog post](https://blog.vatsalambastha.com/2025/07/unimic-330-many-streamedaudiosource.html)
113
+ * [@FrantisekHolubec](https://github.com/FrantisekHolubec) for [FishNet support code](https://github.com/adrenak/univoice/commit/fdc3424180d8991c92b3e092b3edb50b6110c863). Here's a [related blog post](https://blog.vatsalambastha.com/2025/09/univoice-480-fishnet-support.html)
114
+ * [Masaryk University](https://www.muni.cz/en) for using UniVoice in their projects and providing helpful feedback
103
115
 
104
116
  ## Contact
105
- The developer can be reached at the following links:
117
+ The author can be reached at the following links:
106
118
 
107
119
  [Website](http://www.vatsalambastha.com)
108
120
  [LinkedIn](https://www.linkedin.com/in/vatsalAmbastha)
@@ -6,7 +6,8 @@
6
6
  "GUID:f87ecb857e752164ab814a3de8eb0262",
7
7
  "GUID:b118fd5a40c85ad4e9b38e8c4a42bbb1",
8
8
  "GUID:4653938bfdb5cf8409322ce17219d5f7",
9
- "GUID:30817c1a0e6d646d99c048fc403f5979"
9
+ "GUID:30817c1a0e6d646d99c048fc403f5979",
10
+ "GUID:7c88a4a7926ee5145ad2dfa06f454c67"
10
11
  ],
11
12
  "includePlatforms": [],
12
13
  "excludePlatforms": [],
@@ -0,0 +1,22 @@
1
+ #if FISHNET
2
+ using System;
3
+ using FishNet.Broadcast;
4
+
5
+ namespace Adrenak.UniVoice.Networks
6
+ {
7
+ /// <summary>
8
+ /// The messages exchanged between the server and client.
9
+ /// To see how the Mirror implementation of UniVoice uses this struct
10
+ /// find the references to the <see cref="data"/> object in the project.
11
+ /// The gist is, it uses BRW (https://www.github.com/adrenak/brw) to
12
+ /// write and read data. The data always starts with a tag. All the tags
13
+ /// used for this UniVoice FishNet implementation are available in
14
+ /// <see cref="FishNetBroadcastTags"/>
15
+ /// </summary>
16
+ [Serializable]
17
+ public struct FishNetBroadcast : IBroadcast
18
+ {
19
+ public byte[] data;
20
+ }
21
+ }
22
+ #endif
@@ -0,0 +1,2 @@
1
+ fileFormatVersion: 2
2
+ guid: f0e5aa417fb8e6249bb05a011a5d7edf
@@ -0,0 +1,15 @@
1
+ #if FISHNET
2
+ namespace Adrenak.UniVoice.Networks
3
+ {
4
+ /// <summary>
5
+ /// The different types of messages we send over FishNet
6
+ /// to implement the <see cref="IAudioClient{T}"/> and <see cref="IAudioServer{T}"/>
7
+ /// interfaces for FishNet
8
+ /// </summary>
9
+ public class FishNetBroadcastTags
10
+ {
11
+ public const string AUDIO_FRAME = "AUDIO_FRAME";
12
+ public const string VOICE_SETTINGS = "VOICE_SETTINGS";
13
+ }
14
+ }
15
+ #endif
@@ -0,0 +1,2 @@
1
+ fileFormatVersion: 2
2
+ guid: fa1db30850448a14b840e4974d687a3b
@@ -0,0 +1,197 @@
1
+ #if FISHNET
2
+ using System;
3
+ using System.Collections.Generic;
4
+ using System.Linq;
5
+ using Adrenak.BRW;
6
+ using FishNet;
7
+ using FishNet.Managing;
8
+ using FishNet.Transporting;
9
+ using UnityEngine;
10
+
11
+ namespace Adrenak.UniVoice.Networks
12
+ {
13
+ /// <summary>
14
+ /// This is the implementation of <see cref="IAudioClient{T}"/> interface for FishNet.
15
+ /// It uses the FishNet to send and receive UniVoice data to the server.
16
+ /// </summary>
17
+ public class FishNetClient : IAudioClient<int>
18
+ {
19
+ private const string TAG = "[FishNetClient]";
20
+ public int ID { get; private set; } = -1;
21
+
22
+ public List<int> PeerIDs { get; private set; }
23
+ public VoiceSettings YourVoiceSettings { get; private set; }
24
+
25
+ public event Action<int, List<int>> OnJoined;
26
+ public event Action OnLeft;
27
+ public event Action<int> OnPeerJoined;
28
+ public event Action<int> OnPeerLeft;
29
+ public event Action<int, AudioFrame> OnReceivedPeerAudioFrame;
30
+
31
+ private NetworkManager _networkManager;
32
+
33
+ public FishNetClient()
34
+ {
35
+ PeerIDs = new List<int>();
36
+ YourVoiceSettings = new VoiceSettings();
37
+
38
+ _networkManager = InstanceFinder.NetworkManager;
39
+ _networkManager.ClientManager.OnClientConnectionState += OnClientConnectionStateChanged;
40
+ _networkManager.ClientManager.OnAuthenticated += OnClientAuthenticated;
41
+ _networkManager.ClientManager.OnRemoteConnectionState += OnRemoteConnectionStateChanged;
42
+ _networkManager.ClientManager.RegisterBroadcast<FishNetBroadcast>(OnReceivedMessage);
43
+ }
44
+
45
+ public void Dispose()
46
+ {
47
+ if (_networkManager)
48
+ {
49
+ _networkManager.ClientManager.OnClientConnectionState -= OnClientConnectionStateChanged;
50
+ _networkManager.ClientManager.OnAuthenticated -= OnClientAuthenticated;
51
+ _networkManager.ClientManager.OnRemoteConnectionState -= OnRemoteConnectionStateChanged;
52
+ _networkManager.ClientManager.UnregisterBroadcast<FishNetBroadcast>(OnReceivedMessage);
53
+ }
54
+ PeerIDs.Clear();
55
+ }
56
+
57
+ private void OnRemoteConnectionStateChanged(RemoteConnectionStateArgs args)
58
+ {
59
+ // Don't process connection state changes before the client is authenticated
60
+ if (_networkManager.ClientManager.Connection.ClientId < 0)
61
+ return;
62
+
63
+ if (args.ConnectionState == RemoteConnectionState.Started)
64
+ {
65
+ var newPeerID = args.ConnectionId;
66
+ if (!PeerIDs.Contains(newPeerID))
67
+ {
68
+ PeerIDs.Add(newPeerID);
69
+ Debug.unityLogger.Log(LogType.Log, TAG,
70
+ $"Peer {newPeerID} joined. Peer list is now {string.Join(", ", PeerIDs)}");
71
+ OnPeerJoined?.Invoke(newPeerID);
72
+ }
73
+ }
74
+ else if (args.ConnectionState == RemoteConnectionState.Stopped)
75
+ {
76
+ var leftPeerID = args.ConnectionId;
77
+ if (PeerIDs.Contains(leftPeerID))
78
+ {
79
+ PeerIDs.Remove(leftPeerID);
80
+ var log2 = $"Peer {leftPeerID} left. ";
81
+ if (PeerIDs.Count == 0)
82
+ log2 += "There are no peers anymore.";
83
+ else
84
+ log2 += $"Peer list is now {string.Join(", ", PeerIDs)}";
85
+
86
+ Debug.unityLogger.Log(LogType.Log, TAG, log2);
87
+ OnPeerLeft?.Invoke(leftPeerID);
88
+ }
89
+ }
90
+ }
91
+
92
+ private void OnClientAuthenticated()
93
+ {
94
+ // We need to use OnClientAuthenticated to ensure the client does have ClientId set
95
+ ID = _networkManager.ClientManager.Connection.ClientId;
96
+ PeerIDs = _networkManager.ClientManager.Clients.Keys.Where(x => x != ID).ToList();
97
+
98
+ var log = $"Initialized with ID {ID}. ";
99
+ if (PeerIDs.Count > 0)
100
+ log += $"Peer list: {string.Join(", ", PeerIDs)}";
101
+ else
102
+ log += "There are currently no peers.";
103
+ Debug.unityLogger.Log(LogType.Log, TAG, log);
104
+
105
+ OnJoined?.Invoke(ID, PeerIDs);
106
+ foreach (var peerId in PeerIDs)
107
+ OnPeerJoined?.Invoke(peerId);
108
+ }
109
+
110
+ private void OnClientConnectionStateChanged(ClientConnectionStateArgs args)
111
+ {
112
+ // We check only for the stopped state here, as the started state is handled in OnClientAuthenticated
113
+ if (args.ConnectionState == LocalConnectionState.Stopped)
114
+ {
115
+ YourVoiceSettings = new VoiceSettings();
116
+ var oldPeerIds = PeerIDs.ToList();
117
+ PeerIDs.Clear();
118
+ ID = -1;
119
+ foreach (var peerId in oldPeerIds)
120
+ OnPeerLeft?.Invoke(peerId);
121
+ OnLeft?.Invoke();
122
+ }
123
+ }
124
+
125
+ private void OnReceivedMessage(FishNetBroadcast msg, Channel channel)
126
+ {
127
+ var reader = new BytesReader(msg.data);
128
+ var tag = reader.ReadString();
129
+ switch (tag)
130
+ {
131
+ // When the server sends audio from a peer meant for this client
132
+ case FishNetBroadcastTags.AUDIO_FRAME:
133
+ var sender = reader.ReadInt();
134
+ if (sender == ID || !PeerIDs.Contains(sender))
135
+ return;
136
+ var frame = new AudioFrame
137
+ {
138
+ timestamp = reader.ReadLong(),
139
+ frequency = reader.ReadInt(),
140
+ channelCount = reader.ReadInt(),
141
+ samples = reader.ReadByteArray()
142
+ };
143
+ OnReceivedPeerAudioFrame?.Invoke(sender, frame);
144
+ break;
145
+ }
146
+ }
147
+
148
+ /// <summary>
149
+ /// Sends an audio frame captured on this client to the server
150
+ /// </summary>
151
+ /// <param name="frame"></param>
152
+ public void SendAudioFrame(AudioFrame frame)
153
+ {
154
+ if (ID == -1)
155
+ return;
156
+ var writer = new BytesWriter();
157
+ writer.WriteString(FishNetBroadcastTags.AUDIO_FRAME);
158
+ writer.WriteInt(ID);
159
+ writer.WriteLong(frame.timestamp);
160
+ writer.WriteInt(frame.frequency);
161
+ writer.WriteInt(frame.channelCount);
162
+ writer.WriteByteArray(frame.samples);
163
+
164
+ var message = new FishNetBroadcast
165
+ {
166
+ data = writer.Bytes
167
+ };
168
+
169
+ if (_networkManager.ClientManager.Started)
170
+ _networkManager.ClientManager.Broadcast(message, Channel.Unreliable);
171
+ }
172
+
173
+ /// <summary>
174
+ /// Updates the server with the voice settings of this client
175
+ /// </summary>
176
+ public void SubmitVoiceSettings()
177
+ {
178
+ if (ID == -1)
179
+ return;
180
+ var writer = new BytesWriter();
181
+ writer.WriteString(FishNetBroadcastTags.VOICE_SETTINGS);
182
+ writer.WriteInt(YourVoiceSettings.muteAll ? 1 : 0);
183
+ writer.WriteIntArray(YourVoiceSettings.mutedPeers.ToArray());
184
+ writer.WriteInt(YourVoiceSettings.deafenAll ? 1 : 0);
185
+ writer.WriteIntArray(YourVoiceSettings.deafenedPeers.ToArray());
186
+ writer.WriteString(string.Join(",", YourVoiceSettings.myTags));
187
+ writer.WriteString(string.Join(",", YourVoiceSettings.mutedTags));
188
+ writer.WriteString(string.Join(",", YourVoiceSettings.deafenedTags));
189
+
190
+ var message = new FishNetBroadcast() {
191
+ data = writer.Bytes
192
+ };
193
+ _networkManager.ClientManager.Broadcast(message);
194
+ }
195
+ }
196
+ }
197
+ #endif
@@ -0,0 +1,2 @@
1
+ fileFormatVersion: 2
2
+ guid: e49db0da30bc5fc479ca244d58f82481
@@ -0,0 +1,236 @@
1
+ #if FISHNET
2
+ using System;
3
+ using System.Linq;
4
+ using System.Collections.Generic;
5
+ using UnityEngine;
6
+ using Adrenak.BRW;
7
+ using FishNet;
8
+ using FishNet.Connection;
9
+ using FishNet.Managing;
10
+ using FishNet.Transporting;
11
+
12
+ namespace Adrenak.UniVoice.Networks
13
+ {
14
+ /// <summary>
15
+ /// This is an implementation of the <see cref="IAudioServer{T}"/> interface for FishNet.
16
+ /// It uses the FishNet to send and receive UniVoice audio data to and from clients.
17
+ /// </summary>
18
+ public class FishNetServer : IAudioServer<int>
19
+ {
20
+ private const string TAG = "[FishNetServer]";
21
+
22
+ public event Action OnServerStart;
23
+ public event Action OnServerStop;
24
+ public event Action OnClientVoiceSettingsUpdated;
25
+
26
+ public List<int> ClientIDs { get; private set; }
27
+ public Dictionary<int, VoiceSettings> ClientVoiceSettings { get; private set; }
28
+
29
+ private NetworkManager _networkManager;
30
+ private List<int> _startedTransports = new();
31
+
32
+ public FishNetServer()
33
+ {
34
+ ClientIDs = new List<int>();
35
+ ClientVoiceSettings = new Dictionary<int, VoiceSettings>();
36
+
37
+ _networkManager = InstanceFinder.NetworkManager;
38
+ _networkManager.ServerManager.OnServerConnectionState += OnServerConnectionStateChanged;
39
+ _networkManager.ServerManager.OnRemoteConnectionState += OnServerRemoteConnectionStateChanged;
40
+ _networkManager.ClientManager.OnClientConnectionState += OnClientConnectionStateChanged;
41
+ _networkManager.ServerManager.RegisterBroadcast<FishNetBroadcast>(OnReceivedMessage, false);
42
+ }
43
+
44
+ public void Dispose()
45
+ {
46
+ if (_networkManager)
47
+ {
48
+ _networkManager.ServerManager.OnServerConnectionState -= OnServerConnectionStateChanged;
49
+ _networkManager.ServerManager.OnRemoteConnectionState -= OnServerRemoteConnectionStateChanged;
50
+ _networkManager.ClientManager.OnClientConnectionState -= OnClientConnectionStateChanged;
51
+ _networkManager.ServerManager.UnregisterBroadcast<FishNetBroadcast>(OnReceivedMessage);
52
+ }
53
+ OnServerShutdown();
54
+ }
55
+
56
+ private void OnServerStarted()
57
+ {
58
+ OnServerStart?.Invoke();
59
+ }
60
+
61
+ private void OnServerShutdown()
62
+ {
63
+ ClientIDs.Clear();
64
+ ClientVoiceSettings.Clear();
65
+ OnServerStop?.Invoke();
66
+ }
67
+
68
+ private void OnServerRemoteConnectionStateChanged(NetworkConnection connection, RemoteConnectionStateArgs args)
69
+ {
70
+ if (args.ConnectionState == RemoteConnectionState.Started)
71
+ {
72
+ OnServerConnected(connection.ClientId);
73
+ }
74
+ else if (args.ConnectionState == RemoteConnectionState.Stopped)
75
+ {
76
+ OnServerDisconnected(connection.ClientId);
77
+ }
78
+ }
79
+
80
+ private void OnServerConnectionStateChanged(ServerConnectionStateArgs args)
81
+ {
82
+ // Connection can change for each transport, so we need to track them
83
+ if (args.ConnectionState == LocalConnectionState.Started)
84
+ {
85
+ var wasStarted = _startedTransports.Count != 0;
86
+ _startedTransports.Add(args.TransportIndex);
87
+ if (!wasStarted)
88
+ OnServerStarted();
89
+ }
90
+ else if (args.ConnectionState == LocalConnectionState.Stopped)
91
+ {
92
+ _startedTransports.Remove(args.TransportIndex);
93
+ if(_startedTransports.Count == 0)
94
+ OnServerShutdown();
95
+ }
96
+ }
97
+
98
+ private void OnClientConnectionStateChanged(ClientConnectionStateArgs args)
99
+ {
100
+ // TODO - do we need to check if host or is this enough?
101
+ if (args.ConnectionState == LocalConnectionState.Started)
102
+ {
103
+ OnServerConnected(0);
104
+ }
105
+ else if (args.ConnectionState == LocalConnectionState.Stopped)
106
+ {
107
+ OnServerDisconnected(0);
108
+ }
109
+ }
110
+
111
+ private void OnReceivedMessage(NetworkConnection connection, FishNetBroadcast message, Channel channel)
112
+ {
113
+ var clientId = connection.ClientId;
114
+ var reader = new BytesReader(message.data);
115
+ var tag = reader.ReadString();
116
+
117
+ if (tag.Equals(FishNetBroadcastTags.AUDIO_FRAME))
118
+ {
119
+ // We start with all the peers except the one that's
120
+ // sent the audio
121
+ var peersToForwardAudioTo = ClientIDs
122
+ .Where(x => x != clientId);
123
+
124
+ // Check the voice settings of the sender and eliminate any peers the sender
125
+ // may have deafened
126
+ if (ClientVoiceSettings.TryGetValue(clientId, out var senderSettings))
127
+ {
128
+ // If the client sending the audio has deafened everyone,
129
+ // we simply return. Sender's audio should not be forwarded to anyone.
130
+ if (senderSettings.deafenAll)
131
+ return;
132
+
133
+ // Filter the recipient list by removing all peers that the sender has
134
+ // deafened using ID
135
+ peersToForwardAudioTo = peersToForwardAudioTo
136
+ .Where(x => !senderSettings.deafenedPeers.Contains(x));
137
+
138
+ // Further filter the recipient list by removing peers that the sender has
139
+ // deafened using tags
140
+ peersToForwardAudioTo = peersToForwardAudioTo.Where(peer =>
141
+ {
142
+ // Get the voice settings of the peer
143
+ if (ClientVoiceSettings.TryGetValue(peer, out var peerVoiceSettings))
144
+ {
145
+ // Check if sender has not deafened peer using tag
146
+ var hasDeafenedPeer = senderSettings.deafenedTags.Intersect(peerVoiceSettings.myTags).Any();
147
+ return !hasDeafenedPeer;
148
+ }
149
+ // If peer doesn't have voice settings, we can keep the peer in the list
150
+ return true;
151
+ });
152
+ }
153
+
154
+ // We iterate through each recipient peer that the sender wants to send audio to, checking if
155
+ // they have muted the sender, before forwarding the audio to them.
156
+ foreach (var recipient in peersToForwardAudioTo) {
157
+ // Get the settings of a potential recipient
158
+ if (ClientVoiceSettings.TryGetValue(recipient, out var recipientSettings)) {
159
+ // If a peer has muted everyone, don't send audio
160
+ if (recipientSettings.muteAll)
161
+ continue;
162
+
163
+ // If the peers has muted the sender using ID, skip sending audio
164
+ if (recipientSettings.mutedPeers.Contains(clientId))
165
+ continue;
166
+
167
+ // If the peer has muted the sender using tag, skip sending audio
168
+ if (recipientSettings.mutedTags.Intersect(senderSettings.myTags).Any())
169
+ continue;
170
+ }
171
+ SendToClient(recipient, message.data, Channel.Unreliable);
172
+ }
173
+ }
174
+ else if (tag.Equals(FishNetBroadcastTags.VOICE_SETTINGS)) {
175
+ //Debug.unityLogger.Log(LogType.Log, TAG, "FishNet server stopped");
176
+ // We create the VoiceSettings object by reading from the reader
177
+ // and update the peer voice settings map
178
+ var muteAll = reader.ReadInt() == 1;
179
+ var mutedPeers = reader.ReadIntArray().ToList();
180
+ var deafenAll = reader.ReadInt() == 1;
181
+ var deafenedPeers = reader.ReadIntArray().ToList();
182
+ var myTags = reader.ReadString().Split(",").ToList();
183
+ var mutedTags = reader.ReadString().Split(",").ToList();
184
+ var deafenedTags = reader.ReadString().Split(",").ToList();
185
+
186
+ var voiceSettings = new VoiceSettings {
187
+ muteAll = muteAll,
188
+ mutedPeers = mutedPeers,
189
+ deafenAll = deafenAll,
190
+ deafenedPeers = deafenedPeers,
191
+ myTags = myTags,
192
+ mutedTags = mutedTags,
193
+ deafenedTags = deafenedTags
194
+ };
195
+ ClientVoiceSettings[clientId] = voiceSettings;
196
+ OnClientVoiceSettingsUpdated?.Invoke();
197
+ }
198
+ }
199
+
200
+ private void OnServerConnected(int connId)
201
+ {
202
+ Debug.unityLogger.Log(LogType.Log, TAG, $"Client {connId} connected");
203
+ ClientIDs.Add(connId);
204
+ }
205
+
206
+ private void OnServerDisconnected(int connId)
207
+ {
208
+ ClientIDs.Remove(connId);
209
+ Debug.unityLogger.Log(LogType.Log, TAG, $"Client {connId} disconnected");
210
+ }
211
+
212
+ private void SendToClient(int clientConnId, byte[] bytes, Channel channel)
213
+ {
214
+ if (!TryGetConnectionToClient(clientConnId, out var connection))
215
+ return;
216
+
217
+ var message = new FishNetBroadcast {data = bytes};
218
+ _networkManager.ServerManager.Broadcast(connection, message, false, channel);
219
+ }
220
+
221
+ private bool TryGetConnectionToClient(int desiredClientID, out NetworkConnection resultConnection)
222
+ {
223
+ resultConnection = null;
224
+ foreach (var (clientID, conn) in _networkManager.ServerManager.Clients)
225
+ {
226
+ if (clientID == desiredClientID)
227
+ {
228
+ resultConnection = conn;
229
+ return true;
230
+ }
231
+ }
232
+ return false;
233
+ }
234
+ }
235
+ }
236
+ #endif
@@ -0,0 +1,2 @@
1
+ fileFormatVersion: 2
2
+ guid: 7a700664d8ca951499b4d5a0fdd8a5f5
@@ -0,0 +1,3 @@
1
+ fileFormatVersion: 2
2
+ guid: 40d4cba94d8644408e9ebc8f61a470c3
3
+ timeCreated: 1755189627
@@ -1,4 +1,4 @@
1
- #if UNIVOICE_MIRROR_NETWORK || UNIVOICE_NETWORK_MIRROR
1
+ #if MIRROR
2
2
  using System;
3
3
  using System.Linq;
4
4
  using System.Collections.Generic;
@@ -1,4 +1,4 @@
1
- #if UNIVOICE_MIRROR_NETWORK || UNIVOICE_NETWORK_MIRROR
1
+ #if MIRROR
2
2
  using System;
3
3
 
4
4
  using Mirror;
@@ -1,4 +1,4 @@
1
- #if UNIVOICE_MIRROR_NETWORK || UNIVOICE_NETWORK_MIRROR
1
+ #if MIRROR
2
2
  namespace Adrenak.UniVoice.Networks {
3
3
  /// <summary>
4
4
  /// The different types of messages we send over Mirror
@@ -1,4 +1,4 @@
1
- #if UNIVOICE_MIRROR_NETWORK || UNIVOICE_NETWORK_MIRROR
1
+ #if MIRROR
2
2
  using Mirror;
3
3
 
4
4
  using System;
@@ -3,7 +3,7 @@
3
3
  // https://github.com/MirrorNetworking/Mirror/releases/tag/v89.11.0
4
4
  // OnServerConnected no longer seems to work?
5
5
 
6
- #if UNIVOICE_MIRROR_NETWORK || UNIVOICE_NETWORK_MIRROR
6
+ #if MIRROR
7
7
  using System;
8
8
  using System.Linq;
9
9
  using System.Threading.Tasks;
@@ -0,0 +1,15 @@
1
+ %YAML 1.1
2
+ %TAG !u! tag:unity3d.com,2011:
3
+ --- !u!114 &11400000
4
+ MonoBehaviour:
5
+ m_ObjectHideFlags: 0
6
+ m_CorrespondingSourceObject: {fileID: 0}
7
+ m_PrefabInstance: {fileID: 0}
8
+ m_PrefabAsset: {fileID: 0}
9
+ m_GameObject: {fileID: 0}
10
+ m_Enabled: 1
11
+ m_EditorHideFlags: 0
12
+ m_Script: {fileID: 11500000, guid: 4489d77032a81ef42b0067acf2737d4d, type: 3}
13
+ m_Name: FishNet-SinglePrefabObjects
14
+ m_EditorClassIdentifier:
15
+ _prefabs: []
@@ -0,0 +1,8 @@
1
+ fileFormatVersion: 2
2
+ guid: b3f3638a0e223704fb042d01a269adcb
3
+ NativeFormatImporter:
4
+ externalObjects: {}
5
+ mainObjectFileID: 11400000
6
+ userData:
7
+ assetBundleName:
8
+ assetBundleVariant:
@@ -0,0 +1,199 @@
1
+ using UnityEngine;
2
+
3
+ using Adrenak.UniMic;
4
+ using Adrenak.UniVoice.Networks;
5
+ using Adrenak.UniVoice.Outputs;
6
+ using Adrenak.UniVoice.Inputs;
7
+ using Adrenak.UniVoice.Filters;
8
+
9
+ namespace Adrenak.UniVoice.Samples {
10
+ /// <summary>
11
+ /// To get this setup sample to work, ensure that you have done the following:
12
+ /// - Import Mirror and add the UNIVOICE_NETWORK_MIRROR compilation symbol to your project
13
+ /// - If you want to use RNNoise filter, import RNNoise4Unity into your project and add UNIVOICE_FILTER_RNNOISE4UNITY
14
+ /// - Add this component to the first scene of your Unity project
15
+ ///
16
+ /// *** More info on adding and activating non packaged dependencies is here: https://github.com/adrenak/univoice?tab=readme-ov-file#activating-non-packaged-dependencies ***
17
+ ///
18
+ /// This is a basic integration script that uses the following to setup UniVoice:
19
+ /// - <see cref="MirrorServer"/>, an implementation of <see cref="IAudioServer{T}"/>
20
+ /// - <see cref="MirrorClient"/>, an implementation of <see cref="IAudioClient{T}"/>
21
+ /// - <see cref="UniMicInput"/>, an implementation of <see cref="IAudioInput"/> that captures audio from a mic
22
+ /// - <see cref="EmptyAudioInput"/>, an implementation of <see cref="IAudioInput"/> that is basically
23
+ /// an idle audio input used when there is no input device
24
+ /// - <see cref="RNNoiseFilter"/>, an implementation of <see cref="IAudioFilter"/> that removes noise from
25
+ /// captured audio.
26
+ /// - <see cref="ConcentusEncodeFilter"/>, an implementation of <see cref="IAudioFilter"/> that encodes captured audio
27
+ /// using Concentus (C# Opus) to reduce the size of audio frames
28
+ /// - <see cref="ConcentusDecodeFilter"/>, an implementation of <see cref="IAudioFilter"/> that decodes incoming audio
29
+ /// using Concentus to decode and make the audio frame playable.
30
+ /// </summary>
31
+ public class UniVoiceFishNetSetupSample : MonoBehaviour {
32
+ const string TAG = "[UniVoiceFishNetSetupSample]";
33
+
34
+ /// <summary>
35
+ /// Whether UniVoice has been setup successfully. This field will return true if the setup was successful.
36
+ /// It runs on both server and client.
37
+ /// </summary>
38
+ public static bool HasSetUp { get; private set; }
39
+
40
+ /// <summary>
41
+ /// The server object.
42
+ /// </summary>
43
+ public static IAudioServer<int> AudioServer { get; private set; }
44
+
45
+ /// <summary>
46
+ /// The client session.
47
+ /// </summary>
48
+ public static ClientSession<int> ClientSession { get; private set; }
49
+
50
+ [SerializeField] bool useRNNoise4UnityIfAvailable = true;
51
+
52
+ [SerializeField] bool useConcentusEncodeAndDecode = true;
53
+
54
+ void Start() {
55
+ if (HasSetUp) {
56
+ Debug.unityLogger.Log(LogType.Log, TAG, "UniVoice is already set up. Ignoring...");
57
+ return;
58
+ }
59
+ HasSetUp = Setup();
60
+ }
61
+
62
+ bool Setup() {
63
+ Debug.unityLogger.Log(LogType.Log, TAG, "Trying to setup UniVoice");
64
+
65
+ bool failed = false;
66
+
67
+ // Set setup the AudioServer and ClientSession on ALL builds. This means that you'd
68
+ // have a ClientSession on a dedicated server, even though there's not much you can do with it.
69
+ // Similarly, a client would also have an AudioServer object. But it would just be inactive.
70
+ // This sample is for ease of use and to get something working quickly, so we don't bother
71
+ // with these minor details. Note that doing so does not have any performance implications
72
+ // so you can do this, so you could keep this approach without any tradeoffs.
73
+ var createdAudioServer = SetupAudioServer();
74
+ if (!createdAudioServer) {
75
+ Debug.unityLogger.Log(LogType.Error, TAG, "Could not setup UniVoice server.");
76
+ failed = true;
77
+ }
78
+
79
+ var setupAudioClient = SetupClientSession();
80
+ if (!setupAudioClient) {
81
+ Debug.unityLogger.Log(LogType.Error, TAG, "Could not setup UniVoice client.");
82
+ failed = true;
83
+ }
84
+
85
+ if (!failed)
86
+ Debug.unityLogger.Log(LogType.Log, TAG, "UniVoice successfully setup!");
87
+ else
88
+ Debug.unityLogger.Log(LogType.Error, TAG, $"Refer to the notes on top of {typeof(UniVoiceMirrorSetupSample).Name}.cs for setup instructions.");
89
+ return !failed;
90
+ }
91
+
92
+ bool SetupAudioServer() {
93
+ #if FISHNET
94
+ // ---- CREATE AUDIO SERVER AND SUBSCRIBE TO EVENTS TO PRINT LOGS ----
95
+ // We create a server. If this code runs in server mode, MirrorServer will take care
96
+ // or automatically handling all incoming messages. On a device connecting as a client,
97
+ // this code doesn't do anything.
98
+ AudioServer = new FishNetServer();
99
+ Debug.unityLogger.Log(LogType.Log, TAG, "Created MirrorServer object");
100
+
101
+ AudioServer.OnServerStart += () => {
102
+ Debug.unityLogger.Log(LogType.Log, TAG, "Server started");
103
+ };
104
+
105
+ AudioServer.OnServerStop += () => {
106
+ Debug.unityLogger.Log(LogType.Log, TAG, "Server stopped");
107
+ };
108
+ return true;
109
+ #else
110
+ Debug.unityLogger.Log(LogType.Error, TAG, "MirrorServer implementation not found!");
111
+ return false;
112
+ #endif
113
+ }
114
+
115
+ bool SetupClientSession() {
116
+ #if FISHNET
117
+ // ---- CREATE AUDIO CLIENT AND SUBSCRIBE TO EVENTS ----
118
+ IAudioClient<int> client = new FishNetClient();
119
+ client.OnJoined += (id, peerIds) => {
120
+ Debug.unityLogger.Log(LogType.Log, TAG, $"You are Peer ID {id}");
121
+ };
122
+
123
+ client.OnLeft += () => {
124
+ Debug.unityLogger.Log(LogType.Log, TAG, "You left the chatroom");
125
+ };
126
+
127
+ // When a peer joins, we instantiate a new peer view
128
+ client.OnPeerJoined += id => {
129
+ Debug.unityLogger.Log(LogType.Log, TAG, $"Peer {id} joined");
130
+ };
131
+
132
+ // When a peer leaves, destroy the UI representing them
133
+ client.OnPeerLeft += id => {
134
+ Debug.unityLogger.Log(LogType.Log, TAG, $"Peer {id} left");
135
+ };
136
+
137
+ Debug.unityLogger.Log(LogType.Log, TAG, "Created MirrorClient object");
138
+
139
+ // ---- CREATE AUDIO INPUT ----
140
+ IAudioInput input;
141
+ // Since in this sample we use microphone input via UniMic, we first check if there
142
+ // are any mic devices available.
143
+ Mic.Init(); // Must do this to use the Mic class
144
+ if (Mic.AvailableDevices.Count == 0) {
145
+ Debug.unityLogger.Log(LogType.Log, TAG, "Device has no microphones." +
146
+ "Will only be able to hear other clients, cannot send any audio.");
147
+ input = new EmptyAudioInput();
148
+ Debug.unityLogger.Log(LogType.Log, TAG, "Created EmptyAudioInput");
149
+ }
150
+ else {
151
+ // Get the first recording device that we have available and start it.
152
+ // Then we create a UniMicInput instance that requires the mic object
153
+ // For more info on UniMic refer to https://www.github.com/adrenak/unimic
154
+ var mic = Mic.AvailableDevices[0];
155
+ mic.StartRecording(60);
156
+ Debug.unityLogger.Log(LogType.Log, TAG, "Started recording with Mic device named." +
157
+ mic.Name + $" at frequency {mic.SamplingFrequency} with frame duration {mic.FrameDurationMS} ms.");
158
+ input = new UniMicInput(mic);
159
+ Debug.unityLogger.Log(LogType.Log, TAG, "Created UniMicInput");
160
+ }
161
+
162
+ // ---- CREATE AUDIO OUTPUT FACTORY ----
163
+ IAudioOutputFactory outputFactory;
164
+ // We want the incoming audio from peers to be played via the StreamedAudioSourceOutput
165
+ // implementation of IAudioSource interface. So we get the factory for it.
166
+ outputFactory = new StreamedAudioSourceOutput.Factory();
167
+ Debug.unityLogger.Log(LogType.Log, TAG, "Using StreamedAudioSourceOutput.Factory as output factory");
168
+
169
+ // ---- CREATE CLIENT SESSION AND ADD FILTERS TO IT ----
170
+ // With the client, input and output factory ready, we create create the client session
171
+ ClientSession = new ClientSession<int>(client, input, outputFactory);
172
+ Debug.unityLogger.Log(LogType.Log, TAG, "Created session");
173
+
174
+ #if UNIVOICE_FILTER_RNNOISE4UNITY
175
+ if(useRNNoise4UnityIfAvailable) {
176
+ // RNNoiseFilter to remove noise from captured audio
177
+ session.InputFilters.Add(new RNNoiseFilter());
178
+ Debug.unityLogger.Log(LogType.Log, TAG, "Registered RNNoiseFilter as an input filter");
179
+ }
180
+ #endif
181
+
182
+ if (useConcentusEncodeAndDecode) {
183
+ // ConcentureEncoder filter to encode captured audio that reduces the audio frame size
184
+ ClientSession.InputFilters.Add(new ConcentusEncodeFilter());
185
+ Debug.unityLogger.Log(LogType.Log, TAG, "Registered ConcentusEncodeFilter as an input filter");
186
+
187
+ // For incoming audio register the ConcentusDecodeFilter to decode the encoded audio received from other clients
188
+ ClientSession.AddOutputFilter<ConcentusDecodeFilter>(() => new ConcentusDecodeFilter());
189
+ Debug.unityLogger.Log(LogType.Log, TAG, "Registered ConcentusDecodeFilter as an output filter");
190
+ }
191
+
192
+ return true;
193
+ #else
194
+ Debug.unityLogger.Log(LogType.Error, TAG, "MirrorClient implementation not found!");
195
+ return false;
196
+ #endif
197
+ }
198
+ }
199
+ }
@@ -0,0 +1,11 @@
1
+ fileFormatVersion: 2
2
+ guid: 812e71335abfc894ea5415f23adf5ea6
3
+ MonoImporter:
4
+ externalObjects: {}
5
+ serializedVersion: 2
6
+ defaultReferences: []
7
+ executionOrder: 0
8
+ icon: {instanceID: 0}
9
+ userData:
10
+ assetBundleName:
11
+ assetBundleVariant:
@@ -0,0 +1,335 @@
1
+ %YAML 1.1
2
+ %TAG !u! tag:unity3d.com,2011:
3
+ --- !u!29 &1
4
+ OcclusionCullingSettings:
5
+ m_ObjectHideFlags: 0
6
+ serializedVersion: 2
7
+ m_OcclusionBakeSettings:
8
+ smallestOccluder: 5
9
+ smallestHole: 0.25
10
+ backfaceThreshold: 100
11
+ m_SceneGUID: 00000000000000000000000000000000
12
+ m_OcclusionCullingData: {fileID: 0}
13
+ --- !u!104 &2
14
+ RenderSettings:
15
+ m_ObjectHideFlags: 0
16
+ serializedVersion: 9
17
+ m_Fog: 0
18
+ m_FogColor: {r: 0.5, g: 0.5, b: 0.5, a: 1}
19
+ m_FogMode: 3
20
+ m_FogDensity: 0.01
21
+ m_LinearFogStart: 0
22
+ m_LinearFogEnd: 300
23
+ m_AmbientSkyColor: {r: 0.212, g: 0.227, b: 0.259, a: 1}
24
+ m_AmbientEquatorColor: {r: 0.114, g: 0.125, b: 0.133, a: 1}
25
+ m_AmbientGroundColor: {r: 0.047, g: 0.043, b: 0.035, a: 1}
26
+ m_AmbientIntensity: 1
27
+ m_AmbientMode: 0
28
+ m_SubtractiveShadowColor: {r: 0.42, g: 0.478, b: 0.627, a: 1}
29
+ m_SkyboxMaterial: {fileID: 10304, guid: 0000000000000000f000000000000000, type: 0}
30
+ m_HaloStrength: 0.5
31
+ m_FlareStrength: 1
32
+ m_FlareFadeSpeed: 3
33
+ m_HaloTexture: {fileID: 0}
34
+ m_SpotCookie: {fileID: 10001, guid: 0000000000000000e000000000000000, type: 0}
35
+ m_DefaultReflectionMode: 0
36
+ m_DefaultReflectionResolution: 128
37
+ m_ReflectionBounces: 1
38
+ m_ReflectionIntensity: 1
39
+ m_CustomReflection: {fileID: 0}
40
+ m_Sun: {fileID: 0}
41
+ m_IndirectSpecularColor: {r: 0.37311926, g: 0.38073996, b: 0.35872698, a: 1}
42
+ m_UseRadianceAmbientProbe: 0
43
+ --- !u!157 &3
44
+ LightmapSettings:
45
+ m_ObjectHideFlags: 0
46
+ serializedVersion: 12
47
+ m_GIWorkflowMode: 1
48
+ m_GISettings:
49
+ serializedVersion: 2
50
+ m_BounceScale: 1
51
+ m_IndirectOutputScale: 1
52
+ m_AlbedoBoost: 1
53
+ m_EnvironmentLightingMode: 0
54
+ m_EnableBakedLightmaps: 1
55
+ m_EnableRealtimeLightmaps: 0
56
+ m_LightmapEditorSettings:
57
+ serializedVersion: 12
58
+ m_Resolution: 2
59
+ m_BakeResolution: 40
60
+ m_AtlasSize: 1024
61
+ m_AO: 0
62
+ m_AOMaxDistance: 1
63
+ m_CompAOExponent: 1
64
+ m_CompAOExponentDirect: 0
65
+ m_ExtractAmbientOcclusion: 0
66
+ m_Padding: 2
67
+ m_LightmapParameters: {fileID: 0}
68
+ m_LightmapsBakeMode: 1
69
+ m_TextureCompression: 1
70
+ m_FinalGather: 0
71
+ m_FinalGatherFiltering: 1
72
+ m_FinalGatherRayCount: 256
73
+ m_ReflectionCompression: 2
74
+ m_MixedBakeMode: 2
75
+ m_BakeBackend: 1
76
+ m_PVRSampling: 1
77
+ m_PVRDirectSampleCount: 32
78
+ m_PVRSampleCount: 512
79
+ m_PVRBounces: 2
80
+ m_PVREnvironmentSampleCount: 256
81
+ m_PVREnvironmentReferencePointCount: 2048
82
+ m_PVRFilteringMode: 1
83
+ m_PVRDenoiserTypeDirect: 1
84
+ m_PVRDenoiserTypeIndirect: 1
85
+ m_PVRDenoiserTypeAO: 1
86
+ m_PVRFilterTypeDirect: 0
87
+ m_PVRFilterTypeIndirect: 0
88
+ m_PVRFilterTypeAO: 0
89
+ m_PVREnvironmentMIS: 1
90
+ m_PVRCulling: 1
91
+ m_PVRFilteringGaussRadiusDirect: 1
92
+ m_PVRFilteringGaussRadiusIndirect: 5
93
+ m_PVRFilteringGaussRadiusAO: 2
94
+ m_PVRFilteringAtrousPositionSigmaDirect: 0.5
95
+ m_PVRFilteringAtrousPositionSigmaIndirect: 2
96
+ m_PVRFilteringAtrousPositionSigmaAO: 1
97
+ m_ExportTrainingData: 0
98
+ m_TrainingDataDestination: TrainingData
99
+ m_LightProbeSampleCountMultiplier: 4
100
+ m_LightingDataAsset: {fileID: 0}
101
+ m_LightingSettings: {fileID: 0}
102
+ --- !u!196 &4
103
+ NavMeshSettings:
104
+ serializedVersion: 2
105
+ m_ObjectHideFlags: 0
106
+ m_BuildSettings:
107
+ serializedVersion: 2
108
+ agentTypeID: 0
109
+ agentRadius: 0.5
110
+ agentHeight: 2
111
+ agentSlope: 45
112
+ agentClimb: 0.4
113
+ ledgeDropHeight: 0
114
+ maxJumpAcrossDistance: 0
115
+ minRegionArea: 2
116
+ manualCellSize: 0
117
+ cellSize: 0.16666667
118
+ manualTileSize: 0
119
+ tileSize: 256
120
+ accuratePlacement: 0
121
+ maxJobWorkers: 0
122
+ preserveTilesOutsideBounds: 0
123
+ debug:
124
+ m_Flags: 0
125
+ m_NavMeshData: {fileID: 0}
126
+ --- !u!1 &810914621
127
+ GameObject:
128
+ m_ObjectHideFlags: 0
129
+ m_CorrespondingSourceObject: {fileID: 0}
130
+ m_PrefabInstance: {fileID: 0}
131
+ m_PrefabAsset: {fileID: 0}
132
+ serializedVersion: 6
133
+ m_Component:
134
+ - component: {fileID: 810914623}
135
+ - component: {fileID: 810914622}
136
+ m_Layer: 0
137
+ m_Name: Sample
138
+ m_TagString: Untagged
139
+ m_Icon: {fileID: 0}
140
+ m_NavMeshLayer: 0
141
+ m_StaticEditorFlags: 0
142
+ m_IsActive: 1
143
+ --- !u!114 &810914622
144
+ MonoBehaviour:
145
+ m_ObjectHideFlags: 0
146
+ m_CorrespondingSourceObject: {fileID: 0}
147
+ m_PrefabInstance: {fileID: 0}
148
+ m_PrefabAsset: {fileID: 0}
149
+ m_GameObject: {fileID: 810914621}
150
+ m_Enabled: 1
151
+ m_EditorHideFlags: 0
152
+ m_Script: {fileID: 11500000, guid: 812e71335abfc894ea5415f23adf5ea6, type: 3}
153
+ m_Name:
154
+ m_EditorClassIdentifier:
155
+ useRNNoise4UnityIfAvailable: 1
156
+ useConcentusEncodeAndDecode: 1
157
+ --- !u!4 &810914623
158
+ Transform:
159
+ m_ObjectHideFlags: 0
160
+ m_CorrespondingSourceObject: {fileID: 0}
161
+ m_PrefabInstance: {fileID: 0}
162
+ m_PrefabAsset: {fileID: 0}
163
+ m_GameObject: {fileID: 810914621}
164
+ m_LocalRotation: {x: 0, y: 0, z: 0, w: 1}
165
+ m_LocalPosition: {x: 0, y: 0, z: 0}
166
+ m_LocalScale: {x: 1, y: 1, z: 1}
167
+ m_ConstrainProportionsScale: 0
168
+ m_Children: []
169
+ m_Father: {fileID: 0}
170
+ m_RootOrder: 2
171
+ m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0}
172
+ --- !u!1001 &1564609055
173
+ PrefabInstance:
174
+ m_ObjectHideFlags: 0
175
+ serializedVersion: 2
176
+ m_Modification:
177
+ m_TransformParent: {fileID: 0}
178
+ m_Modifications:
179
+ - target: {fileID: 7443408887813606049, guid: 0b650fca685f2eb41a86538aa883e4c1,
180
+ type: 3}
181
+ propertyPath: m_RootOrder
182
+ value: 0
183
+ objectReference: {fileID: 0}
184
+ - target: {fileID: 7443408887813606049, guid: 0b650fca685f2eb41a86538aa883e4c1,
185
+ type: 3}
186
+ propertyPath: m_LocalPosition.x
187
+ value: 0
188
+ objectReference: {fileID: 0}
189
+ - target: {fileID: 7443408887813606049, guid: 0b650fca685f2eb41a86538aa883e4c1,
190
+ type: 3}
191
+ propertyPath: m_LocalPosition.y
192
+ value: 0
193
+ objectReference: {fileID: 0}
194
+ - target: {fileID: 7443408887813606049, guid: 0b650fca685f2eb41a86538aa883e4c1,
195
+ type: 3}
196
+ propertyPath: m_LocalPosition.z
197
+ value: 0
198
+ objectReference: {fileID: 0}
199
+ - target: {fileID: 7443408887813606049, guid: 0b650fca685f2eb41a86538aa883e4c1,
200
+ type: 3}
201
+ propertyPath: m_LocalRotation.w
202
+ value: 1
203
+ objectReference: {fileID: 0}
204
+ - target: {fileID: 7443408887813606049, guid: 0b650fca685f2eb41a86538aa883e4c1,
205
+ type: 3}
206
+ propertyPath: m_LocalRotation.x
207
+ value: 0
208
+ objectReference: {fileID: 0}
209
+ - target: {fileID: 7443408887813606049, guid: 0b650fca685f2eb41a86538aa883e4c1,
210
+ type: 3}
211
+ propertyPath: m_LocalRotation.y
212
+ value: 0
213
+ objectReference: {fileID: 0}
214
+ - target: {fileID: 7443408887813606049, guid: 0b650fca685f2eb41a86538aa883e4c1,
215
+ type: 3}
216
+ propertyPath: m_LocalRotation.z
217
+ value: 0
218
+ objectReference: {fileID: 0}
219
+ - target: {fileID: 7443408887813606049, guid: 0b650fca685f2eb41a86538aa883e4c1,
220
+ type: 3}
221
+ propertyPath: m_LocalEulerAnglesHint.x
222
+ value: 0
223
+ objectReference: {fileID: 0}
224
+ - target: {fileID: 7443408887813606049, guid: 0b650fca685f2eb41a86538aa883e4c1,
225
+ type: 3}
226
+ propertyPath: m_LocalEulerAnglesHint.y
227
+ value: 0
228
+ objectReference: {fileID: 0}
229
+ - target: {fileID: 7443408887813606049, guid: 0b650fca685f2eb41a86538aa883e4c1,
230
+ type: 3}
231
+ propertyPath: m_LocalEulerAnglesHint.z
232
+ value: 0
233
+ objectReference: {fileID: 0}
234
+ - target: {fileID: 7443408887813606050, guid: 0b650fca685f2eb41a86538aa883e4c1,
235
+ type: 3}
236
+ propertyPath: _spawnablePrefabs
237
+ value:
238
+ objectReference: {fileID: 11400000, guid: b3f3638a0e223704fb042d01a269adcb,
239
+ type: 2}
240
+ - target: {fileID: 7443408887813606051, guid: 0b650fca685f2eb41a86538aa883e4c1,
241
+ type: 3}
242
+ propertyPath: m_Name
243
+ value: NetworkManager
244
+ objectReference: {fileID: 0}
245
+ - target: {fileID: 7443408887813606060, guid: 0b650fca685f2eb41a86538aa883e4c1,
246
+ type: 3}
247
+ propertyPath: _addToDefaultScene
248
+ value: 0
249
+ objectReference: {fileID: 0}
250
+ m_RemovedComponents: []
251
+ m_SourcePrefab: {fileID: 100100000, guid: 0b650fca685f2eb41a86538aa883e4c1, type: 3}
252
+ --- !u!1 &1792070878
253
+ GameObject:
254
+ m_ObjectHideFlags: 0
255
+ m_CorrespondingSourceObject: {fileID: 0}
256
+ m_PrefabInstance: {fileID: 0}
257
+ m_PrefabAsset: {fileID: 0}
258
+ serializedVersion: 6
259
+ m_Component:
260
+ - component: {fileID: 1792070881}
261
+ - component: {fileID: 1792070880}
262
+ - component: {fileID: 1792070879}
263
+ m_Layer: 0
264
+ m_Name: Camera
265
+ m_TagString: Untagged
266
+ m_Icon: {fileID: 0}
267
+ m_NavMeshLayer: 0
268
+ m_StaticEditorFlags: 0
269
+ m_IsActive: 1
270
+ --- !u!81 &1792070879
271
+ AudioListener:
272
+ m_ObjectHideFlags: 0
273
+ m_CorrespondingSourceObject: {fileID: 0}
274
+ m_PrefabInstance: {fileID: 0}
275
+ m_PrefabAsset: {fileID: 0}
276
+ m_GameObject: {fileID: 1792070878}
277
+ m_Enabled: 1
278
+ --- !u!20 &1792070880
279
+ Camera:
280
+ m_ObjectHideFlags: 0
281
+ m_CorrespondingSourceObject: {fileID: 0}
282
+ m_PrefabInstance: {fileID: 0}
283
+ m_PrefabAsset: {fileID: 0}
284
+ m_GameObject: {fileID: 1792070878}
285
+ m_Enabled: 1
286
+ serializedVersion: 2
287
+ m_ClearFlags: 1
288
+ m_BackGroundColor: {r: 0.19215687, g: 0.3019608, b: 0.4745098, a: 0}
289
+ m_projectionMatrixMode: 1
290
+ m_GateFitMode: 2
291
+ m_FOVAxisMode: 0
292
+ m_SensorSize: {x: 36, y: 24}
293
+ m_LensShift: {x: 0, y: 0}
294
+ m_FocalLength: 50
295
+ m_NormalizedViewPortRect:
296
+ serializedVersion: 2
297
+ x: 0
298
+ y: 0
299
+ width: 1
300
+ height: 1
301
+ near clip plane: 0.3
302
+ far clip plane: 1000
303
+ field of view: 60
304
+ orthographic: 0
305
+ orthographic size: 5
306
+ m_Depth: 0
307
+ m_CullingMask:
308
+ serializedVersion: 2
309
+ m_Bits: 4294967295
310
+ m_RenderingPath: -1
311
+ m_TargetTexture: {fileID: 0}
312
+ m_TargetDisplay: 0
313
+ m_TargetEye: 3
314
+ m_HDR: 1
315
+ m_AllowMSAA: 1
316
+ m_AllowDynamicResolution: 0
317
+ m_ForceIntoRT: 0
318
+ m_OcclusionCulling: 1
319
+ m_StereoConvergence: 10
320
+ m_StereoSeparation: 0.022
321
+ --- !u!4 &1792070881
322
+ Transform:
323
+ m_ObjectHideFlags: 0
324
+ m_CorrespondingSourceObject: {fileID: 0}
325
+ m_PrefabInstance: {fileID: 0}
326
+ m_PrefabAsset: {fileID: 0}
327
+ m_GameObject: {fileID: 1792070878}
328
+ m_LocalRotation: {x: 0, y: 0, z: 0, w: 1}
329
+ m_LocalPosition: {x: 0, y: 0, z: 0}
330
+ m_LocalScale: {x: 1, y: 1, z: 1}
331
+ m_ConstrainProportionsScale: 0
332
+ m_Children: []
333
+ m_Father: {fileID: 0}
334
+ m_RootOrder: 1
335
+ m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0}
@@ -0,0 +1,7 @@
1
+ fileFormatVersion: 2
2
+ guid: fcc19499518b4204d8fdeb4ddd323bde
3
+ DefaultImporter:
4
+ externalObjects: {}
5
+ userData:
6
+ assetBundleName:
7
+ assetBundleVariant:
@@ -90,7 +90,7 @@ namespace Adrenak.UniVoice.Samples {
90
90
  }
91
91
 
92
92
  bool SetupAudioServer() {
93
- #if UNIVOICE_NETWORK_MIRROR
93
+ #if MIRROR
94
94
  // ---- CREATE AUDIO SERVER AND SUBSCRIBE TO EVENTS TO PRINT LOGS ----
95
95
  // We create a server. If this code runs in server mode, MirrorServer will take care
96
96
  // or automatically handling all incoming messages. On a device connecting as a client,
@@ -113,7 +113,7 @@ namespace Adrenak.UniVoice.Samples {
113
113
  }
114
114
 
115
115
  bool SetupClientSession() {
116
- #if UNIVOICE_NETWORK_MIRROR
116
+ #if MIRROR
117
117
  // ---- CREATE AUDIO CLIENT AND SUBSCRIBE TO EVENTS ----
118
118
  IAudioClient<int> client = new MirrorClient();
119
119
  client.OnJoined += (id, peerIds) => {
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "com.adrenak.univoice",
3
- "version": "4.7.0",
3
+ "version": "4.8.0",
4
4
  "displayName": "Adrenak.UniVoice",
5
5
  "description": "Voice chat/VoIP framework for Unity.",
6
6
  "unity": "2021.2",