@elevenlabs/react 0.1.5
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/LICENSE +21 -0
- package/README.md +370 -0
- package/dist/index.d.ts +29 -0
- package/dist/lib.cjs +2 -0
- package/dist/lib.cjs.map +1 -0
- package/dist/lib.modern.js +2 -0
- package/dist/lib.modern.js.map +1 -0
- package/dist/lib.module.js +2 -0
- package/dist/lib.module.js.map +1 -0
- package/dist/lib.umd.js +2 -0
- package/dist/lib.umd.js.map +1 -0
- package/package.json +50 -0
package/LICENSE
ADDED
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2024 ElevenLabs
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
package/README.md
ADDED
|
@@ -0,0 +1,370 @@
|
|
|
1
|
+
# ElevenLabs React Library
|
|
2
|
+
|
|
3
|
+
An SDK library for using ElevenLabs in React based applications. If you're looking for a Node.js library, please refer to the [ElevenLabs Node.js Library](https://www.npmjs.com/package/elevenlabs).
|
|
4
|
+
|
|
5
|
+
> Note that this library is launching to primarily support Conversational AI. The support for speech synthesis and other more generic use cases is planned for the future.
|
|
6
|
+
|
|
7
|
+

|
|
8
|
+
[](https://discord.gg/elevenlabs)
|
|
9
|
+
[](https://twitter.com/elevenlabsio)
|
|
10
|
+
|
|
11
|
+
## Installation
|
|
12
|
+
|
|
13
|
+
Install the package in your project through package manager.
|
|
14
|
+
|
|
15
|
+
```shell
|
|
16
|
+
npm install @elevenlabs/react
|
|
17
|
+
# or
|
|
18
|
+
yarn add @elevenlabs/react
|
|
19
|
+
# or
|
|
20
|
+
pnpm install @elevenlabs/react
|
|
21
|
+
```
|
|
22
|
+
|
|
23
|
+
## Usage
|
|
24
|
+
|
|
25
|
+
### useConversation
|
|
26
|
+
|
|
27
|
+
React hook for managing websocket connection and audio usage for ElevenLabs Conversational AI.
|
|
28
|
+
|
|
29
|
+
#### Initialize conversation
|
|
30
|
+
|
|
31
|
+
First, initialize the Conversation instance.
|
|
32
|
+
|
|
33
|
+
```tsx
|
|
34
|
+
const conversation = useConversation();
|
|
35
|
+
```
|
|
36
|
+
|
|
37
|
+
Note that Conversational AI requires microphone access.
|
|
38
|
+
Consider explaining and allowing microphone access in your apps UI before the Conversation kicks off. The microphone may also be blocked for the current page by default, resulting in the allow prompt not showing up at all. You should handle such use case in your application and display appropriate message to the user:
|
|
39
|
+
|
|
40
|
+
```js
|
|
41
|
+
// call after explaning to the user why the microphone access is needed
|
|
42
|
+
// handle errors and show appropriate message to the user
|
|
43
|
+
try {
|
|
44
|
+
await navigator.mediaDevices.getUserMedia();
|
|
45
|
+
} catch {
|
|
46
|
+
// handle error
|
|
47
|
+
}
|
|
48
|
+
```
|
|
49
|
+
|
|
50
|
+
#### Options
|
|
51
|
+
|
|
52
|
+
The Conversation can be initialized with certain options. Those are all optional.
|
|
53
|
+
|
|
54
|
+
```tsx
|
|
55
|
+
const conversation = useConversation({
|
|
56
|
+
/* options object */
|
|
57
|
+
});
|
|
58
|
+
```
|
|
59
|
+
|
|
60
|
+
- **clientTools** - object definition for client tools that can be invoked by agent. [See below](#client-tools) for details.
|
|
61
|
+
- **overrides** - object definition conversations settings overrides. [See below](#conversation-overrides) for details.
|
|
62
|
+
- **textOnly** - whether the conversation should run in text-only mode. [See below](#text-only) for details.
|
|
63
|
+
- **onConnect** - handler called when the conversation websocket connection is established.
|
|
64
|
+
- **onDisconnect** - handler called when the conversation websocket connection is ended.
|
|
65
|
+
- **onMessage** - handler called when a new message is received. These can be tentative or final transcriptions of user voice, replies produced by LLM, or debug message when a debug option is enabled.
|
|
66
|
+
- **onError** - handler called when a error is encountered.
|
|
67
|
+
|
|
68
|
+
##### Client Tools
|
|
69
|
+
|
|
70
|
+
Client tools are a way to enabled agent to invoke client-side functionality. This can be used to trigger actions in the client, such as opening a modal or doing an API call on behalf of the user.
|
|
71
|
+
|
|
72
|
+
Client tools definition is an object of functions, and needs to be identical with your configuration within the [ElevenLabs UI](https://elevenlabs.io/app/conversational-ai), where you can name and describe different tools, as well as set up the parameters passed by the agent.
|
|
73
|
+
|
|
74
|
+
```ts
|
|
75
|
+
const conversation = useConversation({
|
|
76
|
+
clientTools: {
|
|
77
|
+
displayMessage: (parameters: { text: string }) => {
|
|
78
|
+
alert(text);
|
|
79
|
+
|
|
80
|
+
return "Message displayed";
|
|
81
|
+
},
|
|
82
|
+
},
|
|
83
|
+
});
|
|
84
|
+
```
|
|
85
|
+
|
|
86
|
+
In case function returns a value, it will be passed back to the agent as a response.
|
|
87
|
+
Note that the tool needs to be explicitly set to be blocking conversation in ElevenLabs UI for the agent to await and react to the response, otherwise agent assumes success and continues the conversation.
|
|
88
|
+
|
|
89
|
+
#### Conversation overrides
|
|
90
|
+
|
|
91
|
+
You may choose to override various settings of the conversation and set them dynamically based other user interactions.
|
|
92
|
+
We support overriding various settings.
|
|
93
|
+
These settings are optional and can be used to customize the conversation experience.
|
|
94
|
+
The following settings are available:
|
|
95
|
+
|
|
96
|
+
```ts
|
|
97
|
+
const conversation = useConversation({
|
|
98
|
+
overrides: {
|
|
99
|
+
agent: {
|
|
100
|
+
prompt: {
|
|
101
|
+
prompt: "My custom prompt",
|
|
102
|
+
},
|
|
103
|
+
firstMessage: "My custom first message",
|
|
104
|
+
language: "en",
|
|
105
|
+
},
|
|
106
|
+
tts: {
|
|
107
|
+
voiceId: "custom voice id",
|
|
108
|
+
},
|
|
109
|
+
conversation: {
|
|
110
|
+
textOnly: true,
|
|
111
|
+
}
|
|
112
|
+
},
|
|
113
|
+
});
|
|
114
|
+
```
|
|
115
|
+
|
|
116
|
+
#### Text only
|
|
117
|
+
|
|
118
|
+
If your agent is configured to run in text-only mode, i.e. it does not send or receive audio messages,
|
|
119
|
+
you can use this flag to use a lighter version of the conversation. In that case, the
|
|
120
|
+
user will not be asked for microphone permissions and no audio context will be created.
|
|
121
|
+
|
|
122
|
+
```ts
|
|
123
|
+
const conversation = useConversation({
|
|
124
|
+
textOnly: true,
|
|
125
|
+
});
|
|
126
|
+
```
|
|
127
|
+
|
|
128
|
+
#### Prefer Headphones for iOS Devices
|
|
129
|
+
|
|
130
|
+
While this SDK leaves the choice of audio input/output device to the browser/system, iOS Safari seem to prefer the built-in speaker over headphones even when bluetooth device is in use. If you want to "force" the use of headphones on iOS devices when available, you can use the following option. Please, keep in mind that this is not guaranteed, since this functionality is not provided by the browser. System audio should be the default choice.
|
|
131
|
+
|
|
132
|
+
```ts
|
|
133
|
+
const conversation = useConversation({
|
|
134
|
+
preferHeadphonesForIosDevices: true,
|
|
135
|
+
});
|
|
136
|
+
```
|
|
137
|
+
|
|
138
|
+
#### Connection delay
|
|
139
|
+
|
|
140
|
+
You can configure additional delay between when the microphone is activated and when the connection is established.
|
|
141
|
+
On Android, the delay is set to 3 seconds by default to make sure the device has time to switch to the correct audio mode.
|
|
142
|
+
Without it, you may experience issues with the beginning of the first message being cut off.
|
|
143
|
+
|
|
144
|
+
```ts
|
|
145
|
+
const conversation = useConversation({
|
|
146
|
+
connectionDelay: {
|
|
147
|
+
android: 3_000,
|
|
148
|
+
ios: 0,
|
|
149
|
+
default: 0,
|
|
150
|
+
},
|
|
151
|
+
});
|
|
152
|
+
```
|
|
153
|
+
|
|
154
|
+
#### Acquiring a Wake Lock
|
|
155
|
+
|
|
156
|
+
By default, the conversation will attempt to acquire a [wake lock](https://developer.mozilla.org/en-US/docs/Web/API/Screen_Wake_Lock_API) to prevent the device from going to sleep during the conversation.
|
|
157
|
+
This can be disabled by setting the `useWakeLock` option to `false`:
|
|
158
|
+
|
|
159
|
+
```ts
|
|
160
|
+
const conversation = useConversation({
|
|
161
|
+
useWakeLock: false,
|
|
162
|
+
});
|
|
163
|
+
```
|
|
164
|
+
|
|
165
|
+
#### Methods
|
|
166
|
+
|
|
167
|
+
##### startConversation
|
|
168
|
+
|
|
169
|
+
`startConversation` method kick off the websocket connection and starts using microphone to communicate with the ElevenLabs Conversational AI agent.
|
|
170
|
+
The method accepts options object, with the `url` or `agentId` option being required.
|
|
171
|
+
|
|
172
|
+
Agent ID can be acquired through [ElevenLabs UI](https://elevenlabs.io/app/conversational-ai) and is always necessary.
|
|
173
|
+
|
|
174
|
+
```js
|
|
175
|
+
const conversation = useConversation();
|
|
176
|
+
const conversationId = await conversation.startSession({ url });
|
|
177
|
+
```
|
|
178
|
+
|
|
179
|
+
For the public agents, define `agentId` - no signed link generation necessary.
|
|
180
|
+
|
|
181
|
+
In case the conversation requires authorization, use the REST API to generate signed links. Use the signed link as a `url` parameter.
|
|
182
|
+
|
|
183
|
+
`startSession` returns promise resolving to `conversationId`. The value is a globally unique conversation ID you can use to identify separate conversations.
|
|
184
|
+
|
|
185
|
+
```js
|
|
186
|
+
// your server
|
|
187
|
+
const requestHeaders: HeadersInit = new Headers();
|
|
188
|
+
requestHeaders.set("xi-api-key", process.env.XI_API_KEY); // use your ElevenLabs API key
|
|
189
|
+
|
|
190
|
+
const response = await fetch(
|
|
191
|
+
"https://api.elevenlabs.io/v1/convai/conversation/get_signed_url?agent_id={{agent id created through ElevenLabs UI}}",
|
|
192
|
+
{
|
|
193
|
+
method: "GET",
|
|
194
|
+
headers: requestHeaders,
|
|
195
|
+
}
|
|
196
|
+
);
|
|
197
|
+
|
|
198
|
+
if (!response.ok) {
|
|
199
|
+
return Response.error();
|
|
200
|
+
}
|
|
201
|
+
|
|
202
|
+
const body = await response.json();
|
|
203
|
+
const url = body.signed_url; // use this URL for startConversation method.
|
|
204
|
+
```
|
|
205
|
+
|
|
206
|
+
##### endSession
|
|
207
|
+
|
|
208
|
+
A method to manually end the conversation. The method will end the conversation and disconnect from websocket.
|
|
209
|
+
|
|
210
|
+
```js
|
|
211
|
+
await conversation.endSession();
|
|
212
|
+
```
|
|
213
|
+
|
|
214
|
+
##### sendFeedback
|
|
215
|
+
|
|
216
|
+
A method for sending binary feedback to the agent.
|
|
217
|
+
The method accepts a boolean value, where `true` represents positive feedback and `false` negative feedback.
|
|
218
|
+
Feedback is always correlated to the most recent agent response and can be sent only once per response.
|
|
219
|
+
Check `canSendFeedback` state to see if feedback can be sent in the given moment.
|
|
220
|
+
|
|
221
|
+
```js
|
|
222
|
+
const { sendFeedback } = useConversation();
|
|
223
|
+
|
|
224
|
+
sendFeedback(true); // positive feedback
|
|
225
|
+
sendFeedback(false); // negative feedback
|
|
226
|
+
```
|
|
227
|
+
|
|
228
|
+
##### sendContextualUpdate
|
|
229
|
+
|
|
230
|
+
A method to send contextual updates to the agent.
|
|
231
|
+
This can be used to inform the agent about user actions that are not directly related to the conversation, but may influence the agent's responses.
|
|
232
|
+
|
|
233
|
+
```js
|
|
234
|
+
const { sendContextualUpdate } = useConversation();
|
|
235
|
+
|
|
236
|
+
sendContextualUpdate(
|
|
237
|
+
"User navigated to another page. Consider it for next response, but don't react to this contextual update."
|
|
238
|
+
);
|
|
239
|
+
```
|
|
240
|
+
|
|
241
|
+
##### sendUserMessage
|
|
242
|
+
|
|
243
|
+
Sends a text messages to the agent.
|
|
244
|
+
|
|
245
|
+
Can be used to let the user type in the message instead of using the microphone.
|
|
246
|
+
Unlike `sendContextualUpdate`, this will be treated as a user message and will prompt the agent to take its turn in the conversation.
|
|
247
|
+
|
|
248
|
+
```js
|
|
249
|
+
const { sendUserMessage, sendUserActivity } = useConversation();
|
|
250
|
+
const [value, setValue] = useState("");
|
|
251
|
+
|
|
252
|
+
return (
|
|
253
|
+
<>
|
|
254
|
+
<input
|
|
255
|
+
value={value}
|
|
256
|
+
onChange={e => {
|
|
257
|
+
setValue(e.target.value);
|
|
258
|
+
sendUserActivity();
|
|
259
|
+
}}
|
|
260
|
+
/>
|
|
261
|
+
<button
|
|
262
|
+
onClick={() => {
|
|
263
|
+
sendUserMessage(value);
|
|
264
|
+
setValue(value);
|
|
265
|
+
}}
|
|
266
|
+
>
|
|
267
|
+
SEND
|
|
268
|
+
</button>
|
|
269
|
+
</>
|
|
270
|
+
);
|
|
271
|
+
```
|
|
272
|
+
|
|
273
|
+
##### sendUserActivity
|
|
274
|
+
|
|
275
|
+
Notifies the agent about user activity.
|
|
276
|
+
|
|
277
|
+
The agent will not attempt to speak for at least 2 seconds after the user activity is detected.
|
|
278
|
+
This can be used to prevent the agent from interrupting the user when they are typing.
|
|
279
|
+
|
|
280
|
+
```js
|
|
281
|
+
const { sendUserMessage, sendUserActivity } = useConversation();
|
|
282
|
+
const [value, setValue] = useState("");
|
|
283
|
+
|
|
284
|
+
return (
|
|
285
|
+
<>
|
|
286
|
+
<input
|
|
287
|
+
value={value}
|
|
288
|
+
onChange={e => {
|
|
289
|
+
setValue(e.target.value);
|
|
290
|
+
sendUserActivity();
|
|
291
|
+
}}
|
|
292
|
+
/>
|
|
293
|
+
<button
|
|
294
|
+
onClick={() => {
|
|
295
|
+
sendUserMessage(value);
|
|
296
|
+
setValue(value);
|
|
297
|
+
}}
|
|
298
|
+
>
|
|
299
|
+
SEND
|
|
300
|
+
</button>
|
|
301
|
+
</>
|
|
302
|
+
);
|
|
303
|
+
```
|
|
304
|
+
|
|
305
|
+
##### setVolume
|
|
306
|
+
|
|
307
|
+
A method to set the output volume of the conversation. Accepts object with volume field between 0 and 1.
|
|
308
|
+
|
|
309
|
+
```js
|
|
310
|
+
const [volume, setVolume] = useState(0.5);
|
|
311
|
+
const conversation = useConversation({ volume });
|
|
312
|
+
|
|
313
|
+
// Set the volume
|
|
314
|
+
setVolume(0.5);
|
|
315
|
+
```
|
|
316
|
+
|
|
317
|
+
##### muteMic
|
|
318
|
+
|
|
319
|
+
A method to mute/unmute the microphone.
|
|
320
|
+
|
|
321
|
+
```js
|
|
322
|
+
const [micMuted, setMicMuted] = useState(false);
|
|
323
|
+
const conversation = useConversation({ micMuted });
|
|
324
|
+
|
|
325
|
+
// Mute the microphone
|
|
326
|
+
setMicMuted(true);
|
|
327
|
+
|
|
328
|
+
// Unmute the microphone
|
|
329
|
+
setMicMuted(false);
|
|
330
|
+
```
|
|
331
|
+
|
|
332
|
+
##### status
|
|
333
|
+
|
|
334
|
+
A React state containing the current status of the conversation.
|
|
335
|
+
|
|
336
|
+
```js
|
|
337
|
+
const { status } = useConversation();
|
|
338
|
+
console.log(status); // "connected" or "disconnected"
|
|
339
|
+
```
|
|
340
|
+
|
|
341
|
+
##### isSpeaking
|
|
342
|
+
|
|
343
|
+
A React state containing the information of whether the agent is currently speaking.
|
|
344
|
+
This is helpful for indicating the mode in your UI.
|
|
345
|
+
|
|
346
|
+
```js
|
|
347
|
+
const { isSpeaking } = useConversation();
|
|
348
|
+
console.log(isSpeaking); // boolean
|
|
349
|
+
```
|
|
350
|
+
|
|
351
|
+
##### canSendFeedback
|
|
352
|
+
|
|
353
|
+
A React state representing whether the user can send feedback to the agent.
|
|
354
|
+
When false, calls to `sendFeedback` will be ignored.
|
|
355
|
+
This is helpful to conditionally show the feedback button in your UI.
|
|
356
|
+
|
|
357
|
+
```js
|
|
358
|
+
const { canSendFeedback } = useConversation();
|
|
359
|
+
console.log(canSendFeedback); // boolean
|
|
360
|
+
```
|
|
361
|
+
|
|
362
|
+
## Development
|
|
363
|
+
|
|
364
|
+
Please, refer to the README.md file in the root of this repository.
|
|
365
|
+
|
|
366
|
+
## Contributing
|
|
367
|
+
|
|
368
|
+
Please, create an issue first to discuss the proposed changes. Any contributions are welcome!
|
|
369
|
+
|
|
370
|
+
Remember, if merged, your code will be used as part of a MIT licensed project. By submitting a Pull Request, you are giving your consent for your code to be integrated into this library.
|
package/dist/index.d.ts
ADDED
|
@@ -0,0 +1,29 @@
|
|
|
1
|
+
import { SessionConfig, Callbacks, Status, ClientToolsConfig, InputConfig } from "@elevenlabs/client";
|
|
2
|
+
export type { Role, Mode, Status, SessionConfig, DisconnectionDetails, Language, } from "@elevenlabs/client";
|
|
3
|
+
export { postOverallFeedback } from "@elevenlabs/client";
|
|
4
|
+
export type HookOptions = Partial<SessionConfig & HookCallbacks & ClientToolsConfig & InputConfig>;
|
|
5
|
+
export type ControlledState = {
|
|
6
|
+
micMuted?: boolean;
|
|
7
|
+
volume?: number;
|
|
8
|
+
};
|
|
9
|
+
export type HookCallbacks = Pick<Callbacks, "onConnect" | "onDisconnect" | "onError" | "onMessage" | "onAudio" | "onDebug" | "onUnhandledClientToolCall">;
|
|
10
|
+
export declare function useConversation<T extends HookOptions & ControlledState>(props?: T): {
|
|
11
|
+
startSession: T extends SessionConfig ? (options?: HookOptions) => Promise<string> : (options: SessionConfig & HookOptions) => Promise<string>;
|
|
12
|
+
endSession: () => Promise<void>;
|
|
13
|
+
setVolume: ({ volume }: {
|
|
14
|
+
volume: number;
|
|
15
|
+
}) => void;
|
|
16
|
+
getInputByteFrequencyData: () => Uint8Array | undefined;
|
|
17
|
+
getOutputByteFrequencyData: () => Uint8Array | undefined;
|
|
18
|
+
getInputVolume: () => number;
|
|
19
|
+
getOutputVolume: () => number;
|
|
20
|
+
sendFeedback: (like: boolean) => void;
|
|
21
|
+
getId: () => string | undefined;
|
|
22
|
+
sendContextualUpdate: (text: string) => void;
|
|
23
|
+
sendUserMessage: (text: string) => void;
|
|
24
|
+
sendUserActivity: () => void;
|
|
25
|
+
status: Status;
|
|
26
|
+
canSendFeedback: boolean;
|
|
27
|
+
micMuted: boolean | undefined;
|
|
28
|
+
isSpeaking: boolean;
|
|
29
|
+
};
|
package/dist/lib.cjs
ADDED
|
@@ -0,0 +1,2 @@
|
|
|
1
|
+
var e=require("react"),n=require("@elevenlabs/client");function t(){return t=Object.assign?Object.assign.bind():function(e){for(var n=1;n<arguments.length;n++){var t=arguments[n];for(var r in t)({}).hasOwnProperty.call(t,r)&&(e[r]=t[r])}return e},t.apply(null,arguments)}var r=["micMuted","volume"];Object.defineProperty(exports,"postOverallFeedback",{enumerable:!0,get:function(){return n.postOverallFeedback}}),exports.useConversation=function(u){void 0===u&&(u={});var c=u.micMuted,o=u.volume,l=function(e,n){if(null==e)return{};var t={};for(var r in e)if({}.hasOwnProperty.call(e,r)){if(n.includes(r))continue;t[r]=e[r]}return t}(u,r),i=e.useRef(null),a=e.useRef(null),s=e.useState("disconnected"),v=s[0],d=s[1],f=e.useState(!1),m=f[0],g=f[1],p=e.useState("listening"),h=p[0],y=p[1];return e.useEffect(function(){var e;void 0!==c&&(null==i||null==(e=i.current)||e.setMicMuted(c))},[c]),e.useEffect(function(){var e;void 0!==o&&(null==i||null==(e=i.current)||e.setVolume({volume:o}))},[o]),e.useEffect(function(){return function(){var e;null==(e=i.current)||e.endSession()}},[]),{startSession:function(e){try{var r,u,s=function(r){return u?r:function(r,u){try{var s=(a.current=n.Conversation.startSession(t({},null!=l?l:{},null!=e?e:{},{onModeChange:function(e){y(e.mode)},onStatusChange:function(e){d(e.status)},onCanSendFeedbackChange:function(e){g(e.canSendFeedback)}})),Promise.resolve(a.current).then(function(e){return i.current=e,void 0!==c&&i.current.setMicMuted(c),void 0!==o&&i.current.setVolume({volume:o}),i.current.getId()}))}catch(e){return u(!0,e)}return s&&s.then?s.then(u.bind(null,!1),u.bind(null,!0)):u(!1,s)}(0,function(e,n){if(a.current=null,e)throw n;return n})};if(null!=(r=i.current)&&r.isOpen())return Promise.resolve(i.current.getId());var v=function(){if(a.current)return Promise.resolve(a.current).then(function(e){var n=e.getId();return u=1,n})}();return Promise.resolve(v&&v.then?v.then(s):s(v))}catch(e){return Promise.reject(e)}},endSession:function(){try{var e=i.current;return i.current=null,Promise.resolve(null==e?void 0:e.endSession()).then(function(){})}catch(e){return Promise.reject(e)}},setVolume:function(e){var n;null==(n=i.current)||n.setVolume({volume:e.volume})},getInputByteFrequencyData:function(){var e;return null==(e=i.current)?void 0:e.getInputByteFrequencyData()},getOutputByteFrequencyData:function(){var e;return null==(e=i.current)?void 0:e.getOutputByteFrequencyData()},getInputVolume:function(){var e,n;return null!=(e=null==(n=i.current)?void 0:n.getInputVolume())?e:0},getOutputVolume:function(){var e,n;return null!=(e=null==(n=i.current)?void 0:n.getOutputVolume())?e:0},sendFeedback:function(e){var n;null==(n=i.current)||n.sendFeedback(e)},getId:function(){var e;return null==(e=i.current)?void 0:e.getId()},sendContextualUpdate:function(e){var n;null==(n=i.current)||n.sendContextualUpdate(e)},sendUserMessage:function(e){var n;null==(n=i.current)||n.sendUserMessage(e)},sendUserActivity:function(){var e;null==(e=i.current)||e.sendUserActivity()},status:v,canSendFeedback:m,micMuted:c,isSpeaking:"speaking"===h}};
|
|
2
|
+
//# sourceMappingURL=lib.cjs.map
|
package/dist/lib.cjs.map
ADDED
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"file":"lib.cjs","sources":["../src/index.ts"],"sourcesContent":["import { useEffect, useRef, useState } from \"react\";\nimport {\n Conversation,\n Mode,\n SessionConfig,\n Callbacks,\n Options,\n Status,\n ClientToolsConfig,\n InputConfig,\n} from \"@elevenlabs/client\";\n\nexport type {\n Role,\n Mode,\n Status,\n SessionConfig,\n DisconnectionDetails,\n Language,\n} from \"@elevenlabs/client\";\nexport { postOverallFeedback } from \"@elevenlabs/client\";\n\nexport type HookOptions = Partial<\n SessionConfig & HookCallbacks & ClientToolsConfig & InputConfig\n>;\nexport type ControlledState = {\n micMuted?: boolean;\n volume?: number;\n};\nexport type HookCallbacks = Pick<\n Callbacks,\n | \"onConnect\"\n | \"onDisconnect\"\n | \"onError\"\n | \"onMessage\"\n | \"onAudio\"\n | \"onDebug\"\n | \"onUnhandledClientToolCall\"\n>;\n\nexport function useConversation<T extends HookOptions & ControlledState>(\n props: T = {} as T\n) {\n const { micMuted, volume, ...defaultOptions } = props;\n const conversationRef = useRef<Conversation | null>(null);\n const lockRef = useRef<Promise<Conversation> | null>(null);\n const [status, setStatus] = useState<Status>(\"disconnected\");\n const [canSendFeedback, setCanSendFeedback] = useState(false);\n const [mode, setMode] = useState<Mode>(\"listening\");\n\n useEffect(() => {\n if (micMuted !== undefined) {\n conversationRef?.current?.setMicMuted(micMuted);\n }\n }, [micMuted]);\n\n useEffect(() => {\n if (volume !== undefined) {\n conversationRef?.current?.setVolume({ volume });\n }\n }, [volume]);\n\n useEffect(() => {\n return () => {\n conversationRef.current?.endSession();\n };\n }, []);\n\n return {\n startSession: (async (options?: HookOptions) => {\n if (conversationRef.current?.isOpen()) {\n return conversationRef.current.getId();\n }\n\n if (lockRef.current) {\n const conversation = await lockRef.current;\n return conversation.getId();\n }\n\n try {\n lockRef.current = Conversation.startSession({\n ...(defaultOptions ?? {}),\n ...(options ?? {}),\n onModeChange: ({ mode }) => {\n setMode(mode);\n },\n onStatusChange: ({ status }) => {\n setStatus(status);\n },\n onCanSendFeedbackChange: ({ canSendFeedback }) => {\n setCanSendFeedback(canSendFeedback);\n },\n } as Options);\n\n conversationRef.current = await lockRef.current;\n // Persist controlled state between sessions\n if (micMuted !== undefined) {\n conversationRef.current.setMicMuted(micMuted);\n }\n if (volume !== undefined) {\n conversationRef.current.setVolume({ volume });\n }\n\n return conversationRef.current.getId();\n } finally {\n lockRef.current = null;\n }\n }) as T extends SessionConfig\n ? (options?: HookOptions) => Promise<string>\n : (options: SessionConfig & HookOptions) => Promise<string>,\n endSession: async () => {\n const conversation = conversationRef.current;\n conversationRef.current = null;\n await conversation?.endSession();\n },\n setVolume: ({ volume }: { volume: number }) => {\n conversationRef.current?.setVolume({ volume });\n },\n getInputByteFrequencyData: () => {\n return conversationRef.current?.getInputByteFrequencyData();\n },\n getOutputByteFrequencyData: () => {\n return conversationRef.current?.getOutputByteFrequencyData();\n },\n getInputVolume: () => {\n return conversationRef.current?.getInputVolume() ?? 0;\n },\n getOutputVolume: () => {\n return conversationRef.current?.getOutputVolume() ?? 0;\n },\n sendFeedback: (like: boolean) => {\n conversationRef.current?.sendFeedback(like);\n },\n getId: () => {\n return conversationRef.current?.getId();\n },\n sendContextualUpdate: (text: string) => {\n conversationRef.current?.sendContextualUpdate(text);\n },\n sendUserMessage: (text: string) => {\n conversationRef.current?.sendUserMessage(text);\n },\n sendUserActivity: () => {\n conversationRef.current?.sendUserActivity();\n },\n status,\n canSendFeedback,\n micMuted,\n isSpeaking: mode === \"speaking\",\n };\n}\n\n// const con = useConversation({agentId: \"\"})\n"],"names":["props","micMuted","volume","defaultOptions","_objectWithoutPropertiesLoose","_excluded","conversationRef","useRef","lockRef","_useState","useState","status","setStatus","_useState2","canSendFeedback","setCanSendFeedback","_useState3","mode","setMode","useEffect","_conversationRef$curr","undefined","current","setMicMuted","_conversationRef$curr2","setVolume","_conversationRef$curr3","endSession","startSession","options","_conversationRef$curr4","_exit","_temp2","_result","Conversation","_extends","onModeChange","_ref","onStatusChange","_ref2","onCanSendFeedbackChange","_ref3","Promise","resolve","then","_lockRef$current","getId","_finallyRethrows","_wasThrown","_result2","isOpen","_temp","conversation","_conversation$getId","e","reject","_ref4","_conversationRef$curr5","getInputByteFrequencyData","_conversationRef$curr6","getOutputByteFrequencyData","_conversationRef$curr7","getInputVolume","_conversationRef$curr8","_conversationRef$curr9","getOutputVolume","_conversationRef$curr10","_conversationRef$curr11","sendFeedback","like","_conversationRef$curr12","_conversationRef$curr13","sendContextualUpdate","text","_conversationRef$curr14","sendUserMessage","_conversationRef$curr15","sendUserActivity","_conversationRef$curr16","isSpeaking"],"mappings":"qbAwCgB,SACdA,QAAAA,IAAAA,IAAAA,EAAW,CAAA,GAEX,IAAQC,EAAwCD,EAAxCC,SAAUC,EAA8BF,EAA9BE,OAAWC,yIAAcC,CAAKJ,EAALK,GACrCC,EAAkBC,EAAMA,OAAsB,MAC9CC,EAAUD,EAAAA,OAAqC,MACrDE,EAA4BC,EAAAA,SAAiB,gBAAtCC,EAAMF,EAAEG,GAAAA,EAASH,EACxB,GAAAI,EAA8CH,EAAAA,UAAS,GAAhDI,EAAeD,EAAA,GAAEE,EAAkBF,EAC1C,GAAAG,EAAwBN,EAAAA,SAAe,aAAhCO,EAAID,EAAA,GAAEE,EAAOF,EAAA,GAoBpB,OAlBAG,EAASA,UAAC,eACoBC,OAAXC,IAAbpB,IACa,MAAfK,GAAAc,OAAeA,EAAfd,EAAiBgB,UAAjBF,EAA0BG,YAAYtB,GAE1C,EAAG,CAACA,IAEJkB,EAASA,UAAC,WACkBK,IAAAA,OAAXH,IAAXnB,IACa,MAAfI,GAAAkB,OAAeA,EAAflB,EAAiBgB,UAAjBE,EAA0BC,UAAU,CAAEvB,OAAAA,IAE1C,EAAG,CAACA,IAEJiB,EAAAA,UAAU,WACR,OAAO,WAAKO,IAAAA,EACa,OAAvBA,EAAApB,EAAgBgB,UAAhBI,EAAyBC,YAC3B,CACF,EAAG,IAEI,CACLC,sBAAsBC,GAAyB,IAAA,IAAAC,EAoCpBC,EApCoBC,EAAA,SAAAC,GAAAF,OAAAA,EAAAE,2BAW3CzB,EAAQc,QAAUY,EAAAA,aAAaN,aAAYO,EAAA,CAAA,EACrChC,MAAAA,EAAAA,EAAkB,CAAA,EACX,MAAP0B,EAAAA,EAAW,CAAA,EACfO,CAAAA,aAAc,SAAFC,GACVnB,EADmBmB,EAAJpB,KAEjB,EACAqB,eAAgB,SAAFC,GACZ3B,EADuB2B,EAAN5B,OAEnB,EACA6B,wBAAyB,SAAFC,GACrB1B,EADyC0B,EAAf3B,gBAE5B,KACY4B,QAAAC,QAEkBnC,EAAQc,SAAOsB,KAAAC,SAAAA,GAS/C,OATAvC,EAAgBgB,QAAOuB,OAENxB,IAAbpB,GACFK,EAAgBgB,QAAQC,YAAYtB,QAEvBoB,IAAXnB,GACFI,EAAgBgB,QAAQG,UAAU,CAAEvB,OAAAA,IAG/BI,EAAgBgB,QAAQwB,OAAQ,6FAlCIC,CAAA,EAmC5CC,SAAAA,EAAAC,GACwB,GAAvBzC,EAAQc,QAAU,KAAK0B,QAAAC,EAAA,OAAAA,CAAA,EAAA,EAnCzB,GAA2B,OAA3BnB,EAAIxB,EAAgBgB,UAAhBQ,EAAyBoB,SAC3B,OAAAR,QAAAC,QAAOrC,EAAgBgB,QAAQwB,SAChC,IAAAK,EAEG3C,WAAAA,GAAAA,EAAQc,QAAOoB,OAAAA,QAAAC,QACUnC,EAAQc,SAAOsB,KAAA,SAApCQ,GAAY,IAAAC,EACXD,EAAaN,QAAOO,OAAAtB,EAAAsB,EAAAA,CAAA,GAFzB7C,UAEyBkC,QAAAC,QAAAQ,GAAAA,EAAAP,KAAAO,EAAAP,KAAAZ,GAAAA,EAAAmB,GA+B/B,CAAC,MAAAG,GAAA,OAAAZ,QAAAa,OAAAD,EAE4D,CAAA,EAC7D3B,WAAU,WAAA,IACR,IAAMyB,EAAe9C,EAAgBgB,QACN,OAA/BhB,EAAgBgB,QAAU,KAAKoB,QAAAC,QACb,MAAZS,OAAY,EAAZA,EAAczB,cAAYiB,kBAClC,CAAC,MAAAU,GAAA,OAAAZ,QAAAa,OAAAD,EAAA,CAAA,EACD7B,UAAW,SAAF+B,GAAqC,IAAAC,EACrB,OAAvBA,EAAAnD,EAAgBgB,UAAhBmC,EAAyBhC,UAAU,CAAEvB,OADnBsD,EAANtD,QAEd,EACAwD,0BAA2B,WAAK,IAAAC,EAC9B,OAA8B,OAA9BA,EAAOrD,EAAgBgB,cAAO,EAAvBqC,EAAyBD,2BAClC,EACAE,2BAA4B,WAAK,IAAAC,EAC/B,OAAOA,OAAPA,EAAOvD,EAAgBgB,cAAhBuC,EAAAA,EAAyBD,4BAClC,EACAE,eAAgB,WAAKC,IAAAA,EAAAC,EACnB,OAAgD,OAAhDD,SAAAC,EAAO1D,EAAgBgB,gBAAhB0C,EAAyBF,kBAAgBC,EAAI,CACtD,EACAE,gBAAiB,WAAK,IAAAC,EAAAC,EACpB,OAAiDD,OAAjDA,EAAOC,OAAPA,EAAO7D,EAAgBgB,cAAhB6C,EAAAA,EAAyBF,mBAAiBC,EAAI,CACvD,EACAE,aAAc,SAACC,GAAiBC,IAAAA,EAC9BA,OAAAA,EAAAhE,EAAgBgB,UAAhBgD,EAAyBF,aAAaC,EACxC,EACAvB,MAAO,WAAK,IAAAyB,EACV,OAA8B,OAA9BA,EAAOjE,EAAgBgB,cAAO,EAAvBiD,EAAyBzB,OAClC,EACA0B,qBAAsB,SAACC,OAAgBC,EACd,OAAvBA,EAAApE,EAAgBgB,UAAhBoD,EAAyBF,qBAAqBC,EAChD,EACAE,gBAAiB,SAACF,GAAgB,IAAAG,EACT,OAAvBA,EAAAtE,EAAgBgB,UAAhBsD,EAAyBD,gBAAgBF,EAC3C,EACAI,iBAAkB,WAAK,IAAAC,EACE,OAAvBA,EAAAxE,EAAgBgB,UAAhBwD,EAAyBD,kBAC3B,EACAlE,OAAAA,EACAG,gBAAAA,EACAb,SAAAA,EACA8E,WAAqB,aAAT9D,EAEhB"}
|
|
@@ -0,0 +1,2 @@
|
|
|
1
|
+
import{useRef as e,useState as n,useEffect as t}from"react";import{Conversation as r}from"@elevenlabs/client";export{postOverallFeedback}from"@elevenlabs/client";function u(){return u=Object.assign?Object.assign.bind():function(e){for(var n=1;n<arguments.length;n++){var t=arguments[n];for(var r in t)({}).hasOwnProperty.call(t,r)&&(e[r]=t[r])}return e},u.apply(null,arguments)}const l=["micMuted","volume"];function a(a={}){const{micMuted:c,volume:s}=a,o=function(e,n){if(null==e)return{};var t={};for(var r in e)if({}.hasOwnProperty.call(e,r)){if(n.includes(r))continue;t[r]=e[r]}return t}(a,l),i=e(null),d=e(null),[v,g]=n("disconnected"),[m,p]=n(!1),[y,f]=n("listening");return t(()=>{var e;void 0!==c&&(null==i||null==(e=i.current)||e.setMicMuted(c))},[c]),t(()=>{var e;void 0!==s&&(null==i||null==(e=i.current)||e.setVolume({volume:s}))},[s]),t(()=>()=>{var e;null==(e=i.current)||e.endSession()},[]),{startSession:async e=>{var n;if(null!=(n=i.current)&&n.isOpen())return i.current.getId();if(d.current)return(await d.current).getId();try{return d.current=r.startSession(u({},null!=o?o:{},null!=e?e:{},{onModeChange:({mode:e})=>{f(e)},onStatusChange:({status:e})=>{g(e)},onCanSendFeedbackChange:({canSendFeedback:e})=>{p(e)}})),i.current=await d.current,void 0!==c&&i.current.setMicMuted(c),void 0!==s&&i.current.setVolume({volume:s}),i.current.getId()}finally{d.current=null}},endSession:async()=>{const e=i.current;i.current=null,await(null==e?void 0:e.endSession())},setVolume:({volume:e})=>{var n;null==(n=i.current)||n.setVolume({volume:e})},getInputByteFrequencyData:()=>{var e;return null==(e=i.current)?void 0:e.getInputByteFrequencyData()},getOutputByteFrequencyData:()=>{var e;return null==(e=i.current)?void 0:e.getOutputByteFrequencyData()},getInputVolume:()=>{var e,n;return null!=(e=null==(n=i.current)?void 0:n.getInputVolume())?e:0},getOutputVolume:()=>{var e,n;return null!=(e=null==(n=i.current)?void 0:n.getOutputVolume())?e:0},sendFeedback:e=>{var n;null==(n=i.current)||n.sendFeedback(e)},getId:()=>{var e;return null==(e=i.current)?void 0:e.getId()},sendContextualUpdate:e=>{var n;null==(n=i.current)||n.sendContextualUpdate(e)},sendUserMessage:e=>{var n;null==(n=i.current)||n.sendUserMessage(e)},sendUserActivity:()=>{var e;null==(e=i.current)||e.sendUserActivity()},status:v,canSendFeedback:m,micMuted:c,isSpeaking:"speaking"===y}}export{a as useConversation};
|
|
2
|
+
//# sourceMappingURL=lib.modern.js.map
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"file":"lib.modern.js","sources":["../src/index.ts"],"sourcesContent":["import { useEffect, useRef, useState } from \"react\";\nimport {\n Conversation,\n Mode,\n SessionConfig,\n Callbacks,\n Options,\n Status,\n ClientToolsConfig,\n InputConfig,\n} from \"@elevenlabs/client\";\n\nexport type {\n Role,\n Mode,\n Status,\n SessionConfig,\n DisconnectionDetails,\n Language,\n} from \"@elevenlabs/client\";\nexport { postOverallFeedback } from \"@elevenlabs/client\";\n\nexport type HookOptions = Partial<\n SessionConfig & HookCallbacks & ClientToolsConfig & InputConfig\n>;\nexport type ControlledState = {\n micMuted?: boolean;\n volume?: number;\n};\nexport type HookCallbacks = Pick<\n Callbacks,\n | \"onConnect\"\n | \"onDisconnect\"\n | \"onError\"\n | \"onMessage\"\n | \"onAudio\"\n | \"onDebug\"\n | \"onUnhandledClientToolCall\"\n>;\n\nexport function useConversation<T extends HookOptions & ControlledState>(\n props: T = {} as T\n) {\n const { micMuted, volume, ...defaultOptions } = props;\n const conversationRef = useRef<Conversation | null>(null);\n const lockRef = useRef<Promise<Conversation> | null>(null);\n const [status, setStatus] = useState<Status>(\"disconnected\");\n const [canSendFeedback, setCanSendFeedback] = useState(false);\n const [mode, setMode] = useState<Mode>(\"listening\");\n\n useEffect(() => {\n if (micMuted !== undefined) {\n conversationRef?.current?.setMicMuted(micMuted);\n }\n }, [micMuted]);\n\n useEffect(() => {\n if (volume !== undefined) {\n conversationRef?.current?.setVolume({ volume });\n }\n }, [volume]);\n\n useEffect(() => {\n return () => {\n conversationRef.current?.endSession();\n };\n }, []);\n\n return {\n startSession: (async (options?: HookOptions) => {\n if (conversationRef.current?.isOpen()) {\n return conversationRef.current.getId();\n }\n\n if (lockRef.current) {\n const conversation = await lockRef.current;\n return conversation.getId();\n }\n\n try {\n lockRef.current = Conversation.startSession({\n ...(defaultOptions ?? {}),\n ...(options ?? {}),\n onModeChange: ({ mode }) => {\n setMode(mode);\n },\n onStatusChange: ({ status }) => {\n setStatus(status);\n },\n onCanSendFeedbackChange: ({ canSendFeedback }) => {\n setCanSendFeedback(canSendFeedback);\n },\n } as Options);\n\n conversationRef.current = await lockRef.current;\n // Persist controlled state between sessions\n if (micMuted !== undefined) {\n conversationRef.current.setMicMuted(micMuted);\n }\n if (volume !== undefined) {\n conversationRef.current.setVolume({ volume });\n }\n\n return conversationRef.current.getId();\n } finally {\n lockRef.current = null;\n }\n }) as T extends SessionConfig\n ? (options?: HookOptions) => Promise<string>\n : (options: SessionConfig & HookOptions) => Promise<string>,\n endSession: async () => {\n const conversation = conversationRef.current;\n conversationRef.current = null;\n await conversation?.endSession();\n },\n setVolume: ({ volume }: { volume: number }) => {\n conversationRef.current?.setVolume({ volume });\n },\n getInputByteFrequencyData: () => {\n return conversationRef.current?.getInputByteFrequencyData();\n },\n getOutputByteFrequencyData: () => {\n return conversationRef.current?.getOutputByteFrequencyData();\n },\n getInputVolume: () => {\n return conversationRef.current?.getInputVolume() ?? 0;\n },\n getOutputVolume: () => {\n return conversationRef.current?.getOutputVolume() ?? 0;\n },\n sendFeedback: (like: boolean) => {\n conversationRef.current?.sendFeedback(like);\n },\n getId: () => {\n return conversationRef.current?.getId();\n },\n sendContextualUpdate: (text: string) => {\n conversationRef.current?.sendContextualUpdate(text);\n },\n sendUserMessage: (text: string) => {\n conversationRef.current?.sendUserMessage(text);\n },\n sendUserActivity: () => {\n conversationRef.current?.sendUserActivity();\n },\n status,\n canSendFeedback,\n micMuted,\n isSpeaking: mode === \"speaking\",\n };\n}\n\n// const con = useConversation({agentId: \"\"})\n"],"names":["_excluded","useConversation","props","micMuted","volume","defaultOptions","_objectWithoutPropertiesLoose","conversationRef","useRef","lockRef","status","setStatus","useState","canSendFeedback","setCanSendFeedback","mode","setMode","useEffect","_conversationRef$curr","undefined","current","setMicMuted","_conversationRef$curr2","setVolume","_conversationRef$curr3","endSession","startSession","async","_conversationRef$curr4","isOpen","getId","Conversation","_extends","options","onModeChange","onStatusChange","onCanSendFeedbackChange","conversation","_conversationRef$curr5","getInputByteFrequencyData","_conversationRef$curr6","getOutputByteFrequencyData","_conversationRef$curr7","getInputVolume","_conversationRef$curr8","_conversationRef$curr9","getOutputVolume","_conversationRef$curr10","_conversationRef$curr11","sendFeedback","like","_conversationRef$curr12","_conversationRef$curr13","sendContextualUpdate","text","_conversationRef$curr14","sendUserMessage","_conversationRef$curr15","sendUserActivity","_conversationRef$curr16","isSpeaking"],"mappings":"0XAAA,MAAAA,EAAA,CAAA,WAAA,UAwCgB,SAAAC,EACdC,EAAW,CAAA,GAEX,MAAMC,SAAEA,EAAQC,OAAEA,GAA8BF,EAAnBG,yIAAcC,CAAKJ,EAAKF,GAC/CO,EAAkBC,EAA4B,MAC9CC,EAAUD,EAAqC,OAC9CE,EAAQC,GAAaC,EAAiB,iBACtCC,EAAiBC,GAAsBF,GAAS,IAChDG,EAAMC,GAAWJ,EAAe,aAoBvC,OAlBAK,EAAU,SACoBC,OAAXC,IAAbhB,IACa,MAAfI,GAAwB,OAATW,EAAfX,EAAiBa,UAAjBF,EAA0BG,YAAYlB,GACxC,EACC,CAACA,IAEJc,EAAU,KACkB,IAAAK,OAAXH,IAAXf,IACakB,MAAff,GAAwB,OAATe,EAAff,EAAiBa,UAAjBE,EAA0BC,UAAU,CAAEnB,WACxC,EACC,CAACA,IAEJa,EAAU,IACD,KAAK,IAAAO,EACVA,OAAAA,EAAAjB,EAAgBa,UAAhBI,EAAyBC,YAC3B,EACC,IAEI,CACLC,aAAeC,UAAgC,IAAAC,EAC7C,GAAIA,OAAJA,EAAIrB,EAAgBa,UAAhBQ,EAAyBC,SAC3B,OAAOtB,EAAgBa,QAAQU,QAGjC,GAAIrB,EAAQW,QAEV,aAD2BX,EAAQW,SACfU,QAGtB,IAwBE,OAvBArB,EAAQW,QAAUW,EAAaL,aAAYM,EACrC3B,CAAAA,EAAAA,MAAAA,EAAAA,EAAkB,CAAA,QAClB4B,EAAAA,EAAW,CAAA,GACfC,aAAcA,EAAGnB,WACfC,EAAQD,EAAI,EAEdoB,eAAgBA,EAAGzB,aACjBC,EAAUD,EAAM,EAElB0B,wBAAyBA,EAAGvB,sBAC1BC,EAAmBD,OAIvBN,EAAgBa,cAAgBX,EAAQW,aAEvBD,IAAbhB,GACFI,EAAgBa,QAAQC,YAAYlB,QAEvBgB,IAAXf,GACFG,EAAgBa,QAAQG,UAAU,CAAEnB,WAG/BG,EAAgBa,QAAQU,OACjC,CAAC,QACCrB,EAAQW,QAAU,IACpB,GAIFK,WAAYE,UACV,MAAMU,EAAe9B,EAAgBa,QACrCb,EAAgBa,QAAU,WACpBiB,MAAAA,OAAAA,EAAAA,EAAcZ,aAAY,EAElCF,UAAWA,EAAGnB,aAAgC,IAAAkC,EACrB,OAAvBA,EAAA/B,EAAgBa,UAAhBkB,EAAyBf,UAAU,CAAEnB,UACvC,EACAmC,0BAA2BA,KAAK,IAAAC,EAC9B,cAAAA,EAAOjC,EAAgBa,gBAAhBoB,EAAyBD,2BAAyB,EAE3DE,2BAA4BA,KAAK,IAAAC,EAC/B,cAAAA,EAAOnC,EAAgBa,gBAAhBsB,EAAyBD,4BAClC,EACAE,eAAgBA,KAAK,IAAAC,EAAAC,EACnB,cAAAD,EAA8B,OAA9BC,EAAOtC,EAAgBa,cAAO,EAAvByB,EAAyBF,kBAAgBC,EAAI,GAEtDE,gBAAiBA,KAAK,IAAAC,EAAAC,EACpB,OAAiD,OAAjDD,EAA8B,OAA9BC,EAAOzC,EAAgBa,cAAO,EAAvB4B,EAAyBF,mBAAiBC,EAAI,GAEvDE,aAAeC,IAAiB,IAAAC,SAC9BA,EAAA5C,EAAgBa,UAAhB+B,EAAyBF,aAAaC,EAAI,EAE5CpB,MAAOA,KAAKsB,IAAAA,EACV,OAAOA,OAAPA,EAAO7C,EAAgBa,cAAhBgC,EAAAA,EAAyBtB,SAElCuB,qBAAuBC,IAAgBC,IAAAA,EACrCA,OAAAA,EAAAhD,EAAgBa,UAAhBmC,EAAyBF,qBAAqBC,IAEhDE,gBAAkBF,IAAgB,IAAAG,EACT,OAAvBA,EAAAlD,EAAgBa,UAAhBqC,EAAyBD,gBAAgBF,EAAI,EAE/CI,iBAAkBA,SAAKC,EACrBA,OAAAA,EAAApD,EAAgBa,UAAhBuC,EAAyBD,kBAAgB,EAE3ChD,SACAG,kBACAV,WACAyD,WAAqB,aAAT7C,EAEhB"}
|
|
@@ -0,0 +1,2 @@
|
|
|
1
|
+
import{useRef as n,useState as e,useEffect as t}from"react";import{Conversation as r}from"@elevenlabs/client";export{postOverallFeedback}from"@elevenlabs/client";function u(){return u=Object.assign?Object.assign.bind():function(n){for(var e=1;e<arguments.length;e++){var t=arguments[e];for(var r in t)({}).hasOwnProperty.call(t,r)&&(n[r]=t[r])}return n},u.apply(null,arguments)}var l=["micMuted","volume"];function o(o){void 0===o&&(o={});var c=o.micMuted,i=o.volume,a=function(n,e){if(null==n)return{};var t={};for(var r in n)if({}.hasOwnProperty.call(n,r)){if(e.includes(r))continue;t[r]=n[r]}return t}(o,l),s=n(null),v=n(null),d=e("disconnected"),f=d[0],m=d[1],g=e(!1),p=g[0],h=g[1],y=e("listening"),b=y[0],F=y[1];return t(function(){var n;void 0!==c&&(null==s||null==(n=s.current)||n.setMicMuted(c))},[c]),t(function(){var n;void 0!==i&&(null==s||null==(n=s.current)||n.setVolume({volume:i}))},[i]),t(function(){return function(){var n;null==(n=s.current)||n.endSession()}},[]),{startSession:function(n){try{var e,t,l=function(e){return t?e:function(e,t){try{var l=(v.current=r.startSession(u({},null!=a?a:{},null!=n?n:{},{onModeChange:function(n){F(n.mode)},onStatusChange:function(n){m(n.status)},onCanSendFeedbackChange:function(n){h(n.canSendFeedback)}})),Promise.resolve(v.current).then(function(n){return s.current=n,void 0!==c&&s.current.setMicMuted(c),void 0!==i&&s.current.setVolume({volume:i}),s.current.getId()}))}catch(n){return t(!0,n)}return l&&l.then?l.then(t.bind(null,!1),t.bind(null,!0)):t(!1,l)}(0,function(n,e){if(v.current=null,n)throw e;return e})};if(null!=(e=s.current)&&e.isOpen())return Promise.resolve(s.current.getId());var o=function(){if(v.current)return Promise.resolve(v.current).then(function(n){var e=n.getId();return t=1,e})}();return Promise.resolve(o&&o.then?o.then(l):l(o))}catch(n){return Promise.reject(n)}},endSession:function(){try{var n=s.current;return s.current=null,Promise.resolve(null==n?void 0:n.endSession()).then(function(){})}catch(n){return Promise.reject(n)}},setVolume:function(n){var e;null==(e=s.current)||e.setVolume({volume:n.volume})},getInputByteFrequencyData:function(){var n;return null==(n=s.current)?void 0:n.getInputByteFrequencyData()},getOutputByteFrequencyData:function(){var n;return null==(n=s.current)?void 0:n.getOutputByteFrequencyData()},getInputVolume:function(){var n,e;return null!=(n=null==(e=s.current)?void 0:e.getInputVolume())?n:0},getOutputVolume:function(){var n,e;return null!=(n=null==(e=s.current)?void 0:e.getOutputVolume())?n:0},sendFeedback:function(n){var e;null==(e=s.current)||e.sendFeedback(n)},getId:function(){var n;return null==(n=s.current)?void 0:n.getId()},sendContextualUpdate:function(n){var e;null==(e=s.current)||e.sendContextualUpdate(n)},sendUserMessage:function(n){var e;null==(e=s.current)||e.sendUserMessage(n)},sendUserActivity:function(){var n;null==(n=s.current)||n.sendUserActivity()},status:f,canSendFeedback:p,micMuted:c,isSpeaking:"speaking"===b}}export{o as useConversation};
|
|
2
|
+
//# sourceMappingURL=lib.module.js.map
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"file":"lib.module.js","sources":["../src/index.ts"],"sourcesContent":["import { useEffect, useRef, useState } from \"react\";\nimport {\n Conversation,\n Mode,\n SessionConfig,\n Callbacks,\n Options,\n Status,\n ClientToolsConfig,\n InputConfig,\n} from \"@elevenlabs/client\";\n\nexport type {\n Role,\n Mode,\n Status,\n SessionConfig,\n DisconnectionDetails,\n Language,\n} from \"@elevenlabs/client\";\nexport { postOverallFeedback } from \"@elevenlabs/client\";\n\nexport type HookOptions = Partial<\n SessionConfig & HookCallbacks & ClientToolsConfig & InputConfig\n>;\nexport type ControlledState = {\n micMuted?: boolean;\n volume?: number;\n};\nexport type HookCallbacks = Pick<\n Callbacks,\n | \"onConnect\"\n | \"onDisconnect\"\n | \"onError\"\n | \"onMessage\"\n | \"onAudio\"\n | \"onDebug\"\n | \"onUnhandledClientToolCall\"\n>;\n\nexport function useConversation<T extends HookOptions & ControlledState>(\n props: T = {} as T\n) {\n const { micMuted, volume, ...defaultOptions } = props;\n const conversationRef = useRef<Conversation | null>(null);\n const lockRef = useRef<Promise<Conversation> | null>(null);\n const [status, setStatus] = useState<Status>(\"disconnected\");\n const [canSendFeedback, setCanSendFeedback] = useState(false);\n const [mode, setMode] = useState<Mode>(\"listening\");\n\n useEffect(() => {\n if (micMuted !== undefined) {\n conversationRef?.current?.setMicMuted(micMuted);\n }\n }, [micMuted]);\n\n useEffect(() => {\n if (volume !== undefined) {\n conversationRef?.current?.setVolume({ volume });\n }\n }, [volume]);\n\n useEffect(() => {\n return () => {\n conversationRef.current?.endSession();\n };\n }, []);\n\n return {\n startSession: (async (options?: HookOptions) => {\n if (conversationRef.current?.isOpen()) {\n return conversationRef.current.getId();\n }\n\n if (lockRef.current) {\n const conversation = await lockRef.current;\n return conversation.getId();\n }\n\n try {\n lockRef.current = Conversation.startSession({\n ...(defaultOptions ?? {}),\n ...(options ?? {}),\n onModeChange: ({ mode }) => {\n setMode(mode);\n },\n onStatusChange: ({ status }) => {\n setStatus(status);\n },\n onCanSendFeedbackChange: ({ canSendFeedback }) => {\n setCanSendFeedback(canSendFeedback);\n },\n } as Options);\n\n conversationRef.current = await lockRef.current;\n // Persist controlled state between sessions\n if (micMuted !== undefined) {\n conversationRef.current.setMicMuted(micMuted);\n }\n if (volume !== undefined) {\n conversationRef.current.setVolume({ volume });\n }\n\n return conversationRef.current.getId();\n } finally {\n lockRef.current = null;\n }\n }) as T extends SessionConfig\n ? (options?: HookOptions) => Promise<string>\n : (options: SessionConfig & HookOptions) => Promise<string>,\n endSession: async () => {\n const conversation = conversationRef.current;\n conversationRef.current = null;\n await conversation?.endSession();\n },\n setVolume: ({ volume }: { volume: number }) => {\n conversationRef.current?.setVolume({ volume });\n },\n getInputByteFrequencyData: () => {\n return conversationRef.current?.getInputByteFrequencyData();\n },\n getOutputByteFrequencyData: () => {\n return conversationRef.current?.getOutputByteFrequencyData();\n },\n getInputVolume: () => {\n return conversationRef.current?.getInputVolume() ?? 0;\n },\n getOutputVolume: () => {\n return conversationRef.current?.getOutputVolume() ?? 0;\n },\n sendFeedback: (like: boolean) => {\n conversationRef.current?.sendFeedback(like);\n },\n getId: () => {\n return conversationRef.current?.getId();\n },\n sendContextualUpdate: (text: string) => {\n conversationRef.current?.sendContextualUpdate(text);\n },\n sendUserMessage: (text: string) => {\n conversationRef.current?.sendUserMessage(text);\n },\n sendUserActivity: () => {\n conversationRef.current?.sendUserActivity();\n },\n status,\n canSendFeedback,\n micMuted,\n isSpeaking: mode === \"speaking\",\n };\n}\n\n// const con = useConversation({agentId: \"\"})\n"],"names":["useConversation","props","micMuted","volume","defaultOptions","_objectWithoutPropertiesLoose","_excluded","conversationRef","useRef","lockRef","_useState","useState","status","setStatus","_useState2","canSendFeedback","setCanSendFeedback","_useState3","mode","setMode","useEffect","_conversationRef$curr","undefined","current","setMicMuted","_conversationRef$curr2","setVolume","_conversationRef$curr3","endSession","startSession","options","_conversationRef$curr4","_exit","_temp2","_result","Conversation","_extends","onModeChange","_ref","onStatusChange","_ref2","onCanSendFeedbackChange","_ref3","Promise","resolve","then","_lockRef$current","getId","_finallyRethrows","_wasThrown","_result2","isOpen","_temp","conversation","_conversation$getId","e","reject","_ref4","_conversationRef$curr5","getInputByteFrequencyData","_conversationRef$curr6","getOutputByteFrequencyData","_conversationRef$curr7","getInputVolume","_conversationRef$curr8","_conversationRef$curr9","getOutputVolume","_conversationRef$curr10","_conversationRef$curr11","sendFeedback","like","_conversationRef$curr12","_conversationRef$curr13","sendContextualUpdate","text","_conversationRef$curr14","sendUserMessage","_conversationRef$curr15","sendUserActivity","_conversationRef$curr16","isSpeaking"],"mappings":"sZAwCgB,SAAAA,EACdC,QAAAA,IAAAA,IAAAA,EAAW,CAAA,GAEX,IAAQC,EAAwCD,EAAxCC,SAAUC,EAA8BF,EAA9BE,OAAWC,yIAAcC,CAAKJ,EAALK,GACrCC,EAAkBC,EAA4B,MAC9CC,EAAUD,EAAqC,MACrDE,EAA4BC,EAAiB,gBAAtCC,EAAMF,EAAEG,GAAAA,EAASH,EACxB,GAAAI,EAA8CH,GAAS,GAAhDI,EAAeD,EAAA,GAAEE,EAAkBF,EAC1C,GAAAG,EAAwBN,EAAe,aAAhCO,EAAID,EAAA,GAAEE,EAAOF,EAAA,GAoBpB,OAlBAG,EAAU,eACoBC,OAAXC,IAAbpB,IACa,MAAfK,GAAAc,OAAeA,EAAfd,EAAiBgB,UAAjBF,EAA0BG,YAAYtB,GAE1C,EAAG,CAACA,IAEJkB,EAAU,WACkBK,IAAAA,OAAXH,IAAXnB,IACa,MAAfI,GAAAkB,OAAeA,EAAflB,EAAiBgB,UAAjBE,EAA0BC,UAAU,CAAEvB,OAAAA,IAE1C,EAAG,CAACA,IAEJiB,EAAU,WACR,OAAO,WAAKO,IAAAA,EACa,OAAvBA,EAAApB,EAAgBgB,UAAhBI,EAAyBC,YAC3B,CACF,EAAG,IAEI,CACLC,sBAAsBC,GAAyB,IAAA,IAAAC,EAoCpBC,EApCoBC,EAAA,SAAAC,GAAAF,OAAAA,EAAAE,2BAW3CzB,EAAQc,QAAUY,EAAaN,aAAYO,EAAA,CAAA,EACrChC,MAAAA,EAAAA,EAAkB,CAAA,EACX,MAAP0B,EAAAA,EAAW,CAAA,EACfO,CAAAA,aAAc,SAAFC,GACVnB,EADmBmB,EAAJpB,KAEjB,EACAqB,eAAgB,SAAFC,GACZ3B,EADuB2B,EAAN5B,OAEnB,EACA6B,wBAAyB,SAAFC,GACrB1B,EADyC0B,EAAf3B,gBAE5B,KACY4B,QAAAC,QAEkBnC,EAAQc,SAAOsB,KAAAC,SAAAA,GAS/C,OATAvC,EAAgBgB,QAAOuB,OAENxB,IAAbpB,GACFK,EAAgBgB,QAAQC,YAAYtB,QAEvBoB,IAAXnB,GACFI,EAAgBgB,QAAQG,UAAU,CAAEvB,OAAAA,IAG/BI,EAAgBgB,QAAQwB,OAAQ,6FAlCIC,CAAA,EAmC5CC,SAAAA,EAAAC,GACwB,GAAvBzC,EAAQc,QAAU,KAAK0B,QAAAC,EAAA,OAAAA,CAAA,EAAA,EAnCzB,GAA2B,OAA3BnB,EAAIxB,EAAgBgB,UAAhBQ,EAAyBoB,SAC3B,OAAAR,QAAAC,QAAOrC,EAAgBgB,QAAQwB,SAChC,IAAAK,EAEG3C,WAAAA,GAAAA,EAAQc,QAAOoB,OAAAA,QAAAC,QACUnC,EAAQc,SAAOsB,KAAA,SAApCQ,GAAY,IAAAC,EACXD,EAAaN,QAAOO,OAAAtB,EAAAsB,EAAAA,CAAA,GAFzB7C,UAEyBkC,QAAAC,QAAAQ,GAAAA,EAAAP,KAAAO,EAAAP,KAAAZ,GAAAA,EAAAmB,GA+B/B,CAAC,MAAAG,GAAA,OAAAZ,QAAAa,OAAAD,EAE4D,CAAA,EAC7D3B,WAAU,WAAA,IACR,IAAMyB,EAAe9C,EAAgBgB,QACN,OAA/BhB,EAAgBgB,QAAU,KAAKoB,QAAAC,QACb,MAAZS,OAAY,EAAZA,EAAczB,cAAYiB,kBAClC,CAAC,MAAAU,GAAA,OAAAZ,QAAAa,OAAAD,EAAA,CAAA,EACD7B,UAAW,SAAF+B,GAAqC,IAAAC,EACrB,OAAvBA,EAAAnD,EAAgBgB,UAAhBmC,EAAyBhC,UAAU,CAAEvB,OADnBsD,EAANtD,QAEd,EACAwD,0BAA2B,WAAK,IAAAC,EAC9B,OAA8B,OAA9BA,EAAOrD,EAAgBgB,cAAO,EAAvBqC,EAAyBD,2BAClC,EACAE,2BAA4B,WAAK,IAAAC,EAC/B,OAAOA,OAAPA,EAAOvD,EAAgBgB,cAAhBuC,EAAAA,EAAyBD,4BAClC,EACAE,eAAgB,WAAKC,IAAAA,EAAAC,EACnB,OAAgD,OAAhDD,SAAAC,EAAO1D,EAAgBgB,gBAAhB0C,EAAyBF,kBAAgBC,EAAI,CACtD,EACAE,gBAAiB,WAAK,IAAAC,EAAAC,EACpB,OAAiDD,OAAjDA,EAAOC,OAAPA,EAAO7D,EAAgBgB,cAAhB6C,EAAAA,EAAyBF,mBAAiBC,EAAI,CACvD,EACAE,aAAc,SAACC,GAAiBC,IAAAA,EAC9BA,OAAAA,EAAAhE,EAAgBgB,UAAhBgD,EAAyBF,aAAaC,EACxC,EACAvB,MAAO,WAAK,IAAAyB,EACV,OAA8B,OAA9BA,EAAOjE,EAAgBgB,cAAO,EAAvBiD,EAAyBzB,OAClC,EACA0B,qBAAsB,SAACC,OAAgBC,EACd,OAAvBA,EAAApE,EAAgBgB,UAAhBoD,EAAyBF,qBAAqBC,EAChD,EACAE,gBAAiB,SAACF,GAAgB,IAAAG,EACT,OAAvBA,EAAAtE,EAAgBgB,UAAhBsD,EAAyBD,gBAAgBF,EAC3C,EACAI,iBAAkB,WAAK,IAAAC,EACE,OAAvBA,EAAAxE,EAAgBgB,UAAhBwD,EAAyBD,kBAC3B,EACAlE,OAAAA,EACAG,gBAAAA,EACAb,SAAAA,EACA8E,WAAqB,aAAT9D,EAEhB"}
|
package/dist/lib.umd.js
ADDED
|
@@ -0,0 +1,2 @@
|
|
|
1
|
+
!function(e,n){"object"==typeof exports&&"undefined"!=typeof module?n(exports,require("react"),require("@elevenlabs/client")):"function"==typeof define&&define.amd?define(["exports","react","@elevenlabs/client"],n):n((e||self).react={},e.react,e.client)}(this,function(e,n,t){function r(){return r=Object.assign?Object.assign.bind():function(e){for(var n=1;n<arguments.length;n++){var t=arguments[n];for(var r in t)({}).hasOwnProperty.call(t,r)&&(e[r]=t[r])}return e},r.apply(null,arguments)}var u=["micMuted","volume"];Object.defineProperty(e,"postOverallFeedback",{enumerable:!0,get:function(){return t.postOverallFeedback}}),e.useConversation=function(e){void 0===e&&(e={});var o=e.micMuted,c=e.volume,l=function(e,n){if(null==e)return{};var t={};for(var r in e)if({}.hasOwnProperty.call(e,r)){if(n.includes(r))continue;t[r]=e[r]}return t}(e,u),i=n.useRef(null),a=n.useRef(null),s=n.useState("disconnected"),d=s[0],f=s[1],v=n.useState(!1),m=v[0],g=v[1],p=n.useState("listening"),y=p[0],h=p[1];return n.useEffect(function(){var e;void 0!==o&&(null==i||null==(e=i.current)||e.setMicMuted(o))},[o]),n.useEffect(function(){var e;void 0!==c&&(null==i||null==(e=i.current)||e.setVolume({volume:c}))},[c]),n.useEffect(function(){return function(){var e;null==(e=i.current)||e.endSession()}},[]),{startSession:function(e){try{var n,u,s=function(n){return u?n:function(n,u){try{var s=(a.current=t.Conversation.startSession(r({},null!=l?l:{},null!=e?e:{},{onModeChange:function(e){h(e.mode)},onStatusChange:function(e){f(e.status)},onCanSendFeedbackChange:function(e){g(e.canSendFeedback)}})),Promise.resolve(a.current).then(function(e){return i.current=e,void 0!==o&&i.current.setMicMuted(o),void 0!==c&&i.current.setVolume({volume:c}),i.current.getId()}))}catch(e){return u(!0,e)}return s&&s.then?s.then(u.bind(null,!1),u.bind(null,!0)):u(!1,s)}(0,function(e,n){if(a.current=null,e)throw n;return n})};if(null!=(n=i.current)&&n.isOpen())return Promise.resolve(i.current.getId());var d=function(){if(a.current)return Promise.resolve(a.current).then(function(e){var n=e.getId();return u=1,n})}();return Promise.resolve(d&&d.then?d.then(s):s(d))}catch(e){return Promise.reject(e)}},endSession:function(){try{var e=i.current;return i.current=null,Promise.resolve(null==e?void 0:e.endSession()).then(function(){})}catch(e){return Promise.reject(e)}},setVolume:function(e){var n;null==(n=i.current)||n.setVolume({volume:e.volume})},getInputByteFrequencyData:function(){var e;return null==(e=i.current)?void 0:e.getInputByteFrequencyData()},getOutputByteFrequencyData:function(){var e;return null==(e=i.current)?void 0:e.getOutputByteFrequencyData()},getInputVolume:function(){var e,n;return null!=(e=null==(n=i.current)?void 0:n.getInputVolume())?e:0},getOutputVolume:function(){var e,n;return null!=(e=null==(n=i.current)?void 0:n.getOutputVolume())?e:0},sendFeedback:function(e){var n;null==(n=i.current)||n.sendFeedback(e)},getId:function(){var e;return null==(e=i.current)?void 0:e.getId()},sendContextualUpdate:function(e){var n;null==(n=i.current)||n.sendContextualUpdate(e)},sendUserMessage:function(e){var n;null==(n=i.current)||n.sendUserMessage(e)},sendUserActivity:function(){var e;null==(e=i.current)||e.sendUserActivity()},status:d,canSendFeedback:m,micMuted:o,isSpeaking:"speaking"===y}}});
|
|
2
|
+
//# sourceMappingURL=lib.umd.js.map
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"file":"lib.umd.js","sources":["../src/index.ts"],"sourcesContent":["import { useEffect, useRef, useState } from \"react\";\nimport {\n Conversation,\n Mode,\n SessionConfig,\n Callbacks,\n Options,\n Status,\n ClientToolsConfig,\n InputConfig,\n} from \"@elevenlabs/client\";\n\nexport type {\n Role,\n Mode,\n Status,\n SessionConfig,\n DisconnectionDetails,\n Language,\n} from \"@elevenlabs/client\";\nexport { postOverallFeedback } from \"@elevenlabs/client\";\n\nexport type HookOptions = Partial<\n SessionConfig & HookCallbacks & ClientToolsConfig & InputConfig\n>;\nexport type ControlledState = {\n micMuted?: boolean;\n volume?: number;\n};\nexport type HookCallbacks = Pick<\n Callbacks,\n | \"onConnect\"\n | \"onDisconnect\"\n | \"onError\"\n | \"onMessage\"\n | \"onAudio\"\n | \"onDebug\"\n | \"onUnhandledClientToolCall\"\n>;\n\nexport function useConversation<T extends HookOptions & ControlledState>(\n props: T = {} as T\n) {\n const { micMuted, volume, ...defaultOptions } = props;\n const conversationRef = useRef<Conversation | null>(null);\n const lockRef = useRef<Promise<Conversation> | null>(null);\n const [status, setStatus] = useState<Status>(\"disconnected\");\n const [canSendFeedback, setCanSendFeedback] = useState(false);\n const [mode, setMode] = useState<Mode>(\"listening\");\n\n useEffect(() => {\n if (micMuted !== undefined) {\n conversationRef?.current?.setMicMuted(micMuted);\n }\n }, [micMuted]);\n\n useEffect(() => {\n if (volume !== undefined) {\n conversationRef?.current?.setVolume({ volume });\n }\n }, [volume]);\n\n useEffect(() => {\n return () => {\n conversationRef.current?.endSession();\n };\n }, []);\n\n return {\n startSession: (async (options?: HookOptions) => {\n if (conversationRef.current?.isOpen()) {\n return conversationRef.current.getId();\n }\n\n if (lockRef.current) {\n const conversation = await lockRef.current;\n return conversation.getId();\n }\n\n try {\n lockRef.current = Conversation.startSession({\n ...(defaultOptions ?? {}),\n ...(options ?? {}),\n onModeChange: ({ mode }) => {\n setMode(mode);\n },\n onStatusChange: ({ status }) => {\n setStatus(status);\n },\n onCanSendFeedbackChange: ({ canSendFeedback }) => {\n setCanSendFeedback(canSendFeedback);\n },\n } as Options);\n\n conversationRef.current = await lockRef.current;\n // Persist controlled state between sessions\n if (micMuted !== undefined) {\n conversationRef.current.setMicMuted(micMuted);\n }\n if (volume !== undefined) {\n conversationRef.current.setVolume({ volume });\n }\n\n return conversationRef.current.getId();\n } finally {\n lockRef.current = null;\n }\n }) as T extends SessionConfig\n ? (options?: HookOptions) => Promise<string>\n : (options: SessionConfig & HookOptions) => Promise<string>,\n endSession: async () => {\n const conversation = conversationRef.current;\n conversationRef.current = null;\n await conversation?.endSession();\n },\n setVolume: ({ volume }: { volume: number }) => {\n conversationRef.current?.setVolume({ volume });\n },\n getInputByteFrequencyData: () => {\n return conversationRef.current?.getInputByteFrequencyData();\n },\n getOutputByteFrequencyData: () => {\n return conversationRef.current?.getOutputByteFrequencyData();\n },\n getInputVolume: () => {\n return conversationRef.current?.getInputVolume() ?? 0;\n },\n getOutputVolume: () => {\n return conversationRef.current?.getOutputVolume() ?? 0;\n },\n sendFeedback: (like: boolean) => {\n conversationRef.current?.sendFeedback(like);\n },\n getId: () => {\n return conversationRef.current?.getId();\n },\n sendContextualUpdate: (text: string) => {\n conversationRef.current?.sendContextualUpdate(text);\n },\n sendUserMessage: (text: string) => {\n conversationRef.current?.sendUserMessage(text);\n },\n sendUserActivity: () => {\n conversationRef.current?.sendUserActivity();\n },\n status,\n canSendFeedback,\n micMuted,\n isSpeaking: mode === \"speaking\",\n };\n}\n\n// const con = useConversation({agentId: \"\"})\n"],"names":["props","micMuted","volume","defaultOptions","_objectWithoutPropertiesLoose","_excluded","conversationRef","useRef","lockRef","_useState","useState","status","setStatus","_useState2","canSendFeedback","setCanSendFeedback","_useState3","mode","setMode","useEffect","_conversationRef$curr","undefined","current","setMicMuted","_conversationRef$curr2","setVolume","_conversationRef$curr3","endSession","startSession","options","_conversationRef$curr4","_exit","_temp2","_result","Conversation","_extends","onModeChange","_ref","onStatusChange","_ref2","onCanSendFeedbackChange","_ref3","Promise","resolve","then","_lockRef$current","getId","_finallyRethrows","_wasThrown","_result2","isOpen","_temp","conversation","_conversation$getId","e","reject","_ref4","_conversationRef$curr5","getInputByteFrequencyData","_conversationRef$curr6","getOutputByteFrequencyData","_conversationRef$curr7","getInputVolume","_conversationRef$curr8","_conversationRef$curr9","getOutputVolume","_conversationRef$curr10","_conversationRef$curr11","sendFeedback","like","_conversationRef$curr12","_conversationRef$curr13","sendContextualUpdate","text","_conversationRef$curr14","sendUserMessage","_conversationRef$curr15","sendUserActivity","_conversationRef$curr16","isSpeaking"],"mappings":"krBAwCgB,SACdA,QAAAA,IAAAA,IAAAA,EAAW,CAAA,GAEX,IAAQC,EAAwCD,EAAxCC,SAAUC,EAA8BF,EAA9BE,OAAWC,yIAAcC,CAAKJ,EAALK,GACrCC,EAAkBC,EAAMA,OAAsB,MAC9CC,EAAUD,EAAAA,OAAqC,MACrDE,EAA4BC,EAAAA,SAAiB,gBAAtCC,EAAMF,EAAEG,GAAAA,EAASH,EACxB,GAAAI,EAA8CH,EAAAA,UAAS,GAAhDI,EAAeD,EAAA,GAAEE,EAAkBF,EAC1C,GAAAG,EAAwBN,EAAAA,SAAe,aAAhCO,EAAID,EAAA,GAAEE,EAAOF,EAAA,GAoBpB,OAlBAG,EAASA,UAAC,eACoBC,OAAXC,IAAbpB,IACa,MAAfK,GAAAc,OAAeA,EAAfd,EAAiBgB,UAAjBF,EAA0BG,YAAYtB,GAE1C,EAAG,CAACA,IAEJkB,EAASA,UAAC,WACkBK,IAAAA,OAAXH,IAAXnB,IACa,MAAfI,GAAAkB,OAAeA,EAAflB,EAAiBgB,UAAjBE,EAA0BC,UAAU,CAAEvB,OAAAA,IAE1C,EAAG,CAACA,IAEJiB,EAAAA,UAAU,WACR,OAAO,WAAKO,IAAAA,EACa,OAAvBA,EAAApB,EAAgBgB,UAAhBI,EAAyBC,YAC3B,CACF,EAAG,IAEI,CACLC,sBAAsBC,GAAyB,IAAA,IAAAC,EAoCpBC,EApCoBC,EAAA,SAAAC,GAAAF,OAAAA,EAAAE,2BAW3CzB,EAAQc,QAAUY,EAAAA,aAAaN,aAAYO,EAAA,CAAA,EACrChC,MAAAA,EAAAA,EAAkB,CAAA,EACX,MAAP0B,EAAAA,EAAW,CAAA,EACfO,CAAAA,aAAc,SAAFC,GACVnB,EADmBmB,EAAJpB,KAEjB,EACAqB,eAAgB,SAAFC,GACZ3B,EADuB2B,EAAN5B,OAEnB,EACA6B,wBAAyB,SAAFC,GACrB1B,EADyC0B,EAAf3B,gBAE5B,KACY4B,QAAAC,QAEkBnC,EAAQc,SAAOsB,KAAAC,SAAAA,GAS/C,OATAvC,EAAgBgB,QAAOuB,OAENxB,IAAbpB,GACFK,EAAgBgB,QAAQC,YAAYtB,QAEvBoB,IAAXnB,GACFI,EAAgBgB,QAAQG,UAAU,CAAEvB,OAAAA,IAG/BI,EAAgBgB,QAAQwB,OAAQ,6FAlCIC,CAAA,EAmC5CC,SAAAA,EAAAC,GACwB,GAAvBzC,EAAQc,QAAU,KAAK0B,QAAAC,EAAA,OAAAA,CAAA,EAAA,EAnCzB,GAA2B,OAA3BnB,EAAIxB,EAAgBgB,UAAhBQ,EAAyBoB,SAC3B,OAAAR,QAAAC,QAAOrC,EAAgBgB,QAAQwB,SAChC,IAAAK,EAEG3C,WAAAA,GAAAA,EAAQc,QAAOoB,OAAAA,QAAAC,QACUnC,EAAQc,SAAOsB,KAAA,SAApCQ,GAAY,IAAAC,EACXD,EAAaN,QAAOO,OAAAtB,EAAAsB,EAAAA,CAAA,GAFzB7C,UAEyBkC,QAAAC,QAAAQ,GAAAA,EAAAP,KAAAO,EAAAP,KAAAZ,GAAAA,EAAAmB,GA+B/B,CAAC,MAAAG,GAAA,OAAAZ,QAAAa,OAAAD,EAE4D,CAAA,EAC7D3B,WAAU,WAAA,IACR,IAAMyB,EAAe9C,EAAgBgB,QACN,OAA/BhB,EAAgBgB,QAAU,KAAKoB,QAAAC,QACb,MAAZS,OAAY,EAAZA,EAAczB,cAAYiB,kBAClC,CAAC,MAAAU,GAAA,OAAAZ,QAAAa,OAAAD,EAAA,CAAA,EACD7B,UAAW,SAAF+B,GAAqC,IAAAC,EACrB,OAAvBA,EAAAnD,EAAgBgB,UAAhBmC,EAAyBhC,UAAU,CAAEvB,OADnBsD,EAANtD,QAEd,EACAwD,0BAA2B,WAAK,IAAAC,EAC9B,OAA8B,OAA9BA,EAAOrD,EAAgBgB,cAAO,EAAvBqC,EAAyBD,2BAClC,EACAE,2BAA4B,WAAK,IAAAC,EAC/B,OAAOA,OAAPA,EAAOvD,EAAgBgB,cAAhBuC,EAAAA,EAAyBD,4BAClC,EACAE,eAAgB,WAAKC,IAAAA,EAAAC,EACnB,OAAgD,OAAhDD,SAAAC,EAAO1D,EAAgBgB,gBAAhB0C,EAAyBF,kBAAgBC,EAAI,CACtD,EACAE,gBAAiB,WAAK,IAAAC,EAAAC,EACpB,OAAiDD,OAAjDA,EAAOC,OAAPA,EAAO7D,EAAgBgB,cAAhB6C,EAAAA,EAAyBF,mBAAiBC,EAAI,CACvD,EACAE,aAAc,SAACC,GAAiBC,IAAAA,EAC9BA,OAAAA,EAAAhE,EAAgBgB,UAAhBgD,EAAyBF,aAAaC,EACxC,EACAvB,MAAO,WAAK,IAAAyB,EACV,OAA8B,OAA9BA,EAAOjE,EAAgBgB,cAAO,EAAvBiD,EAAyBzB,OAClC,EACA0B,qBAAsB,SAACC,OAAgBC,EACd,OAAvBA,EAAApE,EAAgBgB,UAAhBoD,EAAyBF,qBAAqBC,EAChD,EACAE,gBAAiB,SAACF,GAAgB,IAAAG,EACT,OAAvBA,EAAAtE,EAAgBgB,UAAhBsD,EAAyBD,gBAAgBF,EAC3C,EACAI,iBAAkB,WAAK,IAAAC,EACE,OAAvBA,EAAAxE,EAAgBgB,UAAhBwD,EAAyBD,kBAC3B,EACAlE,OAAAA,EACAG,gBAAAA,EACAb,SAAAA,EACA8E,WAAqB,aAAT9D,EAEhB"}
|
package/package.json
ADDED
|
@@ -0,0 +1,50 @@
|
|
|
1
|
+
{
|
|
2
|
+
"name": "@elevenlabs/react",
|
|
3
|
+
"version": "0.1.5",
|
|
4
|
+
"description": "ElevenLabs React Library",
|
|
5
|
+
"main": "./dist/lib.umd.js",
|
|
6
|
+
"module": "./dist/lib.module.js",
|
|
7
|
+
"source": "src/index.ts",
|
|
8
|
+
"type": "module",
|
|
9
|
+
"unpkg": "./dist/lib.umd.js",
|
|
10
|
+
"types": "./dist/index.d.ts",
|
|
11
|
+
"exports": {
|
|
12
|
+
".": {
|
|
13
|
+
"types": "./dist/index.d.ts",
|
|
14
|
+
"import": "./dist/lib.modern.js",
|
|
15
|
+
"require": "./dist/lib.cjs"
|
|
16
|
+
}
|
|
17
|
+
},
|
|
18
|
+
"keywords": [],
|
|
19
|
+
"author": "ElevenLabs",
|
|
20
|
+
"license": "MIT",
|
|
21
|
+
"dependencies": {
|
|
22
|
+
"@elevenlabs/client": "0.1.5"
|
|
23
|
+
},
|
|
24
|
+
"peerDependencies": {
|
|
25
|
+
"react": ">=16.8.0"
|
|
26
|
+
},
|
|
27
|
+
"devDependencies": {
|
|
28
|
+
"@types/jest": "^29.5.12",
|
|
29
|
+
"@types/react": "^18.3.3",
|
|
30
|
+
"eslint": "^9.8.0",
|
|
31
|
+
"jest": "^29.7.0",
|
|
32
|
+
"microbundle": "^0.15.1",
|
|
33
|
+
"typescript": "^5.5.4"
|
|
34
|
+
},
|
|
35
|
+
"repository": {
|
|
36
|
+
"type": "git",
|
|
37
|
+
"url": "git+https://github.com/elevenlabs/packages.git",
|
|
38
|
+
"directory": "packages/react"
|
|
39
|
+
},
|
|
40
|
+
"scripts": {
|
|
41
|
+
"build": "BROWSERSLIST_ENV=modern microbundle --jsx React.createElement --jsxFragment React.Fragment --jsxImportSource react src/index.ts",
|
|
42
|
+
"clean": "rm -rf ./dist",
|
|
43
|
+
"dev": "npm run clean && BROWSERSLIST_ENV=development microbundle --jsx React.createElement --jsxFragment React.Fragment --jsxImportSource react src/index.ts -w -f modern",
|
|
44
|
+
"lint": "npm run lint:ts && npm run lint:es && npm run lint:prettier",
|
|
45
|
+
"lint:ts": "tsc --noEmit --skipLibCheck",
|
|
46
|
+
"lint:es": "npx eslint .",
|
|
47
|
+
"lint:prettier": "prettier 'src/**/*.ts' --check",
|
|
48
|
+
"test": "jest"
|
|
49
|
+
}
|
|
50
|
+
}
|