@revrag-ai/embed-react-native 1.0.11 → 1.0.12

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (28) hide show
  1. package/README.md +707 -253
  2. package/dist/commonjs/components/Embed/EmbedButton.js +2 -0
  3. package/dist/commonjs/components/Embed/EmbedButton.js.map +1 -1
  4. package/dist/commonjs/hooks/initialize.js +3 -24
  5. package/dist/commonjs/hooks/initialize.js.map +1 -1
  6. package/dist/commonjs/store/store.key.js +0 -3
  7. package/dist/commonjs/store/store.key.js.map +1 -1
  8. package/dist/commonjs/utils/permision.js +29 -0
  9. package/dist/commonjs/utils/permision.js.map +1 -0
  10. package/dist/commonjs/utils/reanimated.helper.js +0 -1
  11. package/dist/commonjs/utils/reanimated.helper.js.map +1 -1
  12. package/dist/module/components/Embed/EmbedButton.js +2 -0
  13. package/dist/module/components/Embed/EmbedButton.js.map +1 -1
  14. package/dist/module/hooks/initialize.js +2 -23
  15. package/dist/module/hooks/initialize.js.map +1 -1
  16. package/dist/module/store/store.key.js +0 -3
  17. package/dist/module/store/store.key.js.map +1 -1
  18. package/dist/module/utils/permision.js +24 -0
  19. package/dist/module/utils/permision.js.map +1 -0
  20. package/dist/module/utils/reanimated.helper.js +0 -1
  21. package/dist/module/utils/reanimated.helper.js.map +1 -1
  22. package/dist/typescript/src/components/Embed/EmbedButton.d.ts.map +1 -1
  23. package/dist/typescript/src/hooks/initialize.d.ts.map +1 -1
  24. package/dist/typescript/src/store/store.key.d.ts.map +1 -1
  25. package/dist/typescript/src/utils/permision.d.ts +2 -0
  26. package/dist/typescript/src/utils/permision.d.ts.map +1 -0
  27. package/dist/typescript/src/utils/reanimated.helper.d.ts.map +1 -1
  28. package/package.json +1 -1
package/README.md CHANGED
@@ -1,381 +1,835 @@
1
- # @revrag-ai/embed-react-native
1
+ # Embed React Native SDK Integration Guide
2
2
 
3
- A powerful React Native library for integrating AI-powered voice agents into your mobile applications. This SDK provides real-time voice communication capabilities with intelligent speech processing, perfect for building conversational AI experiences.
3
+ ## Overview
4
4
 
5
- ## 🚀 Features
5
+ The Embed React Native SDK is a powerful voice-enabled AI agent library that provides real-time communication capabilities. This comprehensive guide will walk you through the complete integration process from installation to deployment.
6
6
 
7
- - **AI Voice Agent Integration**: Seamlessly integrate intelligent voice agents into your React Native app
8
- - **Real-time Voice Communication**: High-quality, low-latency voice chat powered by LiveKit
9
- - **Voice Activity Detection**: Automatic speech detection and processing
10
- - **Customizable UI Components**: Pre-built, customizable voice interface components
11
- - **Cross-platform**: Works on both iOS and Android
12
- - **TypeScript Support**: Full TypeScript definitions included
13
- - **Event System**: Comprehensive event handling for voice interactions
14
- - **Audio Visualization**: Built-in waveform visualization for voice activity
7
+ ## Table of Contents
15
8
 
16
- ## 📦 Installation
9
+ 1. [Installation](https://www.notion.so/Embed-React-Native-SDK-Integration-Guide-201b9b86659d80b38625fb72de8c8f0e?pvs=21)
10
+ 2. [Peer Dependencies](https://www.notion.so/Embed-React-Native-SDK-Integration-Guide-201b9b86659d80b38625fb72de8c8f0e?pvs=21)
11
+ 3. [Android Configuration](https://www.notion.so/Embed-React-Native-SDK-Integration-Guide-201b9b86659d80b38625fb72de8c8f0e?pvs=21)
12
+ 4. [iOS Configuration](https://www.notion.so/Embed-React-Native-SDK-Integration-Guide-201b9b86659d80b38625fb72de8c8f0e?pvs=21)
13
+ 5. [Babel Configuration](https://www.notion.so/Embed-React-Native-SDK-Integration-Guide-201b9b86659d80b38625fb72de8c8f0e?pvs=21)
14
+ 6. [SDK Initialization](https://www.notion.so/Embed-React-Native-SDK-Integration-Guide-201b9b86659d80b38625fb72de8c8f0e?pvs=21)
15
+ 7. [App Setup](https://www.notion.so/Embed-React-Native-SDK-Integration-Guide-201b9b86659d80b38625fb72de8c8f0e?pvs=21)
16
+ 8. [Event System](https://www.notion.so/Embed-React-Native-SDK-Integration-Guide-201b9b86659d80b38625fb72de8c8f0e?pvs=21)
17
+ 9. [Usage Examples](https://www.notion.so/Embed-React-Native-SDK-Integration-Guide-201b9b86659d80b38625fb72de8c8f0e?pvs=21)
18
+ 10. [FAQ & Troubleshooting](https://www.notion.so/Embed-React-Native-SDK-Integration-Guide-201b9b86659d80b38625fb72de8c8f0e?pvs=21)
17
19
 
18
- ```sh
20
+ ## Installation
21
+
22
+ Install the Embed React Native SDK using your preferred package manager:
23
+
24
+ ```bash
25
+ # Using npm
19
26
  npm install @revrag-ai/embed-react-native
27
+
28
+ # Using yarn
29
+ yarn add @revrag-ai/embed-react-native
30
+
31
+ # Using pnpm
32
+ pnpm add @revrag-ai/embed-react-native
33
+
20
34
  ```
21
35
 
22
- ### Peer Dependencies
36
+ ## Peer Dependencies
37
+
38
+ The SDK requires several peer dependencies to be installed in your project. Install all required dependencies:
39
+
40
+ ```bash
41
+ # Install peer dependencies
42
+ npm install @livekit/react-native @livekit/react-native-webrtc
43
+ npm install @react-native-async-storage/async-storage
44
+ npm install react-native-gesture-handler react-native-reanimated
45
+ npm install react-native-linear-gradient lottie-react-native
46
+ npm install react-native-safe-area-context
23
47
 
24
- Make sure to install the required peer dependencies:
48
+ # For iOS, run pod install
49
+ cd ios && pod install && cd ..
25
50
 
26
- ```sh
27
- npm install @livekit/react-native @livekit/react-native-webrtc @react-native-async-storage/async-storage lottie-react-native react-native-gesture-handler react-native-linear-gradient react-native-reanimated react-native-safe-area-context
28
51
  ```
29
52
 
30
- ### iOS Setup
53
+ ### Alternative installation commands:
31
54
 
32
- Add the following to your `ios/YourApp/Info.plist`:
55
+ **Using Yarn:**
56
+
57
+ ```bash
58
+ yarn add @livekit/react-native @livekit/react-native-webrtc @react-native-async-storage/async-storage react-native-gesture-handler react-native-reanimated react-native-linear-gradient lottie-react-native react-native-safe-area-context
59
+
60
+ ```
61
+
62
+ **Using pnpm:**
63
+
64
+ ```bash
65
+ pnpm add @livekit/react-native @livekit/react-native-webrtc @react-native-async-storage/async-storage react-native-gesture-handler react-native-reanimated react-native-linear-gradient lottie-react-native react-native-safe-area-context
33
66
 
34
- ```xml
35
- <key>NSMicrophoneUsageDescription</key>
36
- <string>This app needs microphone access for voice communication</string>
37
67
  ```
38
68
 
39
- ### Android Setup
69
+ ## Android Configuration
70
+
71
+ ### 1. Android Manifest Permissions
40
72
 
41
73
  Add the following permissions to your `android/app/src/main/AndroidManifest.xml`:
42
74
 
43
75
  ```xml
44
- <uses-permission android:name="android.permission.RECORD_AUDIO" />
45
- <uses-permission android:name="android.permission.INTERNET" />
46
- <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
76
+ <manifest xmlns:android="<http://schemas.android.com/apk/res/android>">
77
+
78
+ <!-- Required permissions for Embed SDK -->
79
+ <uses-permission android:name="android.permission.INTERNET" />
80
+ <uses-permission android:name="android.permission.RECORD_AUDIO" />
81
+ <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
82
+ <uses-permission android:name="android.permission.MICROPHONE" />
83
+ <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
84
+ <uses-permission android:name="android.permission.WAKE_LOCK" />
85
+
86
+ <application
87
+ android:name=".MainApplication"
88
+ android:label="@string/app_name"
89
+ android:icon="@mipmap/ic_launcher"
90
+ android:roundIcon="@mipmap/ic_launcher_round"
91
+ android:allowBackup="false"
92
+ android:theme="@style/AppTheme"
93
+ android:supportsRtl="true"
94
+ android:usesCleartextTraffic="true"
95
+ android:hardwareAccelerated="true">
96
+
97
+ <!-- Your activities and other components -->
98
+
99
+ </application>
100
+ </manifest>
101
+
102
+ ```
103
+
104
+ ### 2. Build.gradle Configuration
105
+
106
+ Add Lottie dependency to your `android/app/build.gradle`:
107
+
108
+ ```
109
+ dependencies {
110
+ implementation 'com.airbnb.android:lottie:6.0.1'
111
+ // ... other dependencies
112
+ }
113
+
114
+ ```
115
+
116
+ ### 3. ProGuard Configuration (if using ProGuard)
117
+
118
+ Add to your `android/app/proguard-rules.pro`:
119
+
120
+ ```
121
+ # Embed SDK
122
+ -keep class com.revrag.embed.** { *; }
123
+ -keep class org.webrtc.** { *; }
124
+ -dontwarn org.webrtc.**
125
+
126
+ # Lottie
127
+ -keep class com.airbnb.lottie.** { *; }
128
+
129
+ ```
130
+
131
+ ## iOS Configuration
132
+
133
+ ### 1. iOS Permissions
134
+
135
+ **🚨 CRITICAL:** Add the following permissions to your `ios/YourAppName/Info.plist`. **Missing `NSMicrophoneUsageDescription` will cause the app to crash when accessing the microphone:**
136
+
137
+ ```xml
138
+ <key>NSMicrophoneUsageDescription</key>
139
+ <string>This app needs access to microphone for voice communication with AI agent</string>
140
+
141
+ <key>NSAppTransportSecurity</key>
142
+ <dict>
143
+ <key>NSAllowsArbitraryLoads</key>
144
+ <false/>
145
+ <key>NSAllowsLocalNetworking</key>
146
+ <true/>
147
+ </dict>
148
+
47
149
  ```
48
150
 
49
- ## 🎯 Quick Start
151
+ **⚠️ App Crash Fix:** If your app crashes with "attempted to access privacy-sensitive data without a usage description", ensure the `NSMicrophoneUsageDescription` key is present in your `Info.plist`.
50
152
 
51
- ### 1. Initialize the SDK
153
+ ### 2. Pod Installation
52
154
 
53
- Wrap your app with the initialization hook:
155
+ After installing peer dependencies, run:
156
+
157
+ ```bash
158
+ cd ios
159
+ pod install
160
+ cd ..
161
+
162
+ ```
163
+
164
+ ### 3. iOS Build Settings (if needed)
165
+
166
+ If you encounter build issues, add these to your iOS project settings:
167
+
168
+ - Enable Bitcode: `NO`
169
+ - Build Active Architecture Only: `YES` (for Debug)
170
+
171
+ ## Babel Configuration
172
+
173
+ **⚠️ CRITICAL: React Native Reanimated Configuration**
174
+
175
+ Add the React Native Reanimated plugin to your `babel.config.js`. **This must be the last plugin in the plugins array:**
176
+
177
+ ```jsx
178
+ module.exports = {
179
+ presets: ['module:@react-native/babel-preset'],
180
+ plugins: [
181
+ // ... other plugins
182
+ 'react-native-reanimated/plugin', // ← This MUST be the last plugin
183
+ ],
184
+ };
185
+
186
+ ```
54
187
 
55
- ```tsx
188
+ **❌ Common Mistake:**
189
+
190
+ ```jsx
191
+ // DON'T DO THIS - other plugins after reanimated plugin will cause issues
192
+ module.exports = {
193
+ presets: ['module:@react-native/babel-preset'],
194
+ plugins: [
195
+ 'react-native-reanimated/plugin',
196
+ 'some-other-plugin', // ← This will break reanimated
197
+ ],
198
+ };
199
+
200
+ ```
201
+
202
+ **✅ Correct Configuration:**
203
+
204
+ ```jsx
205
+ // DO THIS - reanimated plugin as the last plugin
206
+ module.exports = {
207
+ presets: ['module:@react-native/babel-preset'],
208
+ plugins: [
209
+ 'some-other-plugin',
210
+ 'another-plugin',
211
+ 'react-native-reanimated/plugin', // ← Last plugin
212
+ ],
213
+ };
214
+
215
+ ```
216
+
217
+ After updating `babel.config.js`, clean your project:
218
+
219
+ ```bash
220
+ # For React Native
221
+ npx react-native start --reset-cache
222
+
223
+ # For Expo (if applicable)
224
+ expo start --clear
225
+
226
+ ```
227
+
228
+ ## SDK Initialization
229
+
230
+ ### 1. useInitialize Hook
231
+
232
+ Initialize the SDK at the root level of your application using the `useInitialize` hook:
233
+
234
+ ```jsx
235
+ import { useInitialize } from '@revrag-ai/embed-react-native';
236
+
237
+ function App() {
238
+ const { isInitialized, error } = useInitialize({
239
+ apiKey: 'YOUR_API_KEY',
240
+ embedUrl: 'YOUR_ONWID_SERVER_URL',
241
+ });
242
+
243
+ if (error) {
244
+ console.error('SDK initialization failed:', error);
245
+ }
246
+
247
+ if (!isInitialized) {
248
+ // Show loading screen while initializing
249
+ return <LoadingScreen />;
250
+ }
251
+
252
+ // Your app components
253
+ return <YourApp />;
254
+ }
255
+
256
+ ```
257
+
258
+ ### 2. Configuration Options
259
+
260
+ | Property | Type | Required | Description |
261
+ | --- | --- | --- | --- |
262
+ | `apiKey` | string | ✅ | Your Embed API key |
263
+ | `embedUrl` | string | ✅ | Your Embed server URL |
264
+
265
+ ## App Setup
266
+
267
+ ### 1. Wrap App with GestureHandlerRootView
268
+
269
+ **⚠️ IMPORTANT:** You must wrap your entire app with `GestureHandlerRootView` for the SDK to work properly:
270
+
271
+ ```jsx
56
272
  import React from 'react';
273
+ import { GestureHandlerRootView } from 'react-native-gesture-handler';
57
274
  import { useInitialize } from '@revrag-ai/embed-react-native';
58
275
 
59
276
  export default function App() {
60
- useInitialize({
61
- apiKey: 'your-api-key-here',
62
- embedUrl: 'https://your-voice-agent-server.com',
277
+ const { isInitialized, error } = useInitialize({
278
+ apiKey: 'your_api_key_here',
279
+ embedUrl: '<https://your-embed-server.com>',
63
280
  });
64
281
 
65
- return <YourAppContent />;
282
+ return (
283
+ <GestureHandlerRootView style={{ flex: 1 }}>
284
+ {/* Your app components */}
285
+ </GestureHandlerRootView>
286
+ );
66
287
  }
288
+
67
289
  ```
68
290
 
69
- ### 2. Add the Voice Button
291
+ ### 2. Add EmbedButton Component
70
292
 
71
- Add the voice interface component to your screen:
293
+ Add the floating voice agent button to your screen:
72
294
 
73
- ```tsx
74
- import React from 'react';
75
- import { View } from 'react-native';
295
+ ```jsx
76
296
  import { EmbedButton } from '@revrag-ai/embed-react-native';
77
297
 
78
- export default function HomeScreen() {
298
+ function MyScreen() {
79
299
  return (
80
300
  <View style={{ flex: 1 }}>
81
- {/* Your app content */}
82
-
83
- {/* Voice agent button - floats over content */}
301
+ {/* Your screen content */}
84
302
  <EmbedButton />
85
303
  </View>
86
304
  );
87
305
  }
306
+
88
307
  ```
89
308
 
90
- ### 3. Handle Voice Events (Optional)
309
+ ## Event System
91
310
 
92
- Listen to voice agent events for custom handling:
311
+ The SDK provides a powerful event system for sending user context and application state to the AI agent.
93
312
 
94
- ```tsx
95
- import React, { useEffect } from 'react';
96
- import { Embed, EmbedEventKeys } from '@revrag-ai/embed-react-native';
313
+ ### Available Events
97
314
 
98
- export default function VoiceEnabledScreen() {
99
- useEffect(() => {
100
- // Listen for user data events
101
- Embed.on(EmbedEventKeys.USER_DATA, (data) => {
102
- console.log('User data received:', data);
103
- });
315
+ The SDK exports the following event types:
104
316
 
105
- // Listen for screen state events
106
- Embed.on(EmbedEventKeys.SCREEN_STATE, (state) => {
107
- console.log('Screen state changed:', state);
108
- });
109
- }, []);
317
+ ```jsx
318
+ import { EventKeys } from '@revrag-ai/embed-react-native';
319
+
320
+ // Available event keys:
321
+ EventKeys.USER_DATA // 'user_data' - User identity and profile
322
+ EventKeys.SCREEN_STATE // 'state_data' - Application state and context
110
323
 
111
- return (
112
- // Your component JSX
113
- );
114
- }
115
324
  ```
116
325
 
117
- ## 📖 API Reference
326
+ ### Event Usage Rules
327
+
328
+ **🚨 CRITICAL REQUIREMENT:**
118
329
 
119
- ### `useInitialize(props: UseInitializeProps)`
330
+ 1. **USER_DATA event MUST be sent first** before any other events
331
+ 2. **USER_DATA must include `app_user_id`** for user identification
332
+ 3. **EmbedButton should only be rendered AFTER** USER_DATA event is sent
333
+ 4. **SCREEN_STATE events** can only be sent after USER_DATA is established
120
334
 
121
- Initializes the voice agent SDK with your configuration.
335
+ ### Event Methods
122
336
 
123
- #### Parameters
337
+ ```jsx
338
+ import { embed, EventKeys } from '@revrag-ai/embed-react-native';
124
339
 
125
- | Parameter | Type | Required | Description |
126
- |-----------|------|----------|-------------|
127
- | `apiKey` | `string` | ✅ | Your unique API key from RevRag AI |
128
- | `embedUrl` | `string` | ✅ | The voice agent server URL |
340
+ // Send events
341
+ await embed.Event(eventKey, data);
129
342
 
130
- #### Example
343
+ // Listen to events (optional)
344
+ embed.on(eventKey, callback);
131
345
 
132
- ```tsx
133
- useInitialize({
134
- apiKey: 'key_abc123',
135
- embedUrl: 'https://api.revrag.ai',
136
- });
137
346
  ```
138
347
 
139
- ### `<EmbedButton />`
348
+ ### How Events are Triggered
140
349
 
141
- A floating action button that provides the voice interface.
350
+ Events are triggered using the `embed.Event()` method and automatically:
142
351
 
143
- #### Features
352
+ 1. **Validate the event key** against allowed EventKeys
353
+ 2. **Store user identity** from USER_DATA events
354
+ 3. **Auto-append app_user_id** to subsequent events
355
+ 4. **Send data to your server** via the configured API
356
+ 5. **Trigger local event listeners**
144
357
 
145
- - **Draggable**: Users can drag the button around the screen
146
- - **Expandable**: Expands to show voice controls when active
147
- - **Auto-opening**: Automatically prompts users after 15 seconds
148
- - **Visual Feedback**: Shows connection status and audio visualization
149
- - **Customizable**: Supports custom styling
358
+ ```jsx
359
+ // Example event flow
360
+ try {
361
+ // Step 1: Send user data first (required)
362
+ await embed.Event(EventKeys.USER_DATA, {
363
+ app_user_id: 'user123',
364
+ name: 'John Doe',
365
+ email: 'john@example.com',
366
+ });
150
367
 
151
- #### Example
368
+ // Step 2: Send context data (app_user_id auto-added)
369
+ await embed.Event(EventKeys.SCREEN_STATE, {
370
+ screen: 'profile',
371
+ data: { plan: 'premium' },
372
+ });
373
+ } catch (error) {
374
+ console.error('Event error:', error);
375
+ }
152
376
 
153
- ```tsx
154
- import { EmbedButton } from '@revrag-ai/embed-react-native';
377
+ ```
378
+
379
+ ## Usage Examples
380
+
381
+ ### Complete Integration Example
382
+
383
+ ```jsx
384
+ import React, { useEffect, useState } from 'react';
385
+ import { View, StyleSheet, Alert } from 'react-native';
386
+ import { GestureHandlerRootView } from 'react-native-gesture-handler';
387
+ import {
388
+ useInitialize,
389
+ EmbedButton,
390
+ embed,
391
+ EventKeys
392
+ } from '@revrag-ai/embed-react-native';
393
+
394
+ export default function App() {
395
+ const [userDataSent, setUserDataSent] = useState(false);
396
+
397
+ const { isInitialized, error } = useInitialize({
398
+ apiKey: 'your_api_key_here',
399
+ embedUrl: '<https://your-embed-server.com>',
400
+ });
401
+
402
+ // Initialize user data when SDK is ready
403
+ useEffect(() => {
404
+ if (isInitialized && !userDataSent) {
405
+ initializeUserData();
406
+ }
407
+ }, [isInitialized, userDataSent]);
408
+
409
+ const initializeUserData = async () => {
410
+ try {
411
+ // STEP 1: Send user data first (REQUIRED)
412
+ await embed.Event(EventKeys.USER_DATA, {
413
+ app_user_id: 'user123', // Required field
414
+ name: 'John Doe',
415
+ email: 'john@example.com',
416
+ subscription: 'premium',
417
+ joinedDate: '2024-01-15',
418
+ });
419
+
420
+ setUserDataSent(true);
421
+
422
+ // STEP 2: Send initial screen state
423
+ await embed.Event(EventKeys.SCREEN_STATE, {
424
+ screen: 'home',
425
+ timestamp: new Date().toISOString(),
426
+ userActions: [],
427
+ });
428
+
429
+ console.log('User data and initial state sent successfully');
430
+ } catch (error) {
431
+ console.error('Failed to initialize user data:', error);
432
+ Alert.alert('Error', 'Failed to initialize voice agent');
433
+ }
434
+ };
435
+
436
+ // Send screen state updates
437
+ const updateScreenState = async (screenName, data = {}) => {
438
+ if (!userDataSent) {
439
+ console.warn('Cannot send screen state before user data');
440
+ return;
441
+ }
155
442
 
156
- // Basic usage
157
- <EmbedButton />
158
-
159
- // The button automatically handles:
160
- // - Voice agent connection
161
- // - Microphone permissions
162
- // - Audio recording and playback
163
- // - Call management
164
- ```
165
-
166
- ### `useVoiceAgent()`
167
-
168
- Hook for advanced voice agent control and state management.
169
-
170
- #### Returns
171
-
172
- | Property | Type | Description |
173
- |----------|------|-------------|
174
- | `initializeVoiceAgent` | `() => Promise<void>` | Manually initialize voice connection |
175
- | `isLoading` | `boolean` | Connection loading state |
176
- | `error` | `string \| null` | Current error message |
177
- | `tokenDetails` | `TokenDetails` | Authentication token information |
178
- | `endCall` | `() => Promise<void>` | End the current voice session |
179
- | `room` | `Room` | LiveKit room instance |
180
- | `roomRef` | `RefObject<Room>` | Room reference for advanced usage |
181
- | `isMicMuted` | `boolean` | Microphone mute state |
182
- | `muteMic` | `() => void` | Mute the microphone |
183
- | `unmuteMic` | `() => void` | Unmute the microphone |
184
- | `connectionState` | `ConnectionState` | Current connection status |
185
- | `cleanup` | `() => void` | Clean up resources |
186
-
187
- #### Example
188
-
189
- ```tsx
190
- import { useVoiceAgent } from '@revrag-ai/embed-react-native';
191
-
192
- function CustomVoiceInterface() {
193
- const {
194
- initializeVoiceAgent,
195
- endCall,
196
- isMicMuted,
197
- muteMic,
198
- unmuteMic,
199
- connectionState,
200
- isLoading
201
- } = useVoiceAgent();
202
-
203
- const handleStartCall = async () => {
204
443
  try {
205
- await initializeVoiceAgent();
444
+ await embed.Event(EventKeys.SCREEN_STATE, {
445
+ screen: screenName,
446
+ timestamp: new Date().toISOString(),
447
+ ...data,
448
+ });
206
449
  } catch (error) {
207
- console.error('Failed to start call:', error);
450
+ console.error('Failed to update screen state:', error);
208
451
  }
209
452
  };
210
453
 
454
+ // Handle initialization errors
455
+ if (error) {
456
+ console.error('SDK initialization failed:', error);
457
+ return (
458
+ <View style={styles.errorContainer}>
459
+ <Text>Failed to initialize voice agent</Text>
460
+ </View>
461
+ );
462
+ }
463
+
464
+ // Show loading while initializing
465
+ if (!isInitialized) {
466
+ return (
467
+ <View style={styles.loadingContainer}>
468
+ <Text>Initializing voice agent...</Text>
469
+ </View>
470
+ );
471
+ }
472
+
211
473
  return (
212
- <View>
213
- <Button
214
- title={connectionState === 'connected' ? 'End Call' : 'Start Call'}
215
- onPress={connectionState === 'connected' ? endCall : handleStartCall}
216
- disabled={isLoading}
217
- />
218
-
219
- {connectionState === 'connected' && (
220
- <Button
221
- title={isMicMuted ? 'Unmute' : 'Mute'}
222
- onPress={isMicMuted ? unmuteMic : muteMic}
223
- />
224
- )}
225
-
226
- <Text>Status: {connectionState}</Text>
227
- </View>
474
+ <GestureHandlerRootView style={styles.container}>
475
+ <View style={styles.content}>
476
+ {/* Your app content */}
477
+ <YourAppComponents onScreenChange={updateScreenState} />
478
+ </View>
479
+
480
+ {/* Only render EmbedButton after user data is sent */}
481
+ {userDataSent && <EmbedButton />}
482
+ </GestureHandlerRootView>
228
483
  );
229
484
  }
485
+
486
+ const styles = StyleSheet.create({
487
+ container: {
488
+ flex: 1,
489
+ },
490
+ content: {
491
+ flex: 1,
492
+ },
493
+ loadingContainer: {
494
+ flex: 1,
495
+ justifyContent: 'center',
496
+ alignItems: 'center',
497
+ },
498
+ errorContainer: {
499
+ flex: 1,
500
+ justifyContent: 'center',
501
+ alignItems: 'center',
502
+ },
503
+ });
504
+
230
505
  ```
231
506
 
232
- ### Event System
507
+ ### Event Listening Example
233
508
 
234
- The library provides an event system for handling voice agent interactions.
509
+ ```jsx
510
+ import { embed, EventKeys } from '@revrag-ai/embed-react-native';
235
511
 
236
- #### `Embed.on(eventKey: EmbedEventKeys, callback: Function)`
512
+ // Listen for events (optional)
513
+ useEffect(() => {
514
+ // Listen for user data events
515
+ const unsubscribeUserData = embed.on(EventKeys.USER_DATA, (data) => {
516
+ console.log('User data updated:', data);
517
+ });
237
518
 
238
- Listen to voice agent events.
519
+ // Listen for screen state events
520
+ const unsubscribeScreenState = embed.on(EventKeys.SCREEN_STATE, (data) => {
521
+ console.log('Screen state changed:', data);
522
+ });
239
523
 
240
- #### `Embed.Event(eventKey: string, data: any)`
524
+ // Cleanup listeners
525
+ return () => {
526
+ // Note: Current version doesn't provide unsubscribe method
527
+ // This is for illustration of the API design
528
+ };
529
+ }, []);
530
+
531
+ ```
241
532
 
242
- Send custom events to the voice agent.
533
+ ### Advanced Usage with Navigation
243
534
 
244
- #### Event Types
535
+ ```jsx
536
+ import { useEffect } from 'react';
537
+ import { useNavigation } from '@react-navigation/native';
538
+ import { embed, EventKeys } from '@revrag-ai/embed-react-native';
245
539
 
246
- ```tsx
247
- enum EmbedEventKeys {
248
- USER_DATA = 'user_data',
249
- SCREEN_STATE = 'state_data'
540
+ function NavigationListener() {
541
+ const navigation = useNavigation();
542
+
543
+ useEffect(() => {
544
+ const unsubscribe = navigation.addListener('state', (e) => {
545
+ const routeName = e.data.state.routes[e.data.state.index].name;
546
+
547
+ // Send screen state when navigation changes
548
+ embed.Event(EventKeys.SCREEN_STATE, {
549
+ screen: routeName,
550
+ timestamp: new Date().toISOString(),
551
+ navigationStack: e.data.state.routes.map(route => route.name),
552
+ }).catch(console.error);
553
+ });
554
+
555
+ return unsubscribe;
556
+ }, [navigation]);
557
+
558
+ return null;
250
559
  }
560
+
251
561
  ```
252
562
 
253
- #### Example
563
+ ## FAQ & Troubleshooting
254
564
 
255
- ```tsx
256
- import { Embed, EmbedEventKeys } from '@revrag-ai/embed-react-native';
565
+ ### 🔴 Critical Issues
257
566
 
258
- // Listen for events
259
- Embed.on(EmbedEventKeys.USER_DATA, (userData) => {
260
- console.log('Received user data:', userData);
261
- // Handle user data updates
262
- });
567
+ ### Q: "react-native-reanimated not working" or animation issues
568
+
569
+ **A:** This is the most common issue. Ensure:
570
+
571
+ 1. ✅ React Native Reanimated plugin is **the last plugin** in `babel.config.js`
572
+ 2. ✅ Clear cache after babel config changes: `npx react-native start --reset-cache`
573
+ 3. ✅ Restart Metro bundler completely
574
+ 4. ✅ For iOS: `cd ios && pod install`
575
+
576
+ ```jsx
577
+ // ✅ Correct babel.config.js
578
+ module.exports = {
579
+ presets: ['module:@react-native/babel-preset'],
580
+ plugins: [
581
+ 'react-native-reanimated/plugin', // ← MUST be last
582
+ ],
583
+ };
263
584
 
264
- // Send events
265
- await Embed.Event('custom_event', {
266
- action: 'page_view',
267
- page: 'profile',
268
- timestamp: Date.now()
269
- });
270
585
  ```
271
586
 
272
- ## 🎨 Customization
587
+ ### Q: "User identity not found" error
273
588
 
274
- ### Styling the Voice Button
589
+ **A:** This error occurs when you try to send events before USER_DATA:
275
590
 
276
- The `EmbedButton` component comes with built-in styling but can be customized through the configuration metadata:
591
+ 1. ✅ Send `USER_DATA` event first with `app_user_id`
592
+ 2. ✅ Wait for the event to complete before sending other events
593
+ 3. ✅ Only render `EmbedButton` after USER_DATA is sent
594
+
595
+ ```jsx
596
+ // ❌ Wrong order
597
+ await embed.Event(EventKeys.SCREEN_STATE, { screen: 'home' }); // Error!
598
+ await embed.Event(EventKeys.USER_DATA, { app_user_id: 'user123' });
599
+
600
+ // ✅ Correct order
601
+ await embed.Event(EventKeys.USER_DATA, { app_user_id: 'user123' });
602
+ await embed.Event(EventKeys.SCREEN_STATE, { screen: 'home' }); // Works!
277
603
 
278
- ```tsx
279
- useInitialize({
280
- apiKey: 'your-api-key',
281
- embedUrl: 'your-server-url',
282
- });
283
604
  ```
284
605
 
285
- ## 🔧 Advanced Usage
606
+ ### Q: Microphone permission denied
286
607
 
287
- ### Context Data Management
608
+ **A:** Ensure permissions are configured:
288
609
 
289
- Provide rich context to your voice agent for more intelligent conversations:
610
+ **Android:**
290
611
 
291
- ```tsx
292
- const contextData = {
293
- user: {
294
- id: 'U123456',
295
- name: 'John Doe',
296
- accountDetails: {
297
- balance: 1500.50,
298
- lastTransaction: '2024-01-15',
299
- accountType: 'premium'
300
- }
301
- },
302
- currentScreen: 'dashboard',
303
- availableActions: ['transfer', 'balance_inquiry', 'transaction_history']
304
- };
612
+ - ✅ `RECORD_AUDIO` and `MICROPHONE` permissions in `AndroidManifest.xml`
613
+ - Request permissions at runtime for Android 6+
614
+
615
+ **iOS:**
616
+
617
+ - ✅ `NSMicrophoneUsageDescription` in `Info.plist`
618
+ - ✅ Provide user-friendly description
619
+
620
+ ### Q: iOS App Crashes - "attempted to access privacy-sensitive data without a usage description"
621
+
622
+ **A:** This crash occurs when the app tries to access the microphone without proper permission description:
623
+
624
+ **Quick Fix:**
625
+
626
+ 1. ✅ Open `ios/YourAppName/Info.plist`
627
+ 2. ✅ Add the microphone usage description:
628
+
629
+ ```xml
630
+ <key>NSMicrophoneUsageDescription</key>
631
+ <string>This app needs access to microphone for voice communication with AI agent</string>
305
632
 
306
- useInitialize({
307
- apiKey: 'your-api-key',
308
- embedUrl: 'your-server-url',
309
- });
310
633
  ```
311
634
 
312
- ### Error Handling
635
+ 1. Clean and rebuild: `cd ios && pod install && cd .. && npx react-native run-ios`
313
636
 
314
- ```tsx
315
- import { useVoiceAgent } from '@revrag-ai/embed-react-native';
637
+ **Why this happens:** iOS requires apps to declare why they need access to privacy-sensitive data like microphone, camera, location, etc.
316
638
 
317
- function VoiceComponent() {
318
- const { error, initializeVoiceAgent } = useVoiceAgent();
639
+ ### 🟡 Common Issues
319
640
 
320
- useEffect(() => {
321
- if (error) {
322
- console.error('Voice agent error:', error);
323
- // Handle error - show user message, retry logic, etc.
324
- }
325
- }, [error]);
641
+ ### Q: EmbedButton not appearing
642
+
643
+ **A:** Check these requirements:
644
+
645
+ 1. ✅ App wrapped with `GestureHandlerRootView`
646
+ 2. ✅ SDK initialized successfully (`isInitialized` is true)
647
+ 3. ✅ USER_DATA event sent first
648
+ 4. ✅ EmbedButton rendered after USER_DATA
649
+
650
+ ### Q: Network/API connection issues
651
+
652
+ **A:** Verify configuration:
653
+
654
+ 1. ✅ Valid `apiKey` and `embedUrl`
655
+ 2. ✅ Network connectivity
656
+ 3. ✅ Server is accessible from the device
657
+ 4. ✅ `usesCleartextTraffic="true"` for HTTP endpoints (Android)
658
+
659
+ ### Q: iOS Network Request Failures - "The resource could not be loaded"
660
+
661
+ **A:** This is caused by iOS App Transport Security (ATS). Solutions:
662
+
663
+ **For HTTP APIs (Development/Testing):**
664
+ Add domain exceptions to `ios/YourApp/Info.plist`:
665
+
666
+ ```xml
667
+ <key>NSAppTransportSecurity</key>
668
+ <dict>
669
+ <key>NSAllowsArbitraryLoads</key>
670
+ <false/>
671
+ <key>NSAllowsLocalNetworking</key>
672
+ <true/>
673
+ <key>NSExceptionDomains</key>
674
+ <dict>
675
+ <!-- Replace with your API domain -->
676
+ <key>your-api-domain.com</key>
677
+ <dict>
678
+ <key>NSExceptionAllowsInsecureHTTPLoads</key>
679
+ <true/>
680
+ <key>NSExceptionMinimumTLSVersion</key>
681
+ <string>TLSv1.0</string>
682
+ <key>NSExceptionRequiresForwardSecrecy</key>
683
+ <false/>
684
+ </dict>
685
+ <!-- For localhost development -->
686
+ <key>localhost</key>
687
+ <dict>
688
+ <key>NSExceptionAllowsInsecureHTTPLoads</key>
689
+ <true/>
690
+ </dict>
691
+ </dict>
692
+ </dict>
326
693
 
327
- // Rest of component...
328
- }
329
694
  ```
330
695
 
331
- ## 🛠 Troubleshooting
696
+ **For Production (Recommended):**
332
697
 
333
- ### Common Issues
698
+ 1. Use HTTPS endpoints instead of HTTP
699
+ 2. ✅ Get proper SSL certificates
700
+ 3. ✅ Update `embedUrl` to use `https://`
334
701
 
335
- 1. **Microphone Permission Denied**
336
- - Ensure you've added the microphone permission to your platform-specific config files
337
- - Check that users have granted microphone permission
702
+ **⚠️ Never use `NSAllowsArbitraryLoads: true` in production apps**
338
703
 
339
- 2. **Connection Issues**
340
- - Verify your API key is correct
341
- - Ensure the `embedUrl` is accessible from your app
342
- - Check network connectivity
704
+ ### Q: Network debugging on iOS
343
705
 
344
- 3. **Audio Not Working**
345
- - Test on a physical device (audio may not work in simulators)
346
- - Verify audio output settings
347
- - Check that other apps can use the microphone
706
+ **A:** Enable network debugging:
348
707
 
349
- ### Debug Mode
708
+ 1. **Add logging to API calls:**
350
709
 
351
- Enable detailed logging for debugging:
710
+ ```jsx
711
+ // Add this to your API initialization
712
+ console.log('API URL:', embedUrl);
713
+ console.log('Making request to:', `${embedUrl}/embedded-agent/initialize`);
352
714
 
353
- ```tsx
354
- // Add this before initializing
355
- console.log('Voice Agent Debug Mode Enabled');
715
+ ```
356
716
 
357
- useInitialize({
358
- apiKey: 'your-api-key',
359
- embedUrl: 'your-server-url',
717
+ 1. **Check iOS Console logs:**
718
+ - Open Xcode → Window → Devices and Simulators
719
+ - Select your device → Open Console
720
+ - Look for network-related errors
721
+ 2. **Test network connectivity:**
722
+
723
+ ```bash
724
+ # Test if your API is reachable
725
+ curl -I <http://your-api-domain.com/embedded-agent/initialize>
726
+
727
+ ```
728
+
729
+ ### 🟢 Best Practices
730
+
731
+ ### Q: How to handle SDK initialization in different app states?
732
+
733
+ **A:** Best practices for initialization:
734
+
735
+ ```jsx
736
+ const [initState, setInitState] = useState('loading');
737
+
738
+ const { isInitialized, error } = useInitialize({
739
+ apiKey: process.env.ONWID_API_KEY,
740
+ embedUrl: process.env.ONWID_URL,
360
741
  });
742
+
743
+ useEffect(() => {
744
+ if (error) {
745
+ setInitState('error');
746
+ } else if (isInitialized) {
747
+ setInitState('ready');
748
+ }
749
+ }, [isInitialized, error]);
750
+
751
+ // Render based on state
752
+ switch (initState) {
753
+ case 'loading': return <LoadingScreen />;
754
+ case 'error': return <ErrorScreen error={error} />;
755
+ case 'ready': return <AppWithOnwid />;
756
+ }
757
+
361
758
  ```
362
759
 
363
- ## 📱 Platform Support
760
+ ### Q: How to optimize event sending?
761
+
762
+ **A:** Event optimization strategies:
763
+
764
+ ```jsx
765
+ // ✅ Debounce frequent events
766
+ const debouncedStateUpdate = useCallback(
767
+ debounce((data) => {
768
+ embed.Event(EventKeys.SCREEN_STATE, data);
769
+ }, 500),
770
+ []
771
+ );
772
+
773
+ // ✅ Batch related data
774
+ const sendUserProfile = async (userData) => {
775
+ await embed.Event(EventKeys.USER_DATA, {
776
+ app_user_id: userData.id,
777
+ ...userData.profile,
778
+ ...userData.preferences,
779
+ lastLogin: new Date().toISOString(),
780
+ });
781
+ };
364
782
 
365
- - **iOS**: 12.0+
366
- - **Android**: API level 21+ (Android 5.0)
367
- - **React Native**: 0.70+
783
+ ```
368
784
 
369
- ## 🤝 Contributing
785
+ ### Q: How to handle offline scenarios?
370
786
 
371
- See the [contributing guide](CONTRIBUTING.md) to learn how to contribute to the repository and the development workflow.
787
+ **A:** Offline handling approach:
372
788
 
373
- ## 📄 License
789
+ ```jsx
790
+ import NetInfo from '@react-native-community/netinfo';
374
791
 
375
- MIT
792
+ const [isOnline, setIsOnline] = useState(true);
376
793
 
377
- ---
794
+ useEffect(() => {
795
+ const unsubscribe = NetInfo.addEventListener(state => {
796
+ setIsOnline(state.isConnected);
797
+ });
798
+ return () => unsubscribe();
799
+ }, []);
800
+
801
+ // Queue events when offline
802
+ const sendEventWithRetry = async (eventKey, data) => {
803
+ if (!isOnline) {
804
+ // Store in AsyncStorage for later retry
805
+ await storeEventForRetry(eventKey, data);
806
+ return;
807
+ }
808
+
809
+ try {
810
+ await embed.Event(eventKey, data);
811
+ } catch (error) {
812
+ // Retry logic or store for later
813
+ await storeEventForRetry(eventKey, data);
814
+ }
815
+ };
816
+
817
+ ```
378
818
 
379
- **Made with ❤️ by [RevRag AI](https://www.revrag.ai)**
819
+ ### 📞 Support
820
+
821
+ For additional help:
822
+
823
+ - **Documentation:** [https://docs.revrag.ai](https://docs.revrag.ai/)
824
+ - **Email Support:** [contact@revrag.ai](mailto:contact@revrag.ai)
825
+ - **GitHub Issues:** https://github.com/RevRag-ai/embed-react-native/issues
826
+
827
+ ### 🔄 Migration Guide
828
+
829
+ If upgrading from a previous version, check the [CHANGELOG.md](https://www.notion.so/CHANGELOG.md) for breaking changes and migration steps.
830
+
831
+ ---
380
832
 
381
- For support, visit our [documentation](https://docs.revrag.ai) or contact us at [contact@revrag.ai](mailto:contact@revrag.ai).
833
+ **Last Updated:** January 2024
834
+ **SDK Version:** Latest
835
+ **React Native Compatibility:** 0.70+