react-native-executorch 0.5.9 → 0.5.11

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -9,6 +9,15 @@
9
9
 
10
10
  ![Software Mansion banner](https://github.com/user-attachments/assets/fa2c4735-e75c-4cc1-970d-88905d95e3a4)
11
11
 
12
+ <p align="center">
13
+ <a href="https://github.com/software-mansion/react-native-executorch/blob/release/0.5/README.md">English</a>
14
+ <a href="https://github.com/software-mansion/react-native-executorch/blob/release/0.5/readmes/README_es.md">Español</a>
15
+ <a href="https://github.com/software-mansion/react-native-executorch/blob/release/0.5/readmes/README_fr.md">Français</a>
16
+ <a href="https://github.com/software-mansion/react-native-executorch/blob/release/0.5/readmes/README_cn.md">简体中文</a>
17
+ <a href="https://github.com/software-mansion/react-native-executorch/blob/release/0.5/readmes/README_pt.md">Português</a>
18
+ <a href="https://github.com/software-mansion/react-native-executorch/blob/release/0.5/readmes/README_in.md">हिंदी</a>
19
+ </p>
20
+
12
21
  **React Native ExecuTorch** provides a declarative way to run AI models on-device using React Native, powered by **ExecuTorch** :rocket:. It offers out-of-the-box support for a wide range of LLMs, computer vision models, and more. Visit our [HuggingFace](https://huggingface.co/software-mansion) page to explore these models.
13
22
 
14
23
  **ExecuTorch**, developed by Meta, is a novel framework allowing AI model execution on devices like mobile phones or microcontrollers.
@@ -21,10 +30,11 @@ React Native ExecuTorch bridges the gap between React Native and native platform
21
30
  **Table of contents:**
22
31
 
23
32
  - [:yin_yang: Supported versions](#yin_yang-supported-versions)
24
- - [:robot: Ready-made models](#robot-ready-made-models)
25
33
  - [:books: Documentation](#books-documentation)
34
+ - [:earth_africa: Real-World Example](#earth_africa-real-world-example)
26
35
  - [:llama: Quickstart - Running Llama](#llama-quickstart---running-llama)
27
- - [:calling: Examples](#calling-examples)
36
+ - [:calling: Demo apps](#calling-demo-apps)
37
+ - [:robot: Ready-made models](#robot-ready-made-models)
28
38
  - [:balance_scale: License](#balance_scale-license)
29
39
  - [:soon: What's next?](#soon-whats-next)
30
40
 
@@ -38,15 +48,17 @@ The minimal supported version are:
38
48
  > [!IMPORTANT]
39
49
  > React Native Executorch supports only the [New React Native architecture](https://reactnative.dev/architecture/landing-page).
40
50
 
41
- ## :robot: Ready-made models
42
-
43
- Our library has a number of ready-to-use AI models; a complete list is available in the documentation. If you're interested in running your own AI model, you need to first export it to the `.pte` format. Instructions on how to do this are available in the [Python API](https://pypi.org/project/executorch/).
44
-
45
51
  ## :books: Documentation
46
52
 
47
53
  Check out how our library can help you build your React Native AI features by visiting our docs:
48
54
  https://docs.swmansion.com/react-native-executorch
49
55
 
56
+ ## :earth_africa: Real-World Example
57
+
58
+ React Native ExecuTorch is powering [Private Mind](https://github.com/software-mansion-labs/private-mind), a privacy-first mobile AI app available on [App Store](https://apps.apple.com/gb/app/private-mind/id6746713439) and [Google Play](https://play.google.com/store/apps/details?id=com.swmansion.privatemind).
59
+
60
+ <img width="2720" height="1085" alt="Private Mind promo" src="https://github.com/user-attachments/assets/b12296fe-19ac-48fc-9726-da9242700346" />
61
+
50
62
  ## :llama: **Quickstart - Running Llama**
51
63
 
52
64
  **Get started with AI-powered text generation in 3 easy steps!**
@@ -93,12 +105,12 @@ const handleGenerate = async () => {
93
105
  };
94
106
  ```
95
107
 
96
- ## :calling: Examples
108
+ ## :calling: Demo apps
97
109
 
98
110
  We currently host a few example [apps](https://github.com/software-mansion/react-native-executorch/tree/main/apps) demonstrating use cases of our library:
99
111
 
100
112
  - `llm` - Chat application showcasing use of LLMs
101
- - `speech-to-text` - Whisper and Moonshine models ready for transcription tasks
113
+ - `speech-to-text` - Whisper model ready for transcription tasks
102
114
  - `computer-vision` - Computer vision related tasks
103
115
  - `text-embeddings` - Computing text representations for semantic search
104
116
 
@@ -117,6 +129,10 @@ yarn expo run:< ios | android >
117
129
  > [!WARNING]
118
130
  > Running LLMs requires a significant amount of RAM. If you are encountering unexpected app crashes, try to increase the amount of RAM allocated to the emulator.
119
131
 
132
+ ## :robot: Ready-made models
133
+
134
+ Our library has a number of ready-to-use AI models; a complete list is available in the documentation. If you're interested in running your own AI model, you need to first export it to the `.pte` format. Instructions on how to do this are available in the [Python API](https://docs.pytorch.org/executorch/stable/using-executorch-export.html) and [optimum-executorch README](https://github.com/huggingface/optimum-executorch?tab=readme-ov-file#option-2-export-and-load-separately).
135
+
120
136
  ## :balance_scale: License
121
137
 
122
138
  This library is licensed under [The MIT License](./LICENSE).
@@ -43,7 +43,7 @@ export const useSpeechToText = ({
43
43
  setIsGenerating(false);
44
44
  }
45
45
  }, [isReady, isGenerating, modelInstance]);
46
- const stream = useCallback(async () => {
46
+ const stream = useCallback(async options => {
47
47
  if (!isReady) throw new Error(getError(ETError.ModuleNotLoaded));
48
48
  if (isGenerating) throw new Error(getError(ETError.ModelGenerating));
49
49
  setIsGenerating(true);
@@ -54,7 +54,7 @@ export const useSpeechToText = ({
54
54
  for await (const {
55
55
  committed,
56
56
  nonCommitted
57
- } of modelInstance.stream()) {
57
+ } of modelInstance.stream(options)) {
58
58
  setCommittedTranscription(prev => prev + committed);
59
59
  setNonCommittedTranscription(nonCommitted);
60
60
  transcription += committed;
@@ -1 +1 @@
1
- {"version":3,"names":["useEffect","useCallback","useState","ETError","getError","SpeechToTextModule","useSpeechToText","model","preventLoad","error","setError","isReady","setIsReady","isGenerating","setIsGenerating","downloadProgress","setDownloadProgress","modelInstance","committedTranscription","setCommittedTranscription","nonCommittedTranscription","setNonCommittedTranscription","load","isMultilingual","encoderSource","decoderSource","tokenizerSource","err","message","stateWrapper","fn","args","Error","ModuleNotLoaded","ModelGenerating","apply","stream","transcription","committed","nonCommitted","prev","wrapper","encode","prototype","decode","transcribe","streamStop","streamInsert"],"sourceRoot":"../../../../src","sources":["hooks/natural_language_processing/useSpeechToText.ts"],"mappings":";;AAAA,SAASA,SAAS,EAAEC,WAAW,EAAEC,QAAQ,QAAQ,OAAO;AACxD,SAASC,OAAO,EAAEC,QAAQ,QAAQ,aAAa;AAC/C,SAASC,kBAAkB,QAAQ,8DAA8D;AAGjG,OAAO,MAAMC,eAAe,GAAGA,CAAC;EAC9BC,KAAK;EACLC,WAAW,GAAG;AAIhB,CAAC,KAAK;EACJ,MAAM,CAACC,KAAK,EAAEC,QAAQ,CAAC,GAAGR,QAAQ,CAAgB,IAAI,CAAC;EACvD,MAAM,CAACS,OAAO,EAAEC,UAAU,CAAC,GAAGV,QAAQ,CAAC,KAAK,CAAC;EAC7C,MAAM,CAACW,YAAY,EAAEC,eAAe,CAAC,GAAGZ,QAAQ,CAAC,KAAK,CAAC;EACvD,MAAM,CAACa,gBAAgB,EAAEC,mBAAmB,CAAC,GAAGd,QAAQ,CAAC,CAAC,CAAC;EAE3D,MAAM,CAACe,aAAa,CAAC,GAAGf,QAAQ,CAAC,MAAM,IAAIG,kBAAkB,CAAC,CAAC,CAAC;EAChE,MAAM,CAACa,sBAAsB,EAAEC,yBAAyB,CAAC,GAAGjB,QAAQ,CAAC,EAAE,CAAC;EACxE,MAAM,CAACkB,yBAAyB,EAAEC,4BAA4B,CAAC,GAC7DnB,QAAQ,CAAC,EAAE,CAAC;EAEdF,SAAS,CAAC,MAAM;IACd,IAAIQ,WAAW,EAAE;IACjB,CAAC,YAAY;MACXQ,mBAAmB,CAAC,CAAC,CAAC;MACtBN,QAAQ,CAAC,IAAI,CAAC;MACd,IAAI;QACFE,UAAU,CAAC,KAAK,CAAC;QACjB,MAAMK,aAAa,CAACK,IAAI,CACtB;UACEC,cAAc,EAAEhB,KAAK,CAACgB,cAAc;UACpCC,aAAa,EAAEjB,KAAK,CAACiB,aAAa;UAClCC,aAAa,EAAElB,KAAK,CAACkB,aAAa;UAClCC,eAAe,EAAEnB,KAAK,CAACmB;QACzB,CAAC,EACDV,mBACF,CAAC;QACDJ,UAAU,CAAC,IAAI,CAAC;MAClB,CAAC,CAAC,OAAOe,GAAG,EAAE;QACZjB,QAAQ,CAAEiB,GAAG,CAAWC,OAAO,CAAC;MAClC;IACF,CAAC,EAAE,CAAC;EACN,CAAC,EAAE,CACDX,aAAa,EACbV,KAAK,CAACgB,cAAc,EACpBhB,KAAK,CAACiB,aAAa,EACnBjB,KAAK,CAACkB,aAAa,EACnBlB,KAAK,CAACmB,eAAe,EACrBlB,WAAW,CACZ,CAAC;EAEF,MAAMqB,YAAY,GAAG5B,WAAW,CACe6B,EAAK,IAChD,OAAO,GAAGC,IAAmB,KAAsC;IACjE,IAAI,CAACpB,OAAO,EAAE,MAAM,IAAIqB,KAAK,CAAC5B,QAAQ,CAACD,OAAO,CAAC8B,eAAe,CAAC,CAAC;IAChE,IAAIpB,YAAY,EAAE,MAAM,IAAImB,KAAK,CAAC5B,QAAQ,CAACD,OAAO,CAAC+B,eAAe,CAAC,CAAC;IACpEpB,eAAe,CAAC,IAAI,CAAC;IACrB,IAAI;MACF,OAAO,MAAMgB,EAAE,CAACK,KAAK,CAAClB,aAAa,EAAEc,IAAI,CAAC;IAC5C,CAAC,SAAS;MACRjB,eAAe,CAAC,KAAK,CAAC;IACxB;EACF,CAAC,EACH,CAACH,OAAO,EAAEE,YAAY,EAAEI,aAAa,CACvC,CAAC;EAED,MAAMmB,MAAM,GAAGnC,WAAW,CAAC,YAAY;IACrC,IAAI,CAACU,OAAO,EAAE,MAAM,IAAIqB,KAAK,CAAC5B,QAAQ,CAACD,OAAO,CAAC8B,eAAe,CAAC,CAAC;IAChE,IAAIpB,YAAY,EAAE,MAAM,IAAImB,KAAK,CAAC5B,QAAQ,CAACD,OAAO,CAAC+B,eAAe,CAAC,CAAC;IACpEpB,eAAe,CAAC,IAAI,CAAC;IACrBK,yBAAyB,CAAC,EAAE,CAAC;IAC7BE,4BAA4B,CAAC,EAAE,CAAC;IAChC,IAAIgB,aAAa,GAAG,EAAE;IACtB,IAAI;MACF,WAAW,MAAM;QAAEC,SAAS;QAAEC;MAAa,CAAC,IAAItB,aAAa,CAACmB,MAAM,CAAC,CAAC,EAAE;QACtEjB,yBAAyB,CAAEqB,IAAI,IAAKA,IAAI,GAAGF,SAAS,CAAC;QACrDjB,4BAA4B,CAACkB,YAAY,CAAC;QAC1CF,aAAa,IAAIC,SAAS;MAC5B;IACF,CAAC,SAAS;MACRxB,eAAe,CAAC,KAAK,CAAC;IACxB;IACA,OAAOuB,aAAa;EACtB,CAAC,EAAE,CAAC1B,OAAO,EAAEE,YAAY,EAAEI,aAAa,CAAC,CAAC;EAE1C,MAAMwB,OAAO,GAAGxC,WAAW,CACW6B,EAAK,IAAK;IAC5C,OAAO,CAAC,GAAGC,IAAmB,KAAoB;MAChD,IAAI,CAACpB,OAAO,EAAE,MAAM,IAAIqB,KAAK,CAAC5B,QAAQ,CAACD,OAAO,CAAC8B,eAAe,CAAC,CAAC;MAChE,OAAOH,EAAE,CAACK,KAAK,CAAClB,aAAa,EAAEc,IAAI,CAAC;IACtC,CAAC;EACH,CAAC,EACD,CAACpB,OAAO,EAAEM,aAAa,CACzB,CAAC;EAED,OAAO;IACLR,KAAK;IACLE,OAAO;IACPE,YAAY;IACZE,gBAAgB;IAChBG,sBAAsB;IACtBE,yBAAyB;IACzBsB,MAAM,EAAEb,YAAY,CAACxB,kBAAkB,CAACsC,SAAS,CAACD,MAAM,CAAC;IACzDE,MAAM,EAAEf,YAAY,CAACxB,kBAAkB,CAACsC,SAAS,CAACC,MAAM,CAAC;IACzDC,UAAU,EAAEhB,YAAY,CAACxB,kBAAkB,CAACsC,SAAS,CAACE,UAAU,CAAC;IACjET,MAAM;IACNU,UAAU,EAAEL,OAAO,CAACpC,kBAAkB,CAACsC,SAAS,CAACG,UAAU,CAAC;IAC5DC,YAAY,EAAEN,OAAO,CAACpC,kBAAkB,CAACsC,SAAS,CAACI,YAAY;EACjE,CAAC;AACH,CAAC","ignoreList":[]}
1
+ {"version":3,"names":["useEffect","useCallback","useState","ETError","getError","SpeechToTextModule","useSpeechToText","model","preventLoad","error","setError","isReady","setIsReady","isGenerating","setIsGenerating","downloadProgress","setDownloadProgress","modelInstance","committedTranscription","setCommittedTranscription","nonCommittedTranscription","setNonCommittedTranscription","load","isMultilingual","encoderSource","decoderSource","tokenizerSource","err","message","stateWrapper","fn","args","Error","ModuleNotLoaded","ModelGenerating","apply","stream","options","transcription","committed","nonCommitted","prev","wrapper","encode","prototype","decode","transcribe","streamStop","streamInsert"],"sourceRoot":"../../../../src","sources":["hooks/natural_language_processing/useSpeechToText.ts"],"mappings":";;AAAA,SAASA,SAAS,EAAEC,WAAW,EAAEC,QAAQ,QAAQ,OAAO;AACxD,SAASC,OAAO,EAAEC,QAAQ,QAAQ,aAAa;AAC/C,SAASC,kBAAkB,QAAQ,8DAA8D;AAGjG,OAAO,MAAMC,eAAe,GAAGA,CAAC;EAC9BC,KAAK;EACLC,WAAW,GAAG;AAIhB,CAAC,KAAK;EACJ,MAAM,CAACC,KAAK,EAAEC,QAAQ,CAAC,GAAGR,QAAQ,CAAgB,IAAI,CAAC;EACvD,MAAM,CAACS,OAAO,EAAEC,UAAU,CAAC,GAAGV,QAAQ,CAAC,KAAK,CAAC;EAC7C,MAAM,CAACW,YAAY,EAAEC,eAAe,CAAC,GAAGZ,QAAQ,CAAC,KAAK,CAAC;EACvD,MAAM,CAACa,gBAAgB,EAAEC,mBAAmB,CAAC,GAAGd,QAAQ,CAAC,CAAC,CAAC;EAE3D,MAAM,CAACe,aAAa,CAAC,GAAGf,QAAQ,CAAC,MAAM,IAAIG,kBAAkB,CAAC,CAAC,CAAC;EAChE,MAAM,CAACa,sBAAsB,EAAEC,yBAAyB,CAAC,GAAGjB,QAAQ,CAAC,EAAE,CAAC;EACxE,MAAM,CAACkB,yBAAyB,EAAEC,4BAA4B,CAAC,GAC7DnB,QAAQ,CAAC,EAAE,CAAC;EAEdF,SAAS,CAAC,MAAM;IACd,IAAIQ,WAAW,EAAE;IACjB,CAAC,YAAY;MACXQ,mBAAmB,CAAC,CAAC,CAAC;MACtBN,QAAQ,CAAC,IAAI,CAAC;MACd,IAAI;QACFE,UAAU,CAAC,KAAK,CAAC;QACjB,MAAMK,aAAa,CAACK,IAAI,CACtB;UACEC,cAAc,EAAEhB,KAAK,CAACgB,cAAc;UACpCC,aAAa,EAAEjB,KAAK,CAACiB,aAAa;UAClCC,aAAa,EAAElB,KAAK,CAACkB,aAAa;UAClCC,eAAe,EAAEnB,KAAK,CAACmB;QACzB,CAAC,EACDV,mBACF,CAAC;QACDJ,UAAU,CAAC,IAAI,CAAC;MAClB,CAAC,CAAC,OAAOe,GAAG,EAAE;QACZjB,QAAQ,CAAEiB,GAAG,CAAWC,OAAO,CAAC;MAClC;IACF,CAAC,EAAE,CAAC;EACN,CAAC,EAAE,CACDX,aAAa,EACbV,KAAK,CAACgB,cAAc,EACpBhB,KAAK,CAACiB,aAAa,EACnBjB,KAAK,CAACkB,aAAa,EACnBlB,KAAK,CAACmB,eAAe,EACrBlB,WAAW,CACZ,CAAC;EAEF,MAAMqB,YAAY,GAAG5B,WAAW,CACe6B,EAAK,IAChD,OAAO,GAAGC,IAAmB,KAAsC;IACjE,IAAI,CAACpB,OAAO,EAAE,MAAM,IAAIqB,KAAK,CAAC5B,QAAQ,CAACD,OAAO,CAAC8B,eAAe,CAAC,CAAC;IAChE,IAAIpB,YAAY,EAAE,MAAM,IAAImB,KAAK,CAAC5B,QAAQ,CAACD,OAAO,CAAC+B,eAAe,CAAC,CAAC;IACpEpB,eAAe,CAAC,IAAI,CAAC;IACrB,IAAI;MACF,OAAO,MAAMgB,EAAE,CAACK,KAAK,CAAClB,aAAa,EAAEc,IAAI,CAAC;IAC5C,CAAC,SAAS;MACRjB,eAAe,CAAC,KAAK,CAAC;IACxB;EACF,CAAC,EACH,CAACH,OAAO,EAAEE,YAAY,EAAEI,aAAa,CACvC,CAAC;EAED,MAAMmB,MAAM,GAAGnC,WAAW,CACxB,MAAOoC,OAAyB,IAAK;IACnC,IAAI,CAAC1B,OAAO,EAAE,MAAM,IAAIqB,KAAK,CAAC5B,QAAQ,CAACD,OAAO,CAAC8B,eAAe,CAAC,CAAC;IAChE,IAAIpB,YAAY,EAAE,MAAM,IAAImB,KAAK,CAAC5B,QAAQ,CAACD,OAAO,CAAC+B,eAAe,CAAC,CAAC;IACpEpB,eAAe,CAAC,IAAI,CAAC;IACrBK,yBAAyB,CAAC,EAAE,CAAC;IAC7BE,4BAA4B,CAAC,EAAE,CAAC;IAChC,IAAIiB,aAAa,GAAG,EAAE;IACtB,IAAI;MACF,WAAW,MAAM;QAAEC,SAAS;QAAEC;MAAa,CAAC,IAAIvB,aAAa,CAACmB,MAAM,CAClEC,OACF,CAAC,EAAE;QACDlB,yBAAyB,CAAEsB,IAAI,IAAKA,IAAI,GAAGF,SAAS,CAAC;QACrDlB,4BAA4B,CAACmB,YAAY,CAAC;QAC1CF,aAAa,IAAIC,SAAS;MAC5B;IACF,CAAC,SAAS;MACRzB,eAAe,CAAC,KAAK,CAAC;IACxB;IACA,OAAOwB,aAAa;EACtB,CAAC,EACD,CAAC3B,OAAO,EAAEE,YAAY,EAAEI,aAAa,CACvC,CAAC;EAED,MAAMyB,OAAO,GAAGzC,WAAW,CACW6B,EAAK,IAAK;IAC5C,OAAO,CAAC,GAAGC,IAAmB,KAAoB;MAChD,IAAI,CAACpB,OAAO,EAAE,MAAM,IAAIqB,KAAK,CAAC5B,QAAQ,CAACD,OAAO,CAAC8B,eAAe,CAAC,CAAC;MAChE,OAAOH,EAAE,CAACK,KAAK,CAAClB,aAAa,EAAEc,IAAI,CAAC;IACtC,CAAC;EACH,CAAC,EACD,CAACpB,OAAO,EAAEM,aAAa,CACzB,CAAC;EAED,OAAO;IACLR,KAAK;IACLE,OAAO;IACPE,YAAY;IACZE,gBAAgB;IAChBG,sBAAsB;IACtBE,yBAAyB;IACzBuB,MAAM,EAAEd,YAAY,CAACxB,kBAAkB,CAACuC,SAAS,CAACD,MAAM,CAAC;IACzDE,MAAM,EAAEhB,YAAY,CAACxB,kBAAkB,CAACuC,SAAS,CAACC,MAAM,CAAC;IACzDC,UAAU,EAAEjB,YAAY,CAACxB,kBAAkB,CAACuC,SAAS,CAACE,UAAU,CAAC;IACjEV,MAAM;IACNW,UAAU,EAAEL,OAAO,CAACrC,kBAAkB,CAACuC,SAAS,CAACG,UAAU,CAAC;IAC5DC,YAAY,EAAEN,OAAO,CAACrC,kBAAkB,CAACuC,SAAS,CAACI,YAAY;EACjE,CAAC;AACH,CAAC","ignoreList":[]}
@@ -1,4 +1,4 @@
1
- import { SpeechToTextModelConfig } from '../../types/stt';
1
+ import { DecodingOptions, SpeechToTextModelConfig } from '../../types/stt';
2
2
  export declare const useSpeechToText: ({ model, preventLoad, }: {
3
3
  model: SpeechToTextModelConfig;
4
4
  preventLoad?: boolean;
@@ -11,8 +11,8 @@ export declare const useSpeechToText: ({ model, preventLoad, }: {
11
11
  nonCommittedTranscription: string;
12
12
  encode: (waveform: number[] | Float32Array<ArrayBufferLike>) => Promise<Float32Array<ArrayBufferLike>>;
13
13
  decode: (tokens: number[] | Int32Array<ArrayBufferLike>, encoderOutput: number[] | Float32Array<ArrayBufferLike>) => Promise<Float32Array<ArrayBufferLike>>;
14
- transcribe: (waveform: number[] | Float32Array<ArrayBufferLike>, options?: import("../../types/stt").DecodingOptions | undefined) => Promise<string>;
15
- stream: () => Promise<string>;
14
+ transcribe: (waveform: number[] | Float32Array<ArrayBufferLike>, options?: DecodingOptions | undefined) => Promise<string>;
15
+ stream: (options?: DecodingOptions) => Promise<string>;
16
16
  streamStop: () => Promise<void>;
17
17
  streamInsert: (waveform: number[] | Float32Array<ArrayBufferLike>) => Promise<void>;
18
18
  };
@@ -1 +1 @@
1
- {"version":3,"file":"useSpeechToText.d.ts","sourceRoot":"","sources":["../../../../src/hooks/natural_language_processing/useSpeechToText.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,uBAAuB,EAAE,MAAM,iBAAiB,CAAC;AAE1D,eAAO,MAAM,eAAe,GAAI,yBAG7B;IACD,KAAK,EAAE,uBAAuB,CAAC;IAC/B,WAAW,CAAC,EAAE,OAAO,CAAC;CACvB;;;;;;;;;;;;;CAmGA,CAAC"}
1
+ {"version":3,"file":"useSpeechToText.d.ts","sourceRoot":"","sources":["../../../../src/hooks/natural_language_processing/useSpeechToText.ts"],"names":[],"mappings":"AAGA,OAAO,EAAE,eAAe,EAAE,uBAAuB,EAAE,MAAM,iBAAiB,CAAC;AAE3E,eAAO,MAAM,eAAe,GAAI,yBAG7B;IACD,KAAK,EAAE,uBAAuB,CAAC;IAC/B,WAAW,CAAC,EAAE,OAAO,CAAC;CACvB;;;;;;;;;;uBAyDoB,eAAe;;;CA+CnC,CAAC"}
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "react-native-executorch",
3
- "version": "0.5.9",
3
+ "version": "0.5.11",
4
4
  "description": "An easy way to run AI models in React Native with ExecuTorch",
5
5
  "source": "./src/index.ts",
6
6
  "main": "./lib/module/index.js",
@@ -1,7 +1,7 @@
1
1
  import { useEffect, useCallback, useState } from 'react';
2
2
  import { ETError, getError } from '../../Error';
3
3
  import { SpeechToTextModule } from '../../modules/natural_language_processing/SpeechToTextModule';
4
- import { SpeechToTextModelConfig } from '../../types/stt';
4
+ import { DecodingOptions, SpeechToTextModelConfig } from '../../types/stt';
5
5
 
6
6
  export const useSpeechToText = ({
7
7
  model,
@@ -65,24 +65,29 @@ export const useSpeechToText = ({
65
65
  [isReady, isGenerating, modelInstance]
66
66
  );
67
67
 
68
- const stream = useCallback(async () => {
69
- if (!isReady) throw new Error(getError(ETError.ModuleNotLoaded));
70
- if (isGenerating) throw new Error(getError(ETError.ModelGenerating));
71
- setIsGenerating(true);
72
- setCommittedTranscription('');
73
- setNonCommittedTranscription('');
74
- let transcription = '';
75
- try {
76
- for await (const { committed, nonCommitted } of modelInstance.stream()) {
77
- setCommittedTranscription((prev) => prev + committed);
78
- setNonCommittedTranscription(nonCommitted);
79
- transcription += committed;
68
+ const stream = useCallback(
69
+ async (options?: DecodingOptions) => {
70
+ if (!isReady) throw new Error(getError(ETError.ModuleNotLoaded));
71
+ if (isGenerating) throw new Error(getError(ETError.ModelGenerating));
72
+ setIsGenerating(true);
73
+ setCommittedTranscription('');
74
+ setNonCommittedTranscription('');
75
+ let transcription = '';
76
+ try {
77
+ for await (const { committed, nonCommitted } of modelInstance.stream(
78
+ options
79
+ )) {
80
+ setCommittedTranscription((prev) => prev + committed);
81
+ setNonCommittedTranscription(nonCommitted);
82
+ transcription += committed;
83
+ }
84
+ } finally {
85
+ setIsGenerating(false);
80
86
  }
81
- } finally {
82
- setIsGenerating(false);
83
- }
84
- return transcription;
85
- }, [isReady, isGenerating, modelInstance]);
87
+ return transcription;
88
+ },
89
+ [isReady, isGenerating, modelInstance]
90
+ );
86
91
 
87
92
  const wrapper = useCallback(
88
93
  <T extends (...args: any[]) => any>(fn: T) => {