whisper.rn 0.3.0-rc.6 → 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -13,7 +13,7 @@ React Native binding of [whisper.cpp](https://github.com/ggerganov/whisper.cpp).
13
13
  | <img src="https://github.com/mybigday/whisper.rn/assets/3001525/2fea7b2d-c911-44fb-9afc-8efc7b594446" width="300" /> | <img src="https://github.com/mybigday/whisper.rn/assets/3001525/a5005a6c-44f7-4db9-95e8-0fd951a2e147" width="300" /> |
14
14
  | :------------------------------------------: | :------------------------------------------: |
15
15
  | iOS: Tested on iPhone 13 Pro Max | Android: Tested on Pixel 6 |
16
- | (tiny.en, Core ML enabled) | (tiny.en, armv8.2-a+fp16) |
16
+ | (tiny.en, Core ML enabled, release mode + archive) | (tiny.en, armv8.2-a+fp16, release mode) |
17
17
 
18
18
  ## Installation
19
19
 
@@ -21,7 +21,9 @@ React Native binding of [whisper.cpp](https://github.com/ggerganov/whisper.cpp).
21
21
  npm install whisper.rn
22
22
  ```
23
23
 
24
- Then re-run `npx pod-install` again for iOS.
24
+ For iOS, please re-run `npx pod-install` again.
25
+
26
+ For Android, it's recommended to use `ndkVersion = "24.0.8215888"` (or above) in your root project build configuration for Apple Silicon Macs. Otherwise please follow this trobleshooting [issue](./TROUBLESHOOTING.md#android-got-build-error-unknown-host-cpu-architecture-arm64-on-apple-silicon-macs).
25
27
 
26
28
  For Expo, you will need to prebuild the project before using it. See [Expo guide](https://docs.expo.io/guides/using-libraries/#using-a-library-in-a-expo-project) for more details.
27
29
 
@@ -129,9 +131,11 @@ To use Core ML on iOS, you will need to have the Core ML model files.
129
131
 
130
132
  The `.mlmodelc` model files is load depend on the ggml model file path. For example, if your ggml model path is `ggml-tiny.en.bin`, the Core ML model path will be `ggml-tiny.en-encoder.mlmodelc`. Please note that the ggml model is still needed as decoder or encoder fallback.
131
133
 
132
- Currently there is no official way to get the Core ML models by URL, you will need to convert Core ML models by yourself. Please see [Core ML Support](https://github.com/ggerganov/whisper.cpp#core-ml-support) of whisper.cpp for more details.
134
+ The Core ML models are hosted here: https://huggingface.co/ggerganov/whisper.cpp/tree/main
135
+
136
+ If you want to download model at runtime, during the host file is archive, you will need to unzip the file to get the `.mlmodelc` directory, you can use library like [react-native-zip-archive](https://github.com/mockingbot/react-native-zip-archive), or host those individual files to download yourself.
133
137
 
134
- During the `.mlmodelc` is a directory, you will need to download 5 files (3 required):
138
+ The `.mlmodelc` is a directory, usually it includes 5 files (3 required):
135
139
 
136
140
  ```json5
137
141
  [
@@ -168,13 +172,9 @@ In real world, we recommended to split the asset imports into another platform s
168
172
 
169
173
  The example app provide a simple UI for testing the functions.
170
174
 
171
- Used Whisper model: `tiny.en` in https://huggingface.co/datasets/ggerganov/whisper.cpp
175
+ Used Whisper model: `tiny.en` in https://huggingface.co/ggerganov/whisper.cpp
172
176
  Sample file: `jfk.wav` in https://github.com/ggerganov/whisper.cpp/tree/master/samples
173
177
 
174
- For test better performance on transcribe, you can run the app in Release mode.
175
- - iOS: `yarn example ios --configuration Release`
176
- - Android: `yarn example android --mode release`
177
-
178
178
  Please follow the [Development Workflow section of contributing guide](./CONTRIBUTING.md#development-workflow) to run the example app.
179
179
 
180
180
  ## Mock `whisper.rn`
@@ -1,15 +1,18 @@
1
1
  WHISPER_LIB_DIR := $(LOCAL_PATH)/../../../../../cpp
2
2
  LOCAL_LDLIBS := -landroid -llog
3
3
 
4
+ # NOTE: If you want to debug the native code, you can uncomment ifneq and endif
5
+ # ifneq ($(APP_OPTIM),debug)
6
+
4
7
  # Make the final output library smaller by only keeping the symbols referenced from the app.
5
- ifneq ($(APP_OPTIM),debug)
6
- LOCAL_CFLAGS += -O3 -DNDEBUG
7
- LOCAL_CFLAGS += -fvisibility=hidden -fvisibility-inlines-hidden
8
- LOCAL_CFLAGS += -ffunction-sections -fdata-sections
9
- LOCAL_LDFLAGS += -Wl,--gc-sections
10
- LOCAL_LDFLAGS += -Wl,--exclude-libs,ALL
11
- LOCAL_LDFLAGS += -flto
12
- endif
8
+ LOCAL_CFLAGS += -O3 -DNDEBUG
9
+ LOCAL_CFLAGS += -fvisibility=hidden -fvisibility-inlines-hidden
10
+ LOCAL_CFLAGS += -ffunction-sections -fdata-sections
11
+ LOCAL_LDFLAGS += -Wl,--gc-sections
12
+ LOCAL_LDFLAGS += -Wl,--exclude-libs,ALL
13
+ LOCAL_LDFLAGS += -flto
14
+
15
+ # endif
13
16
 
14
17
  LOCAL_CFLAGS += -DSTDC_HEADERS -std=c11 -I $(WHISPER_LIB_DIR)
15
18
  LOCAL_CPPFLAGS += -std=c++11 -I $(WHISPER_LIB_DIR)