whisper-coreml 0.2.0 → 1.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -5,106 +5,96 @@
5
5
  </p>
6
6
 
7
7
  <p align="center">
8
- <strong>OpenAI Whisper ASR for Node.js with CoreML/ANE acceleration on Apple Silicon</strong>
8
+ <strong>Best-in-class speech recognition for Node.js on Apple Silicon</strong>
9
9
  </p>
10
10
 
11
11
  <p align="center">
12
12
  <a href="https://github.com/sebastian-software/whisper-coreml/actions/workflows/ci.yml"><img src="https://github.com/sebastian-software/whisper-coreml/actions/workflows/ci.yml/badge.svg" alt="CI"></a>
13
13
  <a href="https://www.npmjs.com/package/whisper-coreml"><img src="https://img.shields.io/npm/v/whisper-coreml.svg" alt="npm version"></a>
14
14
  <a href="https://www.npmjs.com/package/whisper-coreml"><img src="https://img.shields.io/npm/dm/whisper-coreml.svg" alt="npm downloads"></a>
15
+ <a href="https://codecov.io/gh/sebastian-software/whisper-coreml"><img src="https://codecov.io/gh/sebastian-software/whisper-coreml/branch/main/graph/badge.svg" alt="codecov"></a>
15
16
  <br>
16
17
  <a href="https://www.typescriptlang.org/"><img src="https://img.shields.io/badge/TypeScript-5.x-blue.svg" alt="TypeScript"></a>
17
18
  <a href="https://nodejs.org/"><img src="https://img.shields.io/badge/Node.js-20+-green.svg" alt="Node.js"></a>
18
19
  <a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-yellow.svg" alt="License: MIT"></a>
19
20
  </p>
20
21
 
21
- Powered by [whisper.cpp](https://github.com/ggerganov/whisper.cpp) running on Apple's Neural Engine
22
- via CoreML.
22
+ **Transcribe audio in 99 languages. Run 100% offline on your Mac.**
23
23
 
24
- ## Why whisper-coreml?
24
+ OpenAI's Whisper is the gold standard for speech recognition accuracy. This package brings it to
25
+ Node.js – powered by Apple's Neural Engine for fast, private, local transcription.
25
26
 
26
- When you need **higher transcription quality** than
27
- [parakeet-coreml](https://github.com/sebastian-software/parakeet-coreml), Whisper's large-v3-turbo
28
- model delivers. It offers:
27
+ ## The Pitch
29
28
 
30
- - **99 language support** vs Parakeet's 25 European languages
31
- - **Better accuracy** on challenging audio (accents, background noise)
32
- - **Translation capability** (any language → English)
33
- - **Word-level confidence scores**
29
+ 🎯 **Accuracy first.** Whisper large-v3-turbo delivers state-of-the-art transcription quality –
30
+ better than any cloud API, right on your Mac.
34
31
 
35
- ### When to Use Which
32
+ 🌍 **99 languages.** From Afrikaans to Zulu. Handles accents, dialects, and background noise.
36
33
 
37
- | Use Case | Recommended |
38
- | ----------------------------------- | ------------------------------------------------------------------------ |
39
- | Fast transcription, major languages | [parakeet-coreml](https://github.com/sebastian-software/parakeet-coreml) |
40
- | Maximum accuracy, any language | **whisper-coreml** |
41
- | Translation to English | **whisper-coreml** |
42
- | Edge cases (accents, noise) | **whisper-coreml** |
34
+ 🔒 **100% private.** Your audio never leaves your device. No API keys. No cloud. No subscription.
43
35
 
44
- ## Features
45
-
46
- - 🎯 **99 Languages** – Full Whisper multilingual support
47
- - 🚀 **14x real-time** – Transcribe 1 hour of audio in ~4.5 minutes (M1 Ultra, measured)
48
- - 🍎 **Neural Engine Acceleration** – Runs on Apple's dedicated ML silicon via CoreML
49
- - 🔒 **Fully Offline** – All processing happens locally
50
- - 📦 **Zero Runtime Dependencies** – No Python, no subprocess
51
- - 📝 **Timestamps** – Segment-level timing for subtitles
52
- - 🔄 **Translation** – Translate any language to English
53
- - ⬇️ **Easy Setup** – Single CLI command to download the model
54
-
55
- ## Performance
56
-
57
- The CoreML encoder runs on Apple's Neural Engine for accelerated inference:
58
-
59
- **Measured: M1 Ultra**
36
+ **Fast enough.** 14x real-time on M1 Ultra – transcribe 1 hour of audio in under 5 minutes.
60
37
 
61
- ```
62
- 5 minutes of audio → 22.5 seconds
63
- Speed: 14x real-time
64
- 1 hour of audio in ~4.5 minutes
65
- ```
38
+ ## Why CoreML?
66
39
 
67
- Run your own benchmark:
40
+ Running Whisper without hardware acceleration is **painfully slow**. Here's how the alternatives
41
+ compare:
68
42
 
69
- ```bash
70
- git clone https://github.com/sebastian-software/whisper-coreml
71
- cd whisper-coreml && npm install && npm run benchmark
72
- ```
43
+ | Approach | Speed | Drawbacks |
44
+ | ----------------------- | ----------------- | --------------------------- |
45
+ | OpenAI Whisper (Python) | ~2x real-time | Slow, needs Python |
46
+ | whisper.cpp (CPU) | ~4x real-time | No acceleration |
47
+ | faster-whisper | ~6x real-time | Needs NVIDIA GPU |
48
+ | Cloud APIs | ~1x + latency | Costs $$$, privacy concerns |
49
+ | **whisper-coreml** | **14x real-time** | macOS only ✓ |
73
50
 
74
- ### Comparison with parakeet-coreml
51
+ The Neural Engine in every Apple Silicon Mac is a **dedicated ML accelerator** that usually sits
52
+ idle. This package puts it to work.
75
53
 
76
- | Metric | whisper-coreml | parakeet-coreml |
77
- | ---------------- | -------------- | --------------- |
78
- | Speed (M1 Ultra) | 14x real-time | 40x real-time |
79
- | Languages | 99 | 25 European |
80
- | Translation | ✅ Yes | ❌ No |
81
- | Accuracy (WER) | Lower (better) | Higher |
82
- | Model Size | ~3 GB | ~1.5 GB |
54
+ ### vs. parakeet-coreml
83
55
 
84
- **When to choose whisper-coreml:** Maximum accuracy, rare languages, translation, challenging audio.
56
+ Need even more speed? Our sister project
57
+ [parakeet-coreml](https://github.com/sebastian-software/parakeet-coreml) trades language coverage
58
+ for **40x real-time** performance.
85
59
 
86
- **When to choose parakeet-coreml:** Maximum speed, major languages only.
60
+ | | whisper-coreml | parakeet-coreml |
61
+ | ------------- | ------------------------ | --------------- |
62
+ | **Best for** | Accuracy, rare languages | Maximum speed |
63
+ | **Speed** | 14x real-time | 40x real-time |
64
+ | **Languages** | 99 | 25 European |
87
65
 
88
- ## Requirements
66
+ ## Features
89
67
 
90
- - macOS 14.0+ (Sonoma or later)
91
- - Apple Silicon (M1, M2, M3, M4 any variant)
92
- - Node.js 20+
68
+ - 🎯 **99 Languages** Full OpenAI Whisper multilingual support
69
+ - 🚀 **14x real-time** 1 hour of audio in ~4.5 minutes (M1 Ultra)
70
+ - 🍎 **Neural Engine** – Runs on Apple's dedicated ML chip via CoreML
71
+ - 🔒 **Fully Offline** – No internet required after setup
72
+ - 📦 **Zero Dependencies** – No Python, no subprocess, no hassle
73
+ - 📝 **Timestamps** – Segment-level timing for subtitles
74
+ - ⬇️ **One Command Setup** – `npx whisper-coreml download`
93
75
 
94
- ## Installation
76
+ ## Get Started
95
77
 
96
78
  ```bash
79
+ # Install
97
80
  npm install whisper-coreml
81
+
82
+ # Download the model (~3GB, one-time)
83
+ npx whisper-coreml download
98
84
  ```
99
85
 
100
- ### Download the Model
86
+ **Requirements:** macOS 14+ (Sonoma), Apple Silicon (M1/M2/M3/M4), Node.js 20+
101
87
 
102
- ```bash
103
- npx whisper-coreml download
88
+ ## Performance
89
+
90
+ Measured on M1 Ultra:
91
+
92
+ ```
93
+ 5 min audio → 22 seconds → 14x real-time
94
+ 1 hour audio → 4.5 minutes
104
95
  ```
105
96
 
106
- This downloads the **large-v3-turbo** model (~1.5GB) the only model we support, as it offers the
107
- best speed/quality ratio.
97
+ Run `npx whisper-coreml benchmark` to test on your machine.
108
98
 
109
99
  ## Quick Start
110
100
 
@@ -188,12 +178,11 @@ new WhisperAsrEngine(options: WhisperAsrOptions)
188
178
 
189
179
  #### Options
190
180
 
191
- | Option | Type | Default | Description |
192
- | ----------- | --------- | -------- | --------------------------------- |
193
- | `modelPath` | `string` | required | Path to ggml model file |
194
- | `language` | `string` | `"auto"` | Language code or "auto" to detect |
195
- | `translate` | `boolean` | `false` | Translate to English |
196
- | `threads` | `number` | `0` | CPU threads (0 = auto) |
181
+ | Option | Type | Default | Description |
182
+ | ----------- | -------- | -------- | --------------------------------- |
183
+ | `modelPath` | `string` | required | Path to ggml model file |
184
+ | `language` | `string` | `"auto"` | Language code or "auto" to detect |
185
+ | `threads` | `number` | `0` | CPU threads (0 = auto) |
197
186
 
198
187
  #### Methods
199
188
 
@@ -233,18 +222,6 @@ interface TranscriptionSegment {
233
222
  | `isModelDownloaded()` | Check if model is downloaded |
234
223
  | `downloadModel()` | Download the model |
235
224
 
236
- ## Translation
237
-
238
- Translate any language to English:
239
-
240
- ```typescript
241
- const engine = new WhisperAsrEngine({
242
- modelPath: getModelPath(),
243
- language: "de", // German input
244
- translate: true // Output in English
245
- })
246
- ```
247
-
248
225
  ## Architecture
249
226
 
250
227
  ```
@@ -266,11 +243,10 @@ const engine = new WhisperAsrEngine({
266
243
 
267
244
  ## Use Cases
268
245
 
269
- - **Maximum accuracy** – When Parakeet's quality isn't sufficient
270
- - **Rare languages** – Languages not supported by Parakeet
271
- - **Translation** – Convert foreign speech to English text
272
- - **Accented speech** – Whisper handles accents better
273
- - **Noisy audio** – More robust to background noise
246
+ - **Maximum accuracy** – When other solutions aren't good enough
247
+ - **Rare languages** – 99 languages, far beyond English/European
248
+ - **Accented speech** – Whisper handles accents and dialects well
249
+ - **Noisy audio** – Robust to background noise and music
274
250
 
275
251
  ## Contributing
276
252
 
Binary file
@@ -140,7 +140,6 @@ var WhisperAsrEngine = class {
140
140
  const success = nativeAddon.initialize({
141
141
  modelPath: this.options.modelPath,
142
142
  language: this.options.language ?? "auto",
143
- translate: this.options.translate ?? false,
144
143
  threads: this.options.threads ?? 0
145
144
  });
146
145
  if (!success) {
@@ -213,4 +212,4 @@ export {
213
212
  getLoadError,
214
213
  WhisperAsrEngine
215
214
  };
216
- //# sourceMappingURL=chunk-MOQMN4DX.js.map
215
+ //# sourceMappingURL=chunk-V34ZDICO.js.map
@@ -0,0 +1 @@
1
+ {"version":3,"sources":["../src/download.ts","../src/index.ts"],"sourcesContent":["/**\n * Model download functionality for whisper-coreml\n *\n * Note: We only support large-v3-turbo as it's the only Whisper model\n * that offers better quality than Parakeet while maintaining reasonable speed.\n */\n\nimport { existsSync, mkdirSync, writeFileSync, rmSync } from \"node:fs\"\nimport { homedir } from \"node:os\"\nimport { join, dirname } from \"node:path\"\n\n/**\n * Whisper large-v3-turbo model info\n * This is the only model we support as it offers the best speed/quality ratio\n * and is the main reason to choose Whisper over Parakeet.\n */\nexport const WHISPER_MODEL = {\n name: \"large-v3-turbo\",\n size: \"1.5 GB\",\n languages: \"99 languages\",\n url: \"https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-large-v3-turbo.bin\"\n} as const\n\n/**\n * Default model directory in user's cache\n */\nexport function getDefaultModelDir(): string {\n return join(homedir(), \".cache\", \"whisper-coreml\", \"models\")\n}\n\n/**\n * Get the path to the model\n */\nexport function getModelPath(modelDir?: string): string {\n const dir = modelDir ?? getDefaultModelDir()\n return join(dir, `ggml-${WHISPER_MODEL.name}.bin`)\n}\n\n/**\n * Check if the model is downloaded\n */\nexport function isModelDownloaded(modelDir?: string): boolean {\n const modelPath = getModelPath(modelDir)\n return existsSync(modelPath)\n}\n\ninterface DownloadProgress {\n downloadedBytes: number\n totalBytes: number\n percent: number\n}\n\nexport interface DownloadOptions {\n /** Target directory for model (default: ~/.cache/whisper-coreml/models) */\n modelDir?: string\n\n /** Progress callback */\n onProgress?: (progress: DownloadProgress) => void\n\n /** Force re-download even if model exists */\n force?: boolean\n}\n\n/* v8 ignore start - network I/O */\n\n/**\n * Download the Whisper large-v3-turbo model from Hugging Face\n */\nexport async function downloadModel(options: DownloadOptions = {}): Promise<string> {\n const modelDir = options.modelDir ?? getDefaultModelDir()\n const modelPath = getModelPath(modelDir)\n\n if (!options.force && existsSync(modelPath)) {\n return modelPath\n }\n\n // Clean up partial downloads\n if (existsSync(modelPath)) {\n rmSync(modelPath)\n }\n\n mkdirSync(dirname(modelPath), { recursive: true })\n\n console.log(`Downloading Whisper ${WHISPER_MODEL.name} (${WHISPER_MODEL.size})...`)\n console.log(`Source: ${WHISPER_MODEL.url}`)\n console.log(`Target: ${modelPath}`)\n\n const response = await fetch(WHISPER_MODEL.url)\n if (!response.ok) {\n throw new Error(`Failed to download model: ${response.statusText}`)\n }\n\n const contentLength = response.headers.get(\"content-length\")\n const totalBytes = contentLength ? parseInt(contentLength, 10) : 0\n\n const reader = response.body?.getReader()\n if (!reader) {\n throw new Error(\"Failed to get response body reader\")\n }\n\n const chunks: Uint8Array[] = []\n let downloadedBytes = 0\n\n // eslint-disable-next-line @typescript-eslint/no-unnecessary-condition\n while (true) {\n const result = await reader.read()\n if (result.done) {\n break\n }\n\n const chunk = result.value as Uint8Array\n chunks.push(chunk)\n downloadedBytes += chunk.length\n\n const percent = totalBytes > 0 ? Math.round((downloadedBytes / totalBytes) * 100) : 0\n\n if (options.onProgress) {\n options.onProgress({\n downloadedBytes,\n totalBytes,\n percent\n })\n }\n\n // Progress indicator\n process.stdout.write(\n `\\rProgress: ${String(percent)}% (${formatBytes(downloadedBytes)}/${formatBytes(totalBytes)})`\n )\n }\n\n // Combine chunks and write to file\n const buffer = Buffer.concat(chunks)\n writeFileSync(modelPath, buffer)\n\n console.log(\"\\n✓ Model downloaded successfully!\")\n return modelPath\n}\n\n/* v8 ignore stop */\n\n/**\n * Format bytes to human readable string\n * @internal Exported for testing\n */\nexport function formatBytes(bytes: number): string {\n if (bytes < 1024) {\n return `${String(bytes)} B`\n }\n if (bytes < 1024 * 1024) {\n return `${(bytes / 1024).toFixed(1)} KB`\n }\n if (bytes < 1024 * 1024 * 1024) {\n return `${(bytes / 1024 / 1024).toFixed(1)} MB`\n }\n return `${(bytes / 1024 / 1024 / 1024).toFixed(2)} GB`\n}\n","/**\n * whisper-coreml\n *\n * OpenAI Whisper ASR for Node.js with CoreML/ANE acceleration on Apple Silicon.\n * Based on whisper.cpp with Apple Neural Engine support.\n *\n * Uses the large-v3-turbo model exclusively, as it offers the best speed/quality\n * ratio and is the main reason to choose Whisper over Parakeet.\n */\n\n// Dynamic require for loading native addon (works in both ESM and CJS)\n// eslint-disable-next-line @typescript-eslint/no-require-imports\nconst bindingsModule = require(\"bindings\") as (name: string) => unknown\n\n/**\n * Native addon interface\n */\ninterface NativeAddon {\n initialize(options: { modelPath: string; language?: string; threads?: number }): boolean\n isInitialized(): boolean\n transcribe(samples: Float32Array, sampleRate: number): NativeTranscriptionResult\n cleanup(): void\n getVersion(): { addon: string; whisper: string; coreml: string }\n}\n\ninterface NativeTranscriptionResult {\n text: string\n language: string\n durationMs: number\n segments: {\n startMs: number\n endMs: number\n text: string\n confidence: number\n }[]\n}\n\n/* v8 ignore start - platform checks and native addon loading */\n\n/**\n * Load the native addon\n */\nfunction loadAddon(): NativeAddon {\n if (process.platform !== \"darwin\") {\n throw new Error(\"whisper-coreml is only supported on macOS\")\n }\n\n try {\n return bindingsModule(\"whisper_asr\") as NativeAddon\n } catch (error) {\n const message = error instanceof Error ? error.message : String(error)\n throw new Error(`Failed to load Whisper ASR native addon: ${message}`)\n }\n}\n\n/* v8 ignore stop */\n\nlet addon: NativeAddon | null = null\nlet loadError: Error | null = null\n\nfunction getAddon(): NativeAddon {\n if (!addon) {\n try {\n addon = loadAddon()\n } catch (error) {\n // v8 ignore - error path only reached with corrupted installation\n loadError = error instanceof Error ? error : new Error(String(error))\n throw error\n }\n }\n return addon\n}\n\n/**\n * Check if Whisper ASR is available on this platform\n */\nexport function isAvailable(): boolean {\n return process.platform === \"darwin\" && process.arch === \"arm64\"\n}\n\n/**\n * Get the load error if the addon failed to load\n */\nexport function getLoadError(): Error | null {\n return loadError\n}\n\n/**\n * Transcription segment with timestamps\n */\nexport interface TranscriptionSegment {\n /** Start time in milliseconds */\n startMs: number\n /** End time in milliseconds */\n endMs: number\n /** Transcribed text for this segment */\n text: string\n /** Confidence score (0-1) */\n confidence: number\n}\n\n/**\n * Transcription result\n */\nexport interface TranscriptionResult {\n /** Full transcribed text */\n text: string\n /** Detected or specified language (ISO code) */\n language: string\n /** Processing time in milliseconds */\n durationMs: number\n /** Individual segments with timestamps */\n segments: TranscriptionSegment[]\n}\n\n/**\n * Whisper ASR engine options\n */\nexport interface WhisperAsrOptions {\n /** Path to the Whisper model file (ggml format) */\n modelPath: string\n /** Language code (e.g., \"en\", \"de\", \"fr\") or \"auto\" for auto-detection */\n language?: string\n /** Number of threads (0 = auto) */\n threads?: number\n}\n\n/**\n * Whisper ASR Engine with CoreML acceleration\n *\n * Uses the large-v3-turbo model for best speed/quality balance.\n *\n * @example\n * ```typescript\n * import { WhisperAsrEngine, getModelPath } from \"whisper-coreml\"\n *\n * const engine = new WhisperAsrEngine({\n * modelPath: getModelPath()\n * })\n *\n * await engine.initialize()\n * const result = await engine.transcribe(audioSamples, 16000)\n * console.log(result.text)\n * ```\n */\nexport class WhisperAsrEngine {\n private options: WhisperAsrOptions\n private initialized = false\n\n constructor(options: WhisperAsrOptions) {\n this.options = options\n }\n\n /* v8 ignore start - native addon calls, tested via E2E */\n\n /**\n * Initialize the Whisper engine\n * This loads the model into memory - may take a few seconds.\n */\n initialize(): Promise<void> {\n if (this.initialized) {\n return Promise.resolve()\n }\n\n const nativeAddon = getAddon()\n const success = nativeAddon.initialize({\n modelPath: this.options.modelPath,\n language: this.options.language ?? \"auto\",\n threads: this.options.threads ?? 0\n })\n\n if (!success) {\n return Promise.reject(new Error(\"Failed to initialize Whisper engine\"))\n }\n\n this.initialized = true\n return Promise.resolve()\n }\n\n /**\n * Check if the engine is ready for transcription\n */\n isReady(): boolean {\n if (!this.initialized) {\n return false\n }\n try {\n return getAddon().isInitialized()\n } catch {\n return false\n }\n }\n\n /**\n * Transcribe audio samples\n *\n * @param samples - Float32Array of audio samples (mono, 16kHz)\n * @param sampleRate - Sample rate in Hz (default: 16000)\n * @returns Transcription result with text and segments\n */\n transcribe(samples: Float32Array, sampleRate = 16000): Promise<TranscriptionResult> {\n if (!this.initialized) {\n return Promise.reject(new Error(\"Whisper engine not initialized. Call initialize() first.\"))\n }\n\n const result = getAddon().transcribe(samples, sampleRate)\n\n return Promise.resolve({\n text: result.text,\n language: result.language,\n durationMs: result.durationMs,\n segments: result.segments\n })\n }\n\n /**\n * Clean up resources and unload the model\n */\n cleanup(): void {\n if (this.initialized) {\n try {\n getAddon().cleanup()\n } catch {\n // Ignore cleanup errors\n }\n this.initialized = false\n }\n }\n\n /**\n * Get version information\n */\n getVersion(): { addon: string; whisper: string; coreml: string } {\n return getAddon().getVersion()\n }\n\n /* v8 ignore stop */\n}\n\n// Re-export download utilities\nexport {\n downloadModel,\n formatBytes,\n getDefaultModelDir,\n getModelPath,\n isModelDownloaded,\n WHISPER_MODEL,\n type DownloadOptions\n} from \"./download.js\"\n"],"mappings":";;;;;;;;AAOA,SAAS,YAAY,WAAW,eAAe,cAAc;AAC7D,SAAS,eAAe;AACxB,SAAS,MAAM,eAAe;AAOvB,IAAM,gBAAgB;AAAA,EAC3B,MAAM;AAAA,EACN,MAAM;AAAA,EACN,WAAW;AAAA,EACX,KAAK;AACP;AAKO,SAAS,qBAA6B;AAC3C,SAAO,KAAK,QAAQ,GAAG,UAAU,kBAAkB,QAAQ;AAC7D;AAKO,SAAS,aAAa,UAA2B;AACtD,QAAM,MAAM,YAAY,mBAAmB;AAC3C,SAAO,KAAK,KAAK,QAAQ,cAAc,IAAI,MAAM;AACnD;AAKO,SAAS,kBAAkB,UAA4B;AAC5D,QAAM,YAAY,aAAa,QAAQ;AACvC,SAAO,WAAW,SAAS;AAC7B;AAwBA,eAAsB,cAAc,UAA2B,CAAC,GAAoB;AAClF,QAAM,WAAW,QAAQ,YAAY,mBAAmB;AACxD,QAAM,YAAY,aAAa,QAAQ;AAEvC,MAAI,CAAC,QAAQ,SAAS,WAAW,SAAS,GAAG;AAC3C,WAAO;AAAA,EACT;AAGA,MAAI,WAAW,SAAS,GAAG;AACzB,WAAO,SAAS;AAAA,EAClB;AAEA,YAAU,QAAQ,SAAS,GAAG,EAAE,WAAW,KAAK,CAAC;AAEjD,UAAQ,IAAI,uBAAuB,cAAc,IAAI,KAAK,cAAc,IAAI,MAAM;AAClF,UAAQ,IAAI,WAAW,cAAc,GAAG,EAAE;AAC1C,UAAQ,IAAI,WAAW,SAAS,EAAE;AAElC,QAAM,WAAW,MAAM,MAAM,cAAc,GAAG;AAC9C,MAAI,CAAC,SAAS,IAAI;AAChB,UAAM,IAAI,MAAM,6BAA6B,SAAS,UAAU,EAAE;AAAA,EACpE;AAEA,QAAM,gBAAgB,SAAS,QAAQ,IAAI,gBAAgB;AAC3D,QAAM,aAAa,gBAAgB,SAAS,eAAe,EAAE,IAAI;AAEjE,QAAM,SAAS,SAAS,MAAM,UAAU;AACxC,MAAI,CAAC,QAAQ;AACX,UAAM,IAAI,MAAM,oCAAoC;AAAA,EACtD;AAEA,QAAM,SAAuB,CAAC;AAC9B,MAAI,kBAAkB;AAGtB,SAAO,MAAM;AACX,UAAM,SAAS,MAAM,OAAO,KAAK;AACjC,QAAI,OAAO,MAAM;AACf;AAAA,IACF;AAEA,UAAM,QAAQ,OAAO;AACrB,WAAO,KAAK,KAAK;AACjB,uBAAmB,MAAM;AAEzB,UAAM,UAAU,aAAa,IAAI,KAAK,MAAO,kBAAkB,aAAc,GAAG,IAAI;AAEpF,QAAI,QAAQ,YAAY;AACtB,cAAQ,WAAW;AAAA,QACjB;AAAA,QACA;AAAA,QACA;AAAA,MACF,CAAC;AAAA,IACH;AAGA,YAAQ,OAAO;AAAA,MACb,eAAe,OAAO,OAAO,CAAC,MAAM,YAAY,eAAe,CAAC,IAAI,YAAY,UAAU,CAAC;AAAA,IAC7F;AAAA,EACF;AAGA,QAAM,SAAS,OAAO,OAAO,MAAM;AACnC,gBAAc,WAAW,MAAM;AAE/B,UAAQ,IAAI,yCAAoC;AAChD,SAAO;AACT;AAQO,SAAS,YAAY,OAAuB;AACjD,MAAI,QAAQ,MAAM;AAChB,WAAO,GAAG,OAAO,KAAK,CAAC;AAAA,EACzB;AACA,MAAI,QAAQ,OAAO,MAAM;AACvB,WAAO,IAAI,QAAQ,MAAM,QAAQ,CAAC,CAAC;AAAA,EACrC;AACA,MAAI,QAAQ,OAAO,OAAO,MAAM;AAC9B,WAAO,IAAI,QAAQ,OAAO,MAAM,QAAQ,CAAC,CAAC;AAAA,EAC5C;AACA,SAAO,IAAI,QAAQ,OAAO,OAAO,MAAM,QAAQ,CAAC,CAAC;AACnD;;;AC/IA,IAAM,iBAAiB,UAAQ,UAAU;AA8BzC,SAAS,YAAyB;AAChC,MAAI,QAAQ,aAAa,UAAU;AACjC,UAAM,IAAI,MAAM,2CAA2C;AAAA,EAC7D;AAEA,MAAI;AACF,WAAO,eAAe,aAAa;AAAA,EACrC,SAAS,OAAO;AACd,UAAM,UAAU,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK;AACrE,UAAM,IAAI,MAAM,4CAA4C,OAAO,EAAE;AAAA,EACvE;AACF;AAIA,IAAI,QAA4B;AAChC,IAAI,YAA0B;AAE9B,SAAS,WAAwB;AAC/B,MAAI,CAAC,OAAO;AACV,QAAI;AACF,cAAQ,UAAU;AAAA,IACpB,SAAS,OAAO;AAEd,kBAAY,iBAAiB,QAAQ,QAAQ,IAAI,MAAM,OAAO,KAAK,CAAC;AACpE,YAAM;AAAA,IACR;AAAA,EACF;AACA,SAAO;AACT;AAKO,SAAS,cAAuB;AACrC,SAAO,QAAQ,aAAa,YAAY,QAAQ,SAAS;AAC3D;AAKO,SAAS,eAA6B;AAC3C,SAAO;AACT;AA4DO,IAAM,mBAAN,MAAuB;AAAA,EACpB;AAAA,EACA,cAAc;AAAA,EAEtB,YAAY,SAA4B;AACtC,SAAK,UAAU;AAAA,EACjB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,aAA4B;AAC1B,QAAI,KAAK,aAAa;AACpB,aAAO,QAAQ,QAAQ;AAAA,IACzB;AAEA,UAAM,cAAc,SAAS;AAC7B,UAAM,UAAU,YAAY,WAAW;AAAA,MACrC,WAAW,KAAK,QAAQ;AAAA,MACxB,UAAU,KAAK,QAAQ,YAAY;AAAA,MACnC,SAAS,KAAK,QAAQ,WAAW;AAAA,IACnC,CAAC;AAED,QAAI,CAAC,SAAS;AACZ,aAAO,QAAQ,OAAO,IAAI,MAAM,qCAAqC,CAAC;AAAA,IACxE;AAEA,SAAK,cAAc;AACnB,WAAO,QAAQ,QAAQ;AAAA,EACzB;AAAA;AAAA;AAAA;AAAA,EAKA,UAAmB;AACjB,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO;AAAA,IACT;AACA,QAAI;AACF,aAAO,SAAS,EAAE,cAAc;AAAA,IAClC,QAAQ;AACN,aAAO;AAAA,IACT;AAAA,EACF;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EASA,WAAW,SAAuB,aAAa,MAAqC;AAClF,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO,QAAQ,OAAO,IAAI,MAAM,0DAA0D,CAAC;AAAA,IAC7F;AAEA,UAAM,SAAS,SAAS,EAAE,WAAW,SAAS,UAAU;AAExD,WAAO,QAAQ,QAAQ;AAAA,MACrB,MAAM,OAAO;AAAA,MACb,UAAU,OAAO;AAAA,MACjB,YAAY,OAAO;AAAA,MACnB,UAAU,OAAO;AAAA,IACnB,CAAC;AAAA,EACH;AAAA;AAAA;AAAA;AAAA,EAKA,UAAgB;AACd,QAAI,KAAK,aAAa;AACpB,UAAI;AACF,iBAAS,EAAE,QAAQ;AAAA,MACrB,QAAQ;AAAA,MAER;AACA,WAAK,cAAc;AAAA,IACrB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,aAAiE;AAC/D,WAAO,SAAS,EAAE,WAAW;AAAA,EAC/B;AAAA;AAGF;","names":[]}
package/dist/cli.cjs CHANGED
@@ -143,7 +143,6 @@ var WhisperAsrEngine = class {
143
143
  const success = nativeAddon.initialize({
144
144
  modelPath: this.options.modelPath,
145
145
  language: this.options.language ?? "auto",
146
- translate: this.options.translate ?? false,
147
146
  threads: this.options.threads ?? 0
148
147
  });
149
148
  if (!success) {
package/dist/cli.cjs.map CHANGED
@@ -1 +1 @@
1
- {"version":3,"sources":["../node_modules/.pnpm/tsup@8.5.1_jiti@2.6.1_postcss@8.5.6_typescript@5.9.3_yaml@2.8.2/node_modules/tsup/assets/cjs_shims.js","../src/cli.ts","../src/download.ts","../src/index.ts"],"sourcesContent":["// Shim globals in cjs bundle\n// There's a weird bug that esbuild will always inject importMetaUrl\n// if we export it as `const importMetaUrl = ... __filename ...`\n// But using a function will not cause this issue\n\nconst getImportMetaUrl = () => \n typeof document === \"undefined\" \n ? new URL(`file:${__filename}`).href \n : (document.currentScript && document.currentScript.tagName.toUpperCase() === 'SCRIPT') \n ? document.currentScript.src \n : new URL(\"main.js\", document.baseURI).href;\n\nexport const importMetaUrl = /* @__PURE__ */ getImportMetaUrl()\n","#!/usr/bin/env node\n/**\n * CLI for whisper-coreml\n */\n\nimport { existsSync } from \"node:fs\"\nimport { join, dirname } from \"node:path\"\nimport { fileURLToPath } from \"node:url\"\nimport { execSync } from \"node:child_process\"\n\nimport {\n downloadModel,\n getDefaultModelDir,\n getModelPath,\n isModelDownloaded,\n WHISPER_MODEL\n} from \"./download.js\"\n\nimport { WhisperAsrEngine, isAvailable } from \"./index.js\"\n\nconst args = process.argv.slice(2)\nconst command = args[0]\n\nfunction printHelp(): void {\n console.log(`\nwhisper-coreml CLI\n\nCommands:\n download [--force] Download the Whisper model (~1.5GB)\n benchmark Run performance benchmark\n status Check if model is downloaded\n path Print model directory path\n\nThe large-v3-turbo model offers the best speed/quality ratio\nand is the reason to choose Whisper over Parakeet.\n\nOptions:\n --force Force re-download even if model exists\n --help, -h Show this help message\n`)\n}\n\n/**\n * Load audio file as Float32Array using ffmpeg\n */\nfunction loadAudio(path: string): Float32Array {\n try {\n // Convert to raw PCM using ffmpeg\n const pcmBuffer = execSync(`ffmpeg -i \"${path}\" -ar 16000 -ac 1 -f s16le -acodec pcm_s16le -`, {\n encoding: \"buffer\",\n stdio: [\"pipe\", \"pipe\", \"pipe\"],\n maxBuffer: 50 * 1024 * 1024\n })\n const pcm16 = new Int16Array(pcmBuffer.buffer, pcmBuffer.byteOffset, pcmBuffer.length / 2)\n const samples = new Float32Array(pcm16.length)\n for (let i = 0; i < pcm16.length; i++) {\n samples[i] = (pcm16[i] ?? 0) / 32768.0\n }\n return samples\n } catch {\n console.error(\"ffmpeg is required for benchmark. Install with: brew install ffmpeg\")\n process.exit(1)\n }\n}\n\n/**\n * Get chip name from system\n */\nfunction getChipName(): string {\n try {\n const output = execSync(\"sysctl -n machdep.cpu.brand_string\", { encoding: \"utf-8\" })\n return output.trim()\n } catch {\n return \"Unknown\"\n }\n}\n\nasync function runBenchmark(): Promise<void> {\n if (!isAvailable()) {\n console.error(\"Benchmark requires macOS with Apple Silicon\")\n process.exit(1)\n }\n\n if (!isModelDownloaded()) {\n console.error(\"Model not downloaded. Run: npx whisper-coreml download\")\n process.exit(1)\n }\n\n // Find the benchmark audio file\n const __dirname = dirname(fileURLToPath(import.meta.url))\n const possiblePaths = [\n join(__dirname, \"../test/fixtures/brian.ogg\"),\n join(__dirname, \"../../test/fixtures/brian.ogg\"),\n join(process.cwd(), \"test/fixtures/brian.ogg\")\n ]\n\n const audioPath = possiblePaths.find((p) => existsSync(p))\n\n if (!audioPath) {\n console.error(\"Benchmark audio not found. Clone the repository to run benchmarks:\")\n console.error(\" git clone https://github.com/sebastian-software/whisper-coreml\")\n console.error(\" cd whisper-coreml && pnpm install && pnpm benchmark\")\n process.exit(1)\n }\n\n console.log(\"Whisper CoreML Benchmark\")\n console.log(\"========================\\n\")\n\n const chip = getChipName()\n console.log(`Chip: ${chip}`)\n console.log(`Model: ${WHISPER_MODEL.name}`)\n console.log(`Node: ${process.version}\\n`)\n\n // Load audio\n console.log(\"Loading audio...\")\n const samples = loadAudio(audioPath)\n const audioDuration = samples.length / 16000\n\n console.log(`Audio: ${audioDuration.toFixed(1)}s (${samples.length.toLocaleString()} samples)\\n`)\n\n // Initialize engine\n console.log(\"Initializing engine...\")\n const modelPath = getModelPath()\n const engine = new WhisperAsrEngine({ modelPath })\n\n const initStart = performance.now()\n await engine.initialize()\n const initTime = performance.now() - initStart\n\n console.log(`Init time: ${(initTime / 1000).toFixed(2)}s\\n`)\n\n // Warm-up run\n console.log(\"Warm-up run...\")\n await engine.transcribe(samples.slice(0, 16000 * 5), 16000) // 5 seconds\n\n // Benchmark runs\n const runs = 3\n console.log(`\\nBenchmark (${String(runs)} runs)...\\n`)\n\n const times: number[] = []\n\n for (let i = 0; i < runs; i++) {\n const result = await engine.transcribe(samples, 16000)\n times.push(result.durationMs)\n console.log(` Run ${String(i + 1)}: ${(result.durationMs / 1000).toFixed(3)}s`)\n }\n\n engine.cleanup()\n\n // Calculate stats\n const avgTime = times.reduce((a, b) => a + b, 0) / times.length\n const rtf = avgTime / 1000 / audioDuration\n const speedup = 1 / rtf\n\n console.log(\"\\n─────────────────────────────\")\n console.log(\"Results\")\n console.log(\"─────────────────────────────\")\n console.log(`Audio duration: ${audioDuration.toFixed(1)}s`)\n console.log(`Avg process time: ${(avgTime / 1000).toFixed(3)}s`)\n console.log(`Real-time factor: ${rtf.toFixed(4)}x`)\n console.log(`Speed: ${speedup.toFixed(0)}x real-time`)\n console.log(\"\")\n console.log(`→ 1 hour of audio in ~${(3600 / speedup).toFixed(0)} seconds`)\n console.log(\"─────────────────────────────\\n\")\n}\n\nasync function main(): Promise<void> {\n if (!command || command === \"--help\" || command === \"-h\") {\n printHelp()\n process.exit(0)\n }\n\n switch (command) {\n case \"download\": {\n const force = args.includes(\"--force\")\n console.log(\"Whisper CoreML Model Downloader\")\n console.log(\"===============================\\n\")\n\n try {\n await downloadModel({ force })\n } catch (error) {\n console.error(\"\\n✗ Download failed:\", error instanceof Error ? error.message : error)\n process.exit(1)\n }\n break\n }\n\n case \"benchmark\": {\n await runBenchmark()\n break\n }\n\n case \"status\": {\n const downloaded = isModelDownloaded()\n\n console.log(\"Whisper CoreML Status\")\n console.log(\"=====================\")\n console.log(`Model directory: ${getDefaultModelDir()}`)\n console.log(\"\")\n\n if (downloaded) {\n console.log(`✓ ${WHISPER_MODEL.name} (${WHISPER_MODEL.size}) - Ready`)\n } else {\n console.log(\"✗ Model not downloaded\")\n console.log(\"Run: npx whisper-coreml download\")\n }\n break\n }\n\n case \"path\": {\n console.log(getDefaultModelDir())\n break\n }\n\n default:\n console.error(`Unknown command: ${command}`)\n printHelp()\n process.exit(1)\n }\n}\n\nmain().catch((error: unknown) => {\n console.error(\"Fatal error:\", error)\n process.exit(1)\n})\n","/**\n * Model download functionality for whisper-coreml\n *\n * Note: We only support large-v3-turbo as it's the only Whisper model\n * that offers better quality than Parakeet while maintaining reasonable speed.\n */\n\nimport { existsSync, mkdirSync, writeFileSync, rmSync } from \"node:fs\"\nimport { homedir } from \"node:os\"\nimport { join, dirname } from \"node:path\"\n\n/**\n * Whisper large-v3-turbo model info\n * This is the only model we support as it offers the best speed/quality ratio\n * and is the main reason to choose Whisper over Parakeet.\n */\nexport const WHISPER_MODEL = {\n name: \"large-v3-turbo\",\n size: \"1.5 GB\",\n languages: \"99 languages\",\n url: \"https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-large-v3-turbo.bin\"\n} as const\n\n/**\n * Default model directory in user's cache\n */\nexport function getDefaultModelDir(): string {\n return join(homedir(), \".cache\", \"whisper-coreml\", \"models\")\n}\n\n/**\n * Get the path to the model\n */\nexport function getModelPath(modelDir?: string): string {\n const dir = modelDir ?? getDefaultModelDir()\n return join(dir, `ggml-${WHISPER_MODEL.name}.bin`)\n}\n\n/**\n * Check if the model is downloaded\n */\nexport function isModelDownloaded(modelDir?: string): boolean {\n const modelPath = getModelPath(modelDir)\n return existsSync(modelPath)\n}\n\ninterface DownloadProgress {\n downloadedBytes: number\n totalBytes: number\n percent: number\n}\n\nexport interface DownloadOptions {\n /** Target directory for model (default: ~/.cache/whisper-coreml/models) */\n modelDir?: string\n\n /** Progress callback */\n onProgress?: (progress: DownloadProgress) => void\n\n /** Force re-download even if model exists */\n force?: boolean\n}\n\n/* v8 ignore start - network I/O */\n\n/**\n * Download the Whisper large-v3-turbo model from Hugging Face\n */\nexport async function downloadModel(options: DownloadOptions = {}): Promise<string> {\n const modelDir = options.modelDir ?? getDefaultModelDir()\n const modelPath = getModelPath(modelDir)\n\n if (!options.force && existsSync(modelPath)) {\n return modelPath\n }\n\n // Clean up partial downloads\n if (existsSync(modelPath)) {\n rmSync(modelPath)\n }\n\n mkdirSync(dirname(modelPath), { recursive: true })\n\n console.log(`Downloading Whisper ${WHISPER_MODEL.name} (${WHISPER_MODEL.size})...`)\n console.log(`Source: ${WHISPER_MODEL.url}`)\n console.log(`Target: ${modelPath}`)\n\n const response = await fetch(WHISPER_MODEL.url)\n if (!response.ok) {\n throw new Error(`Failed to download model: ${response.statusText}`)\n }\n\n const contentLength = response.headers.get(\"content-length\")\n const totalBytes = contentLength ? parseInt(contentLength, 10) : 0\n\n const reader = response.body?.getReader()\n if (!reader) {\n throw new Error(\"Failed to get response body reader\")\n }\n\n const chunks: Uint8Array[] = []\n let downloadedBytes = 0\n\n // eslint-disable-next-line @typescript-eslint/no-unnecessary-condition\n while (true) {\n const result = await reader.read()\n if (result.done) {\n break\n }\n\n const chunk = result.value as Uint8Array\n chunks.push(chunk)\n downloadedBytes += chunk.length\n\n const percent = totalBytes > 0 ? Math.round((downloadedBytes / totalBytes) * 100) : 0\n\n if (options.onProgress) {\n options.onProgress({\n downloadedBytes,\n totalBytes,\n percent\n })\n }\n\n // Progress indicator\n process.stdout.write(\n `\\rProgress: ${String(percent)}% (${formatBytes(downloadedBytes)}/${formatBytes(totalBytes)})`\n )\n }\n\n // Combine chunks and write to file\n const buffer = Buffer.concat(chunks)\n writeFileSync(modelPath, buffer)\n\n console.log(\"\\n✓ Model downloaded successfully!\")\n return modelPath\n}\n\n/* v8 ignore stop */\n\n/**\n * Format bytes to human readable string\n * @internal Exported for testing\n */\nexport function formatBytes(bytes: number): string {\n if (bytes < 1024) {\n return `${String(bytes)} B`\n }\n if (bytes < 1024 * 1024) {\n return `${(bytes / 1024).toFixed(1)} KB`\n }\n if (bytes < 1024 * 1024 * 1024) {\n return `${(bytes / 1024 / 1024).toFixed(1)} MB`\n }\n return `${(bytes / 1024 / 1024 / 1024).toFixed(2)} GB`\n}\n","/**\n * whisper-coreml\n *\n * OpenAI Whisper ASR for Node.js with CoreML/ANE acceleration on Apple Silicon.\n * Based on whisper.cpp with Apple Neural Engine support.\n *\n * Uses the large-v3-turbo model exclusively, as it offers the best speed/quality\n * ratio and is the main reason to choose Whisper over Parakeet.\n */\n\n// Dynamic require for loading native addon (works in both ESM and CJS)\n// eslint-disable-next-line @typescript-eslint/no-require-imports\nconst bindingsModule = require(\"bindings\") as (name: string) => unknown\n\n/**\n * Native addon interface\n */\ninterface NativeAddon {\n initialize(options: {\n modelPath: string\n language?: string\n translate?: boolean\n threads?: number\n }): boolean\n isInitialized(): boolean\n transcribe(samples: Float32Array, sampleRate: number): NativeTranscriptionResult\n cleanup(): void\n getVersion(): { addon: string; whisper: string; coreml: string }\n}\n\ninterface NativeTranscriptionResult {\n text: string\n language: string\n durationMs: number\n segments: {\n startMs: number\n endMs: number\n text: string\n confidence: number\n }[]\n}\n\n/* v8 ignore start - platform checks and native addon loading */\n\n/**\n * Load the native addon\n */\nfunction loadAddon(): NativeAddon {\n if (process.platform !== \"darwin\") {\n throw new Error(\"whisper-coreml is only supported on macOS\")\n }\n\n try {\n return bindingsModule(\"whisper_asr\") as NativeAddon\n } catch (error) {\n const message = error instanceof Error ? error.message : String(error)\n throw new Error(`Failed to load Whisper ASR native addon: ${message}`)\n }\n}\n\n/* v8 ignore stop */\n\nlet addon: NativeAddon | null = null\nlet loadError: Error | null = null\n\nfunction getAddon(): NativeAddon {\n if (!addon) {\n try {\n addon = loadAddon()\n } catch (error) {\n loadError = error instanceof Error ? error : new Error(String(error))\n throw error\n }\n }\n return addon\n}\n\n/**\n * Check if Whisper ASR is available on this platform\n */\nexport function isAvailable(): boolean {\n return process.platform === \"darwin\" && process.arch === \"arm64\"\n}\n\n/**\n * Get the load error if the addon failed to load\n */\nexport function getLoadError(): Error | null {\n return loadError\n}\n\n/**\n * Transcription segment with timestamps\n */\nexport interface TranscriptionSegment {\n /** Start time in milliseconds */\n startMs: number\n /** End time in milliseconds */\n endMs: number\n /** Transcribed text for this segment */\n text: string\n /** Confidence score (0-1) */\n confidence: number\n}\n\n/**\n * Transcription result\n */\nexport interface TranscriptionResult {\n /** Full transcribed text */\n text: string\n /** Detected or specified language (ISO code) */\n language: string\n /** Processing time in milliseconds */\n durationMs: number\n /** Individual segments with timestamps */\n segments: TranscriptionSegment[]\n}\n\n/**\n * Whisper ASR engine options\n */\nexport interface WhisperAsrOptions {\n /** Path to the Whisper model file (ggml format) */\n modelPath: string\n /** Language code (e.g., \"en\", \"de\", \"fr\") or \"auto\" for auto-detection */\n language?: string\n /** Translate to English (default: false) */\n translate?: boolean\n /** Number of threads (0 = auto) */\n threads?: number\n}\n\n/**\n * Whisper ASR Engine with CoreML acceleration\n *\n * Uses the large-v3-turbo model for best speed/quality balance.\n *\n * @example\n * ```typescript\n * import { WhisperAsrEngine, getModelPath } from \"whisper-coreml\"\n *\n * const engine = new WhisperAsrEngine({\n * modelPath: getModelPath()\n * })\n *\n * await engine.initialize()\n * const result = await engine.transcribe(audioSamples, 16000)\n * console.log(result.text)\n * ```\n */\nexport class WhisperAsrEngine {\n private options: WhisperAsrOptions\n private initialized = false\n\n constructor(options: WhisperAsrOptions) {\n this.options = options\n }\n\n /* v8 ignore start - native addon calls, tested via E2E */\n\n /**\n * Initialize the Whisper engine\n * This loads the model into memory - may take a few seconds.\n */\n initialize(): Promise<void> {\n if (this.initialized) {\n return Promise.resolve()\n }\n\n const nativeAddon = getAddon()\n const success = nativeAddon.initialize({\n modelPath: this.options.modelPath,\n language: this.options.language ?? \"auto\",\n translate: this.options.translate ?? false,\n threads: this.options.threads ?? 0\n })\n\n if (!success) {\n return Promise.reject(new Error(\"Failed to initialize Whisper engine\"))\n }\n\n this.initialized = true\n return Promise.resolve()\n }\n\n /**\n * Check if the engine is ready for transcription\n */\n isReady(): boolean {\n if (!this.initialized) {\n return false\n }\n try {\n return getAddon().isInitialized()\n } catch {\n return false\n }\n }\n\n /**\n * Transcribe audio samples\n *\n * @param samples - Float32Array of audio samples (mono, 16kHz)\n * @param sampleRate - Sample rate in Hz (default: 16000)\n * @returns Transcription result with text and segments\n */\n transcribe(samples: Float32Array, sampleRate = 16000): Promise<TranscriptionResult> {\n if (!this.initialized) {\n return Promise.reject(new Error(\"Whisper engine not initialized. Call initialize() first.\"))\n }\n\n const result = getAddon().transcribe(samples, sampleRate)\n\n return Promise.resolve({\n text: result.text,\n language: result.language,\n durationMs: result.durationMs,\n segments: result.segments\n })\n }\n\n /**\n * Clean up resources and unload the model\n */\n cleanup(): void {\n if (this.initialized) {\n try {\n getAddon().cleanup()\n } catch {\n // Ignore cleanup errors\n }\n this.initialized = false\n }\n }\n\n /**\n * Get version information\n */\n getVersion(): { addon: string; whisper: string; coreml: string } {\n return getAddon().getVersion()\n }\n\n /* v8 ignore stop */\n}\n\n// Re-export download utilities\nexport {\n downloadModel,\n formatBytes,\n getDefaultModelDir,\n getModelPath,\n isModelDownloaded,\n WHISPER_MODEL,\n type DownloadOptions\n} from \"./download.js\"\n"],"mappings":";;;;AAKA,IAAM,mBAAmB,MACvB,OAAO,aAAa,cAChB,IAAI,IAAI,QAAQ,UAAU,EAAE,EAAE,OAC7B,SAAS,iBAAiB,SAAS,cAAc,QAAQ,YAAY,MAAM,WAC1E,SAAS,cAAc,MACvB,IAAI,IAAI,WAAW,SAAS,OAAO,EAAE;AAEtC,IAAM,gBAAgC,iCAAiB;;;ACP9D,IAAAA,kBAA2B;AAC3B,IAAAC,oBAA8B;AAC9B,sBAA8B;AAC9B,gCAAyB;;;ACDzB,qBAA6D;AAC7D,qBAAwB;AACxB,uBAA8B;AAOvB,IAAM,gBAAgB;AAAA,EAC3B,MAAM;AAAA,EACN,MAAM;AAAA,EACN,WAAW;AAAA,EACX,KAAK;AACP;AAKO,SAAS,qBAA6B;AAC3C,aAAO,2BAAK,wBAAQ,GAAG,UAAU,kBAAkB,QAAQ;AAC7D;AAKO,SAAS,aAAa,UAA2B;AACtD,QAAM,MAAM,YAAY,mBAAmB;AAC3C,aAAO,uBAAK,KAAK,QAAQ,cAAc,IAAI,MAAM;AACnD;AAKO,SAAS,kBAAkB,UAA4B;AAC5D,QAAM,YAAY,aAAa,QAAQ;AACvC,aAAO,2BAAW,SAAS;AAC7B;AAwBA,eAAsB,cAAc,UAA2B,CAAC,GAAoB;AAClF,QAAM,WAAW,QAAQ,YAAY,mBAAmB;AACxD,QAAM,YAAY,aAAa,QAAQ;AAEvC,MAAI,CAAC,QAAQ,aAAS,2BAAW,SAAS,GAAG;AAC3C,WAAO;AAAA,EACT;AAGA,UAAI,2BAAW,SAAS,GAAG;AACzB,+BAAO,SAAS;AAAA,EAClB;AAEA,oCAAU,0BAAQ,SAAS,GAAG,EAAE,WAAW,KAAK,CAAC;AAEjD,UAAQ,IAAI,uBAAuB,cAAc,IAAI,KAAK,cAAc,IAAI,MAAM;AAClF,UAAQ,IAAI,WAAW,cAAc,GAAG,EAAE;AAC1C,UAAQ,IAAI,WAAW,SAAS,EAAE;AAElC,QAAM,WAAW,MAAM,MAAM,cAAc,GAAG;AAC9C,MAAI,CAAC,SAAS,IAAI;AAChB,UAAM,IAAI,MAAM,6BAA6B,SAAS,UAAU,EAAE;AAAA,EACpE;AAEA,QAAM,gBAAgB,SAAS,QAAQ,IAAI,gBAAgB;AAC3D,QAAM,aAAa,gBAAgB,SAAS,eAAe,EAAE,IAAI;AAEjE,QAAM,SAAS,SAAS,MAAM,UAAU;AACxC,MAAI,CAAC,QAAQ;AACX,UAAM,IAAI,MAAM,oCAAoC;AAAA,EACtD;AAEA,QAAM,SAAuB,CAAC;AAC9B,MAAI,kBAAkB;AAGtB,SAAO,MAAM;AACX,UAAM,SAAS,MAAM,OAAO,KAAK;AACjC,QAAI,OAAO,MAAM;AACf;AAAA,IACF;AAEA,UAAM,QAAQ,OAAO;AACrB,WAAO,KAAK,KAAK;AACjB,uBAAmB,MAAM;AAEzB,UAAM,UAAU,aAAa,IAAI,KAAK,MAAO,kBAAkB,aAAc,GAAG,IAAI;AAEpF,QAAI,QAAQ,YAAY;AACtB,cAAQ,WAAW;AAAA,QACjB;AAAA,QACA;AAAA,QACA;AAAA,MACF,CAAC;AAAA,IACH;AAGA,YAAQ,OAAO;AAAA,MACb,eAAe,OAAO,OAAO,CAAC,MAAM,YAAY,eAAe,CAAC,IAAI,YAAY,UAAU,CAAC;AAAA,IAC7F;AAAA,EACF;AAGA,QAAM,SAAS,OAAO,OAAO,MAAM;AACnC,oCAAc,WAAW,MAAM;AAE/B,UAAQ,IAAI,yCAAoC;AAChD,SAAO;AACT;AAQO,SAAS,YAAY,OAAuB;AACjD,MAAI,QAAQ,MAAM;AAChB,WAAO,GAAG,OAAO,KAAK,CAAC;AAAA,EACzB;AACA,MAAI,QAAQ,OAAO,MAAM;AACvB,WAAO,IAAI,QAAQ,MAAM,QAAQ,CAAC,CAAC;AAAA,EACrC;AACA,MAAI,QAAQ,OAAO,OAAO,MAAM;AAC9B,WAAO,IAAI,QAAQ,OAAO,MAAM,QAAQ,CAAC,CAAC;AAAA,EAC5C;AACA,SAAO,IAAI,QAAQ,OAAO,OAAO,MAAM,QAAQ,CAAC,CAAC;AACnD;;;AC/IA,IAAM,iBAAiB,QAAQ,UAAU;AAmCzC,SAAS,YAAyB;AAChC,MAAI,QAAQ,aAAa,UAAU;AACjC,UAAM,IAAI,MAAM,2CAA2C;AAAA,EAC7D;AAEA,MAAI;AACF,WAAO,eAAe,aAAa;AAAA,EACrC,SAAS,OAAO;AACd,UAAM,UAAU,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK;AACrE,UAAM,IAAI,MAAM,4CAA4C,OAAO,EAAE;AAAA,EACvE;AACF;AAIA,IAAI,QAA4B;AAChC,IAAI,YAA0B;AAE9B,SAAS,WAAwB;AAC/B,MAAI,CAAC,OAAO;AACV,QAAI;AACF,cAAQ,UAAU;AAAA,IACpB,SAAS,OAAO;AACd,kBAAY,iBAAiB,QAAQ,QAAQ,IAAI,MAAM,OAAO,KAAK,CAAC;AACpE,YAAM;AAAA,IACR;AAAA,EACF;AACA,SAAO;AACT;AAKO,SAAS,cAAuB;AACrC,SAAO,QAAQ,aAAa,YAAY,QAAQ,SAAS;AAC3D;AAqEO,IAAM,mBAAN,MAAuB;AAAA,EACpB;AAAA,EACA,cAAc;AAAA,EAEtB,YAAY,SAA4B;AACtC,SAAK,UAAU;AAAA,EACjB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,aAA4B;AAC1B,QAAI,KAAK,aAAa;AACpB,aAAO,QAAQ,QAAQ;AAAA,IACzB;AAEA,UAAM,cAAc,SAAS;AAC7B,UAAM,UAAU,YAAY,WAAW;AAAA,MACrC,WAAW,KAAK,QAAQ;AAAA,MACxB,UAAU,KAAK,QAAQ,YAAY;AAAA,MACnC,WAAW,KAAK,QAAQ,aAAa;AAAA,MACrC,SAAS,KAAK,QAAQ,WAAW;AAAA,IACnC,CAAC;AAED,QAAI,CAAC,SAAS;AACZ,aAAO,QAAQ,OAAO,IAAI,MAAM,qCAAqC,CAAC;AAAA,IACxE;AAEA,SAAK,cAAc;AACnB,WAAO,QAAQ,QAAQ;AAAA,EACzB;AAAA;AAAA;AAAA;AAAA,EAKA,UAAmB;AACjB,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO;AAAA,IACT;AACA,QAAI;AACF,aAAO,SAAS,EAAE,cAAc;AAAA,IAClC,QAAQ;AACN,aAAO;AAAA,IACT;AAAA,EACF;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EASA,WAAW,SAAuB,aAAa,MAAqC;AAClF,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO,QAAQ,OAAO,IAAI,MAAM,0DAA0D,CAAC;AAAA,IAC7F;AAEA,UAAM,SAAS,SAAS,EAAE,WAAW,SAAS,UAAU;AAExD,WAAO,QAAQ,QAAQ;AAAA,MACrB,MAAM,OAAO;AAAA,MACb,UAAU,OAAO;AAAA,MACjB,YAAY,OAAO;AAAA,MACnB,UAAU,OAAO;AAAA,IACnB,CAAC;AAAA,EACH;AAAA;AAAA;AAAA;AAAA,EAKA,UAAgB;AACd,QAAI,KAAK,aAAa;AACpB,UAAI;AACF,iBAAS,EAAE,QAAQ;AAAA,MACrB,QAAQ;AAAA,MAER;AACA,WAAK,cAAc;AAAA,IACrB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,aAAiE;AAC/D,WAAO,SAAS,EAAE,WAAW;AAAA,EAC/B;AAAA;AAGF;;;AFhOA,IAAM,OAAO,QAAQ,KAAK,MAAM,CAAC;AACjC,IAAM,UAAU,KAAK,CAAC;AAEtB,SAAS,YAAkB;AACzB,UAAQ,IAAI;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,CAeb;AACD;AAKA,SAAS,UAAU,MAA4B;AAC7C,MAAI;AAEF,UAAM,gBAAY,oCAAS,cAAc,IAAI,kDAAkD;AAAA,MAC7F,UAAU;AAAA,MACV,OAAO,CAAC,QAAQ,QAAQ,MAAM;AAAA,MAC9B,WAAW,KAAK,OAAO;AAAA,IACzB,CAAC;AACD,UAAM,QAAQ,IAAI,WAAW,UAAU,QAAQ,UAAU,YAAY,UAAU,SAAS,CAAC;AACzF,UAAM,UAAU,IAAI,aAAa,MAAM,MAAM;AAC7C,aAAS,IAAI,GAAG,IAAI,MAAM,QAAQ,KAAK;AACrC,cAAQ,CAAC,KAAK,MAAM,CAAC,KAAK,KAAK;AAAA,IACjC;AACA,WAAO;AAAA,EACT,QAAQ;AACN,YAAQ,MAAM,qEAAqE;AACnF,YAAQ,KAAK,CAAC;AAAA,EAChB;AACF;AAKA,SAAS,cAAsB;AAC7B,MAAI;AACF,UAAM,aAAS,oCAAS,sCAAsC,EAAE,UAAU,QAAQ,CAAC;AACnF,WAAO,OAAO,KAAK;AAAA,EACrB,QAAQ;AACN,WAAO;AAAA,EACT;AACF;AAEA,eAAe,eAA8B;AAC3C,MAAI,CAAC,YAAY,GAAG;AAClB,YAAQ,MAAM,6CAA6C;AAC3D,YAAQ,KAAK,CAAC;AAAA,EAChB;AAEA,MAAI,CAAC,kBAAkB,GAAG;AACxB,YAAQ,MAAM,wDAAwD;AACtE,YAAQ,KAAK,CAAC;AAAA,EAChB;AAGA,QAAM,gBAAY,+BAAQ,+BAAc,aAAe,CAAC;AACxD,QAAM,gBAAgB;AAAA,QACpB,wBAAK,WAAW,4BAA4B;AAAA,QAC5C,wBAAK,WAAW,+BAA+B;AAAA,QAC/C,wBAAK,QAAQ,IAAI,GAAG,yBAAyB;AAAA,EAC/C;AAEA,QAAM,YAAY,cAAc,KAAK,CAAC,UAAM,4BAAW,CAAC,CAAC;AAEzD,MAAI,CAAC,WAAW;AACd,YAAQ,MAAM,oEAAoE;AAClF,YAAQ,MAAM,kEAAkE;AAChF,YAAQ,MAAM,uDAAuD;AACrE,YAAQ,KAAK,CAAC;AAAA,EAChB;AAEA,UAAQ,IAAI,0BAA0B;AACtC,UAAQ,IAAI,4BAA4B;AAExC,QAAM,OAAO,YAAY;AACzB,UAAQ,IAAI,SAAS,IAAI,EAAE;AAC3B,UAAQ,IAAI,UAAU,cAAc,IAAI,EAAE;AAC1C,UAAQ,IAAI,SAAS,QAAQ,OAAO;AAAA,CAAI;AAGxC,UAAQ,IAAI,kBAAkB;AAC9B,QAAM,UAAU,UAAU,SAAS;AACnC,QAAM,gBAAgB,QAAQ,SAAS;AAEvC,UAAQ,IAAI,UAAU,cAAc,QAAQ,CAAC,CAAC,MAAM,QAAQ,OAAO,eAAe,CAAC;AAAA,CAAa;AAGhG,UAAQ,IAAI,wBAAwB;AACpC,QAAM,YAAY,aAAa;AAC/B,QAAM,SAAS,IAAI,iBAAiB,EAAE,UAAU,CAAC;AAEjD,QAAM,YAAY,YAAY,IAAI;AAClC,QAAM,OAAO,WAAW;AACxB,QAAM,WAAW,YAAY,IAAI,IAAI;AAErC,UAAQ,IAAI,eAAe,WAAW,KAAM,QAAQ,CAAC,CAAC;AAAA,CAAK;AAG3D,UAAQ,IAAI,gBAAgB;AAC5B,QAAM,OAAO,WAAW,QAAQ,MAAM,GAAG,OAAQ,CAAC,GAAG,IAAK;AAG1D,QAAM,OAAO;AACb,UAAQ,IAAI;AAAA,aAAgB,OAAO,IAAI,CAAC;AAAA,CAAa;AAErD,QAAM,QAAkB,CAAC;AAEzB,WAAS,IAAI,GAAG,IAAI,MAAM,KAAK;AAC7B,UAAM,SAAS,MAAM,OAAO,WAAW,SAAS,IAAK;AACrD,UAAM,KAAK,OAAO,UAAU;AAC5B,YAAQ,IAAI,SAAS,OAAO,IAAI,CAAC,CAAC,MAAM,OAAO,aAAa,KAAM,QAAQ,CAAC,CAAC,GAAG;AAAA,EACjF;AAEA,SAAO,QAAQ;AAGf,QAAM,UAAU,MAAM,OAAO,CAAC,GAAG,MAAM,IAAI,GAAG,CAAC,IAAI,MAAM;AACzD,QAAM,MAAM,UAAU,MAAO;AAC7B,QAAM,UAAU,IAAI;AAEpB,UAAQ,IAAI,kLAAiC;AAC7C,UAAQ,IAAI,SAAS;AACrB,UAAQ,IAAI,gLAA+B;AAC3C,UAAQ,IAAI,sBAAsB,cAAc,QAAQ,CAAC,CAAC,GAAG;AAC7D,UAAQ,IAAI,uBAAuB,UAAU,KAAM,QAAQ,CAAC,CAAC,GAAG;AAChE,UAAQ,IAAI,sBAAsB,IAAI,QAAQ,CAAC,CAAC,GAAG;AACnD,UAAQ,IAAI,sBAAsB,QAAQ,QAAQ,CAAC,CAAC,aAAa;AACjE,UAAQ,IAAI,EAAE;AACd,UAAQ,IAAI,+BAA0B,OAAO,SAAS,QAAQ,CAAC,CAAC,UAAU;AAC1E,UAAQ,IAAI,kLAAiC;AAC/C;AAEA,eAAe,OAAsB;AACnC,MAAI,CAAC,WAAW,YAAY,YAAY,YAAY,MAAM;AACxD,cAAU;AACV,YAAQ,KAAK,CAAC;AAAA,EAChB;AAEA,UAAQ,SAAS;AAAA,IACf,KAAK,YAAY;AACf,YAAM,QAAQ,KAAK,SAAS,SAAS;AACrC,cAAQ,IAAI,iCAAiC;AAC7C,cAAQ,IAAI,mCAAmC;AAE/C,UAAI;AACF,cAAM,cAAc,EAAE,MAAM,CAAC;AAAA,MAC/B,SAAS,OAAO;AACd,gBAAQ,MAAM,6BAAwB,iBAAiB,QAAQ,MAAM,UAAU,KAAK;AACpF,gBAAQ,KAAK,CAAC;AAAA,MAChB;AACA;AAAA,IACF;AAAA,IAEA,KAAK,aAAa;AAChB,YAAM,aAAa;AACnB;AAAA,IACF;AAAA,IAEA,KAAK,UAAU;AACb,YAAM,aAAa,kBAAkB;AAErC,cAAQ,IAAI,uBAAuB;AACnC,cAAQ,IAAI,uBAAuB;AACnC,cAAQ,IAAI,oBAAoB,mBAAmB,CAAC,EAAE;AACtD,cAAQ,IAAI,EAAE;AAEd,UAAI,YAAY;AACd,gBAAQ,IAAI,UAAK,cAAc,IAAI,KAAK,cAAc,IAAI,WAAW;AAAA,MACvE,OAAO;AACL,gBAAQ,IAAI,6BAAwB;AACpC,gBAAQ,IAAI,kCAAkC;AAAA,MAChD;AACA;AAAA,IACF;AAAA,IAEA,KAAK,QAAQ;AACX,cAAQ,IAAI,mBAAmB,CAAC;AAChC;AAAA,IACF;AAAA,IAEA;AACE,cAAQ,MAAM,oBAAoB,OAAO,EAAE;AAC3C,gBAAU;AACV,cAAQ,KAAK,CAAC;AAAA,EAClB;AACF;AAEA,KAAK,EAAE,MAAM,CAAC,UAAmB;AAC/B,UAAQ,MAAM,gBAAgB,KAAK;AACnC,UAAQ,KAAK,CAAC;AAChB,CAAC;","names":["import_node_fs","import_node_path"]}
1
+ {"version":3,"sources":["../node_modules/.pnpm/tsup@8.5.1_jiti@2.6.1_postcss@8.5.6_typescript@5.9.3_yaml@2.8.2/node_modules/tsup/assets/cjs_shims.js","../src/cli.ts","../src/download.ts","../src/index.ts"],"sourcesContent":["// Shim globals in cjs bundle\n// There's a weird bug that esbuild will always inject importMetaUrl\n// if we export it as `const importMetaUrl = ... __filename ...`\n// But using a function will not cause this issue\n\nconst getImportMetaUrl = () => \n typeof document === \"undefined\" \n ? new URL(`file:${__filename}`).href \n : (document.currentScript && document.currentScript.tagName.toUpperCase() === 'SCRIPT') \n ? document.currentScript.src \n : new URL(\"main.js\", document.baseURI).href;\n\nexport const importMetaUrl = /* @__PURE__ */ getImportMetaUrl()\n","#!/usr/bin/env node\n/**\n * CLI for whisper-coreml\n */\n\nimport { existsSync } from \"node:fs\"\nimport { join, dirname } from \"node:path\"\nimport { fileURLToPath } from \"node:url\"\nimport { execSync } from \"node:child_process\"\n\nimport {\n downloadModel,\n getDefaultModelDir,\n getModelPath,\n isModelDownloaded,\n WHISPER_MODEL\n} from \"./download.js\"\n\nimport { WhisperAsrEngine, isAvailable } from \"./index.js\"\n\nconst args = process.argv.slice(2)\nconst command = args[0]\n\nfunction printHelp(): void {\n console.log(`\nwhisper-coreml CLI\n\nCommands:\n download [--force] Download the Whisper model (~1.5GB)\n benchmark Run performance benchmark\n status Check if model is downloaded\n path Print model directory path\n\nThe large-v3-turbo model offers the best speed/quality ratio\nand is the reason to choose Whisper over Parakeet.\n\nOptions:\n --force Force re-download even if model exists\n --help, -h Show this help message\n`)\n}\n\n/**\n * Load audio file as Float32Array using ffmpeg\n */\nfunction loadAudio(path: string): Float32Array {\n try {\n // Convert to raw PCM using ffmpeg\n const pcmBuffer = execSync(`ffmpeg -i \"${path}\" -ar 16000 -ac 1 -f s16le -acodec pcm_s16le -`, {\n encoding: \"buffer\",\n stdio: [\"pipe\", \"pipe\", \"pipe\"],\n maxBuffer: 50 * 1024 * 1024\n })\n const pcm16 = new Int16Array(pcmBuffer.buffer, pcmBuffer.byteOffset, pcmBuffer.length / 2)\n const samples = new Float32Array(pcm16.length)\n for (let i = 0; i < pcm16.length; i++) {\n samples[i] = (pcm16[i] ?? 0) / 32768.0\n }\n return samples\n } catch {\n console.error(\"ffmpeg is required for benchmark. Install with: brew install ffmpeg\")\n process.exit(1)\n }\n}\n\n/**\n * Get chip name from system\n */\nfunction getChipName(): string {\n try {\n const output = execSync(\"sysctl -n machdep.cpu.brand_string\", { encoding: \"utf-8\" })\n return output.trim()\n } catch {\n return \"Unknown\"\n }\n}\n\nasync function runBenchmark(): Promise<void> {\n if (!isAvailable()) {\n console.error(\"Benchmark requires macOS with Apple Silicon\")\n process.exit(1)\n }\n\n if (!isModelDownloaded()) {\n console.error(\"Model not downloaded. Run: npx whisper-coreml download\")\n process.exit(1)\n }\n\n // Find the benchmark audio file\n const __dirname = dirname(fileURLToPath(import.meta.url))\n const possiblePaths = [\n join(__dirname, \"../test/fixtures/brian.ogg\"),\n join(__dirname, \"../../test/fixtures/brian.ogg\"),\n join(process.cwd(), \"test/fixtures/brian.ogg\")\n ]\n\n const audioPath = possiblePaths.find((p) => existsSync(p))\n\n if (!audioPath) {\n console.error(\"Benchmark audio not found. Clone the repository to run benchmarks:\")\n console.error(\" git clone https://github.com/sebastian-software/whisper-coreml\")\n console.error(\" cd whisper-coreml && pnpm install && pnpm benchmark\")\n process.exit(1)\n }\n\n console.log(\"Whisper CoreML Benchmark\")\n console.log(\"========================\\n\")\n\n const chip = getChipName()\n console.log(`Chip: ${chip}`)\n console.log(`Model: ${WHISPER_MODEL.name}`)\n console.log(`Node: ${process.version}\\n`)\n\n // Load audio\n console.log(\"Loading audio...\")\n const samples = loadAudio(audioPath)\n const audioDuration = samples.length / 16000\n\n console.log(`Audio: ${audioDuration.toFixed(1)}s (${samples.length.toLocaleString()} samples)\\n`)\n\n // Initialize engine\n console.log(\"Initializing engine...\")\n const modelPath = getModelPath()\n const engine = new WhisperAsrEngine({ modelPath })\n\n const initStart = performance.now()\n await engine.initialize()\n const initTime = performance.now() - initStart\n\n console.log(`Init time: ${(initTime / 1000).toFixed(2)}s\\n`)\n\n // Warm-up run\n console.log(\"Warm-up run...\")\n await engine.transcribe(samples.slice(0, 16000 * 5), 16000) // 5 seconds\n\n // Benchmark runs\n const runs = 3\n console.log(`\\nBenchmark (${String(runs)} runs)...\\n`)\n\n const times: number[] = []\n\n for (let i = 0; i < runs; i++) {\n const result = await engine.transcribe(samples, 16000)\n times.push(result.durationMs)\n console.log(` Run ${String(i + 1)}: ${(result.durationMs / 1000).toFixed(3)}s`)\n }\n\n engine.cleanup()\n\n // Calculate stats\n const avgTime = times.reduce((a, b) => a + b, 0) / times.length\n const rtf = avgTime / 1000 / audioDuration\n const speedup = 1 / rtf\n\n console.log(\"\\n─────────────────────────────\")\n console.log(\"Results\")\n console.log(\"─────────────────────────────\")\n console.log(`Audio duration: ${audioDuration.toFixed(1)}s`)\n console.log(`Avg process time: ${(avgTime / 1000).toFixed(3)}s`)\n console.log(`Real-time factor: ${rtf.toFixed(4)}x`)\n console.log(`Speed: ${speedup.toFixed(0)}x real-time`)\n console.log(\"\")\n console.log(`→ 1 hour of audio in ~${(3600 / speedup).toFixed(0)} seconds`)\n console.log(\"─────────────────────────────\\n\")\n}\n\nasync function main(): Promise<void> {\n if (!command || command === \"--help\" || command === \"-h\") {\n printHelp()\n process.exit(0)\n }\n\n switch (command) {\n case \"download\": {\n const force = args.includes(\"--force\")\n console.log(\"Whisper CoreML Model Downloader\")\n console.log(\"===============================\\n\")\n\n try {\n await downloadModel({ force })\n } catch (error) {\n console.error(\"\\n✗ Download failed:\", error instanceof Error ? error.message : error)\n process.exit(1)\n }\n break\n }\n\n case \"benchmark\": {\n await runBenchmark()\n break\n }\n\n case \"status\": {\n const downloaded = isModelDownloaded()\n\n console.log(\"Whisper CoreML Status\")\n console.log(\"=====================\")\n console.log(`Model directory: ${getDefaultModelDir()}`)\n console.log(\"\")\n\n if (downloaded) {\n console.log(`✓ ${WHISPER_MODEL.name} (${WHISPER_MODEL.size}) - Ready`)\n } else {\n console.log(\"✗ Model not downloaded\")\n console.log(\"Run: npx whisper-coreml download\")\n }\n break\n }\n\n case \"path\": {\n console.log(getDefaultModelDir())\n break\n }\n\n default:\n console.error(`Unknown command: ${command}`)\n printHelp()\n process.exit(1)\n }\n}\n\nmain().catch((error: unknown) => {\n console.error(\"Fatal error:\", error)\n process.exit(1)\n})\n","/**\n * Model download functionality for whisper-coreml\n *\n * Note: We only support large-v3-turbo as it's the only Whisper model\n * that offers better quality than Parakeet while maintaining reasonable speed.\n */\n\nimport { existsSync, mkdirSync, writeFileSync, rmSync } from \"node:fs\"\nimport { homedir } from \"node:os\"\nimport { join, dirname } from \"node:path\"\n\n/**\n * Whisper large-v3-turbo model info\n * This is the only model we support as it offers the best speed/quality ratio\n * and is the main reason to choose Whisper over Parakeet.\n */\nexport const WHISPER_MODEL = {\n name: \"large-v3-turbo\",\n size: \"1.5 GB\",\n languages: \"99 languages\",\n url: \"https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-large-v3-turbo.bin\"\n} as const\n\n/**\n * Default model directory in user's cache\n */\nexport function getDefaultModelDir(): string {\n return join(homedir(), \".cache\", \"whisper-coreml\", \"models\")\n}\n\n/**\n * Get the path to the model\n */\nexport function getModelPath(modelDir?: string): string {\n const dir = modelDir ?? getDefaultModelDir()\n return join(dir, `ggml-${WHISPER_MODEL.name}.bin`)\n}\n\n/**\n * Check if the model is downloaded\n */\nexport function isModelDownloaded(modelDir?: string): boolean {\n const modelPath = getModelPath(modelDir)\n return existsSync(modelPath)\n}\n\ninterface DownloadProgress {\n downloadedBytes: number\n totalBytes: number\n percent: number\n}\n\nexport interface DownloadOptions {\n /** Target directory for model (default: ~/.cache/whisper-coreml/models) */\n modelDir?: string\n\n /** Progress callback */\n onProgress?: (progress: DownloadProgress) => void\n\n /** Force re-download even if model exists */\n force?: boolean\n}\n\n/* v8 ignore start - network I/O */\n\n/**\n * Download the Whisper large-v3-turbo model from Hugging Face\n */\nexport async function downloadModel(options: DownloadOptions = {}): Promise<string> {\n const modelDir = options.modelDir ?? getDefaultModelDir()\n const modelPath = getModelPath(modelDir)\n\n if (!options.force && existsSync(modelPath)) {\n return modelPath\n }\n\n // Clean up partial downloads\n if (existsSync(modelPath)) {\n rmSync(modelPath)\n }\n\n mkdirSync(dirname(modelPath), { recursive: true })\n\n console.log(`Downloading Whisper ${WHISPER_MODEL.name} (${WHISPER_MODEL.size})...`)\n console.log(`Source: ${WHISPER_MODEL.url}`)\n console.log(`Target: ${modelPath}`)\n\n const response = await fetch(WHISPER_MODEL.url)\n if (!response.ok) {\n throw new Error(`Failed to download model: ${response.statusText}`)\n }\n\n const contentLength = response.headers.get(\"content-length\")\n const totalBytes = contentLength ? parseInt(contentLength, 10) : 0\n\n const reader = response.body?.getReader()\n if (!reader) {\n throw new Error(\"Failed to get response body reader\")\n }\n\n const chunks: Uint8Array[] = []\n let downloadedBytes = 0\n\n // eslint-disable-next-line @typescript-eslint/no-unnecessary-condition\n while (true) {\n const result = await reader.read()\n if (result.done) {\n break\n }\n\n const chunk = result.value as Uint8Array\n chunks.push(chunk)\n downloadedBytes += chunk.length\n\n const percent = totalBytes > 0 ? Math.round((downloadedBytes / totalBytes) * 100) : 0\n\n if (options.onProgress) {\n options.onProgress({\n downloadedBytes,\n totalBytes,\n percent\n })\n }\n\n // Progress indicator\n process.stdout.write(\n `\\rProgress: ${String(percent)}% (${formatBytes(downloadedBytes)}/${formatBytes(totalBytes)})`\n )\n }\n\n // Combine chunks and write to file\n const buffer = Buffer.concat(chunks)\n writeFileSync(modelPath, buffer)\n\n console.log(\"\\n✓ Model downloaded successfully!\")\n return modelPath\n}\n\n/* v8 ignore stop */\n\n/**\n * Format bytes to human readable string\n * @internal Exported for testing\n */\nexport function formatBytes(bytes: number): string {\n if (bytes < 1024) {\n return `${String(bytes)} B`\n }\n if (bytes < 1024 * 1024) {\n return `${(bytes / 1024).toFixed(1)} KB`\n }\n if (bytes < 1024 * 1024 * 1024) {\n return `${(bytes / 1024 / 1024).toFixed(1)} MB`\n }\n return `${(bytes / 1024 / 1024 / 1024).toFixed(2)} GB`\n}\n","/**\n * whisper-coreml\n *\n * OpenAI Whisper ASR for Node.js with CoreML/ANE acceleration on Apple Silicon.\n * Based on whisper.cpp with Apple Neural Engine support.\n *\n * Uses the large-v3-turbo model exclusively, as it offers the best speed/quality\n * ratio and is the main reason to choose Whisper over Parakeet.\n */\n\n// Dynamic require for loading native addon (works in both ESM and CJS)\n// eslint-disable-next-line @typescript-eslint/no-require-imports\nconst bindingsModule = require(\"bindings\") as (name: string) => unknown\n\n/**\n * Native addon interface\n */\ninterface NativeAddon {\n initialize(options: { modelPath: string; language?: string; threads?: number }): boolean\n isInitialized(): boolean\n transcribe(samples: Float32Array, sampleRate: number): NativeTranscriptionResult\n cleanup(): void\n getVersion(): { addon: string; whisper: string; coreml: string }\n}\n\ninterface NativeTranscriptionResult {\n text: string\n language: string\n durationMs: number\n segments: {\n startMs: number\n endMs: number\n text: string\n confidence: number\n }[]\n}\n\n/* v8 ignore start - platform checks and native addon loading */\n\n/**\n * Load the native addon\n */\nfunction loadAddon(): NativeAddon {\n if (process.platform !== \"darwin\") {\n throw new Error(\"whisper-coreml is only supported on macOS\")\n }\n\n try {\n return bindingsModule(\"whisper_asr\") as NativeAddon\n } catch (error) {\n const message = error instanceof Error ? error.message : String(error)\n throw new Error(`Failed to load Whisper ASR native addon: ${message}`)\n }\n}\n\n/* v8 ignore stop */\n\nlet addon: NativeAddon | null = null\nlet loadError: Error | null = null\n\nfunction getAddon(): NativeAddon {\n if (!addon) {\n try {\n addon = loadAddon()\n } catch (error) {\n // v8 ignore - error path only reached with corrupted installation\n loadError = error instanceof Error ? error : new Error(String(error))\n throw error\n }\n }\n return addon\n}\n\n/**\n * Check if Whisper ASR is available on this platform\n */\nexport function isAvailable(): boolean {\n return process.platform === \"darwin\" && process.arch === \"arm64\"\n}\n\n/**\n * Get the load error if the addon failed to load\n */\nexport function getLoadError(): Error | null {\n return loadError\n}\n\n/**\n * Transcription segment with timestamps\n */\nexport interface TranscriptionSegment {\n /** Start time in milliseconds */\n startMs: number\n /** End time in milliseconds */\n endMs: number\n /** Transcribed text for this segment */\n text: string\n /** Confidence score (0-1) */\n confidence: number\n}\n\n/**\n * Transcription result\n */\nexport interface TranscriptionResult {\n /** Full transcribed text */\n text: string\n /** Detected or specified language (ISO code) */\n language: string\n /** Processing time in milliseconds */\n durationMs: number\n /** Individual segments with timestamps */\n segments: TranscriptionSegment[]\n}\n\n/**\n * Whisper ASR engine options\n */\nexport interface WhisperAsrOptions {\n /** Path to the Whisper model file (ggml format) */\n modelPath: string\n /** Language code (e.g., \"en\", \"de\", \"fr\") or \"auto\" for auto-detection */\n language?: string\n /** Number of threads (0 = auto) */\n threads?: number\n}\n\n/**\n * Whisper ASR Engine with CoreML acceleration\n *\n * Uses the large-v3-turbo model for best speed/quality balance.\n *\n * @example\n * ```typescript\n * import { WhisperAsrEngine, getModelPath } from \"whisper-coreml\"\n *\n * const engine = new WhisperAsrEngine({\n * modelPath: getModelPath()\n * })\n *\n * await engine.initialize()\n * const result = await engine.transcribe(audioSamples, 16000)\n * console.log(result.text)\n * ```\n */\nexport class WhisperAsrEngine {\n private options: WhisperAsrOptions\n private initialized = false\n\n constructor(options: WhisperAsrOptions) {\n this.options = options\n }\n\n /* v8 ignore start - native addon calls, tested via E2E */\n\n /**\n * Initialize the Whisper engine\n * This loads the model into memory - may take a few seconds.\n */\n initialize(): Promise<void> {\n if (this.initialized) {\n return Promise.resolve()\n }\n\n const nativeAddon = getAddon()\n const success = nativeAddon.initialize({\n modelPath: this.options.modelPath,\n language: this.options.language ?? \"auto\",\n threads: this.options.threads ?? 0\n })\n\n if (!success) {\n return Promise.reject(new Error(\"Failed to initialize Whisper engine\"))\n }\n\n this.initialized = true\n return Promise.resolve()\n }\n\n /**\n * Check if the engine is ready for transcription\n */\n isReady(): boolean {\n if (!this.initialized) {\n return false\n }\n try {\n return getAddon().isInitialized()\n } catch {\n return false\n }\n }\n\n /**\n * Transcribe audio samples\n *\n * @param samples - Float32Array of audio samples (mono, 16kHz)\n * @param sampleRate - Sample rate in Hz (default: 16000)\n * @returns Transcription result with text and segments\n */\n transcribe(samples: Float32Array, sampleRate = 16000): Promise<TranscriptionResult> {\n if (!this.initialized) {\n return Promise.reject(new Error(\"Whisper engine not initialized. Call initialize() first.\"))\n }\n\n const result = getAddon().transcribe(samples, sampleRate)\n\n return Promise.resolve({\n text: result.text,\n language: result.language,\n durationMs: result.durationMs,\n segments: result.segments\n })\n }\n\n /**\n * Clean up resources and unload the model\n */\n cleanup(): void {\n if (this.initialized) {\n try {\n getAddon().cleanup()\n } catch {\n // Ignore cleanup errors\n }\n this.initialized = false\n }\n }\n\n /**\n * Get version information\n */\n getVersion(): { addon: string; whisper: string; coreml: string } {\n return getAddon().getVersion()\n }\n\n /* v8 ignore stop */\n}\n\n// Re-export download utilities\nexport {\n downloadModel,\n formatBytes,\n getDefaultModelDir,\n getModelPath,\n isModelDownloaded,\n WHISPER_MODEL,\n type DownloadOptions\n} from \"./download.js\"\n"],"mappings":";;;;AAKA,IAAM,mBAAmB,MACvB,OAAO,aAAa,cAChB,IAAI,IAAI,QAAQ,UAAU,EAAE,EAAE,OAC7B,SAAS,iBAAiB,SAAS,cAAc,QAAQ,YAAY,MAAM,WAC1E,SAAS,cAAc,MACvB,IAAI,IAAI,WAAW,SAAS,OAAO,EAAE;AAEtC,IAAM,gBAAgC,iCAAiB;;;ACP9D,IAAAA,kBAA2B;AAC3B,IAAAC,oBAA8B;AAC9B,sBAA8B;AAC9B,gCAAyB;;;ACDzB,qBAA6D;AAC7D,qBAAwB;AACxB,uBAA8B;AAOvB,IAAM,gBAAgB;AAAA,EAC3B,MAAM;AAAA,EACN,MAAM;AAAA,EACN,WAAW;AAAA,EACX,KAAK;AACP;AAKO,SAAS,qBAA6B;AAC3C,aAAO,2BAAK,wBAAQ,GAAG,UAAU,kBAAkB,QAAQ;AAC7D;AAKO,SAAS,aAAa,UAA2B;AACtD,QAAM,MAAM,YAAY,mBAAmB;AAC3C,aAAO,uBAAK,KAAK,QAAQ,cAAc,IAAI,MAAM;AACnD;AAKO,SAAS,kBAAkB,UAA4B;AAC5D,QAAM,YAAY,aAAa,QAAQ;AACvC,aAAO,2BAAW,SAAS;AAC7B;AAwBA,eAAsB,cAAc,UAA2B,CAAC,GAAoB;AAClF,QAAM,WAAW,QAAQ,YAAY,mBAAmB;AACxD,QAAM,YAAY,aAAa,QAAQ;AAEvC,MAAI,CAAC,QAAQ,aAAS,2BAAW,SAAS,GAAG;AAC3C,WAAO;AAAA,EACT;AAGA,UAAI,2BAAW,SAAS,GAAG;AACzB,+BAAO,SAAS;AAAA,EAClB;AAEA,oCAAU,0BAAQ,SAAS,GAAG,EAAE,WAAW,KAAK,CAAC;AAEjD,UAAQ,IAAI,uBAAuB,cAAc,IAAI,KAAK,cAAc,IAAI,MAAM;AAClF,UAAQ,IAAI,WAAW,cAAc,GAAG,EAAE;AAC1C,UAAQ,IAAI,WAAW,SAAS,EAAE;AAElC,QAAM,WAAW,MAAM,MAAM,cAAc,GAAG;AAC9C,MAAI,CAAC,SAAS,IAAI;AAChB,UAAM,IAAI,MAAM,6BAA6B,SAAS,UAAU,EAAE;AAAA,EACpE;AAEA,QAAM,gBAAgB,SAAS,QAAQ,IAAI,gBAAgB;AAC3D,QAAM,aAAa,gBAAgB,SAAS,eAAe,EAAE,IAAI;AAEjE,QAAM,SAAS,SAAS,MAAM,UAAU;AACxC,MAAI,CAAC,QAAQ;AACX,UAAM,IAAI,MAAM,oCAAoC;AAAA,EACtD;AAEA,QAAM,SAAuB,CAAC;AAC9B,MAAI,kBAAkB;AAGtB,SAAO,MAAM;AACX,UAAM,SAAS,MAAM,OAAO,KAAK;AACjC,QAAI,OAAO,MAAM;AACf;AAAA,IACF;AAEA,UAAM,QAAQ,OAAO;AACrB,WAAO,KAAK,KAAK;AACjB,uBAAmB,MAAM;AAEzB,UAAM,UAAU,aAAa,IAAI,KAAK,MAAO,kBAAkB,aAAc,GAAG,IAAI;AAEpF,QAAI,QAAQ,YAAY;AACtB,cAAQ,WAAW;AAAA,QACjB;AAAA,QACA;AAAA,QACA;AAAA,MACF,CAAC;AAAA,IACH;AAGA,YAAQ,OAAO;AAAA,MACb,eAAe,OAAO,OAAO,CAAC,MAAM,YAAY,eAAe,CAAC,IAAI,YAAY,UAAU,CAAC;AAAA,IAC7F;AAAA,EACF;AAGA,QAAM,SAAS,OAAO,OAAO,MAAM;AACnC,oCAAc,WAAW,MAAM;AAE/B,UAAQ,IAAI,yCAAoC;AAChD,SAAO;AACT;AAQO,SAAS,YAAY,OAAuB;AACjD,MAAI,QAAQ,MAAM;AAChB,WAAO,GAAG,OAAO,KAAK,CAAC;AAAA,EACzB;AACA,MAAI,QAAQ,OAAO,MAAM;AACvB,WAAO,IAAI,QAAQ,MAAM,QAAQ,CAAC,CAAC;AAAA,EACrC;AACA,MAAI,QAAQ,OAAO,OAAO,MAAM;AAC9B,WAAO,IAAI,QAAQ,OAAO,MAAM,QAAQ,CAAC,CAAC;AAAA,EAC5C;AACA,SAAO,IAAI,QAAQ,OAAO,OAAO,MAAM,QAAQ,CAAC,CAAC;AACnD;;;AC/IA,IAAM,iBAAiB,QAAQ,UAAU;AA8BzC,SAAS,YAAyB;AAChC,MAAI,QAAQ,aAAa,UAAU;AACjC,UAAM,IAAI,MAAM,2CAA2C;AAAA,EAC7D;AAEA,MAAI;AACF,WAAO,eAAe,aAAa;AAAA,EACrC,SAAS,OAAO;AACd,UAAM,UAAU,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK;AACrE,UAAM,IAAI,MAAM,4CAA4C,OAAO,EAAE;AAAA,EACvE;AACF;AAIA,IAAI,QAA4B;AAChC,IAAI,YAA0B;AAE9B,SAAS,WAAwB;AAC/B,MAAI,CAAC,OAAO;AACV,QAAI;AACF,cAAQ,UAAU;AAAA,IACpB,SAAS,OAAO;AAEd,kBAAY,iBAAiB,QAAQ,QAAQ,IAAI,MAAM,OAAO,KAAK,CAAC;AACpE,YAAM;AAAA,IACR;AAAA,EACF;AACA,SAAO;AACT;AAKO,SAAS,cAAuB;AACrC,SAAO,QAAQ,aAAa,YAAY,QAAQ,SAAS;AAC3D;AAmEO,IAAM,mBAAN,MAAuB;AAAA,EACpB;AAAA,EACA,cAAc;AAAA,EAEtB,YAAY,SAA4B;AACtC,SAAK,UAAU;AAAA,EACjB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,aAA4B;AAC1B,QAAI,KAAK,aAAa;AACpB,aAAO,QAAQ,QAAQ;AAAA,IACzB;AAEA,UAAM,cAAc,SAAS;AAC7B,UAAM,UAAU,YAAY,WAAW;AAAA,MACrC,WAAW,KAAK,QAAQ;AAAA,MACxB,UAAU,KAAK,QAAQ,YAAY;AAAA,MACnC,SAAS,KAAK,QAAQ,WAAW;AAAA,IACnC,CAAC;AAED,QAAI,CAAC,SAAS;AACZ,aAAO,QAAQ,OAAO,IAAI,MAAM,qCAAqC,CAAC;AAAA,IACxE;AAEA,SAAK,cAAc;AACnB,WAAO,QAAQ,QAAQ;AAAA,EACzB;AAAA;AAAA;AAAA;AAAA,EAKA,UAAmB;AACjB,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO;AAAA,IACT;AACA,QAAI;AACF,aAAO,SAAS,EAAE,cAAc;AAAA,IAClC,QAAQ;AACN,aAAO;AAAA,IACT;AAAA,EACF;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EASA,WAAW,SAAuB,aAAa,MAAqC;AAClF,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO,QAAQ,OAAO,IAAI,MAAM,0DAA0D,CAAC;AAAA,IAC7F;AAEA,UAAM,SAAS,SAAS,EAAE,WAAW,SAAS,UAAU;AAExD,WAAO,QAAQ,QAAQ;AAAA,MACrB,MAAM,OAAO;AAAA,MACb,UAAU,OAAO;AAAA,MACjB,YAAY,OAAO;AAAA,MACnB,UAAU,OAAO;AAAA,IACnB,CAAC;AAAA,EACH;AAAA;AAAA;AAAA;AAAA,EAKA,UAAgB;AACd,QAAI,KAAK,aAAa;AACpB,UAAI;AACF,iBAAS,EAAE,QAAQ;AAAA,MACrB,QAAQ;AAAA,MAER;AACA,WAAK,cAAc;AAAA,IACrB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,aAAiE;AAC/D,WAAO,SAAS,EAAE,WAAW;AAAA,EAC/B;AAAA;AAGF;;;AFzNA,IAAM,OAAO,QAAQ,KAAK,MAAM,CAAC;AACjC,IAAM,UAAU,KAAK,CAAC;AAEtB,SAAS,YAAkB;AACzB,UAAQ,IAAI;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,CAeb;AACD;AAKA,SAAS,UAAU,MAA4B;AAC7C,MAAI;AAEF,UAAM,gBAAY,oCAAS,cAAc,IAAI,kDAAkD;AAAA,MAC7F,UAAU;AAAA,MACV,OAAO,CAAC,QAAQ,QAAQ,MAAM;AAAA,MAC9B,WAAW,KAAK,OAAO;AAAA,IACzB,CAAC;AACD,UAAM,QAAQ,IAAI,WAAW,UAAU,QAAQ,UAAU,YAAY,UAAU,SAAS,CAAC;AACzF,UAAM,UAAU,IAAI,aAAa,MAAM,MAAM;AAC7C,aAAS,IAAI,GAAG,IAAI,MAAM,QAAQ,KAAK;AACrC,cAAQ,CAAC,KAAK,MAAM,CAAC,KAAK,KAAK;AAAA,IACjC;AACA,WAAO;AAAA,EACT,QAAQ;AACN,YAAQ,MAAM,qEAAqE;AACnF,YAAQ,KAAK,CAAC;AAAA,EAChB;AACF;AAKA,SAAS,cAAsB;AAC7B,MAAI;AACF,UAAM,aAAS,oCAAS,sCAAsC,EAAE,UAAU,QAAQ,CAAC;AACnF,WAAO,OAAO,KAAK;AAAA,EACrB,QAAQ;AACN,WAAO;AAAA,EACT;AACF;AAEA,eAAe,eAA8B;AAC3C,MAAI,CAAC,YAAY,GAAG;AAClB,YAAQ,MAAM,6CAA6C;AAC3D,YAAQ,KAAK,CAAC;AAAA,EAChB;AAEA,MAAI,CAAC,kBAAkB,GAAG;AACxB,YAAQ,MAAM,wDAAwD;AACtE,YAAQ,KAAK,CAAC;AAAA,EAChB;AAGA,QAAM,gBAAY,+BAAQ,+BAAc,aAAe,CAAC;AACxD,QAAM,gBAAgB;AAAA,QACpB,wBAAK,WAAW,4BAA4B;AAAA,QAC5C,wBAAK,WAAW,+BAA+B;AAAA,QAC/C,wBAAK,QAAQ,IAAI,GAAG,yBAAyB;AAAA,EAC/C;AAEA,QAAM,YAAY,cAAc,KAAK,CAAC,UAAM,4BAAW,CAAC,CAAC;AAEzD,MAAI,CAAC,WAAW;AACd,YAAQ,MAAM,oEAAoE;AAClF,YAAQ,MAAM,kEAAkE;AAChF,YAAQ,MAAM,uDAAuD;AACrE,YAAQ,KAAK,CAAC;AAAA,EAChB;AAEA,UAAQ,IAAI,0BAA0B;AACtC,UAAQ,IAAI,4BAA4B;AAExC,QAAM,OAAO,YAAY;AACzB,UAAQ,IAAI,SAAS,IAAI,EAAE;AAC3B,UAAQ,IAAI,UAAU,cAAc,IAAI,EAAE;AAC1C,UAAQ,IAAI,SAAS,QAAQ,OAAO;AAAA,CAAI;AAGxC,UAAQ,IAAI,kBAAkB;AAC9B,QAAM,UAAU,UAAU,SAAS;AACnC,QAAM,gBAAgB,QAAQ,SAAS;AAEvC,UAAQ,IAAI,UAAU,cAAc,QAAQ,CAAC,CAAC,MAAM,QAAQ,OAAO,eAAe,CAAC;AAAA,CAAa;AAGhG,UAAQ,IAAI,wBAAwB;AACpC,QAAM,YAAY,aAAa;AAC/B,QAAM,SAAS,IAAI,iBAAiB,EAAE,UAAU,CAAC;AAEjD,QAAM,YAAY,YAAY,IAAI;AAClC,QAAM,OAAO,WAAW;AACxB,QAAM,WAAW,YAAY,IAAI,IAAI;AAErC,UAAQ,IAAI,eAAe,WAAW,KAAM,QAAQ,CAAC,CAAC;AAAA,CAAK;AAG3D,UAAQ,IAAI,gBAAgB;AAC5B,QAAM,OAAO,WAAW,QAAQ,MAAM,GAAG,OAAQ,CAAC,GAAG,IAAK;AAG1D,QAAM,OAAO;AACb,UAAQ,IAAI;AAAA,aAAgB,OAAO,IAAI,CAAC;AAAA,CAAa;AAErD,QAAM,QAAkB,CAAC;AAEzB,WAAS,IAAI,GAAG,IAAI,MAAM,KAAK;AAC7B,UAAM,SAAS,MAAM,OAAO,WAAW,SAAS,IAAK;AACrD,UAAM,KAAK,OAAO,UAAU;AAC5B,YAAQ,IAAI,SAAS,OAAO,IAAI,CAAC,CAAC,MAAM,OAAO,aAAa,KAAM,QAAQ,CAAC,CAAC,GAAG;AAAA,EACjF;AAEA,SAAO,QAAQ;AAGf,QAAM,UAAU,MAAM,OAAO,CAAC,GAAG,MAAM,IAAI,GAAG,CAAC,IAAI,MAAM;AACzD,QAAM,MAAM,UAAU,MAAO;AAC7B,QAAM,UAAU,IAAI;AAEpB,UAAQ,IAAI,kLAAiC;AAC7C,UAAQ,IAAI,SAAS;AACrB,UAAQ,IAAI,gLAA+B;AAC3C,UAAQ,IAAI,sBAAsB,cAAc,QAAQ,CAAC,CAAC,GAAG;AAC7D,UAAQ,IAAI,uBAAuB,UAAU,KAAM,QAAQ,CAAC,CAAC,GAAG;AAChE,UAAQ,IAAI,sBAAsB,IAAI,QAAQ,CAAC,CAAC,GAAG;AACnD,UAAQ,IAAI,sBAAsB,QAAQ,QAAQ,CAAC,CAAC,aAAa;AACjE,UAAQ,IAAI,EAAE;AACd,UAAQ,IAAI,+BAA0B,OAAO,SAAS,QAAQ,CAAC,CAAC,UAAU;AAC1E,UAAQ,IAAI,kLAAiC;AAC/C;AAEA,eAAe,OAAsB;AACnC,MAAI,CAAC,WAAW,YAAY,YAAY,YAAY,MAAM;AACxD,cAAU;AACV,YAAQ,KAAK,CAAC;AAAA,EAChB;AAEA,UAAQ,SAAS;AAAA,IACf,KAAK,YAAY;AACf,YAAM,QAAQ,KAAK,SAAS,SAAS;AACrC,cAAQ,IAAI,iCAAiC;AAC7C,cAAQ,IAAI,mCAAmC;AAE/C,UAAI;AACF,cAAM,cAAc,EAAE,MAAM,CAAC;AAAA,MAC/B,SAAS,OAAO;AACd,gBAAQ,MAAM,6BAAwB,iBAAiB,QAAQ,MAAM,UAAU,KAAK;AACpF,gBAAQ,KAAK,CAAC;AAAA,MAChB;AACA;AAAA,IACF;AAAA,IAEA,KAAK,aAAa;AAChB,YAAM,aAAa;AACnB;AAAA,IACF;AAAA,IAEA,KAAK,UAAU;AACb,YAAM,aAAa,kBAAkB;AAErC,cAAQ,IAAI,uBAAuB;AACnC,cAAQ,IAAI,uBAAuB;AACnC,cAAQ,IAAI,oBAAoB,mBAAmB,CAAC,EAAE;AACtD,cAAQ,IAAI,EAAE;AAEd,UAAI,YAAY;AACd,gBAAQ,IAAI,UAAK,cAAc,IAAI,KAAK,cAAc,IAAI,WAAW;AAAA,MACvE,OAAO;AACL,gBAAQ,IAAI,6BAAwB;AACpC,gBAAQ,IAAI,kCAAkC;AAAA,MAChD;AACA;AAAA,IACF;AAAA,IAEA,KAAK,QAAQ;AACX,cAAQ,IAAI,mBAAmB,CAAC;AAChC;AAAA,IACF;AAAA,IAEA;AACE,cAAQ,MAAM,oBAAoB,OAAO,EAAE;AAC3C,gBAAU;AACV,cAAQ,KAAK,CAAC;AAAA,EAClB;AACF;AAEA,KAAK,EAAE,MAAM,CAAC,UAAmB;AAC/B,UAAQ,MAAM,gBAAgB,KAAK;AACnC,UAAQ,KAAK,CAAC;AAChB,CAAC;","names":["import_node_fs","import_node_path"]}
package/dist/cli.js CHANGED
@@ -7,7 +7,7 @@ import {
7
7
  getModelPath,
8
8
  isAvailable,
9
9
  isModelDownloaded
10
- } from "./chunk-MOQMN4DX.js";
10
+ } from "./chunk-V34ZDICO.js";
11
11
 
12
12
  // src/cli.ts
13
13
  import { existsSync } from "fs";
package/dist/index.cjs CHANGED
@@ -167,7 +167,6 @@ var WhisperAsrEngine = class {
167
167
  const success = nativeAddon.initialize({
168
168
  modelPath: this.options.modelPath,
169
169
  language: this.options.language ?? "auto",
170
- translate: this.options.translate ?? false,
171
170
  threads: this.options.threads ?? 0
172
171
  });
173
172
  if (!success) {
@@ -1 +1 @@
1
- {"version":3,"sources":["../src/index.ts","../src/download.ts"],"sourcesContent":["/**\n * whisper-coreml\n *\n * OpenAI Whisper ASR for Node.js with CoreML/ANE acceleration on Apple Silicon.\n * Based on whisper.cpp with Apple Neural Engine support.\n *\n * Uses the large-v3-turbo model exclusively, as it offers the best speed/quality\n * ratio and is the main reason to choose Whisper over Parakeet.\n */\n\n// Dynamic require for loading native addon (works in both ESM and CJS)\n// eslint-disable-next-line @typescript-eslint/no-require-imports\nconst bindingsModule = require(\"bindings\") as (name: string) => unknown\n\n/**\n * Native addon interface\n */\ninterface NativeAddon {\n initialize(options: {\n modelPath: string\n language?: string\n translate?: boolean\n threads?: number\n }): boolean\n isInitialized(): boolean\n transcribe(samples: Float32Array, sampleRate: number): NativeTranscriptionResult\n cleanup(): void\n getVersion(): { addon: string; whisper: string; coreml: string }\n}\n\ninterface NativeTranscriptionResult {\n text: string\n language: string\n durationMs: number\n segments: {\n startMs: number\n endMs: number\n text: string\n confidence: number\n }[]\n}\n\n/* v8 ignore start - platform checks and native addon loading */\n\n/**\n * Load the native addon\n */\nfunction loadAddon(): NativeAddon {\n if (process.platform !== \"darwin\") {\n throw new Error(\"whisper-coreml is only supported on macOS\")\n }\n\n try {\n return bindingsModule(\"whisper_asr\") as NativeAddon\n } catch (error) {\n const message = error instanceof Error ? error.message : String(error)\n throw new Error(`Failed to load Whisper ASR native addon: ${message}`)\n }\n}\n\n/* v8 ignore stop */\n\nlet addon: NativeAddon | null = null\nlet loadError: Error | null = null\n\nfunction getAddon(): NativeAddon {\n if (!addon) {\n try {\n addon = loadAddon()\n } catch (error) {\n loadError = error instanceof Error ? error : new Error(String(error))\n throw error\n }\n }\n return addon\n}\n\n/**\n * Check if Whisper ASR is available on this platform\n */\nexport function isAvailable(): boolean {\n return process.platform === \"darwin\" && process.arch === \"arm64\"\n}\n\n/**\n * Get the load error if the addon failed to load\n */\nexport function getLoadError(): Error | null {\n return loadError\n}\n\n/**\n * Transcription segment with timestamps\n */\nexport interface TranscriptionSegment {\n /** Start time in milliseconds */\n startMs: number\n /** End time in milliseconds */\n endMs: number\n /** Transcribed text for this segment */\n text: string\n /** Confidence score (0-1) */\n confidence: number\n}\n\n/**\n * Transcription result\n */\nexport interface TranscriptionResult {\n /** Full transcribed text */\n text: string\n /** Detected or specified language (ISO code) */\n language: string\n /** Processing time in milliseconds */\n durationMs: number\n /** Individual segments with timestamps */\n segments: TranscriptionSegment[]\n}\n\n/**\n * Whisper ASR engine options\n */\nexport interface WhisperAsrOptions {\n /** Path to the Whisper model file (ggml format) */\n modelPath: string\n /** Language code (e.g., \"en\", \"de\", \"fr\") or \"auto\" for auto-detection */\n language?: string\n /** Translate to English (default: false) */\n translate?: boolean\n /** Number of threads (0 = auto) */\n threads?: number\n}\n\n/**\n * Whisper ASR Engine with CoreML acceleration\n *\n * Uses the large-v3-turbo model for best speed/quality balance.\n *\n * @example\n * ```typescript\n * import { WhisperAsrEngine, getModelPath } from \"whisper-coreml\"\n *\n * const engine = new WhisperAsrEngine({\n * modelPath: getModelPath()\n * })\n *\n * await engine.initialize()\n * const result = await engine.transcribe(audioSamples, 16000)\n * console.log(result.text)\n * ```\n */\nexport class WhisperAsrEngine {\n private options: WhisperAsrOptions\n private initialized = false\n\n constructor(options: WhisperAsrOptions) {\n this.options = options\n }\n\n /* v8 ignore start - native addon calls, tested via E2E */\n\n /**\n * Initialize the Whisper engine\n * This loads the model into memory - may take a few seconds.\n */\n initialize(): Promise<void> {\n if (this.initialized) {\n return Promise.resolve()\n }\n\n const nativeAddon = getAddon()\n const success = nativeAddon.initialize({\n modelPath: this.options.modelPath,\n language: this.options.language ?? \"auto\",\n translate: this.options.translate ?? false,\n threads: this.options.threads ?? 0\n })\n\n if (!success) {\n return Promise.reject(new Error(\"Failed to initialize Whisper engine\"))\n }\n\n this.initialized = true\n return Promise.resolve()\n }\n\n /**\n * Check if the engine is ready for transcription\n */\n isReady(): boolean {\n if (!this.initialized) {\n return false\n }\n try {\n return getAddon().isInitialized()\n } catch {\n return false\n }\n }\n\n /**\n * Transcribe audio samples\n *\n * @param samples - Float32Array of audio samples (mono, 16kHz)\n * @param sampleRate - Sample rate in Hz (default: 16000)\n * @returns Transcription result with text and segments\n */\n transcribe(samples: Float32Array, sampleRate = 16000): Promise<TranscriptionResult> {\n if (!this.initialized) {\n return Promise.reject(new Error(\"Whisper engine not initialized. Call initialize() first.\"))\n }\n\n const result = getAddon().transcribe(samples, sampleRate)\n\n return Promise.resolve({\n text: result.text,\n language: result.language,\n durationMs: result.durationMs,\n segments: result.segments\n })\n }\n\n /**\n * Clean up resources and unload the model\n */\n cleanup(): void {\n if (this.initialized) {\n try {\n getAddon().cleanup()\n } catch {\n // Ignore cleanup errors\n }\n this.initialized = false\n }\n }\n\n /**\n * Get version information\n */\n getVersion(): { addon: string; whisper: string; coreml: string } {\n return getAddon().getVersion()\n }\n\n /* v8 ignore stop */\n}\n\n// Re-export download utilities\nexport {\n downloadModel,\n formatBytes,\n getDefaultModelDir,\n getModelPath,\n isModelDownloaded,\n WHISPER_MODEL,\n type DownloadOptions\n} from \"./download.js\"\n","/**\n * Model download functionality for whisper-coreml\n *\n * Note: We only support large-v3-turbo as it's the only Whisper model\n * that offers better quality than Parakeet while maintaining reasonable speed.\n */\n\nimport { existsSync, mkdirSync, writeFileSync, rmSync } from \"node:fs\"\nimport { homedir } from \"node:os\"\nimport { join, dirname } from \"node:path\"\n\n/**\n * Whisper large-v3-turbo model info\n * This is the only model we support as it offers the best speed/quality ratio\n * and is the main reason to choose Whisper over Parakeet.\n */\nexport const WHISPER_MODEL = {\n name: \"large-v3-turbo\",\n size: \"1.5 GB\",\n languages: \"99 languages\",\n url: \"https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-large-v3-turbo.bin\"\n} as const\n\n/**\n * Default model directory in user's cache\n */\nexport function getDefaultModelDir(): string {\n return join(homedir(), \".cache\", \"whisper-coreml\", \"models\")\n}\n\n/**\n * Get the path to the model\n */\nexport function getModelPath(modelDir?: string): string {\n const dir = modelDir ?? getDefaultModelDir()\n return join(dir, `ggml-${WHISPER_MODEL.name}.bin`)\n}\n\n/**\n * Check if the model is downloaded\n */\nexport function isModelDownloaded(modelDir?: string): boolean {\n const modelPath = getModelPath(modelDir)\n return existsSync(modelPath)\n}\n\ninterface DownloadProgress {\n downloadedBytes: number\n totalBytes: number\n percent: number\n}\n\nexport interface DownloadOptions {\n /** Target directory for model (default: ~/.cache/whisper-coreml/models) */\n modelDir?: string\n\n /** Progress callback */\n onProgress?: (progress: DownloadProgress) => void\n\n /** Force re-download even if model exists */\n force?: boolean\n}\n\n/* v8 ignore start - network I/O */\n\n/**\n * Download the Whisper large-v3-turbo model from Hugging Face\n */\nexport async function downloadModel(options: DownloadOptions = {}): Promise<string> {\n const modelDir = options.modelDir ?? getDefaultModelDir()\n const modelPath = getModelPath(modelDir)\n\n if (!options.force && existsSync(modelPath)) {\n return modelPath\n }\n\n // Clean up partial downloads\n if (existsSync(modelPath)) {\n rmSync(modelPath)\n }\n\n mkdirSync(dirname(modelPath), { recursive: true })\n\n console.log(`Downloading Whisper ${WHISPER_MODEL.name} (${WHISPER_MODEL.size})...`)\n console.log(`Source: ${WHISPER_MODEL.url}`)\n console.log(`Target: ${modelPath}`)\n\n const response = await fetch(WHISPER_MODEL.url)\n if (!response.ok) {\n throw new Error(`Failed to download model: ${response.statusText}`)\n }\n\n const contentLength = response.headers.get(\"content-length\")\n const totalBytes = contentLength ? parseInt(contentLength, 10) : 0\n\n const reader = response.body?.getReader()\n if (!reader) {\n throw new Error(\"Failed to get response body reader\")\n }\n\n const chunks: Uint8Array[] = []\n let downloadedBytes = 0\n\n // eslint-disable-next-line @typescript-eslint/no-unnecessary-condition\n while (true) {\n const result = await reader.read()\n if (result.done) {\n break\n }\n\n const chunk = result.value as Uint8Array\n chunks.push(chunk)\n downloadedBytes += chunk.length\n\n const percent = totalBytes > 0 ? Math.round((downloadedBytes / totalBytes) * 100) : 0\n\n if (options.onProgress) {\n options.onProgress({\n downloadedBytes,\n totalBytes,\n percent\n })\n }\n\n // Progress indicator\n process.stdout.write(\n `\\rProgress: ${String(percent)}% (${formatBytes(downloadedBytes)}/${formatBytes(totalBytes)})`\n )\n }\n\n // Combine chunks and write to file\n const buffer = Buffer.concat(chunks)\n writeFileSync(modelPath, buffer)\n\n console.log(\"\\n✓ Model downloaded successfully!\")\n return modelPath\n}\n\n/* v8 ignore stop */\n\n/**\n * Format bytes to human readable string\n * @internal Exported for testing\n */\nexport function formatBytes(bytes: number): string {\n if (bytes < 1024) {\n return `${String(bytes)} B`\n }\n if (bytes < 1024 * 1024) {\n return `${(bytes / 1024).toFixed(1)} KB`\n }\n if (bytes < 1024 * 1024 * 1024) {\n return `${(bytes / 1024 / 1024).toFixed(1)} MB`\n }\n return `${(bytes / 1024 / 1024 / 1024).toFixed(2)} GB`\n}\n"],"mappings":";;;;;;;;;;;;;;;;;;;;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;;;ACOA,qBAA6D;AAC7D,qBAAwB;AACxB,uBAA8B;AAOvB,IAAM,gBAAgB;AAAA,EAC3B,MAAM;AAAA,EACN,MAAM;AAAA,EACN,WAAW;AAAA,EACX,KAAK;AACP;AAKO,SAAS,qBAA6B;AAC3C,aAAO,2BAAK,wBAAQ,GAAG,UAAU,kBAAkB,QAAQ;AAC7D;AAKO,SAAS,aAAa,UAA2B;AACtD,QAAM,MAAM,YAAY,mBAAmB;AAC3C,aAAO,uBAAK,KAAK,QAAQ,cAAc,IAAI,MAAM;AACnD;AAKO,SAAS,kBAAkB,UAA4B;AAC5D,QAAM,YAAY,aAAa,QAAQ;AACvC,aAAO,2BAAW,SAAS;AAC7B;AAwBA,eAAsB,cAAc,UAA2B,CAAC,GAAoB;AAClF,QAAM,WAAW,QAAQ,YAAY,mBAAmB;AACxD,QAAM,YAAY,aAAa,QAAQ;AAEvC,MAAI,CAAC,QAAQ,aAAS,2BAAW,SAAS,GAAG;AAC3C,WAAO;AAAA,EACT;AAGA,UAAI,2BAAW,SAAS,GAAG;AACzB,+BAAO,SAAS;AAAA,EAClB;AAEA,oCAAU,0BAAQ,SAAS,GAAG,EAAE,WAAW,KAAK,CAAC;AAEjD,UAAQ,IAAI,uBAAuB,cAAc,IAAI,KAAK,cAAc,IAAI,MAAM;AAClF,UAAQ,IAAI,WAAW,cAAc,GAAG,EAAE;AAC1C,UAAQ,IAAI,WAAW,SAAS,EAAE;AAElC,QAAM,WAAW,MAAM,MAAM,cAAc,GAAG;AAC9C,MAAI,CAAC,SAAS,IAAI;AAChB,UAAM,IAAI,MAAM,6BAA6B,SAAS,UAAU,EAAE;AAAA,EACpE;AAEA,QAAM,gBAAgB,SAAS,QAAQ,IAAI,gBAAgB;AAC3D,QAAM,aAAa,gBAAgB,SAAS,eAAe,EAAE,IAAI;AAEjE,QAAM,SAAS,SAAS,MAAM,UAAU;AACxC,MAAI,CAAC,QAAQ;AACX,UAAM,IAAI,MAAM,oCAAoC;AAAA,EACtD;AAEA,QAAM,SAAuB,CAAC;AAC9B,MAAI,kBAAkB;AAGtB,SAAO,MAAM;AACX,UAAM,SAAS,MAAM,OAAO,KAAK;AACjC,QAAI,OAAO,MAAM;AACf;AAAA,IACF;AAEA,UAAM,QAAQ,OAAO;AACrB,WAAO,KAAK,KAAK;AACjB,uBAAmB,MAAM;AAEzB,UAAM,UAAU,aAAa,IAAI,KAAK,MAAO,kBAAkB,aAAc,GAAG,IAAI;AAEpF,QAAI,QAAQ,YAAY;AACtB,cAAQ,WAAW;AAAA,QACjB;AAAA,QACA;AAAA,QACA;AAAA,MACF,CAAC;AAAA,IACH;AAGA,YAAQ,OAAO;AAAA,MACb,eAAe,OAAO,OAAO,CAAC,MAAM,YAAY,eAAe,CAAC,IAAI,YAAY,UAAU,CAAC;AAAA,IAC7F;AAAA,EACF;AAGA,QAAM,SAAS,OAAO,OAAO,MAAM;AACnC,oCAAc,WAAW,MAAM;AAE/B,UAAQ,IAAI,yCAAoC;AAChD,SAAO;AACT;AAQO,SAAS,YAAY,OAAuB;AACjD,MAAI,QAAQ,MAAM;AAChB,WAAO,GAAG,OAAO,KAAK,CAAC;AAAA,EACzB;AACA,MAAI,QAAQ,OAAO,MAAM;AACvB,WAAO,IAAI,QAAQ,MAAM,QAAQ,CAAC,CAAC;AAAA,EACrC;AACA,MAAI,QAAQ,OAAO,OAAO,MAAM;AAC9B,WAAO,IAAI,QAAQ,OAAO,MAAM,QAAQ,CAAC,CAAC;AAAA,EAC5C;AACA,SAAO,IAAI,QAAQ,OAAO,OAAO,MAAM,QAAQ,CAAC,CAAC;AACnD;;;AD/IA,IAAM,iBAAiB,QAAQ,UAAU;AAmCzC,SAAS,YAAyB;AAChC,MAAI,QAAQ,aAAa,UAAU;AACjC,UAAM,IAAI,MAAM,2CAA2C;AAAA,EAC7D;AAEA,MAAI;AACF,WAAO,eAAe,aAAa;AAAA,EACrC,SAAS,OAAO;AACd,UAAM,UAAU,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK;AACrE,UAAM,IAAI,MAAM,4CAA4C,OAAO,EAAE;AAAA,EACvE;AACF;AAIA,IAAI,QAA4B;AAChC,IAAI,YAA0B;AAE9B,SAAS,WAAwB;AAC/B,MAAI,CAAC,OAAO;AACV,QAAI;AACF,cAAQ,UAAU;AAAA,IACpB,SAAS,OAAO;AACd,kBAAY,iBAAiB,QAAQ,QAAQ,IAAI,MAAM,OAAO,KAAK,CAAC;AACpE,YAAM;AAAA,IACR;AAAA,EACF;AACA,SAAO;AACT;AAKO,SAAS,cAAuB;AACrC,SAAO,QAAQ,aAAa,YAAY,QAAQ,SAAS;AAC3D;AAKO,SAAS,eAA6B;AAC3C,SAAO;AACT;AA8DO,IAAM,mBAAN,MAAuB;AAAA,EACpB;AAAA,EACA,cAAc;AAAA,EAEtB,YAAY,SAA4B;AACtC,SAAK,UAAU;AAAA,EACjB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,aAA4B;AAC1B,QAAI,KAAK,aAAa;AACpB,aAAO,QAAQ,QAAQ;AAAA,IACzB;AAEA,UAAM,cAAc,SAAS;AAC7B,UAAM,UAAU,YAAY,WAAW;AAAA,MACrC,WAAW,KAAK,QAAQ;AAAA,MACxB,UAAU,KAAK,QAAQ,YAAY;AAAA,MACnC,WAAW,KAAK,QAAQ,aAAa;AAAA,MACrC,SAAS,KAAK,QAAQ,WAAW;AAAA,IACnC,CAAC;AAED,QAAI,CAAC,SAAS;AACZ,aAAO,QAAQ,OAAO,IAAI,MAAM,qCAAqC,CAAC;AAAA,IACxE;AAEA,SAAK,cAAc;AACnB,WAAO,QAAQ,QAAQ;AAAA,EACzB;AAAA;AAAA;AAAA;AAAA,EAKA,UAAmB;AACjB,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO;AAAA,IACT;AACA,QAAI;AACF,aAAO,SAAS,EAAE,cAAc;AAAA,IAClC,QAAQ;AACN,aAAO;AAAA,IACT;AAAA,EACF;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EASA,WAAW,SAAuB,aAAa,MAAqC;AAClF,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO,QAAQ,OAAO,IAAI,MAAM,0DAA0D,CAAC;AAAA,IAC7F;AAEA,UAAM,SAAS,SAAS,EAAE,WAAW,SAAS,UAAU;AAExD,WAAO,QAAQ,QAAQ;AAAA,MACrB,MAAM,OAAO;AAAA,MACb,UAAU,OAAO;AAAA,MACjB,YAAY,OAAO;AAAA,MACnB,UAAU,OAAO;AAAA,IACnB,CAAC;AAAA,EACH;AAAA;AAAA;AAAA;AAAA,EAKA,UAAgB;AACd,QAAI,KAAK,aAAa;AACpB,UAAI;AACF,iBAAS,EAAE,QAAQ;AAAA,MACrB,QAAQ;AAAA,MAER;AACA,WAAK,cAAc;AAAA,IACrB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,aAAiE;AAC/D,WAAO,SAAS,EAAE,WAAW;AAAA,EAC/B;AAAA;AAGF;","names":[]}
1
+ {"version":3,"sources":["../src/index.ts","../src/download.ts"],"sourcesContent":["/**\n * whisper-coreml\n *\n * OpenAI Whisper ASR for Node.js with CoreML/ANE acceleration on Apple Silicon.\n * Based on whisper.cpp with Apple Neural Engine support.\n *\n * Uses the large-v3-turbo model exclusively, as it offers the best speed/quality\n * ratio and is the main reason to choose Whisper over Parakeet.\n */\n\n// Dynamic require for loading native addon (works in both ESM and CJS)\n// eslint-disable-next-line @typescript-eslint/no-require-imports\nconst bindingsModule = require(\"bindings\") as (name: string) => unknown\n\n/**\n * Native addon interface\n */\ninterface NativeAddon {\n initialize(options: { modelPath: string; language?: string; threads?: number }): boolean\n isInitialized(): boolean\n transcribe(samples: Float32Array, sampleRate: number): NativeTranscriptionResult\n cleanup(): void\n getVersion(): { addon: string; whisper: string; coreml: string }\n}\n\ninterface NativeTranscriptionResult {\n text: string\n language: string\n durationMs: number\n segments: {\n startMs: number\n endMs: number\n text: string\n confidence: number\n }[]\n}\n\n/* v8 ignore start - platform checks and native addon loading */\n\n/**\n * Load the native addon\n */\nfunction loadAddon(): NativeAddon {\n if (process.platform !== \"darwin\") {\n throw new Error(\"whisper-coreml is only supported on macOS\")\n }\n\n try {\n return bindingsModule(\"whisper_asr\") as NativeAddon\n } catch (error) {\n const message = error instanceof Error ? error.message : String(error)\n throw new Error(`Failed to load Whisper ASR native addon: ${message}`)\n }\n}\n\n/* v8 ignore stop */\n\nlet addon: NativeAddon | null = null\nlet loadError: Error | null = null\n\nfunction getAddon(): NativeAddon {\n if (!addon) {\n try {\n addon = loadAddon()\n } catch (error) {\n // v8 ignore - error path only reached with corrupted installation\n loadError = error instanceof Error ? error : new Error(String(error))\n throw error\n }\n }\n return addon\n}\n\n/**\n * Check if Whisper ASR is available on this platform\n */\nexport function isAvailable(): boolean {\n return process.platform === \"darwin\" && process.arch === \"arm64\"\n}\n\n/**\n * Get the load error if the addon failed to load\n */\nexport function getLoadError(): Error | null {\n return loadError\n}\n\n/**\n * Transcription segment with timestamps\n */\nexport interface TranscriptionSegment {\n /** Start time in milliseconds */\n startMs: number\n /** End time in milliseconds */\n endMs: number\n /** Transcribed text for this segment */\n text: string\n /** Confidence score (0-1) */\n confidence: number\n}\n\n/**\n * Transcription result\n */\nexport interface TranscriptionResult {\n /** Full transcribed text */\n text: string\n /** Detected or specified language (ISO code) */\n language: string\n /** Processing time in milliseconds */\n durationMs: number\n /** Individual segments with timestamps */\n segments: TranscriptionSegment[]\n}\n\n/**\n * Whisper ASR engine options\n */\nexport interface WhisperAsrOptions {\n /** Path to the Whisper model file (ggml format) */\n modelPath: string\n /** Language code (e.g., \"en\", \"de\", \"fr\") or \"auto\" for auto-detection */\n language?: string\n /** Number of threads (0 = auto) */\n threads?: number\n}\n\n/**\n * Whisper ASR Engine with CoreML acceleration\n *\n * Uses the large-v3-turbo model for best speed/quality balance.\n *\n * @example\n * ```typescript\n * import { WhisperAsrEngine, getModelPath } from \"whisper-coreml\"\n *\n * const engine = new WhisperAsrEngine({\n * modelPath: getModelPath()\n * })\n *\n * await engine.initialize()\n * const result = await engine.transcribe(audioSamples, 16000)\n * console.log(result.text)\n * ```\n */\nexport class WhisperAsrEngine {\n private options: WhisperAsrOptions\n private initialized = false\n\n constructor(options: WhisperAsrOptions) {\n this.options = options\n }\n\n /* v8 ignore start - native addon calls, tested via E2E */\n\n /**\n * Initialize the Whisper engine\n * This loads the model into memory - may take a few seconds.\n */\n initialize(): Promise<void> {\n if (this.initialized) {\n return Promise.resolve()\n }\n\n const nativeAddon = getAddon()\n const success = nativeAddon.initialize({\n modelPath: this.options.modelPath,\n language: this.options.language ?? \"auto\",\n threads: this.options.threads ?? 0\n })\n\n if (!success) {\n return Promise.reject(new Error(\"Failed to initialize Whisper engine\"))\n }\n\n this.initialized = true\n return Promise.resolve()\n }\n\n /**\n * Check if the engine is ready for transcription\n */\n isReady(): boolean {\n if (!this.initialized) {\n return false\n }\n try {\n return getAddon().isInitialized()\n } catch {\n return false\n }\n }\n\n /**\n * Transcribe audio samples\n *\n * @param samples - Float32Array of audio samples (mono, 16kHz)\n * @param sampleRate - Sample rate in Hz (default: 16000)\n * @returns Transcription result with text and segments\n */\n transcribe(samples: Float32Array, sampleRate = 16000): Promise<TranscriptionResult> {\n if (!this.initialized) {\n return Promise.reject(new Error(\"Whisper engine not initialized. Call initialize() first.\"))\n }\n\n const result = getAddon().transcribe(samples, sampleRate)\n\n return Promise.resolve({\n text: result.text,\n language: result.language,\n durationMs: result.durationMs,\n segments: result.segments\n })\n }\n\n /**\n * Clean up resources and unload the model\n */\n cleanup(): void {\n if (this.initialized) {\n try {\n getAddon().cleanup()\n } catch {\n // Ignore cleanup errors\n }\n this.initialized = false\n }\n }\n\n /**\n * Get version information\n */\n getVersion(): { addon: string; whisper: string; coreml: string } {\n return getAddon().getVersion()\n }\n\n /* v8 ignore stop */\n}\n\n// Re-export download utilities\nexport {\n downloadModel,\n formatBytes,\n getDefaultModelDir,\n getModelPath,\n isModelDownloaded,\n WHISPER_MODEL,\n type DownloadOptions\n} from \"./download.js\"\n","/**\n * Model download functionality for whisper-coreml\n *\n * Note: We only support large-v3-turbo as it's the only Whisper model\n * that offers better quality than Parakeet while maintaining reasonable speed.\n */\n\nimport { existsSync, mkdirSync, writeFileSync, rmSync } from \"node:fs\"\nimport { homedir } from \"node:os\"\nimport { join, dirname } from \"node:path\"\n\n/**\n * Whisper large-v3-turbo model info\n * This is the only model we support as it offers the best speed/quality ratio\n * and is the main reason to choose Whisper over Parakeet.\n */\nexport const WHISPER_MODEL = {\n name: \"large-v3-turbo\",\n size: \"1.5 GB\",\n languages: \"99 languages\",\n url: \"https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-large-v3-turbo.bin\"\n} as const\n\n/**\n * Default model directory in user's cache\n */\nexport function getDefaultModelDir(): string {\n return join(homedir(), \".cache\", \"whisper-coreml\", \"models\")\n}\n\n/**\n * Get the path to the model\n */\nexport function getModelPath(modelDir?: string): string {\n const dir = modelDir ?? getDefaultModelDir()\n return join(dir, `ggml-${WHISPER_MODEL.name}.bin`)\n}\n\n/**\n * Check if the model is downloaded\n */\nexport function isModelDownloaded(modelDir?: string): boolean {\n const modelPath = getModelPath(modelDir)\n return existsSync(modelPath)\n}\n\ninterface DownloadProgress {\n downloadedBytes: number\n totalBytes: number\n percent: number\n}\n\nexport interface DownloadOptions {\n /** Target directory for model (default: ~/.cache/whisper-coreml/models) */\n modelDir?: string\n\n /** Progress callback */\n onProgress?: (progress: DownloadProgress) => void\n\n /** Force re-download even if model exists */\n force?: boolean\n}\n\n/* v8 ignore start - network I/O */\n\n/**\n * Download the Whisper large-v3-turbo model from Hugging Face\n */\nexport async function downloadModel(options: DownloadOptions = {}): Promise<string> {\n const modelDir = options.modelDir ?? getDefaultModelDir()\n const modelPath = getModelPath(modelDir)\n\n if (!options.force && existsSync(modelPath)) {\n return modelPath\n }\n\n // Clean up partial downloads\n if (existsSync(modelPath)) {\n rmSync(modelPath)\n }\n\n mkdirSync(dirname(modelPath), { recursive: true })\n\n console.log(`Downloading Whisper ${WHISPER_MODEL.name} (${WHISPER_MODEL.size})...`)\n console.log(`Source: ${WHISPER_MODEL.url}`)\n console.log(`Target: ${modelPath}`)\n\n const response = await fetch(WHISPER_MODEL.url)\n if (!response.ok) {\n throw new Error(`Failed to download model: ${response.statusText}`)\n }\n\n const contentLength = response.headers.get(\"content-length\")\n const totalBytes = contentLength ? parseInt(contentLength, 10) : 0\n\n const reader = response.body?.getReader()\n if (!reader) {\n throw new Error(\"Failed to get response body reader\")\n }\n\n const chunks: Uint8Array[] = []\n let downloadedBytes = 0\n\n // eslint-disable-next-line @typescript-eslint/no-unnecessary-condition\n while (true) {\n const result = await reader.read()\n if (result.done) {\n break\n }\n\n const chunk = result.value as Uint8Array\n chunks.push(chunk)\n downloadedBytes += chunk.length\n\n const percent = totalBytes > 0 ? Math.round((downloadedBytes / totalBytes) * 100) : 0\n\n if (options.onProgress) {\n options.onProgress({\n downloadedBytes,\n totalBytes,\n percent\n })\n }\n\n // Progress indicator\n process.stdout.write(\n `\\rProgress: ${String(percent)}% (${formatBytes(downloadedBytes)}/${formatBytes(totalBytes)})`\n )\n }\n\n // Combine chunks and write to file\n const buffer = Buffer.concat(chunks)\n writeFileSync(modelPath, buffer)\n\n console.log(\"\\n✓ Model downloaded successfully!\")\n return modelPath\n}\n\n/* v8 ignore stop */\n\n/**\n * Format bytes to human readable string\n * @internal Exported for testing\n */\nexport function formatBytes(bytes: number): string {\n if (bytes < 1024) {\n return `${String(bytes)} B`\n }\n if (bytes < 1024 * 1024) {\n return `${(bytes / 1024).toFixed(1)} KB`\n }\n if (bytes < 1024 * 1024 * 1024) {\n return `${(bytes / 1024 / 1024).toFixed(1)} MB`\n }\n return `${(bytes / 1024 / 1024 / 1024).toFixed(2)} GB`\n}\n"],"mappings":";;;;;;;;;;;;;;;;;;;;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;;;ACOA,qBAA6D;AAC7D,qBAAwB;AACxB,uBAA8B;AAOvB,IAAM,gBAAgB;AAAA,EAC3B,MAAM;AAAA,EACN,MAAM;AAAA,EACN,WAAW;AAAA,EACX,KAAK;AACP;AAKO,SAAS,qBAA6B;AAC3C,aAAO,2BAAK,wBAAQ,GAAG,UAAU,kBAAkB,QAAQ;AAC7D;AAKO,SAAS,aAAa,UAA2B;AACtD,QAAM,MAAM,YAAY,mBAAmB;AAC3C,aAAO,uBAAK,KAAK,QAAQ,cAAc,IAAI,MAAM;AACnD;AAKO,SAAS,kBAAkB,UAA4B;AAC5D,QAAM,YAAY,aAAa,QAAQ;AACvC,aAAO,2BAAW,SAAS;AAC7B;AAwBA,eAAsB,cAAc,UAA2B,CAAC,GAAoB;AAClF,QAAM,WAAW,QAAQ,YAAY,mBAAmB;AACxD,QAAM,YAAY,aAAa,QAAQ;AAEvC,MAAI,CAAC,QAAQ,aAAS,2BAAW,SAAS,GAAG;AAC3C,WAAO;AAAA,EACT;AAGA,UAAI,2BAAW,SAAS,GAAG;AACzB,+BAAO,SAAS;AAAA,EAClB;AAEA,oCAAU,0BAAQ,SAAS,GAAG,EAAE,WAAW,KAAK,CAAC;AAEjD,UAAQ,IAAI,uBAAuB,cAAc,IAAI,KAAK,cAAc,IAAI,MAAM;AAClF,UAAQ,IAAI,WAAW,cAAc,GAAG,EAAE;AAC1C,UAAQ,IAAI,WAAW,SAAS,EAAE;AAElC,QAAM,WAAW,MAAM,MAAM,cAAc,GAAG;AAC9C,MAAI,CAAC,SAAS,IAAI;AAChB,UAAM,IAAI,MAAM,6BAA6B,SAAS,UAAU,EAAE;AAAA,EACpE;AAEA,QAAM,gBAAgB,SAAS,QAAQ,IAAI,gBAAgB;AAC3D,QAAM,aAAa,gBAAgB,SAAS,eAAe,EAAE,IAAI;AAEjE,QAAM,SAAS,SAAS,MAAM,UAAU;AACxC,MAAI,CAAC,QAAQ;AACX,UAAM,IAAI,MAAM,oCAAoC;AAAA,EACtD;AAEA,QAAM,SAAuB,CAAC;AAC9B,MAAI,kBAAkB;AAGtB,SAAO,MAAM;AACX,UAAM,SAAS,MAAM,OAAO,KAAK;AACjC,QAAI,OAAO,MAAM;AACf;AAAA,IACF;AAEA,UAAM,QAAQ,OAAO;AACrB,WAAO,KAAK,KAAK;AACjB,uBAAmB,MAAM;AAEzB,UAAM,UAAU,aAAa,IAAI,KAAK,MAAO,kBAAkB,aAAc,GAAG,IAAI;AAEpF,QAAI,QAAQ,YAAY;AACtB,cAAQ,WAAW;AAAA,QACjB;AAAA,QACA;AAAA,QACA;AAAA,MACF,CAAC;AAAA,IACH;AAGA,YAAQ,OAAO;AAAA,MACb,eAAe,OAAO,OAAO,CAAC,MAAM,YAAY,eAAe,CAAC,IAAI,YAAY,UAAU,CAAC;AAAA,IAC7F;AAAA,EACF;AAGA,QAAM,SAAS,OAAO,OAAO,MAAM;AACnC,oCAAc,WAAW,MAAM;AAE/B,UAAQ,IAAI,yCAAoC;AAChD,SAAO;AACT;AAQO,SAAS,YAAY,OAAuB;AACjD,MAAI,QAAQ,MAAM;AAChB,WAAO,GAAG,OAAO,KAAK,CAAC;AAAA,EACzB;AACA,MAAI,QAAQ,OAAO,MAAM;AACvB,WAAO,IAAI,QAAQ,MAAM,QAAQ,CAAC,CAAC;AAAA,EACrC;AACA,MAAI,QAAQ,OAAO,OAAO,MAAM;AAC9B,WAAO,IAAI,QAAQ,OAAO,MAAM,QAAQ,CAAC,CAAC;AAAA,EAC5C;AACA,SAAO,IAAI,QAAQ,OAAO,OAAO,MAAM,QAAQ,CAAC,CAAC;AACnD;;;AD/IA,IAAM,iBAAiB,QAAQ,UAAU;AA8BzC,SAAS,YAAyB;AAChC,MAAI,QAAQ,aAAa,UAAU;AACjC,UAAM,IAAI,MAAM,2CAA2C;AAAA,EAC7D;AAEA,MAAI;AACF,WAAO,eAAe,aAAa;AAAA,EACrC,SAAS,OAAO;AACd,UAAM,UAAU,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK;AACrE,UAAM,IAAI,MAAM,4CAA4C,OAAO,EAAE;AAAA,EACvE;AACF;AAIA,IAAI,QAA4B;AAChC,IAAI,YAA0B;AAE9B,SAAS,WAAwB;AAC/B,MAAI,CAAC,OAAO;AACV,QAAI;AACF,cAAQ,UAAU;AAAA,IACpB,SAAS,OAAO;AAEd,kBAAY,iBAAiB,QAAQ,QAAQ,IAAI,MAAM,OAAO,KAAK,CAAC;AACpE,YAAM;AAAA,IACR;AAAA,EACF;AACA,SAAO;AACT;AAKO,SAAS,cAAuB;AACrC,SAAO,QAAQ,aAAa,YAAY,QAAQ,SAAS;AAC3D;AAKO,SAAS,eAA6B;AAC3C,SAAO;AACT;AA4DO,IAAM,mBAAN,MAAuB;AAAA,EACpB;AAAA,EACA,cAAc;AAAA,EAEtB,YAAY,SAA4B;AACtC,SAAK,UAAU;AAAA,EACjB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,aAA4B;AAC1B,QAAI,KAAK,aAAa;AACpB,aAAO,QAAQ,QAAQ;AAAA,IACzB;AAEA,UAAM,cAAc,SAAS;AAC7B,UAAM,UAAU,YAAY,WAAW;AAAA,MACrC,WAAW,KAAK,QAAQ;AAAA,MACxB,UAAU,KAAK,QAAQ,YAAY;AAAA,MACnC,SAAS,KAAK,QAAQ,WAAW;AAAA,IACnC,CAAC;AAED,QAAI,CAAC,SAAS;AACZ,aAAO,QAAQ,OAAO,IAAI,MAAM,qCAAqC,CAAC;AAAA,IACxE;AAEA,SAAK,cAAc;AACnB,WAAO,QAAQ,QAAQ;AAAA,EACzB;AAAA;AAAA;AAAA;AAAA,EAKA,UAAmB;AACjB,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO;AAAA,IACT;AACA,QAAI;AACF,aAAO,SAAS,EAAE,cAAc;AAAA,IAClC,QAAQ;AACN,aAAO;AAAA,IACT;AAAA,EACF;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EASA,WAAW,SAAuB,aAAa,MAAqC;AAClF,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO,QAAQ,OAAO,IAAI,MAAM,0DAA0D,CAAC;AAAA,IAC7F;AAEA,UAAM,SAAS,SAAS,EAAE,WAAW,SAAS,UAAU;AAExD,WAAO,QAAQ,QAAQ;AAAA,MACrB,MAAM,OAAO;AAAA,MACb,UAAU,OAAO;AAAA,MACjB,YAAY,OAAO;AAAA,MACnB,UAAU,OAAO;AAAA,IACnB,CAAC;AAAA,EACH;AAAA;AAAA;AAAA;AAAA,EAKA,UAAgB;AACd,QAAI,KAAK,aAAa;AACpB,UAAI;AACF,iBAAS,EAAE,QAAQ;AAAA,MACrB,QAAQ;AAAA,MAER;AACA,WAAK,cAAc;AAAA,IACrB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,aAAiE;AAC/D,WAAO,SAAS,EAAE,WAAW;AAAA,EAC/B;AAAA;AAGF;","names":[]}
package/dist/index.d.cts CHANGED
@@ -101,8 +101,6 @@ interface WhisperAsrOptions {
101
101
  modelPath: string;
102
102
  /** Language code (e.g., "en", "de", "fr") or "auto" for auto-detection */
103
103
  language?: string;
104
- /** Translate to English (default: false) */
105
- translate?: boolean;
106
104
  /** Number of threads (0 = auto) */
107
105
  threads?: number;
108
106
  }
package/dist/index.d.ts CHANGED
@@ -101,8 +101,6 @@ interface WhisperAsrOptions {
101
101
  modelPath: string;
102
102
  /** Language code (e.g., "en", "de", "fr") or "auto" for auto-detection */
103
103
  language?: string;
104
- /** Translate to English (default: false) */
105
- translate?: boolean;
106
104
  /** Number of threads (0 = auto) */
107
105
  threads?: number;
108
106
  }
package/dist/index.js CHANGED
@@ -8,7 +8,7 @@ import {
8
8
  getModelPath,
9
9
  isAvailable,
10
10
  isModelDownloaded
11
- } from "./chunk-MOQMN4DX.js";
11
+ } from "./chunk-V34ZDICO.js";
12
12
  export {
13
13
  WHISPER_MODEL,
14
14
  WhisperAsrEngine,
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "whisper-coreml",
3
- "version": "0.2.0",
3
+ "version": "1.0.1",
4
4
  "description": "OpenAI Whisper ASR for Node.js with CoreML/ANE acceleration on Apple Silicon",
5
5
  "type": "module",
6
6
  "main": "./dist/index.cjs",
@@ -1 +0,0 @@
1
- {"version":3,"sources":["../src/download.ts","../src/index.ts"],"sourcesContent":["/**\n * Model download functionality for whisper-coreml\n *\n * Note: We only support large-v3-turbo as it's the only Whisper model\n * that offers better quality than Parakeet while maintaining reasonable speed.\n */\n\nimport { existsSync, mkdirSync, writeFileSync, rmSync } from \"node:fs\"\nimport { homedir } from \"node:os\"\nimport { join, dirname } from \"node:path\"\n\n/**\n * Whisper large-v3-turbo model info\n * This is the only model we support as it offers the best speed/quality ratio\n * and is the main reason to choose Whisper over Parakeet.\n */\nexport const WHISPER_MODEL = {\n name: \"large-v3-turbo\",\n size: \"1.5 GB\",\n languages: \"99 languages\",\n url: \"https://huggingface.co/ggerganov/whisper.cpp/resolve/main/ggml-large-v3-turbo.bin\"\n} as const\n\n/**\n * Default model directory in user's cache\n */\nexport function getDefaultModelDir(): string {\n return join(homedir(), \".cache\", \"whisper-coreml\", \"models\")\n}\n\n/**\n * Get the path to the model\n */\nexport function getModelPath(modelDir?: string): string {\n const dir = modelDir ?? getDefaultModelDir()\n return join(dir, `ggml-${WHISPER_MODEL.name}.bin`)\n}\n\n/**\n * Check if the model is downloaded\n */\nexport function isModelDownloaded(modelDir?: string): boolean {\n const modelPath = getModelPath(modelDir)\n return existsSync(modelPath)\n}\n\ninterface DownloadProgress {\n downloadedBytes: number\n totalBytes: number\n percent: number\n}\n\nexport interface DownloadOptions {\n /** Target directory for model (default: ~/.cache/whisper-coreml/models) */\n modelDir?: string\n\n /** Progress callback */\n onProgress?: (progress: DownloadProgress) => void\n\n /** Force re-download even if model exists */\n force?: boolean\n}\n\n/* v8 ignore start - network I/O */\n\n/**\n * Download the Whisper large-v3-turbo model from Hugging Face\n */\nexport async function downloadModel(options: DownloadOptions = {}): Promise<string> {\n const modelDir = options.modelDir ?? getDefaultModelDir()\n const modelPath = getModelPath(modelDir)\n\n if (!options.force && existsSync(modelPath)) {\n return modelPath\n }\n\n // Clean up partial downloads\n if (existsSync(modelPath)) {\n rmSync(modelPath)\n }\n\n mkdirSync(dirname(modelPath), { recursive: true })\n\n console.log(`Downloading Whisper ${WHISPER_MODEL.name} (${WHISPER_MODEL.size})...`)\n console.log(`Source: ${WHISPER_MODEL.url}`)\n console.log(`Target: ${modelPath}`)\n\n const response = await fetch(WHISPER_MODEL.url)\n if (!response.ok) {\n throw new Error(`Failed to download model: ${response.statusText}`)\n }\n\n const contentLength = response.headers.get(\"content-length\")\n const totalBytes = contentLength ? parseInt(contentLength, 10) : 0\n\n const reader = response.body?.getReader()\n if (!reader) {\n throw new Error(\"Failed to get response body reader\")\n }\n\n const chunks: Uint8Array[] = []\n let downloadedBytes = 0\n\n // eslint-disable-next-line @typescript-eslint/no-unnecessary-condition\n while (true) {\n const result = await reader.read()\n if (result.done) {\n break\n }\n\n const chunk = result.value as Uint8Array\n chunks.push(chunk)\n downloadedBytes += chunk.length\n\n const percent = totalBytes > 0 ? Math.round((downloadedBytes / totalBytes) * 100) : 0\n\n if (options.onProgress) {\n options.onProgress({\n downloadedBytes,\n totalBytes,\n percent\n })\n }\n\n // Progress indicator\n process.stdout.write(\n `\\rProgress: ${String(percent)}% (${formatBytes(downloadedBytes)}/${formatBytes(totalBytes)})`\n )\n }\n\n // Combine chunks and write to file\n const buffer = Buffer.concat(chunks)\n writeFileSync(modelPath, buffer)\n\n console.log(\"\\n✓ Model downloaded successfully!\")\n return modelPath\n}\n\n/* v8 ignore stop */\n\n/**\n * Format bytes to human readable string\n * @internal Exported for testing\n */\nexport function formatBytes(bytes: number): string {\n if (bytes < 1024) {\n return `${String(bytes)} B`\n }\n if (bytes < 1024 * 1024) {\n return `${(bytes / 1024).toFixed(1)} KB`\n }\n if (bytes < 1024 * 1024 * 1024) {\n return `${(bytes / 1024 / 1024).toFixed(1)} MB`\n }\n return `${(bytes / 1024 / 1024 / 1024).toFixed(2)} GB`\n}\n","/**\n * whisper-coreml\n *\n * OpenAI Whisper ASR for Node.js with CoreML/ANE acceleration on Apple Silicon.\n * Based on whisper.cpp with Apple Neural Engine support.\n *\n * Uses the large-v3-turbo model exclusively, as it offers the best speed/quality\n * ratio and is the main reason to choose Whisper over Parakeet.\n */\n\n// Dynamic require for loading native addon (works in both ESM and CJS)\n// eslint-disable-next-line @typescript-eslint/no-require-imports\nconst bindingsModule = require(\"bindings\") as (name: string) => unknown\n\n/**\n * Native addon interface\n */\ninterface NativeAddon {\n initialize(options: {\n modelPath: string\n language?: string\n translate?: boolean\n threads?: number\n }): boolean\n isInitialized(): boolean\n transcribe(samples: Float32Array, sampleRate: number): NativeTranscriptionResult\n cleanup(): void\n getVersion(): { addon: string; whisper: string; coreml: string }\n}\n\ninterface NativeTranscriptionResult {\n text: string\n language: string\n durationMs: number\n segments: {\n startMs: number\n endMs: number\n text: string\n confidence: number\n }[]\n}\n\n/* v8 ignore start - platform checks and native addon loading */\n\n/**\n * Load the native addon\n */\nfunction loadAddon(): NativeAddon {\n if (process.platform !== \"darwin\") {\n throw new Error(\"whisper-coreml is only supported on macOS\")\n }\n\n try {\n return bindingsModule(\"whisper_asr\") as NativeAddon\n } catch (error) {\n const message = error instanceof Error ? error.message : String(error)\n throw new Error(`Failed to load Whisper ASR native addon: ${message}`)\n }\n}\n\n/* v8 ignore stop */\n\nlet addon: NativeAddon | null = null\nlet loadError: Error | null = null\n\nfunction getAddon(): NativeAddon {\n if (!addon) {\n try {\n addon = loadAddon()\n } catch (error) {\n loadError = error instanceof Error ? error : new Error(String(error))\n throw error\n }\n }\n return addon\n}\n\n/**\n * Check if Whisper ASR is available on this platform\n */\nexport function isAvailable(): boolean {\n return process.platform === \"darwin\" && process.arch === \"arm64\"\n}\n\n/**\n * Get the load error if the addon failed to load\n */\nexport function getLoadError(): Error | null {\n return loadError\n}\n\n/**\n * Transcription segment with timestamps\n */\nexport interface TranscriptionSegment {\n /** Start time in milliseconds */\n startMs: number\n /** End time in milliseconds */\n endMs: number\n /** Transcribed text for this segment */\n text: string\n /** Confidence score (0-1) */\n confidence: number\n}\n\n/**\n * Transcription result\n */\nexport interface TranscriptionResult {\n /** Full transcribed text */\n text: string\n /** Detected or specified language (ISO code) */\n language: string\n /** Processing time in milliseconds */\n durationMs: number\n /** Individual segments with timestamps */\n segments: TranscriptionSegment[]\n}\n\n/**\n * Whisper ASR engine options\n */\nexport interface WhisperAsrOptions {\n /** Path to the Whisper model file (ggml format) */\n modelPath: string\n /** Language code (e.g., \"en\", \"de\", \"fr\") or \"auto\" for auto-detection */\n language?: string\n /** Translate to English (default: false) */\n translate?: boolean\n /** Number of threads (0 = auto) */\n threads?: number\n}\n\n/**\n * Whisper ASR Engine with CoreML acceleration\n *\n * Uses the large-v3-turbo model for best speed/quality balance.\n *\n * @example\n * ```typescript\n * import { WhisperAsrEngine, getModelPath } from \"whisper-coreml\"\n *\n * const engine = new WhisperAsrEngine({\n * modelPath: getModelPath()\n * })\n *\n * await engine.initialize()\n * const result = await engine.transcribe(audioSamples, 16000)\n * console.log(result.text)\n * ```\n */\nexport class WhisperAsrEngine {\n private options: WhisperAsrOptions\n private initialized = false\n\n constructor(options: WhisperAsrOptions) {\n this.options = options\n }\n\n /* v8 ignore start - native addon calls, tested via E2E */\n\n /**\n * Initialize the Whisper engine\n * This loads the model into memory - may take a few seconds.\n */\n initialize(): Promise<void> {\n if (this.initialized) {\n return Promise.resolve()\n }\n\n const nativeAddon = getAddon()\n const success = nativeAddon.initialize({\n modelPath: this.options.modelPath,\n language: this.options.language ?? \"auto\",\n translate: this.options.translate ?? false,\n threads: this.options.threads ?? 0\n })\n\n if (!success) {\n return Promise.reject(new Error(\"Failed to initialize Whisper engine\"))\n }\n\n this.initialized = true\n return Promise.resolve()\n }\n\n /**\n * Check if the engine is ready for transcription\n */\n isReady(): boolean {\n if (!this.initialized) {\n return false\n }\n try {\n return getAddon().isInitialized()\n } catch {\n return false\n }\n }\n\n /**\n * Transcribe audio samples\n *\n * @param samples - Float32Array of audio samples (mono, 16kHz)\n * @param sampleRate - Sample rate in Hz (default: 16000)\n * @returns Transcription result with text and segments\n */\n transcribe(samples: Float32Array, sampleRate = 16000): Promise<TranscriptionResult> {\n if (!this.initialized) {\n return Promise.reject(new Error(\"Whisper engine not initialized. Call initialize() first.\"))\n }\n\n const result = getAddon().transcribe(samples, sampleRate)\n\n return Promise.resolve({\n text: result.text,\n language: result.language,\n durationMs: result.durationMs,\n segments: result.segments\n })\n }\n\n /**\n * Clean up resources and unload the model\n */\n cleanup(): void {\n if (this.initialized) {\n try {\n getAddon().cleanup()\n } catch {\n // Ignore cleanup errors\n }\n this.initialized = false\n }\n }\n\n /**\n * Get version information\n */\n getVersion(): { addon: string; whisper: string; coreml: string } {\n return getAddon().getVersion()\n }\n\n /* v8 ignore stop */\n}\n\n// Re-export download utilities\nexport {\n downloadModel,\n formatBytes,\n getDefaultModelDir,\n getModelPath,\n isModelDownloaded,\n WHISPER_MODEL,\n type DownloadOptions\n} from \"./download.js\"\n"],"mappings":";;;;;;;;AAOA,SAAS,YAAY,WAAW,eAAe,cAAc;AAC7D,SAAS,eAAe;AACxB,SAAS,MAAM,eAAe;AAOvB,IAAM,gBAAgB;AAAA,EAC3B,MAAM;AAAA,EACN,MAAM;AAAA,EACN,WAAW;AAAA,EACX,KAAK;AACP;AAKO,SAAS,qBAA6B;AAC3C,SAAO,KAAK,QAAQ,GAAG,UAAU,kBAAkB,QAAQ;AAC7D;AAKO,SAAS,aAAa,UAA2B;AACtD,QAAM,MAAM,YAAY,mBAAmB;AAC3C,SAAO,KAAK,KAAK,QAAQ,cAAc,IAAI,MAAM;AACnD;AAKO,SAAS,kBAAkB,UAA4B;AAC5D,QAAM,YAAY,aAAa,QAAQ;AACvC,SAAO,WAAW,SAAS;AAC7B;AAwBA,eAAsB,cAAc,UAA2B,CAAC,GAAoB;AAClF,QAAM,WAAW,QAAQ,YAAY,mBAAmB;AACxD,QAAM,YAAY,aAAa,QAAQ;AAEvC,MAAI,CAAC,QAAQ,SAAS,WAAW,SAAS,GAAG;AAC3C,WAAO;AAAA,EACT;AAGA,MAAI,WAAW,SAAS,GAAG;AACzB,WAAO,SAAS;AAAA,EAClB;AAEA,YAAU,QAAQ,SAAS,GAAG,EAAE,WAAW,KAAK,CAAC;AAEjD,UAAQ,IAAI,uBAAuB,cAAc,IAAI,KAAK,cAAc,IAAI,MAAM;AAClF,UAAQ,IAAI,WAAW,cAAc,GAAG,EAAE;AAC1C,UAAQ,IAAI,WAAW,SAAS,EAAE;AAElC,QAAM,WAAW,MAAM,MAAM,cAAc,GAAG;AAC9C,MAAI,CAAC,SAAS,IAAI;AAChB,UAAM,IAAI,MAAM,6BAA6B,SAAS,UAAU,EAAE;AAAA,EACpE;AAEA,QAAM,gBAAgB,SAAS,QAAQ,IAAI,gBAAgB;AAC3D,QAAM,aAAa,gBAAgB,SAAS,eAAe,EAAE,IAAI;AAEjE,QAAM,SAAS,SAAS,MAAM,UAAU;AACxC,MAAI,CAAC,QAAQ;AACX,UAAM,IAAI,MAAM,oCAAoC;AAAA,EACtD;AAEA,QAAM,SAAuB,CAAC;AAC9B,MAAI,kBAAkB;AAGtB,SAAO,MAAM;AACX,UAAM,SAAS,MAAM,OAAO,KAAK;AACjC,QAAI,OAAO,MAAM;AACf;AAAA,IACF;AAEA,UAAM,QAAQ,OAAO;AACrB,WAAO,KAAK,KAAK;AACjB,uBAAmB,MAAM;AAEzB,UAAM,UAAU,aAAa,IAAI,KAAK,MAAO,kBAAkB,aAAc,GAAG,IAAI;AAEpF,QAAI,QAAQ,YAAY;AACtB,cAAQ,WAAW;AAAA,QACjB;AAAA,QACA;AAAA,QACA;AAAA,MACF,CAAC;AAAA,IACH;AAGA,YAAQ,OAAO;AAAA,MACb,eAAe,OAAO,OAAO,CAAC,MAAM,YAAY,eAAe,CAAC,IAAI,YAAY,UAAU,CAAC;AAAA,IAC7F;AAAA,EACF;AAGA,QAAM,SAAS,OAAO,OAAO,MAAM;AACnC,gBAAc,WAAW,MAAM;AAE/B,UAAQ,IAAI,yCAAoC;AAChD,SAAO;AACT;AAQO,SAAS,YAAY,OAAuB;AACjD,MAAI,QAAQ,MAAM;AAChB,WAAO,GAAG,OAAO,KAAK,CAAC;AAAA,EACzB;AACA,MAAI,QAAQ,OAAO,MAAM;AACvB,WAAO,IAAI,QAAQ,MAAM,QAAQ,CAAC,CAAC;AAAA,EACrC;AACA,MAAI,QAAQ,OAAO,OAAO,MAAM;AAC9B,WAAO,IAAI,QAAQ,OAAO,MAAM,QAAQ,CAAC,CAAC;AAAA,EAC5C;AACA,SAAO,IAAI,QAAQ,OAAO,OAAO,MAAM,QAAQ,CAAC,CAAC;AACnD;;;AC/IA,IAAM,iBAAiB,UAAQ,UAAU;AAmCzC,SAAS,YAAyB;AAChC,MAAI,QAAQ,aAAa,UAAU;AACjC,UAAM,IAAI,MAAM,2CAA2C;AAAA,EAC7D;AAEA,MAAI;AACF,WAAO,eAAe,aAAa;AAAA,EACrC,SAAS,OAAO;AACd,UAAM,UAAU,iBAAiB,QAAQ,MAAM,UAAU,OAAO,KAAK;AACrE,UAAM,IAAI,MAAM,4CAA4C,OAAO,EAAE;AAAA,EACvE;AACF;AAIA,IAAI,QAA4B;AAChC,IAAI,YAA0B;AAE9B,SAAS,WAAwB;AAC/B,MAAI,CAAC,OAAO;AACV,QAAI;AACF,cAAQ,UAAU;AAAA,IACpB,SAAS,OAAO;AACd,kBAAY,iBAAiB,QAAQ,QAAQ,IAAI,MAAM,OAAO,KAAK,CAAC;AACpE,YAAM;AAAA,IACR;AAAA,EACF;AACA,SAAO;AACT;AAKO,SAAS,cAAuB;AACrC,SAAO,QAAQ,aAAa,YAAY,QAAQ,SAAS;AAC3D;AAKO,SAAS,eAA6B;AAC3C,SAAO;AACT;AA8DO,IAAM,mBAAN,MAAuB;AAAA,EACpB;AAAA,EACA,cAAc;AAAA,EAEtB,YAAY,SAA4B;AACtC,SAAK,UAAU;AAAA,EACjB;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EAQA,aAA4B;AAC1B,QAAI,KAAK,aAAa;AACpB,aAAO,QAAQ,QAAQ;AAAA,IACzB;AAEA,UAAM,cAAc,SAAS;AAC7B,UAAM,UAAU,YAAY,WAAW;AAAA,MACrC,WAAW,KAAK,QAAQ;AAAA,MACxB,UAAU,KAAK,QAAQ,YAAY;AAAA,MACnC,WAAW,KAAK,QAAQ,aAAa;AAAA,MACrC,SAAS,KAAK,QAAQ,WAAW;AAAA,IACnC,CAAC;AAED,QAAI,CAAC,SAAS;AACZ,aAAO,QAAQ,OAAO,IAAI,MAAM,qCAAqC,CAAC;AAAA,IACxE;AAEA,SAAK,cAAc;AACnB,WAAO,QAAQ,QAAQ;AAAA,EACzB;AAAA;AAAA;AAAA;AAAA,EAKA,UAAmB;AACjB,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO;AAAA,IACT;AACA,QAAI;AACF,aAAO,SAAS,EAAE,cAAc;AAAA,IAClC,QAAQ;AACN,aAAO;AAAA,IACT;AAAA,EACF;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA,EASA,WAAW,SAAuB,aAAa,MAAqC;AAClF,QAAI,CAAC,KAAK,aAAa;AACrB,aAAO,QAAQ,OAAO,IAAI,MAAM,0DAA0D,CAAC;AAAA,IAC7F;AAEA,UAAM,SAAS,SAAS,EAAE,WAAW,SAAS,UAAU;AAExD,WAAO,QAAQ,QAAQ;AAAA,MACrB,MAAM,OAAO;AAAA,MACb,UAAU,OAAO;AAAA,MACjB,YAAY,OAAO;AAAA,MACnB,UAAU,OAAO;AAAA,IACnB,CAAC;AAAA,EACH;AAAA;AAAA;AAAA;AAAA,EAKA,UAAgB;AACd,QAAI,KAAK,aAAa;AACpB,UAAI;AACF,iBAAS,EAAE,QAAQ;AAAA,MACrB,QAAQ;AAAA,MAER;AACA,WAAK,cAAc;AAAA,IACrB;AAAA,EACF;AAAA;AAAA;AAAA;AAAA,EAKA,aAAiE;AAC/D,WAAO,SAAS,EAAE,WAAW;AAAA,EAC/B;AAAA;AAGF;","names":[]}