prompt-api-polyfill 1.16.0 → 1.18.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +44 -1
- package/dist/backends/firebase.js +31 -1342
- package/dist/backends/gemini.js +1 -1
- package/dist/backends/openai.js +1 -1
- package/dist/backends/transformers.js +4 -4
- package/dist/backends/webllm.js +107 -0
- package/dist/chunks/{defaults-B5W7MP9T.js → defaults-DMD-8IKq.js} +2 -1
- package/dist/prompt-api-polyfill.js +8 -1
- package/package.json +11 -10
package/README.md
CHANGED
|
@@ -8,6 +8,7 @@ supporting dynamic backends:
|
|
|
8
8
|
- **Google Gemini API** (cloud)
|
|
9
9
|
- **OpenAI API** (cloud)
|
|
10
10
|
- **Transformers.js** (local after initial model download, **default backend**)
|
|
11
|
+
- **WebLLM** (local after initial model download)
|
|
11
12
|
|
|
12
13
|
When loaded in the browser, it defines a global:
|
|
13
14
|
|
|
@@ -57,6 +58,13 @@ configured, the polyfill will use Transformers.js with the default model.
|
|
|
57
58
|
- **Model**: Uses default if not specified (see
|
|
58
59
|
[`backends/defaults.js`](backends/defaults.js)).
|
|
59
60
|
|
|
61
|
+
### WebLLM (local after initial model download)
|
|
62
|
+
|
|
63
|
+
- **Uses**: `@mlc-ai/web-llm` SDK (MLC engine, runs via WebGPU).
|
|
64
|
+
- **Select by setting**: `window.WEBLLM_CONFIG`.
|
|
65
|
+
- **Model**: Uses default if not specified (see
|
|
66
|
+
[`backends/defaults.js`](backends/defaults.js)).
|
|
67
|
+
|
|
60
68
|
---
|
|
61
69
|
|
|
62
70
|
## Installation
|
|
@@ -166,6 +174,28 @@ npm install prompt-api-polyfill
|
|
|
166
174
|
</script>
|
|
167
175
|
```
|
|
168
176
|
|
|
177
|
+
### Backed by WebLLM (local after initial model download)
|
|
178
|
+
|
|
179
|
+
1. **Only a dummy API Key required** (runs locally in the browser via WebGPU).
|
|
180
|
+
2. **Provide configuration** on `window.WEBLLM_CONFIG`.
|
|
181
|
+
3. **Import the polyfill**.
|
|
182
|
+
|
|
183
|
+
```html
|
|
184
|
+
<script type="module">
|
|
185
|
+
// Set WEBLLM_CONFIG to select the WebLLM backend
|
|
186
|
+
window.WEBLLM_CONFIG = {
|
|
187
|
+
apiKey: 'dummy', // Required for now by the loader
|
|
188
|
+
modelName: 'Llama-3.2-3B-Instruct-q4f32_1-MLC', // Optional: override default
|
|
189
|
+
};
|
|
190
|
+
|
|
191
|
+
if (!('LanguageModel' in window)) {
|
|
192
|
+
await import('prompt-api-polyfill');
|
|
193
|
+
}
|
|
194
|
+
|
|
195
|
+
const session = await LanguageModel.create();
|
|
196
|
+
</script>
|
|
197
|
+
```
|
|
198
|
+
|
|
169
199
|
---
|
|
170
200
|
|
|
171
201
|
## Configuration
|
|
@@ -322,13 +352,22 @@ Then open `.env.json` and fill in the values.
|
|
|
322
352
|
}
|
|
323
353
|
```
|
|
324
354
|
|
|
355
|
+
**For WebLLM:**
|
|
356
|
+
|
|
357
|
+
```json
|
|
358
|
+
{
|
|
359
|
+
"apiKey": "dummy",
|
|
360
|
+
"modelName": "Llama-3.2-3B-Instruct-q4f32_1-MLC"
|
|
361
|
+
}
|
|
362
|
+
```
|
|
363
|
+
|
|
325
364
|
### Field-by-field explanation
|
|
326
365
|
|
|
327
366
|
- `apiKey`:
|
|
328
367
|
- **Firebase AI Logic**: Your Firebase Web API key.
|
|
329
368
|
- **Gemini**: Your Gemini API Key.
|
|
330
369
|
- **OpenAI**: Your OpenAI API Key.
|
|
331
|
-
- **Transformers.js**: Use `"dummy"`.
|
|
370
|
+
- **Transformers.js** / **WebLLM**: Use `"dummy"`.
|
|
332
371
|
- `projectId` / `appId`: **Firebase AI Logic only**.
|
|
333
372
|
- `geminiApiProvider`: **Firebase AI Logic only**. Either `"developer"` (Gemini
|
|
334
373
|
Developer API, default) or `"vertex"` (Vertex AI).
|
|
@@ -344,6 +383,10 @@ Then open `.env.json` and fill in the values.
|
|
|
344
383
|
- `env` (optional): **Transformers.js only**. A flexible object to override
|
|
345
384
|
[Transformers.js environment variables](https://huggingface.co/docs/transformers.js/api/env).
|
|
346
385
|
This is useful for specifying local `wasmPaths` or proxy settings.
|
|
386
|
+
- **WebLLM** only requires `apiKey: "dummy"` and an optional `modelName`. Model
|
|
387
|
+
IDs must be valid
|
|
388
|
+
[MLC model identifiers](https://github.com/mlc-ai/web-llm?tab=readme-ov-file#list-available-models)
|
|
389
|
+
(e.g., `"Llama-3.2-3B-Instruct-q4f32_1-MLC"`).
|
|
347
390
|
|
|
348
391
|
- `modelName` (optional): The model ID to use. If not provided, the polyfill
|
|
349
392
|
uses the defaults defined in [`backends/defaults.js`](backends/defaults.js).
|