@agentick/apple 0.8.0 → 0.9.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,13 +1,14 @@
1
1
  # @agentick/apple
2
2
 
3
- Apple Foundation Models adapter for Agentick — on-device inference with macOS 26+.
3
+ Apple on-device AI for Agentick — inference and embeddings via Foundation Models and NaturalLanguage, running entirely on your machine.
4
4
 
5
5
  ## Features
6
6
 
7
7
  - **On-device inference** — No API keys, no external requests, zero cost
8
+ - **On-device embeddings** — 512-dimensional vector embeddings via `NLContextualEmbedding`
8
9
  - **Privacy-first** — All processing happens locally with Apple Intelligence
9
- - **Structured output** — Native support for JSON schema-constrained generation via `DynamicGenerationSchema`
10
- - **Streaming support** — Real-time token-by-token responses
10
+ - **Structured output** — JSON schema-constrained generation via `DynamicGenerationSchema`
11
+ - **Streaming** — Real-time token-by-token responses
11
12
  - **Auto-compiled binary** — Swift bridge compiles automatically on install
12
13
 
13
14
  ## Requirements
@@ -24,11 +25,11 @@ npm install @agentick/apple
24
25
  pnpm add @agentick/apple
25
26
  ```
26
27
 
27
- The postinstall script automatically compiles the Swift bridge binary. If compilation fails (e.g., on non-macOS or without Xcode), the package still installs but won't be functional until the binary is available.
28
+ The postinstall script compiles the Swift bridge binary. If compilation fails (e.g., on non-macOS or without Xcode), the package still installs but won't be functional until the binary is available.
28
29
 
29
30
  ## Quick Start
30
31
 
31
- ### Simple Text Generation
32
+ ### Text Generation
32
33
 
33
34
  ```typescript
34
35
  import { apple } from '@agentick/apple';
@@ -46,6 +47,27 @@ const session = app.createSession();
46
47
  const result = await session.send({ messages: [{ role: 'user', content: 'Hello!' }] });
47
48
  ```
48
49
 
50
+ ### Embeddings
51
+
52
+ ```typescript
53
+ import { appleEmbedding } from "@agentick/apple";
54
+
55
+ const embed = appleEmbedding();
56
+
57
+ // Single text
58
+ const { embeddings, dimensions } = await embed("Hello world");
59
+ console.log(dimensions); // 512
60
+ console.log(embeddings[0].length); // 512
61
+
62
+ // Batch
63
+ const { embeddings } = await embed([
64
+ "machine learning and AI",
65
+ "deep neural networks",
66
+ "the cat sat on the mat",
67
+ ]);
68
+ // embeddings → number[3][512]
69
+ ```
70
+
49
71
  ### Structured Output
50
72
 
51
73
  ```typescript
@@ -68,9 +90,7 @@ const result = await session.send({
68
90
  },
69
91
  });
70
92
 
71
- // Result is guaranteed valid JSON matching the schema
72
93
  const recipe = JSON.parse(result.message.content[0].text);
73
- console.log(recipe.title); // "Spaghetti Carbonara"
74
94
  ```
75
95
 
76
96
  ### JSX Component
@@ -87,197 +107,151 @@ const Agent = () => (
87
107
  );
88
108
  ```
89
109
 
90
- ### Direct Execution
91
-
92
- You can also use the model directly without creating a session:
93
-
94
- ```typescript
95
- import { apple } from "@agentick/apple";
96
-
97
- const model = apple();
98
-
99
- // Simple generation
100
- const result = await model.execute({
101
- messages: [{ role: "user", content: "Hello!" }],
102
- stream: false,
103
- });
104
- console.log(result.message.content[0].text);
105
-
106
- // Structured output
107
- const structuredResult = await model.execute({
108
- messages: [{ role: "user", content: "Generate a person profile" }],
109
- responseFormat: {
110
- type: "json_schema",
111
- schema: {
112
- type: "object",
113
- properties: {
114
- name: { type: "string", description: "Full name" },
115
- age: { type: "integer", description: "Age in years" },
116
- },
117
- },
118
- },
119
- stream: false,
120
- });
121
-
122
- const person = JSON.parse(structuredResult.message.content[0].text);
123
- console.log(person); // { name: "...", age: 30 }
124
- ```
125
-
126
110
  ## API
127
111
 
128
112
  ### `apple(config?)`
129
113
 
130
- Factory function that returns a `ModelClass` for use with `createApp`, as a JSX component, or for direct execution.
131
-
132
- **Config:**
114
+ Factory function returning a `ModelClass` for text generation.
133
115
 
134
- - `bridgePath?: string` Path to Swift bridge binary (defaults to auto-compiled binary)
135
- - `model?: string` Model identifier (defaults to `"apple-foundation-3b"`)
116
+ | Option | Type | Default | Description |
117
+ | ------------ | -------- | ----------------------- | --------------------------- |
118
+ | `bridgePath` | `string` | auto-detected | Path to Swift bridge binary |
119
+ | `model` | `string` | `"apple-foundation-3b"` | Model identifier |
136
120
 
137
- **Returns:** `ModelClass` with:
121
+ Returns a `ModelClass` usable with `createApp`, as JSX, or for direct execution.
138
122
 
139
- - `.execute(input)` — Direct generation (returns `Promise<ModelOutput>`)
140
- - `.executeStream(input)` — Streaming generation (returns `AsyncIterable<AdapterDelta>`)
141
- - Use as JSX: `<model>...</model>`
142
-
143
- **Example:**
123
+ ### `AppleModel`
144
124
 
145
- ```typescript
146
- const model = apple();
125
+ JSX component wrapping `apple()` for declarative model configuration. Accepts the same props as `apple()`.
147
126
 
148
- // Direct execution
149
- await model.execute({ messages: [...], stream: false });
127
+ ### `appleEmbedding(config?)`
150
128
 
151
- // Use with createApp
152
- createApp(Agent, { model });
129
+ Factory function returning a callable embedding function.
153
130
 
154
- // Use as JSX component
155
- <model><Agent /></model>
156
- ```
131
+ | Option | Type | Default | Description |
132
+ | ------------ | ----------------- | ------------- | ---------------------------------------------------- |
133
+ | `bridgePath` | `string` | auto-detected | Path to Swift bridge binary |
134
+ | `script` | `EmbeddingScript` | `"latin"` | Script model to load (see below) |
135
+ | `language` | `string` | — | BCP-47 code (e.g. `"en"`, `"fr"`) for better results |
157
136
 
158
- ### `AppleModel`
137
+ Returns an `AppleEmbeddingFunction`:
159
138
 
160
- JSX component for declarative model configuration.
139
+ ```typescript
140
+ const embed = appleEmbedding({ script: "latin" });
161
141
 
162
- **Props:**
142
+ // Call with a single string or array
143
+ const result = await embed("Hello world");
144
+ const batch = await embed(["Hello", "World"]);
163
145
 
164
- - `bridgePath?: string`
165
- - `model?: string`
146
+ // Result shape
147
+ result.embeddings; // number[][] — one vector per input text
148
+ result.dimensions; // number — vector dimensionality (512)
149
+ result.model; // "apple-contextual-embedding"
150
+ result.script; // "latin"
151
+ ```
166
152
 
167
- ## Model Capabilities
153
+ #### Script Models
168
154
 
169
- | Feature | Supported |
170
- | --------------------------------- | ---------------------- |
171
- | Text generation | ✅ |
172
- | Streaming | ✅ |
173
- | Structured output (`json_schema`) | ✅ |
174
- | Tool calling | ❌ (compile-time only) |
175
- | Vision/multimodal input | ❌ |
176
- | Context window | 4096 tokens |
155
+ Each script model covers a group of languages. You pick the script, not individual languages:
177
156
 
178
- ### Structured Output Details
157
+ | Script | Languages |
158
+ | ------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
159
+ | `"latin"` (default) | English, French, German, Spanish, Portuguese, Italian, Dutch, Swedish, Danish, Norwegian, Finnish, Polish, Czech, Hungarian, Romanian, Slovak, Croatian, Indonesian, Turkish, Vietnamese |
160
+ | `"cyrillic"` | Russian, Ukrainian, Bulgarian, Kazakh |
161
+ | `"cjk"` | Chinese, Japanese, Korean |
162
+ | `"indic"` | Hindi, Marathi, Bangla, Urdu, Punjabi, Gujarati, Tamil, Telugu, Kannada, Malayalam |
163
+ | `"thai"` | Thai |
164
+ | `"arabic"` | Arabic |
179
165
 
180
- The adapter uses Apple's `DynamicGenerationSchema` API to enforce schema constraints at generation time. Unlike LLM providers that generate free text and then validate, Apple Foundation Models **guarantee** the output matches your schema.
166
+ The optional `language` parameter (BCP-47 code like `"en"`, `"ja"`, `"ru"`) refines results when you know the input language.
181
167
 
182
- **Supported types:**
168
+ ## Capabilities
183
169
 
184
- - Primitives: `string`, `integer`, `number`, `boolean`
185
- - Nested objects (unlimited depth)
186
- - Arrays: Not yet supported in this bridge implementation
170
+ | Feature | Supported |
171
+ | --------------------------------- | --------------------------------------- |
172
+ | Text generation | Yes |
173
+ | Streaming | Yes |
174
+ | Structured output (`json_schema`) | Yes |
175
+ | On-device embeddings | Yes — 512-dim via NLContextualEmbedding |
176
+ | Tool calling | Not yet — see [Roadmap](#roadmap) |
177
+ | Vision/multimodal | No |
178
+ | Context window | 4096 tokens |
187
179
 
188
- **Example nested schema:**
180
+ ### Structured Output
189
181
 
190
- ```typescript
191
- const schema = {
192
- type: "object",
193
- properties: {
194
- person: {
195
- type: "object",
196
- properties: {
197
- name: { type: "string", description: "Full name" },
198
- age: { type: "integer", description: "Age in years" },
199
- },
200
- },
201
- summary: { type: "string", description: "Brief bio" },
202
- },
203
- };
182
+ Uses Apple's `DynamicGenerationSchema` to enforce constraints at generation time — the model **cannot** produce invalid output.
204
183
 
205
- const result = await session.send({
206
- messages: [{ role: "user", content: "Generate a person profile" }],
207
- responseFormat: { type: "json_schema", schema },
208
- });
209
- ```
184
+ Supported types: `string`, `integer`, `number`, `boolean`, nested objects. Arrays not yet supported in bridge.
210
185
 
211
186
  ## Architecture
212
187
 
213
188
  ```
214
- Node.js (agentick adapter)
215
- ↓ JSON via stdin
216
- Swift Bridge (inference.swift)
217
- ↓ LanguageModelSession
218
- Apple Foundation Models (on-device)
219
- ↓ stdout JSON/NDJSON
220
- Node.js (agentick adapter)
189
+ Node.js (agentick)
190
+
191
+ ├── Text generation ──▶ stdin JSON ──▶ Swift Bridge ──▶ FoundationModels
192
+ │ │ │
193
+ │ ◀── stdout JSON/NDJSON ──┘
194
+
195
+ └── Embeddings ──▶ stdin JSON ──▶ Swift Bridge ──▶ NLContextualEmbedding
196
+ │ │
197
+ ◀── stdout JSON ────┘
221
198
  ```
222
199
 
223
- The adapter spawns the Swift bridge as a child process and communicates via JSON over stdin/stdout. This design avoids FFI complexity and provides a stable interface despite Swift's evolving runtime.
200
+ Single Swift binary (`apple-fm-bridge`) handles both operations, routed by the `operation` field:
224
201
 
225
- ### Manual Compilation
202
+ - `"generate"` (default) — text generation via `LanguageModelSession`
203
+ - `"embed"` — vector embeddings via `NLContextualEmbedding`
226
204
 
227
- If you need to recompile the bridge manually:
205
+ ### Manual Compilation
228
206
 
229
207
  ```bash
230
208
  cd node_modules/@agentick/apple
231
- swiftc -parse-as-library -framework FoundationModels -O inference.swift -o bin/apple-fm-bridge
209
+ swiftc -parse-as-library -framework FoundationModels -framework NaturalLanguage -O inference.swift -o bin/apple-fm-bridge
232
210
  ```
233
211
 
212
+ ## Roadmap
213
+
214
+ ### Tool Calling
215
+
216
+ Apple Foundation Models support tool calling via the `Tool` protocol — the model can autonomously call Swift functions and use results in its response. Our adapter currently doesn't support this because Apple's tool loop runs internally within `session.respond()`.
217
+
218
+ The path forward is a **bidirectional bridge protocol**: proxy `Tool` structs in Swift that write `tool_call` messages to stdout and read `tool_result` responses from stdin, letting agentick's tool executors handle execution while Apple's framework manages the model loop.
219
+
220
+ ### Embedding Improvements
221
+
222
+ - Cosine similarity utility functions
223
+ - Batch performance optimization (keep model loaded across calls)
224
+ - Configurable pooling strategies (mean, CLS, max)
225
+
234
226
  ## Limitations
235
227
 
236
228
  - **macOS 26+ only** — Foundation Models framework isn't available on earlier versions
237
229
  - **Apple Intelligence required** — Model must be downloaded and enabled in System Settings
238
- - **No tool calling** — While the framework supports tools via the `@Generable` macro, they must be compile-time Swift types. Dynamic tool schemas from Node.js would require bidirectional IPC.
239
- - **No vision input** — The public `LanguageModelSession` API is text-only. Underlying models may support images, but it's not exposed.
240
- - **Limited context** — 4096 token window (confirmed empirically)
241
- - **Array schemas unsupported** — Current bridge doesn't implement array handling in `DynamicGenerationSchema` conversion
230
+ - **Limited context** — 4096 token window
231
+ - **No vision input** — `LanguageModelSession` API is text-only
232
+ - **Array schemas unsupported** — `DynamicGenerationSchema` doesn't support dynamic array generation
242
233
 
243
234
  ## Troubleshooting
244
235
 
245
236
  ### "Model not available" error
246
237
 
247
- Apple Intelligence must be enabled and the model downloaded:
248
-
249
238
  1. Open **System Settings** > **Apple Intelligence & Siri**
250
239
  2. Enable Apple Intelligence
251
240
  3. Wait for model download (may take several minutes)
252
241
 
253
242
  ### Compilation fails on install
254
243
 
255
- Ensure Xcode is installed:
256
-
257
244
  ```bash
258
245
  xcode-select --install
259
246
  ```
260
247
 
261
- Or download from the App Store / developer.apple.com.
248
+ ### "Embedding model assets not downloaded"
262
249
 
263
- ### Sandbox access issues
264
-
265
- If running in a restricted environment (e.g., Cursor sandbox), the process may not have access to the model:
266
-
267
- ```typescript
268
- // Use outside sandbox or request appropriate permissions
269
- const app = createApp(Agent, { model: apple() });
270
- ```
250
+ The NLContextualEmbedding model assets may need to be downloaded. Ensure Apple Intelligence is enabled and the device has internet access for the initial download.
271
251
 
272
252
  ### Guardrail violations
273
253
 
274
- Apple's on-device models include safety guardrails. Requests for harmful, illegal, or repetitive content may be rejected with:
275
-
276
- ```
277
- Model not available: guardrailViolation(...)
278
- ```
279
-
280
- This is expected behavior and cannot be disabled.
254
+ Apple's on-device models include safety guardrails. Requests for harmful or repetitive content may be rejected — this is expected and cannot be disabled.
281
255
 
282
256
  ## License
283
257