@agtlantis/core 0.4.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md ADDED
@@ -0,0 +1,267 @@
1
+ # @agtlantis/core
2
+
3
+ > Higher-level abstractions for Vercel AI SDK
4
+
5
+ ## Overview
6
+
7
+ [Vercel AI SDK](https://sdk.vercel.ai/) provides excellent primitives for LLM applications. @agtlantis/core raises the abstraction level and makes common concerns easier to address.
8
+
9
+ ## Features
10
+
11
+ - **Unified Provider Interface** — Same API for Google, OpenAI
12
+ - **Streaming Patterns** — Progressive, validation-aware streaming with event-driven architecture
13
+ - **Observability Helpers** — Structured logging, metrics, and cost tracking
14
+ - **Validation with Retry** — Zod-based output validation with automatic retries
15
+ - **Cost Calculation** — Token-based pricing with customizable provider rates
16
+ - **Prompt Management** — File-based prompt repository with Handlebars templating
17
+
18
+ ## Installation
19
+
20
+ Not published to npm yet. Use from the monorepo:
21
+
22
+ ```bash
23
+ git clone <repo-url>
24
+ cd agtlantis
25
+ pnpm install
26
+ pnpm build
27
+ ```
28
+
29
+ ## Quick Start
30
+
31
+ ```typescript
32
+ import { createGoogleProvider } from '@agtlantis/core';
33
+
34
+ const provider = createGoogleProvider({
35
+ apiKey: process.env.GOOGLE_AI_API_KEY,
36
+ });
37
+
38
+ const execution = provider.simpleExecution(async (session) => {
39
+ return session.generateText({ prompt: 'Say hello briefly' });
40
+ });
41
+
42
+ const result = await execution.toResult();
43
+ console.log(result.text);
44
+ ```
45
+
46
+ ### Using OpenAI
47
+
48
+ Switch providers while keeping the same execution pattern:
49
+
50
+ ```typescript
51
+ import { createOpenAIProvider } from '@agtlantis/core';
52
+
53
+ const provider = createOpenAIProvider({
54
+ apiKey: process.env.OPENAI_API_KEY,
55
+ });
56
+
57
+ const execution = provider.simpleExecution(async (session) => {
58
+ return session.generateText({ prompt: 'Say hello briefly' });
59
+ });
60
+
61
+ const result = await execution.toResult();
62
+ console.log(result.text);
63
+ ```
64
+
65
+ ### Streaming with Progressive Pattern
66
+
67
+ For structured streaming with progress events:
68
+
69
+ ```typescript
70
+ import { z } from 'zod';
71
+ import { createGoogleProvider, defineProgressivePattern } from '@agtlantis/core';
72
+
73
+ const progressSchema = z.object({ step: z.string() });
74
+ const resultSchema = z.object({ answer: z.string() });
75
+
76
+ const pattern = defineProgressivePattern({ progressSchema, resultSchema });
77
+
78
+ const provider = createGoogleProvider({
79
+ apiKey: process.env.GOOGLE_AI_API_KEY,
80
+ });
81
+
82
+ const execution = provider.streamingExecution(async function* (session) {
83
+ yield* pattern.runInSession(session, {
84
+ system: 'Report your thinking steps, then provide the final answer.',
85
+ messages: [{ role: 'user', content: 'What is 2 + 2?' }],
86
+ });
87
+ });
88
+
89
+ for await (const event of execution) {
90
+ if (event.type === 'progress') {
91
+ console.log('Step:', event.data.step);
92
+ } else if (event.type === 'complete') {
93
+ console.log('Answer:', event.data.answer);
94
+ }
95
+ }
96
+ ```
97
+
98
+ ## Architecture
99
+
100
+ ```
101
+ Your Application
102
+
103
+ @agtlantis/core (Providers, Sessions, Patterns)
104
+
105
+ Vercel AI SDK (generateText, streamText, tools)
106
+
107
+ LLM APIs (Google, OpenAI, Anthropic)
108
+ ```
109
+
110
+ ## Modules
111
+
112
+ | Module | Description | Key Exports |
113
+ |--------|-------------|-------------|
114
+ | **Provider** | Unified LLM interface with session management | `createGoogleProvider`, `createOpenAIProvider` |
115
+ | **File Management** | File upload with caching support | `FileManager`, `FileCache`, `InMemoryFileCache` |
116
+ | **Patterns** | Reusable execution patterns for streaming | `defineProgressivePattern` |
117
+ | **Observability** | Structured logging and metrics collection | `createLogger`, `EventMetrics` |
118
+ | **Validation** | Output validation with automatic retries | `withValidation`, `ValidationHistory` |
119
+ | **Pricing** | Token cost calculation and tracking | `configurePricing`, `calculateCost` |
120
+ | **Prompt** | Prompt templating and repository management | `createFilePromptRepository` |
121
+
122
+ ### Provider Module
123
+
124
+ Unified interface for different LLM providers:
125
+
126
+ ```typescript
127
+ import { createGoogleProvider, createOpenAIProvider } from '@agtlantis/core';
128
+
129
+ const google = createGoogleProvider({ apiKey: process.env.GOOGLE_AI_API_KEY });
130
+ const openai = createOpenAIProvider({ apiKey: process.env.OPENAI_API_KEY });
131
+
132
+ // Override default model
133
+ const flashProvider = google.withDefaultModel('gemini-2.0-flash-exp');
134
+ ```
135
+
136
+ ### File Management Module
137
+
138
+ Upload files with automatic caching to avoid redundant uploads:
139
+
140
+ ```typescript
141
+ import { createGoogleProvider, InMemoryFileCache } from '@agtlantis/core';
142
+
143
+ // Create provider with file caching (30 min TTL)
144
+ const provider = createGoogleProvider({
145
+ apiKey: process.env.GOOGLE_AI_API_KEY,
146
+ })
147
+ .withDefaultModel('gemini-2.5-flash')
148
+ .withFileCache(new InMemoryFileCache({ defaultTTL: 30 * 60 * 1000 }));
149
+
150
+ const execution = provider.simpleExecution(async (session) => {
151
+ // Files are cached by content hash - identical files won't be re-uploaded
152
+ const uploaded = await session.fileManager.upload([
153
+ { source: 'path', path: './document.pdf' },
154
+ ]);
155
+
156
+ return session.generateText({
157
+ prompt: [
158
+ { type: 'text', text: 'Summarize this document:' },
159
+ uploaded[0].part,
160
+ ],
161
+ });
162
+ });
163
+ ```
164
+
165
+ ### Validation Module
166
+
167
+ Validate and retry LLM outputs with custom logic:
168
+
169
+ ```typescript
170
+ import { createGoogleProvider, withValidation } from '@agtlantis/core';
171
+
172
+ const provider = createGoogleProvider({ apiKey: process.env.GOOGLE_AI_API_KEY });
173
+
174
+ const execution = provider.simpleExecution(async (session) => {
175
+ return withValidation(
176
+ async (history) => {
177
+ const prompt = history.isRetry
178
+ ? `Generate a number between 1-10. Previous attempt failed: ${history.failureReasons.join(', ')}`
179
+ : 'Generate a number between 1 and 10';
180
+
181
+ const result = await session.generateText({ prompt });
182
+ return result.text;
183
+ },
184
+ {
185
+ validate: (text) => {
186
+ const num = parseInt(text, 10);
187
+ if (isNaN(num) || num < 1 || num > 10) {
188
+ return { valid: false, reason: `"${text}" is not a number between 1-10` };
189
+ }
190
+ return { valid: true };
191
+ },
192
+ maxAttempts: 3,
193
+ },
194
+ );
195
+ });
196
+
197
+ const result = await execution.toResult();
198
+ console.log(result); // A number between 1-10
199
+ ```
200
+
201
+ ### Pricing Module
202
+
203
+ Track costs across LLM calls:
204
+
205
+ ```typescript
206
+ import { configurePricing, calculateCost } from '@agtlantis/core';
207
+
208
+ configurePricing({
209
+ providers: {
210
+ google: {
211
+ 'gemini-2.5-flash': { inputPricePerMillion: 0.5, outputPricePerMillion: 3.0 },
212
+ },
213
+ },
214
+ });
215
+
216
+ const cost = calculateCost({
217
+ inputTokens: 1000,
218
+ outputTokens: 500,
219
+ model: 'gemini-2.5-flash',
220
+ provider: 'google',
221
+ });
222
+
223
+ console.log(cost); // { inputCost: 0.0005, outputCost: 0.0015, totalCost: 0.002 }
224
+ ```
225
+
226
+ ### Prompt Module
227
+
228
+ Manage prompts with file-based storage and Handlebars templating:
229
+
230
+ ```typescript
231
+ import { createFilePromptRepository } from '@agtlantis/core';
232
+
233
+ const repo = createFilePromptRepository({ directory: './prompts' });
234
+
235
+ interface GreetingInput {
236
+ name: string;
237
+ }
238
+
239
+ const prompt = await repo.read<GreetingInput>('greeting');
240
+ const userPrompt = prompt.renderUserPrompt({ name: 'World' });
241
+
242
+ console.log(prompt.system);
243
+ console.log(userPrompt);
244
+ ```
245
+
246
+ ## Documentation
247
+
248
+ For detailed guides and API reference, see the [docs](./docs/) folder:
249
+
250
+ - [Getting Started](./docs/getting-started.md)
251
+ - [Provider Guide](./docs/guides/provider-guide.md)
252
+ - [Prompt Guide](./docs/guides/prompt-guide.md) - Prompt management with templating
253
+ - [Validation Guide](./docs/guides/validation-guide.md)
254
+ - [Patterns Guide](./docs/guides/patterns-guide.md)
255
+ - [Cancellation Guide](./docs/guides/cancellation.md) - Cancel in-progress LLM operations
256
+ - [API Reference](./docs/api/)
257
+
258
+ See [CHANGELOG.md](./CHANGELOG.md) for version history and breaking changes.
259
+
260
+ ## Requirements
261
+
262
+ - Node.js 18+
263
+ - TypeScript 5.0+ (recommended)
264
+
265
+ ## License
266
+
267
+ MIT