@krutai/ai-provider 0.1.0 → 0.2.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/AI_REFERENCE.md CHANGED
@@ -3,54 +3,65 @@
3
3
  ## Package Overview
4
4
 
5
5
  - **Name**: `@krutai/ai-provider`
6
- - **Version**: `0.1.0`
7
- - **Purpose**: AI provider for KrutAI — wraps `@openrouter/sdk` with a `krutAI()` factory, key validation, and a configurable default model
6
+ - **Version**: `0.2.2`
7
+ - **Purpose**: AI provider for KrutAI — fetch-based client for your deployed LangChain server with API key validation
8
8
  - **Entry**: `src/index.ts` → `dist/index.{js,mjs,d.ts}`
9
- - **Build**: `tsup` (CJS + ESM, all deps external)
9
+ - **Build**: `tsup` (CJS + ESM, no external SDK deps)
10
10
 
11
- ## Dependency Architecture
11
+ ## Architecture
12
12
 
13
13
  ```
14
- @krutai/ai-provider@0.1.0
15
- ├── dependency: @openrouter/sdk ← Official OpenRouter TypeScript SDK (external in tsup)
16
- └── peerDep: krutai ← Core utilities
14
+ @krutai/ai-provider@0.2.2
15
+ └── peerDep: krutai (core utilities)
16
+
17
+ AI Flow:
18
+ User App → krutAI() / KrutAIProvider
19
+ → POST {serverUrl}/validate (key validation)
20
+ → POST {serverUrl}/generate (single response)
21
+ → POST {serverUrl}/stream (SSE streaming)
22
+ → POST {serverUrl}/chat (multi-turn)
23
+ → Your deployed LangChain server
17
24
  ```
18
25
 
19
- > **Important for AI**: Do NOT bundle `@openrouter/sdk` inline. It must stay external in tsup.
20
-
21
26
  ## File Structure
22
27
 
23
28
  ```
24
29
  packages/ai-provider/
25
30
  ├── src/
26
31
  │ ├── index.ts # krutAI() factory + all exports
27
- │ ├── client.ts # KrutAIProvider class
32
+ │ ├── client.ts # KrutAIProvider class (fetch-based)
28
33
  │ ├── types.ts # KrutAIProviderConfig, GenerateOptions, ChatMessage, DEFAULT_MODEL
29
- │ └── validator.ts # OpenRouter key format + service validation
34
+ │ └── validator.ts # API key format check + server validation
30
35
  ├── package.json
31
36
  ├── tsconfig.json
32
37
  └── tsup.config.ts
33
38
  ```
34
39
 
35
- ## Default Model
40
+ ## Server Endpoints (Expected by This Package)
36
41
 
37
- ```
38
- qwen/qwen3-235b-a22b-thinking-2507
39
- ```
42
+ | Endpoint | Method | Body | Response |
43
+ |---|---|---|---|
44
+ | `/validate` | POST | `{ apiKey }` | `{ valid: true/false, message? }` |
45
+ | `/generate` | POST | `{ prompt, model, system?, maxTokens?, temperature? }` | `{ text/content/message: string }` |
46
+ | `/stream` | POST | `{ prompt, model, system?, maxTokens?, temperature? }` | SSE stream `data: <chunk>` |
47
+ | `/chat` | POST | `{ messages, model, maxTokens?, temperature? }` | `{ text/content/message: string }` |
40
48
 
41
- Exported as `DEFAULT_MODEL` constant. Users override via `config.model` or per-call `options.model`.
49
+ All AI endpoints receive `Authorization: Bearer <apiKey>` and `x-api-key: <apiKey>` headers.
42
50
 
43
51
  ## Main Exports
44
52
 
45
- ### `krutAI(config?)` ← PRIMARY API
46
-
47
- Drop-in factory. Mirrors `krutAuth` from `@krutai/auth`.
53
+ ### `krutAI(config)` ← PRIMARY API
48
54
 
49
55
  ```typescript
50
56
  import { krutAI } from '@krutai/ai-provider';
51
57
 
52
- const ai = krutAI(); // OPENROUTER_API_KEY from env, default model
53
- await ai.initialize();
58
+ const ai = krutAI({
59
+ apiKey: process.env.KRUTAI_API_KEY!,
60
+ // uses http://localhost:8000 by default for local development
61
+ // serverUrl: 'https://ai.yourapp.com',
62
+ });
63
+
64
+ await ai.initialize(); // validates key with server
54
65
 
55
66
  const text = await ai.generate('Hello!');
56
67
  ```
@@ -62,51 +73,32 @@ import { KrutAIProvider } from '@krutai/ai-provider';
62
73
 
63
74
  const ai = new KrutAIProvider({
64
75
  apiKey: process.env.KRUTAI_API_KEY!,
65
- openRouterApiKey: process.env.OPENROUTER_API_KEY!,
66
- model: 'openai/gpt-4o', // optional
67
- validateOnInit: true, // default
68
- validationEndpoint: undefined, // TODO: wire up POST route
76
+ // serverUrl: 'https://ai.yourapp.com', // Optional: defaults to localhost:8000
77
+ model: 'gpt-4o', // optional, default: 'default'
78
+ validateOnInit: true, // default: true
69
79
  });
70
80
 
71
81
  await ai.initialize();
72
82
  ```
73
83
 
74
84
  **Methods:**
75
- - `initialize(): Promise<void>` — validates key + sets up OpenRouter client
85
+ - `initialize(): Promise<void>` — validates key against server, marks provider ready
76
86
  - `generate(prompt, opts?): Promise<string>` — single response (non-streaming)
77
- - `stream(prompt, opts?)` — async iterable of SSE chunks (`chunk.choices[0].delta.content`)
87
+ - `stream(prompt, opts?)` — `AsyncGenerator<string>` SSE-based streaming
78
88
  - `chat(messages, opts?): Promise<string>` — multi-turn conversation
79
89
  - `getModel(): string` — active model name
80
- - `getClient(): OpenRouter` — raw `@openrouter/sdk` client (advanced)
81
90
  - `isInitialized(): boolean`
82
91
 
83
- ## Underlying SDK Call
84
-
85
- The package calls `@openrouter/sdk` using the following structure:
86
-
87
- ```typescript
88
- // Non-streaming
89
- client.chat.send({
90
- chatGenerationParams: { model, messages, stream: false, maxTokens?, temperature? }
91
- });
92
-
93
- // Streaming
94
- client.chat.send({
95
- chatGenerationParams: { model, messages, stream: true, maxTokens?, temperature? }
96
- });
97
- ```
98
-
99
92
  ## Types
100
93
 
101
94
  ### `KrutAIProviderConfig`
102
95
 
103
96
  ```typescript
104
97
  interface KrutAIProviderConfig {
105
- apiKey: string; // KrutAI API key (required)
106
- openRouterApiKey?: string; // falls back to process.env.OPENROUTER_API_KEY
107
- model?: string; // default: DEFAULT_MODEL
108
- validateOnInit?: boolean; // default: true
109
- validationEndpoint?: string; // POST URL for key validation (future)
98
+ apiKey: string; // KrutAI API key — validated with server (required)
99
+ serverUrl?: string; // Base URL of deployed LangChain server (default: 'http://localhost:8000')
100
+ model?: string; // default: 'default'
101
+ validateOnInit?: boolean; // default: true
110
102
  }
111
103
  ```
112
104
 
@@ -132,29 +124,29 @@ interface GenerateOptions {
132
124
 
133
125
  ## Validator
134
126
 
135
- Defined in `src/validator.ts` (NOT imported from `krutai` — OpenRouter-specific).
127
+ Defined in `src/validator.ts`.
136
128
 
137
129
  ```typescript
138
- export { validateOpenRouterKeyFormat, validateOpenRouterKeyWithService, OpenRouterKeyValidationError };
130
+ export { validateApiKey, validateApiKeyFormat, KrutAIKeyValidationError };
139
131
  ```
140
132
 
141
133
  ### Validation Flow
142
134
 
143
- 1. **Format check** (sync, on construction): key must start with `sk-or-v1-` and be ≥ 20 chars
144
- 2. **Service check** (async, on `initialize()`): if `validationEndpoint` is set, sends `POST { apiKey }` and checks response; otherwise placeholder returns `true`
135
+ 1. **Format check** (sync, on construction): key must be a non-empty string
136
+ 2. **Server check** (async, on `initialize()`): sends `POST {serverUrl}/validate` with the key; expects `{ valid: true }`
145
137
 
146
138
  ## tsup Configuration Notes
147
139
 
148
- - `@openrouter/sdk` **external** (real dependency, NOT bundled)
149
- - `krutai` **external** (peer dep, NOT bundled)
140
+ - Only `krutai` is external (peer dep, NOT bundled)
141
+ - No third-party AI SDK pure native `fetch`
150
142
 
151
143
  ## Important Notes
152
144
 
153
- 1. **`krutAI()` is the primary API** prefer it over `new KrutAIProvider()` for simple setups
154
- 2. **Default model is `qwen/qwen3-235b-a22b-thinking-2507`**override via `config.model` or `opts.model`
155
- 3. **OpenRouter key from env** — set `OPENROUTER_API_KEY` and omit `openRouterApiKey` in config
156
- 4. **Validation endpoint is a placeholder** — wire up the POST route when deployed
157
- 5. **Do NOT bundle `@openrouter/sdk`** must stay external in tsup
145
+ 1. **`serverUrl` defaults to `http://localhost:8000`**make sure to override this with your deployed LangChain backend in production
146
+ 2. **`apiKey` is validated server-side**the server controls what keys are valid
147
+ 3. **Streaming uses SSE** — server must respond with `Content-Type: text/event-stream`
148
+ 4. **No external SDK needed** — uses native `fetch` only (Node 18+, browser, edge runtimes)
149
+ 5. **Response field fallback**tries `text content → message` from server JSON response
158
150
 
159
151
  ## Related Packages
160
152
 
@@ -164,7 +156,5 @@ export { validateOpenRouterKeyFormat, validateOpenRouterKeyWithService, OpenRout
164
156
 
165
157
  ## Links
166
158
 
167
- - OpenRouter SDK Docs: https://openrouter.ai/docs/sdks/typescript
168
- - OpenRouter Models: https://openrouter.ai/models
169
159
  - GitHub: https://github.com/AccountantAIOrg/krut_packages
170
160
  - npm: https://www.npmjs.com/package/@krutai/ai-provider
package/README.md CHANGED
@@ -1,16 +1,14 @@
1
1
  # @krutai/ai-provider
2
2
 
3
- AI Provider package for KrutAI — a thin wrapper around [`@openrouter/sdk`](https://www.npmjs.com/package/@openrouter/sdk), following the same patterns as `@krutai/auth`.
3
+ AI provider package for KrutAI — fetch-based client for your deployed LangChain server.
4
4
 
5
5
  ## Features
6
6
 
7
- - **Default model**: `qwen/qwen3-235b-a22b-thinking-2507` (change any time)
8
- - **OpenRouter key validation**: format check (`sk-or-v1-` prefix) on construction, optional service validation
9
- - **Pluggable validation endpoint**: set your own POST route once it's ready
10
- - **Text generation & streaming**: `generate()`, `stream()`, and `chat()` built on `@openrouter/sdk`
11
- - **Mirrors `@krutai/auth` patterns**: `krutAI()` factory + `KrutAIProvider` class
12
-
13
- ---
7
+ - 🔑 **API Key validation** validates your key against the server before use
8
+ - 🚀 **Zero SDK dependencies** uses native `fetch` only
9
+ - 📡 **Streaming** SSE-based streaming via async generator
10
+ - 💬 **Multi-turn chat** full conversation history support
11
+ - ⚙️ **Configurable** pass any model name to the server
14
12
 
15
13
  ## Installation
16
14
 
@@ -18,146 +16,137 @@ AI Provider package for KrutAI — a thin wrapper around [`@openrouter/sdk`](htt
18
16
  npm install @krutai/ai-provider
19
17
  ```
20
18
 
21
- ---
22
-
23
19
  ## Quick Start
24
20
 
25
- ### Simplest usage (no setup needed)
26
-
27
- The OpenRouter API key is built-in by default — just install and call:
28
-
29
21
  ```typescript
30
22
  import { krutAI } from '@krutai/ai-provider';
31
23
 
32
- const ai = krutAI();
33
- await ai.initialize();
24
+ const ai = krutAI({
25
+ apiKey: 'your-krutai-api-key',
26
+ // Optional: omitted to use the default local dev server ('http://localhost:8000')
27
+ // serverUrl: 'https://ai.yourapp.com',
28
+ });
34
29
 
35
- const text = await ai.generate('Write a haiku about coding');
30
+ await ai.initialize(); // validates key with your server
31
+
32
+ // Single response
33
+ const text = await ai.generate('Write a poem about TypeScript');
36
34
  console.log(text);
37
35
  ```
38
36
 
39
- > **Note:** The built-in key is temporary. It will be removed once the validation endpoint is live, at which point you will need to provide your own key via `OPENROUTER_API_KEY` env var or `openRouterApiKey` config.
37
+ ## Usage
40
38
 
41
- ### Change the model
39
+ ### Generate (single response)
42
40
 
43
41
  ```typescript
44
- const ai = krutAI({ model: 'openai/gpt-4o-mini' });
42
+ const ai = krutAI({
43
+ apiKey: process.env.KRUTAI_API_KEY!,
44
+ serverUrl: 'https://ai.yourapp.com', // Override default for production
45
+ model: 'gpt-4o', // optional — server's default is used if omitted
46
+ });
47
+
45
48
  await ai.initialize();
46
49
 
47
- const text = await ai.generate('Hello!');
50
+ const text = await ai.generate('Explain async/await in JavaScript', {
51
+ system: 'You are a helpful coding tutor.',
52
+ maxTokens: 500,
53
+ temperature: 0.7,
54
+ });
55
+
56
+ console.log(text);
48
57
  ```
49
58
 
50
59
  ### Streaming
51
60
 
52
61
  ```typescript
53
- const ai = krutAI();
62
+ const ai = krutAI({
63
+ apiKey: process.env.KRUTAI_API_KEY!,
64
+ // uses http://localhost:8000 by default
65
+ });
66
+
54
67
  await ai.initialize();
55
68
 
56
- const stream = await ai.stream('Tell me a story');
57
- for await (const chunk of stream) {
58
- process.stdout.write(chunk.choices?.[0]?.delta?.content ?? '');
69
+ // stream() is an async generator
70
+ for await (const chunk of ai.stream('Tell me a short story')) {
71
+ process.stdout.write(chunk);
59
72
  }
60
73
  ```
61
74
 
62
- ### Multi-turn conversation
75
+ ### Multi-turn Chat
63
76
 
64
77
  ```typescript
65
- const reply = await ai.chat([
66
- { role: 'user', content: 'Hi!' },
67
- { role: 'assistant', content: 'Hello! How can I help?' },
68
- { role: 'user', content: 'What is TypeScript?' },
69
- ]);
70
- ```
78
+ const ai = krutAI({
79
+ apiKey: process.env.KRUTAI_API_KEY!,
80
+ });
71
81
 
72
- ### Override model per call
82
+ await ai.initialize();
73
83
 
74
- ```typescript
75
- const text = await ai.generate('Summarise this', {
76
- model: 'anthropic/claude-3.5-sonnet',
77
- maxTokens: 512,
78
- temperature: 0.7,
79
- });
84
+ const response = await ai.chat([
85
+ { role: 'system', content: 'You are a helpful assistant.' },
86
+ { role: 'user', content: 'What is the capital of France?' },
87
+ { role: 'assistant', content: 'Paris.' },
88
+ { role: 'user', content: 'What is it famous for?' },
89
+ ]);
90
+
91
+ console.log(response);
80
92
  ```
81
93
 
82
- ### Using the class directly
94
+ ### Skip validation (useful for tests)
83
95
 
84
96
  ```typescript
85
- import { KrutAIProvider } from '@krutai/ai-provider';
86
-
87
- const ai = new KrutAIProvider({
88
- apiKey: process.env.KRUTAI_API_KEY!,
89
- openRouterApiKey: process.env.OPENROUTER_API_KEY!,
90
- model: 'google/gemini-flash-1.5', // optional override
91
- validateOnInit: true, // default
97
+ const ai = krutAI({
98
+ apiKey: 'test-key',
99
+ serverUrl: 'http://localhost:3000',
100
+ validateOnInit: false, // skips the /validate round-trip
92
101
  });
93
102
 
94
- await ai.initialize();
95
- const text = await ai.generate('Hi!');
103
+ // No need to call initialize() when validateOnInit is false
104
+ const text = await ai.generate('Hello!');
96
105
  ```
97
106
 
98
- ---
99
-
100
- ## API Reference
101
-
102
- ### `krutAI(config?)` — factory function
103
-
104
- | Field | Type | Default |
105
- |---|---|---|
106
- | `openRouterApiKey` | `string` | `process.env.OPENROUTER_API_KEY` |
107
- | `model` | `string` | `"qwen/qwen3-235b-a22b-thinking-2507"` |
108
- | `validateOnInit` | `boolean` | `true` |
109
- | `validationEndpoint` | `string` | `undefined` (placeholder) |
110
-
111
- ### `KrutAIProvider` class
107
+ ## Server API Contract
112
108
 
113
- | Method | Returns | Description |
114
- |---|---|---|
115
- | `initialize()` | `Promise<void>` | Validates key + sets up OpenRouter client |
116
- | `generate(prompt, opts?)` | `Promise<string>` | Single response |
117
- | `stream(prompt, opts?)` | async iterable | Streaming response |
118
- | `chat(messages, opts?)` | `Promise<string>` | Multi-turn conversation |
119
- | `getModel()` | `string` | Active model name |
120
- | `getClient()` | `OpenRouter` | Raw `@openrouter/sdk` client (advanced) |
121
- | `isInitialized()` | `boolean` | Ready check |
109
+ Your LangChain server must expose these endpoints:
122
110
 
123
- ---
111
+ | Endpoint | Method | Auth | Body |
112
+ |---|---|---|---|
113
+ | `/validate` | POST | `x-api-key` header | `{ "apiKey": "..." }` |
114
+ | `/generate` | POST | `Authorization: Bearer <key>` | `{ "prompt": "...", "model": "...", ... }` |
115
+ | `/stream` | POST | `Authorization: Bearer <key>` | `{ "prompt": "...", "model": "...", ... }` |
116
+ | `/chat` | POST | `Authorization: Bearer <key>` | `{ "messages": [...], "model": "...", ... }` |
124
117
 
125
- ## Default Model
118
+ **Validation response:** `{ "valid": true }` or `{ "valid": false, "message": "reason" }`
126
119
 
127
- ```
128
- qwen/qwen3-235b-a22b-thinking-2507
129
- ```
120
+ **AI response:** `{ "text": "..." }` or `{ "content": "..." }` or `{ "message": "..." }`
130
121
 
131
- Pass `model` in the constructor or per-call to override.
132
- Browse all available models at https://openrouter.ai/models.
122
+ **Stream:** `text/event-stream` with `data: <chunk>` lines, ending with `data: [DONE]`
133
123
 
134
- ---
124
+ ## API Reference
135
125
 
136
- ## Validation
126
+ ### `krutAI(config)`
137
127
 
138
- The OpenRouter key is validated for format (`sk-or-v1-` prefix) on construction.
128
+ Factory function preferred way to create a provider.
139
129
 
140
- When a `validationEndpoint` is provided, `initialize()` sends a `POST` request:
141
- ```json
142
- { "apiKey": "sk-or-v1-..." }
130
+ ```typescript
131
+ const ai = krutAI({
132
+ apiKey: string; // required — KrutAI API key
133
+ serverUrl?: string; // optional — defaults to 'http://localhost:8000'
134
+ model?: string; // optional — passed to server (default: 'default')
135
+ validateOnInit?: boolean; // optional — default: true
136
+ });
143
137
  ```
144
- Expected response: `{ "valid": true }` (or any `2xx` without `valid: false`).
145
-
146
- > The live endpoint will be wired in once you deploy the POST route.
147
138
 
148
- ---
139
+ ### `KrutAIProvider`
149
140
 
150
- ## Related Packages
141
+ Full class API with the same methods as above. Use when you need the class directly.
151
142
 
152
- - [`krutai`](https://www.npmjs.com/package/krutai) — Core utilities & API validation
153
- - [`@krutai/auth`](https://www.npmjs.com/package/@krutai/auth) — Authentication (wraps better-auth)
154
- - [`@krutai/rbac`](https://www.npmjs.com/package/@krutai/rbac) — Role-Based Access Control
143
+ ### Exports
155
144
 
156
- ---
145
+ ```typescript
146
+ export { krutAI, KrutAIProvider, KrutAIKeyValidationError, validateApiKey, validateApiKeyFormat, DEFAULT_MODEL };
147
+ export type { KrutAIProviderConfig, GenerateOptions, ChatMessage };
148
+ ```
157
149
 
158
- ## Links
150
+ ## License
159
151
 
160
- - OpenRouter SDK: https://openrouter.ai/docs/sdks/typescript
161
- - Available Models: https://openrouter.ai/models
162
- - GitHub: https://github.com/AccountantAIOrg/krut_packages
163
- - npm: https://www.npmjs.com/package/@krutai/ai-provider
152
+ MIT