@node-llm/orm 0.4.0 → 0.5.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -2,6 +2,42 @@
2
2
 
3
3
  All notable changes to this project will be documented in this file.
4
4
 
5
+ ## [1.11.0] - 2026-02-07 (@node-llm/core)
6
+
7
+ ### Features
8
+
9
+ - **Agent Class**: Declarative DSL for defining agents with static properties (model, instructions, tools, temperature, thinking, schema).
10
+ - **ToolHalt Mechanism**: Early termination of agentic loops via `this.halt(result)` in Tool.execute().
11
+ - **defineAgent()**: Inline agent definition without creating a class.
12
+ - **Agent Inheritance**: Child agents inherit parent static properties.
13
+ - **Instance Overrides**: Override static properties at instantiation (temperature, maxTokens, etc.).
14
+
15
+ ### Documentation
16
+
17
+ - Comprehensive Agent documentation with examples for RAG, model routing, and multi-agent patterns.
18
+ - Added doc tests for Agent, Tool, and ToolHalt patterns.
19
+
20
+ ## [0.5.0] - 2026-02-07 (@node-llm/orm)
21
+
22
+ ### Features
23
+
24
+ - **AgentSession**: Persistent agent conversations following "Code Wins" principle.
25
+ - `createAgentSession()`: Create new session with metadata.
26
+ - `loadAgentSession()`: Resume session by ID with class validation.
27
+ - Agent class defines behavior (model, tools, instructions), database provides history.
28
+ - **Idempotent Migrations**: Safe migration files for LlmAgentSession table.
29
+ - **Custom Table Names**: Support for custom table name mapping.
30
+
31
+ ## [0.5.0] - 2026-02-07 (@node-llm/testing)
32
+
33
+ ### Features
34
+
35
+ - **Agent Testing Support**:
36
+ - `mocker.callsTools()`: Mock responses that trigger multiple tool calls.
37
+ - `mocker.sequence()`: Return different responses for multi-turn agent conversations.
38
+ - `mocker.times(n)`: Limit mock matches with fallthrough behavior.
39
+ - **Improved Matching**: First-match semantics enable `times()` fallthrough patterns.
40
+
5
41
  ## [1.10.0] - 2026-02-01 (@node-llm/core)
6
42
 
7
43
  ### Features
@@ -109,7 +145,7 @@ All notable changes to this project will be documented in this file.
109
145
  - Shifted from `db push` to professional **Prisma Migrate** workflow for safe, versioned schema updates.
110
146
  - Added repository-wide `prisma/migrations` folder for reproducible deployments.
111
147
  - New built-in scripts: `npm run db:migrate`, `npm run db:deploy`, and `npm run db:status`.
112
- - Comprehensive [Database Migration Guide](https://node-llm.eshaiju.com/orm/migrations) for application scaling.
148
+ - Comprehensive [Database Migration Guide](https://nodellm.dev/orm/migrations) for application scaling.
113
149
 
114
150
  ## [1.6.2] - 2026-01-21 (@node-llm/core)
115
151
 
@@ -211,7 +247,7 @@ All notable changes to this project will be documented in this file.
211
247
  - `onToolCallStart`: Triggered when a tool call begins
212
248
  - `onToolCallEnd`: Triggered when a tool call completes successfully
213
249
  - `onToolCallError`: Triggered when a tool call fails
214
- - **Security Documentation**: New comprehensive [Security & Compliance](https://node-llm.eshaiju.com/advanced/security.html) guide covering:
250
+ - **Security Documentation**: New comprehensive [Security & Compliance](https://nodellm.dev/advanced/security.html) guide covering:
215
251
  - Smart Context Isolation
216
252
  - Content Policy Hooks
217
253
  - Tool Execution Policies
package/README.md CHANGED
@@ -7,7 +7,7 @@
7
7
 
8
8
  > Database persistence layer for NodeLLM. Automatically tracks chats, messages, tool calls, and API requests.
9
9
 
10
- **[Read the Full Documentation](https://node-llm.eshaiju.com/orm/prisma)** | **[View Example App](https://github.com/node-llm/node-llm/tree/main/examples/applications/hr-chatbot-rag)**
10
+ **[Read the Full Documentation](https://nodellm.dev/orm/prisma)** | **[View Example App](https://github.com/node-llm/node-llm/tree/main/examples/applications/hr-chatbot-rag)**
11
11
 
12
12
  ## Features
13
13
 
@@ -123,14 +123,15 @@ console.log(messages); // [{ role: 'user', content: '...' }, { role: 'assistant'
123
123
 
124
124
  ## Architecture
125
125
 
126
- The ORM tracks four core entities:
126
+ The ORM tracks five core entities:
127
127
 
128
- | Model | Purpose | Example |
129
- | --------------- | -------------------- | ------------------------------------------ |
130
- | **LlmChat** | Session container | Holds model, provider, system instructions |
131
- | **LlmMessage** | Conversation history | User queries and assistant responses |
132
- | **LlmToolCall** | Tool executions | Function calls made by the assistant |
133
- | **LlmRequest** | API metrics | Token usage, latency, cost per API call |
128
+ | Model | Purpose | Example |
129
+ | ------------------- | -------------------- | ------------------------------------------ |
130
+ | **LlmAgentSession** | Agent persistence | Links Agent class to Chat (v0.5.0+) |
131
+ | **LlmChat** | Session container | Holds model, provider, system instructions |
132
+ | **LlmMessage** | Conversation history | User queries and assistant responses |
133
+ | **LlmToolCall** | Tool executions | Function calls made by the assistant |
134
+ | **LlmRequest** | API metrics | Token usage, latency, cost per API call |
134
135
 
135
136
  ### Data Flow
136
137
 
@@ -152,6 +153,91 @@ Chat.ask()
152
153
  Return Response
153
154
  ```
154
155
 
156
+ ## Agent Sessions (v0.5.0+)
157
+
158
+ For **stateful agents** with persistence, use `AgentSession`. This follows the "Code Wins" principle:
159
+
160
+ - **Model, Tools, Instructions** → from Agent class (code)
161
+ - **Message History** → from database
162
+
163
+ ### Define an Agent (in @node-llm/core)
164
+
165
+ ```typescript
166
+ import { Agent, Tool, z } from "@node-llm/core";
167
+
168
+ class LookupOrderTool extends Tool {
169
+ static definition = {
170
+ name: "lookup_order",
171
+ description: "Look up order status",
172
+ parameters: z.object({ orderId: z.string() })
173
+ };
174
+ async execute({ orderId }) {
175
+ return { status: "shipped", eta: "Tomorrow" };
176
+ }
177
+ }
178
+
179
+ class SupportAgent extends Agent {
180
+ static model = "gpt-4.1";
181
+ static instructions = "You are a helpful support agent.";
182
+ static tools = [LookupOrderTool];
183
+ }
184
+ ```
185
+
186
+ ### Create & Resume Sessions
187
+
188
+ ```typescript
189
+ import { createAgentSession, loadAgentSession } from "@node-llm/orm/prisma";
190
+
191
+ // Create a new persistent session
192
+ const session = await createAgentSession(prisma, llm, SupportAgent, {
193
+ metadata: { userId: "user_123", ticketId: "TKT-456" }
194
+ });
195
+
196
+ await session.ask("Where is my order #789?");
197
+ console.log(session.id); // "sess_abc123" - save this!
198
+
199
+ // Resume later (even after code upgrades)
200
+ const session = await loadAgentSession(prisma, llm, SupportAgent, "sess_abc123");
201
+ await session.ask("Can you cancel it?");
202
+ ```
203
+
204
+ ### Code Wins Principle
205
+
206
+ When you deploy a code change (new model, updated tools), resumed sessions use the **new configuration**:
207
+
208
+ | Aspect | Source | Why |
209
+ | ------------ | ----------- | ------------------------------- |
210
+ | Model | Agent class | Immediate upgrades |
211
+ | Tools | Agent class | Only code can execute functions |
212
+ | Instructions | Agent class | Deploy prompt fixes immediately |
213
+ | History | Database | Sacred, never modified |
214
+
215
+ ### Schema Addition
216
+
217
+ Add `LlmAgentSession` to your Prisma schema:
218
+
219
+ ```prisma
220
+ model LlmAgentSession {
221
+ id String @id @default(uuid())
222
+ agentClass String // For validation (e.g., 'SupportAgent')
223
+ chatId String @unique
224
+ metadata Json? // Session context (userId, ticketId)
225
+ createdAt DateTime @default(now())
226
+ updatedAt DateTime @updatedAt
227
+
228
+ chat LlmChat @relation(fields: [chatId], references: [id], onDelete: Cascade)
229
+
230
+ @@index([agentClass])
231
+ @@index([createdAt])
232
+ }
233
+
234
+ // Add relation to LlmChat
235
+ model LlmChat {
236
+ // ... existing fields
237
+ agentSession LlmAgentSession?
238
+ }
239
+ ```
240
+
155
241
  ## Advanced Usage
156
242
 
157
243
  ### Streaming Responses
package/bin/cli.js CHANGED
@@ -63,21 +63,55 @@ function sync() {
63
63
  "thoughtSignature"
64
64
  ];
65
65
 
66
+ // Check for AgentSession model (v0.5.0+)
67
+ const hasAgentSession =
68
+ userSchema.includes("LlmAgentSession") || userSchema.includes("AgentSession");
69
+
66
70
  const missingFields = requiredFields.filter((field) => !userSchema.includes(field));
67
71
 
68
- if (missingFields.length === 0) {
69
- console.log("✓ Schema is already up to date with @node-llm/orm v0.2.0 features.");
72
+ if (missingFields.length === 0 && hasAgentSession) {
73
+ console.log("✓ Schema is already up to date with @node-llm/orm v0.5.0 features.");
70
74
  return;
71
75
  }
72
76
 
73
- console.log("🛠 Syncing missing fields for Extended Thinking support...");
74
- console.log(`\nDetected missing fields: ${missingFields.join(", ")}`);
77
+ if (missingFields.length > 0) {
78
+ console.log("🛠 Syncing missing fields for Extended Thinking support...");
79
+ console.log(`\nDetected missing fields: ${missingFields.join(", ")}`);
80
+ }
81
+
82
+ if (!hasAgentSession) {
83
+ console.log("\n🤖 Missing AgentSession support (v0.5.0+)");
84
+ console.log(
85
+ " AgentSession enables persistent agent conversations with the 'Code Wins' pattern."
86
+ );
87
+ console.log("\n To add AgentSession, copy the migration from:");
88
+ console.log(" node_modules/@node-llm/orm/migrations/add_agent_session.sql");
89
+ console.log("\n Or add this to your schema.prisma:");
90
+ console.log(`
91
+ model LlmAgentSession {
92
+ id String @id @default(uuid())
93
+ agentClass String
94
+ chatId String @unique
95
+ metadata Json?
96
+ createdAt DateTime @default(now())
97
+ updatedAt DateTime @updatedAt
98
+
99
+ chat LlmChat @relation(fields: [chatId], references: [id], onDelete: Cascade)
100
+
101
+ @@index([agentClass])
102
+ @@index([createdAt])
103
+ }
104
+
105
+ // Also add to LlmChat:
106
+ // agentSession LlmAgentSession?
107
+ `);
108
+ }
75
109
 
76
110
  console.log("\nTo update your schema safely, follow the Reference Schema at:");
77
111
  console.log("https://github.com/node-llm/node-llm/blob/main/packages/orm/schema.prisma");
78
112
 
79
113
  console.log("\nOr run this to generate a versioned migration:");
80
- console.log(" npx prisma migrate dev --name add_reasoning_support");
114
+ console.log(" npx prisma migrate dev --name add_agent_session");
81
115
  }
82
116
 
83
117
  function main() {
@@ -103,7 +137,7 @@ function main() {
103
137
  console.log(" npx @node-llm/orm sync # Check for missing ORM fields");
104
138
  console.log(" npx @node-llm/orm migrate # Rails-style migration helper");
105
139
  console.log("\nFor enterprise patterns, see:");
106
- console.log(" https://node-llm.eshaiju.com/orm/migrations");
140
+ console.log(" https://nodellm.dev/orm/migrations");
107
141
  }
108
142
  }
109
143
 
@@ -0,0 +1,140 @@
1
+ import { ChatOptions, AskOptions, NodeLLMCore, Agent, AgentConfig, Message, ChatChunk, Usage } from "@node-llm/core";
2
+ /**
3
+ * Record structure for the LLM Agent Session table.
4
+ */
5
+ export interface AgentSessionRecord {
6
+ id: string;
7
+ agentClass: string;
8
+ chatId: string;
9
+ metadata?: Record<string, unknown> | null;
10
+ createdAt: Date;
11
+ updatedAt: Date;
12
+ }
13
+ /**
14
+ * Record structure for the LLM Message table.
15
+ */
16
+ export interface MessageRecord {
17
+ id: string;
18
+ chatId: string;
19
+ role: string;
20
+ content: string | null;
21
+ contentRaw?: string | null;
22
+ thinkingText?: string | null;
23
+ thinkingSignature?: string | null;
24
+ thinkingTokens?: number | null;
25
+ inputTokens?: number | null;
26
+ outputTokens?: number | null;
27
+ modelId?: string | null;
28
+ provider?: string | null;
29
+ createdAt: Date;
30
+ }
31
+ /**
32
+ * Table name customization.
33
+ */
34
+ export interface TableNames {
35
+ agentSession?: string;
36
+ chat?: string;
37
+ message?: string;
38
+ toolCall?: string;
39
+ request?: string;
40
+ }
41
+ type AgentClass<T extends Agent = Agent> = (new (overrides?: Partial<AgentConfig & ChatOptions>) => T) & {
42
+ name: string;
43
+ model?: string;
44
+ instructions?: string;
45
+ tools?: unknown[];
46
+ };
47
+ /**
48
+ * AgentSession - Wraps an Agent instance with persistence capabilities.
49
+ *
50
+ * Follows "Code Wins" sovereignty:
51
+ * - Model, Tools, Instructions come from the Agent class (code)
52
+ * - Message history comes from the database
53
+ *
54
+ * @example
55
+ * ```typescript
56
+ * // Create a new session
57
+ * const session = await createAgentSession(prisma, llm, SupportAgent, {
58
+ * metadata: { userId: "123" }
59
+ * });
60
+ *
61
+ * // Resume a session
62
+ * const session = await loadAgentSession(prisma, llm, SupportAgent, "sess_abc");
63
+ *
64
+ * // Agent behavior is always defined in code
65
+ * const result = await session.ask("Hello");
66
+ * ```
67
+ */
68
+ export declare class AgentSession<T extends Agent = Agent> {
69
+ private prisma;
70
+ private llm;
71
+ private AgentClass;
72
+ private record;
73
+ private agent;
74
+ private currentMessageId;
75
+ private tableNames;
76
+ private debug;
77
+ constructor(prisma: any, llm: NodeLLMCore, AgentClass: AgentClass<T>, record: AgentSessionRecord, tableNames?: TableNames, agent?: T, debug?: boolean);
78
+ private log;
79
+ /** Agent instance (for direct access if needed) */
80
+ get instance(): T;
81
+ /** Session ID for persistence */
82
+ get id(): string;
83
+ /** Underlying chat ID */
84
+ get chatId(): string;
85
+ /** Session metadata */
86
+ get metadata(): Record<string, unknown> | null | undefined;
87
+ /** Agent class name */
88
+ get agentClass(): string;
89
+ /** Model ID used by the agent */
90
+ get modelId(): string;
91
+ /** Cumulative usage for this session (from agent memory) */
92
+ get totalUsage(): Usage;
93
+ /** Current in-memory message history */
94
+ get history(): readonly Message[];
95
+ /**
96
+ * Helper to get a typed Prisma model by its dynamic name.
97
+ */
98
+ private getModel;
99
+ /**
100
+ * Send a message and persist the conversation.
101
+ */
102
+ ask(input: string, options?: AskOptions): Promise<MessageRecord>;
103
+ /**
104
+ * Stream a response and persist the conversation.
105
+ */
106
+ askStream(input: string, options?: AskOptions): AsyncGenerator<ChatChunk, MessageRecord, undefined>;
107
+ /**
108
+ * Returns the current full message history for this session.
109
+ */
110
+ messages(): Promise<MessageRecord[]>;
111
+ /**
112
+ * Delete the entire session and its history.
113
+ */
114
+ delete(): Promise<void>;
115
+ }
116
+ /**
117
+ * Options for creating a new agent session.
118
+ */
119
+ export interface CreateAgentSessionOptions {
120
+ metadata?: Record<string, unknown>;
121
+ tableNames?: TableNames;
122
+ debug?: boolean;
123
+ }
124
+ /**
125
+ * Creates a new agent session and its persistent chat record.
126
+ */
127
+ export declare function createAgentSession<T extends Agent>(prisma: any, llm: NodeLLMCore, AgentClass: AgentClass<T>, options?: CreateAgentSessionOptions): Promise<AgentSession<T>>;
128
+ /**
129
+ * Options for loading an existing agent session.
130
+ */
131
+ export interface LoadAgentSessionOptions {
132
+ tableNames?: TableNames;
133
+ debug?: boolean;
134
+ }
135
+ /**
136
+ * Loads an existing agent session and re-instantiates the agent with history.
137
+ */
138
+ export declare function loadAgentSession<T extends Agent>(prisma: any, llm: NodeLLMCore, AgentClass: AgentClass<T>, sessionId: string, options?: LoadAgentSessionOptions): Promise<AgentSession<T> | null>;
139
+ export {};
140
+ //# sourceMappingURL=AgentSession.d.ts.map
@@ -0,0 +1 @@
1
+ {"version":3,"file":"AgentSession.d.ts","sourceRoot":"","sources":["../../../src/adapters/prisma/AgentSession.ts"],"names":[],"mappings":"AAAA,OAAO,EACL,WAAW,EACX,UAAU,EACV,WAAW,EACX,KAAK,EACL,WAAW,EACX,OAAO,EACP,SAAS,EACT,KAAK,EACN,MAAM,gBAAgB,CAAC;AASxB;;GAEG;AACH,MAAM,WAAW,kBAAkB;IACjC,EAAE,EAAE,MAAM,CAAC;IACX,UAAU,EAAE,MAAM,CAAC;IACnB,MAAM,EAAE,MAAM,CAAC;IACf,QAAQ,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,OAAO,CAAC,GAAG,IAAI,CAAC;IAC1C,SAAS,EAAE,IAAI,CAAC;IAChB,SAAS,EAAE,IAAI,CAAC;CACjB;AAED;;GAEG;AACH,MAAM,WAAW,aAAa;IAC5B,EAAE,EAAE,MAAM,CAAC;IACX,MAAM,EAAE,MAAM,CAAC;IACf,IAAI,EAAE,MAAM,CAAC;IACb,OAAO,EAAE,MAAM,GAAG,IAAI,CAAC;IACvB,UAAU,CAAC,EAAE,MAAM,GAAG,IAAI,CAAC;IAC3B,YAAY,CAAC,EAAE,MAAM,GAAG,IAAI,CAAC;IAC7B,iBAAiB,CAAC,EAAE,MAAM,GAAG,IAAI,CAAC;IAClC,cAAc,CAAC,EAAE,MAAM,GAAG,IAAI,CAAC;IAC/B,WAAW,CAAC,EAAE,MAAM,GAAG,IAAI,CAAC;IAC5B,YAAY,CAAC,EAAE,MAAM,GAAG,IAAI,CAAC;IAC7B,OAAO,CAAC,EAAE,MAAM,GAAG,IAAI,CAAC;IACxB,QAAQ,CAAC,EAAE,MAAM,GAAG,IAAI,CAAC;IACzB,SAAS,EAAE,IAAI,CAAC;CACjB;AAED;;GAEG;AACH,MAAM,WAAW,UAAU;IACzB,YAAY,CAAC,EAAE,MAAM,CAAC;IACtB,IAAI,CAAC,EAAE,MAAM,CAAC;IACd,OAAO,CAAC,EAAE,MAAM,CAAC;IACjB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,OAAO,CAAC,EAAE,MAAM,CAAC;CAClB;AAgBD,KAAK,UAAU,CAAC,CAAC,SAAS,KAAK,GAAG,KAAK,IAAI,CAAC,KAC1C,SAAS,CAAC,EAAE,OAAO,CAAC,WAAW,GAAG,WAAW,CAAC,KAC3C,CAAC,CAAC,GAAG;IACR,IAAI,EAAE,MAAM,CAAC;IACb,KAAK,CAAC,EAAE,MAAM,CAAC;IACf,YAAY,CAAC,EAAE,MAAM,CAAC;IACtB,KAAK,CAAC,EAAE,OAAO,EAAE,CAAC;CACnB,CAAC;AAEF;;;;;;;;;;;;;;;;;;;;GAoBG;AACH,qBAAa,YAAY,CAAC,CAAC,SAAS,KAAK,GAAG,KAAK;IAM7C,OAAO,CAAC,MAAM;IACd,OAAO,CAAC,GAAG;IACX,OAAO,CAAC,UAAU;IAClB,OAAO,CAAC,MAAM;IAEd,OAAO,CAAC,KAAK;IAVf,OAAO,CAAC,gBAAgB,CAAuB;IAC/C,OAAO,CAAC,UAAU,CAAuB;IACzC,OAAO,CAAC,KAAK,CAAU;gBAGb,MAAM,EAAE,GAAG,EACX,GAAG,EAAE,WAAW,EAChB,UAAU,EAAE,UAAU,CAAC,CAAC,CAAC,EACzB,MAAM,EAAE,kBAAkB,EAClC,UAAU,CAAC,EAAE,UAAU,EACf,KAAK,GAAE,CAEb,EACF,KAAK,GAAE,OAAe;IAYxB,OAAO,CAAC,GAAG;IAMX,mDAAmD;IACnD,IAAI,QAAQ,IAAI,CAAC,CAEhB;IAED,iCAAiC;IACjC,IAAI,EAAE,IAAI,MAAM,CAEf;IAED,yBAAyB;IACzB,IAAI,MAAM,IAAI,MAAM,CAEnB;IAED,uBAAuB;IACvB,IAAI,QAAQ,IAAI,MAAM,CAAC,MAAM,EAAE,OAAO,CAAC,GAAG,IAAI,GAAG,SAAS,CAEzD;IAED,uBAAuB;IACvB,IAAI,UAAU,IAAI,MAAM,CAEvB;IAED,iCAAiC;IACjC,IAAI,OAAO,IAAI,MAAM,CAEpB;IAED,4DAA4D;IAC5D,IAAI,UAAU,IAAI,KAAK,CAEtB;IAED,wCAAwC;IACxC,IAAI,OAAO,IAAI,SAAS,OAAO,EAAE,CAEhC;IAED;;OAEG;IACH,OAAO,CAAC,QAAQ;IAIhB;;OAEG;IACG,GAAG,CAAC,KAAK,EAAE,MAAM,EAAE,OAAO,GAAE,UAAe,GAAG,OAAO,CAAC,aAAa,CAAC;IAyC1E;;OAEG;IACI,SAAS,CACd,KAAK,EAAE,MAAM,EACb,OAAO,GAAE,UAAe,GACvB,cAAc,CAAC,SAAS,EAAE,aAAa,EAAE,SAAS,CAAC;IA+CtD;;OAEG;IACG,QAAQ,IAAI,OAAO,CAAC,aAAa,EAAE,CAAC;IAQ1C;;OAEG;IACG,MAAM,IAAI,OAAO,CAAC,IAAI,CAAC;CAK9B;AAED;;GAEG;AACH,MAAM,WAAW,yBAAyB;IACxC,QAAQ,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;IACnC,UAAU,CAAC,EAAE,UAAU,CAAC;IACxB,KAAK,CAAC,EAAE,OAAO,CAAC;CACjB;AAED;;GAEG;AACH,wBAAsB,kBAAkB,CAAC,CAAC,SAAS,KAAK,EACtD,MAAM,EAAE,GAAG,EACX,GAAG,EAAE,WAAW,EAChB,UAAU,EAAE,UAAU,CAAC,CAAC,CAAC,EACzB,OAAO,GAAE,yBAA8B,GACtC,OAAO,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC,CAyC1B;AAED;;GAEG;AACH,MAAM,WAAW,uBAAuB;IACtC,UAAU,CAAC,EAAE,UAAU,CAAC;IACxB,KAAK,CAAC,EAAE,OAAO,CAAC;CACjB;AAED;;GAEG;AACH,wBAAsB,gBAAgB,CAAC,CAAC,SAAS,KAAK,EACpD,MAAM,EAAE,GAAG,EACX,GAAG,EAAE,WAAW,EAChB,UAAU,EAAE,UAAU,CAAC,CAAC,CAAC,EACzB,SAAS,EAAE,MAAM,EACjB,OAAO,GAAE,uBAA4B,GACpC,OAAO,CAAC,YAAY,CAAC,CAAC,CAAC,GAAG,IAAI,CAAC,CAyDjC"}
@@ -0,0 +1,284 @@
1
+ /**
2
+ * AgentSession - Wraps an Agent instance with persistence capabilities.
3
+ *
4
+ * Follows "Code Wins" sovereignty:
5
+ * - Model, Tools, Instructions come from the Agent class (code)
6
+ * - Message history comes from the database
7
+ *
8
+ * @example
9
+ * ```typescript
10
+ * // Create a new session
11
+ * const session = await createAgentSession(prisma, llm, SupportAgent, {
12
+ * metadata: { userId: "123" }
13
+ * });
14
+ *
15
+ * // Resume a session
16
+ * const session = await loadAgentSession(prisma, llm, SupportAgent, "sess_abc");
17
+ *
18
+ * // Agent behavior is always defined in code
19
+ * const result = await session.ask("Hello");
20
+ * ```
21
+ */
22
+ export class AgentSession {
23
+ prisma;
24
+ llm;
25
+ AgentClass;
26
+ record;
27
+ agent;
28
+ currentMessageId = null;
29
+ tableNames;
30
+ debug;
31
+ constructor(prisma, llm, AgentClass, record, tableNames, agent = new AgentClass({
32
+ llm
33
+ }), debug = false) {
34
+ this.prisma = prisma;
35
+ this.llm = llm;
36
+ this.AgentClass = AgentClass;
37
+ this.record = record;
38
+ this.agent = agent;
39
+ this.debug = debug;
40
+ this.tableNames = {
41
+ agentSession: tableNames?.agentSession || "llmAgentSession",
42
+ chat: tableNames?.chat || "llmChat",
43
+ message: tableNames?.message || "llmMessage",
44
+ toolCall: tableNames?.toolCall || "llmToolCall",
45
+ request: tableNames?.request || "llmRequest"
46
+ };
47
+ }
48
+ log(...args) {
49
+ if (this.debug) {
50
+ console.log(`[@node-llm/orm]`, ...args);
51
+ }
52
+ }
53
+ /** Agent instance (for direct access if needed) */
54
+ get instance() {
55
+ return this.agent;
56
+ }
57
+ /** Session ID for persistence */
58
+ get id() {
59
+ return this.record.id;
60
+ }
61
+ /** Underlying chat ID */
62
+ get chatId() {
63
+ return this.record.chatId;
64
+ }
65
+ /** Session metadata */
66
+ get metadata() {
67
+ return this.record.metadata;
68
+ }
69
+ /** Agent class name */
70
+ get agentClass() {
71
+ return this.record.agentClass;
72
+ }
73
+ /** Model ID used by the agent */
74
+ get modelId() {
75
+ return this.agent.modelId;
76
+ }
77
+ /** Cumulative usage for this session (from agent memory) */
78
+ get totalUsage() {
79
+ return this.agent.totalUsage;
80
+ }
81
+ /** Current in-memory message history */
82
+ get history() {
83
+ return this.agent.history;
84
+ }
85
+ /**
86
+ * Helper to get a typed Prisma model by its dynamic name.
87
+ */
88
+ getModel(name) {
89
+ return getTable(this.prisma, name);
90
+ }
91
+ /**
92
+ * Send a message and persist the conversation.
93
+ */
94
+ async ask(input, options = {}) {
95
+ const model = this.getModel(this.tableNames.message);
96
+ // Persist user message
97
+ await model.create({
98
+ data: { chatId: this.chatId, role: "user", content: input }
99
+ });
100
+ // Create placeholder for assistant message
101
+ const assistantMessage = await model.create({
102
+ data: { chatId: this.chatId, role: "assistant", content: null }
103
+ });
104
+ this.currentMessageId = assistantMessage.id;
105
+ try {
106
+ // Get response from agent (uses code-defined config + injected history)
107
+ const response = await this.agent.ask(input, options);
108
+ // Update assistant message with response
109
+ return await model.update({
110
+ where: { id: assistantMessage.id },
111
+ data: {
112
+ content: response.content,
113
+ contentRaw: JSON.stringify(response.meta),
114
+ inputTokens: response.usage?.input_tokens || 0,
115
+ outputTokens: response.usage?.output_tokens || 0,
116
+ thinkingText: response.thinking?.text || null,
117
+ thinkingSignature: response.thinking?.signature || null,
118
+ thinkingTokens: response.thinking?.tokens || null,
119
+ modelId: response.model || null,
120
+ provider: response.provider || null
121
+ }
122
+ });
123
+ }
124
+ catch (error) {
125
+ // Clean up placeholder on error
126
+ await model.delete({ where: { id: assistantMessage.id } });
127
+ throw error;
128
+ }
129
+ }
130
+ /**
131
+ * Stream a response and persist the conversation.
132
+ */
133
+ async *askStream(input, options = {}) {
134
+ const model = this.getModel(this.tableNames.message);
135
+ // Persist user message
136
+ await model.create({
137
+ data: { chatId: this.chatId, role: "user", content: input }
138
+ });
139
+ // Create placeholder for assistant message
140
+ const assistantMessage = await model.create({
141
+ data: { chatId: this.chatId, role: "assistant", content: null }
142
+ });
143
+ this.currentMessageId = assistantMessage.id;
144
+ try {
145
+ const stream = this.agent.stream(input, options);
146
+ let fullContent = "";
147
+ let lastChunk = null;
148
+ for await (const chunk of stream) {
149
+ fullContent += chunk.content;
150
+ lastChunk = chunk;
151
+ yield chunk;
152
+ }
153
+ // Final update with accumulated result
154
+ return await model.update({
155
+ where: { id: assistantMessage.id },
156
+ data: {
157
+ content: fullContent,
158
+ inputTokens: lastChunk?.usage?.input_tokens || 0,
159
+ outputTokens: lastChunk?.usage?.output_tokens || 0,
160
+ thinkingText: lastChunk?.thinking?.text || null,
161
+ thinkingSignature: lastChunk?.thinking?.signature || null,
162
+ thinkingTokens: lastChunk?.thinking?.tokens || null,
163
+ modelId: lastChunk?.metadata?.model || null,
164
+ provider: lastChunk?.metadata?.provider || null
165
+ }
166
+ });
167
+ }
168
+ catch (error) {
169
+ await model.delete({ where: { id: assistantMessage.id } });
170
+ throw error;
171
+ }
172
+ }
173
+ /**
174
+ * Returns the current full message history for this session.
175
+ */
176
+ async messages() {
177
+ const model = this.getModel(this.tableNames.message);
178
+ return await model.findMany({
179
+ where: { chatId: this.chatId },
180
+ orderBy: { createdAt: "asc" }
181
+ });
182
+ }
183
+ /**
184
+ * Delete the entire session and its history.
185
+ */
186
+ async delete() {
187
+ const chatTable = this.getModel(this.tableNames.chat);
188
+ await chatTable.delete({ where: { id: this.chatId } });
189
+ // AgentSession record is deleted via Cascade from LlmChat
190
+ }
191
+ }
192
+ /**
193
+ * Creates a new agent session and its persistent chat record.
194
+ */
195
+ export async function createAgentSession(prisma, llm, AgentClass, options = {}) {
196
+ const tableNames = {
197
+ agentSession: options.tableNames?.agentSession || "llmAgentSession",
198
+ chat: options.tableNames?.chat || "llmChat",
199
+ message: options.tableNames?.message || "llmMessage"
200
+ };
201
+ if (options.debug) {
202
+ console.log(`[@node-llm/orm] createAgentSession: agentClass=${AgentClass.name}`);
203
+ }
204
+ // 1. Create underlying LlmChat record
205
+ const chatTable = getTable(prisma, tableNames.chat);
206
+ const chatRecord = (await chatTable.create({
207
+ data: {
208
+ model: AgentClass.model || null,
209
+ provider: null,
210
+ instructions: AgentClass.instructions || null,
211
+ metadata: null // Runtime metadata goes in Chat, session context in AgentSession
212
+ }
213
+ }));
214
+ // 2. Create AgentSession record
215
+ const sessionTable = getTable(prisma, tableNames.agentSession);
216
+ const sessionRecord = (await sessionTable.create({
217
+ data: {
218
+ agentClass: AgentClass.name,
219
+ chatId: chatRecord.id,
220
+ metadata: options.metadata || null
221
+ }
222
+ }));
223
+ return new AgentSession(prisma, llm, AgentClass, sessionRecord, options.tableNames, undefined, options.debug);
224
+ }
225
+ /**
226
+ * Loads an existing agent session and re-instantiates the agent with history.
227
+ */
228
+ export async function loadAgentSession(prisma, llm, AgentClass, sessionId, options = {}) {
229
+ const tableNames = {
230
+ agentSession: options.tableNames?.agentSession || "llmAgentSession",
231
+ chat: options.tableNames?.chat || "llmChat",
232
+ message: options.tableNames?.message || "llmMessage"
233
+ };
234
+ if (options.debug) {
235
+ console.log(`[@node-llm/orm] loadAgentSession: id=${sessionId}`);
236
+ }
237
+ // 1. Find session record
238
+ const sessionTable = getTable(prisma, tableNames.agentSession);
239
+ const sessionRecord = (await sessionTable.findUnique({
240
+ where: { id: sessionId }
241
+ }));
242
+ if (!sessionRecord) {
243
+ return null;
244
+ }
245
+ // 1.5. Validate Agent Class (Code Wins Sovereignty)
246
+ if (sessionRecord.agentClass !== AgentClass.name) {
247
+ throw new Error(`Agent class mismatch: Session "${sessionId}" was created for "${sessionRecord.agentClass}", but is being loaded with "${AgentClass.name}".`);
248
+ }
249
+ // 2. Load message history
250
+ const messageTable = getTable(prisma, tableNames.message);
251
+ const messages = (await messageTable.findMany({
252
+ where: { chatId: sessionRecord.chatId },
253
+ orderBy: { createdAt: "asc" }
254
+ }));
255
+ // 3. Convert DB messages to NodeLLM Message format
256
+ const history = messages.map((m) => ({
257
+ role: m.role,
258
+ content: m.content || ""
259
+ }));
260
+ // 4. Instantiate agent with injected history and LLM
261
+ // "Code Wins" - model, tools, instructions come from AgentClass
262
+ const agent = new AgentClass({
263
+ llm,
264
+ messages: history
265
+ });
266
+ return new AgentSession(prisma, llm, AgentClass, sessionRecord, options.tableNames, agent, options.debug);
267
+ }
268
+ /**
269
+ * Dynamic helper to access Prisma models by name.
270
+ * Handles both case-sensitive and case-insensitive lookups for flexibility.
271
+ */
272
+ function getTable(prisma, tableName) {
273
+ const p = prisma;
274
+ // 1. Direct match
275
+ const table = p[tableName];
276
+ if (table)
277
+ return table;
278
+ // 2. Case-insensitive match
279
+ const keys = Object.keys(prisma).filter((k) => !k.startsWith("$") && !k.startsWith("_"));
280
+ const match = keys.find((k) => k.toLowerCase() === tableName.toLowerCase());
281
+ if (match && p[match])
282
+ return p[match];
283
+ throw new Error(`[@node-llm/orm] Prisma table "${tableName}" not found. Available tables: ${keys.join(", ")}`);
284
+ }