conversationalist 0.0.1 → 0.0.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,47 +1,123 @@
1
1
  # Conversationalist
2
2
 
3
- A TypeScript library for managing LLM conversation state with immutable data structures and type-safe APIs.
3
+ A TypeScript-first library for managing LLM conversation state with immutable updates, type-safe APIs, and provider adapters.
4
+
5
+ The high-level value proposition:
6
+
7
+ - keep a single, model-agnostic conversation shape across UI, storage, and providers
8
+ - append messages without mutation (safe for React, concurrent updates, replay)
9
+ - handle tool calls, streaming responses, and hidden or internal messages
10
+ - trim context windows and estimate token budgets
11
+ - validate input with Zod schemas and typed errors
4
12
 
5
13
  ## Installation
6
14
 
7
15
  ```bash
8
- bun add conversationalist
9
- npm add conversationalist
16
+ bun add conversationalist zod
17
+ npm add conversationalist zod
18
+ pnpm add conversationalist zod
10
19
  ```
11
20
 
12
- ## Core Concepts
13
-
14
- ### Conversation
21
+ This package is ESM-only. Use `import` syntax.
22
+ Zod is a peer dependency and must be installed by your app.
15
23
 
16
- A conversation is an immutable data structure containing messages, metadata, and timestamps:
24
+ ## Quick Start
17
25
 
18
- ```typescript
26
+ ```ts
19
27
  import {
20
- createConversation,
28
+ appendAssistantMessage,
21
29
  appendUserMessage,
30
+ createConversation,
31
+ deserializeConversation,
32
+ serializeConversation,
33
+ } from 'conversationalist';
34
+ import { toOpenAIMessages } from 'conversationalist/openai';
35
+
36
+ let conversation = createConversation({
37
+ title: 'Support chat',
38
+ tags: ['support'],
39
+ });
40
+
41
+ conversation = appendUserMessage(conversation, 'Where is my order?');
42
+ conversation = appendAssistantMessage(conversation, 'Let me check that for you.');
43
+
44
+ const messages = toOpenAIMessages(conversation);
45
+ // send messages to your provider...
46
+
47
+ const stored = serializeConversation(conversation);
48
+ const restored = deserializeConversation(stored);
49
+ ```
50
+
51
+ ## End-to-End Example (Store + Resume)
52
+
53
+ ```ts
54
+ import {
22
55
  appendAssistantMessage,
56
+ appendSystemMessage,
57
+ appendUserMessage,
58
+ createConversation,
59
+ deserializeConversation,
60
+ serializeConversation,
23
61
  } from 'conversationalist';
62
+ import { toOpenAIMessages } from 'conversationalist/openai';
63
+
64
+ let conversation = createConversation({ title: 'Order lookup' });
65
+ conversation = appendSystemMessage(conversation, 'You are a support agent.');
66
+ conversation = appendUserMessage(conversation, 'Where is order 123?');
67
+
68
+ const response = await openai.chat.completions.create({
69
+ model: 'gpt-4.1-mini',
70
+ messages: toOpenAIMessages(conversation),
71
+ });
24
72
 
25
- let conversation = createConversation({ title: 'My Chat' });
73
+ conversation = appendAssistantMessage(
74
+ conversation,
75
+ response.choices[0]?.message?.content ?? '',
76
+ );
26
77
 
27
- conversation = appendUserMessage(conversation, 'Hello!');
28
- conversation = appendAssistantMessage(conversation, 'Hi there!');
78
+ const stored = serializeConversation(conversation);
79
+ // await db.save(stored);
80
+
81
+ let restored = deserializeConversation(stored);
82
+ restored = appendUserMessage(restored, 'Can you email me the tracking link?');
29
83
  ```
30
84
 
31
- ### Messages
85
+ ## When to Use
32
86
 
33
- Messages have roles (`user`, `assistant`, `system`, `tool-use`, `tool-result`, `developer`, `snapshot`) and can contain text or multi-modal content:
87
+ - Build multi-provider chat apps with a consistent message model.
88
+ - Store and replay conversation history safely.
89
+ - Drive streaming UIs without ad-hoc state machines.
90
+ - Enforce tool call correctness and pair calls with results.
91
+ - Manage context windows and token limits in one place.
34
92
 
35
- ```typescript
36
- import { appendMessages } from 'conversationalist';
93
+ ## Core Concepts
37
94
 
38
- // Text message
39
- conversation = appendMessages(conversation, {
40
- role: 'user',
41
- content: 'What is in this image?',
95
+ ### Conversations
96
+
97
+ A conversation is an immutable record with metadata, tags, timestamps, and ordered messages.
98
+
99
+ ```ts
100
+ import { createConversation } from 'conversationalist';
101
+
102
+ const conversation = createConversation({
103
+ title: 'My Chat',
104
+ status: 'active',
105
+ metadata: { customerId: 'cus_123' },
106
+ tags: ['support', 'vip'],
42
107
  });
108
+ ```
109
+
110
+ ### Messages
111
+
112
+ Messages have roles and can contain text or multi-modal content. Optional fields include
113
+ `metadata`, `hidden`, `tokenUsage`, `toolCall`, `toolResult`, and `goalCompleted`.
114
+
115
+ Roles: `user`, `assistant`, `system`, `developer`, `tool-use`, `tool-result`, `snapshot`.
116
+ The `snapshot` role is for internal state and is skipped by adapters.
117
+
118
+ ```ts
119
+ import { appendMessages } from 'conversationalist';
43
120
 
44
- // Multi-modal message with image
45
121
  conversation = appendMessages(conversation, {
46
122
  role: 'user',
47
123
  content: [
@@ -51,100 +127,278 @@ conversation = appendMessages(conversation, {
51
127
  });
52
128
  ```
53
129
 
130
+ Hidden messages remain in history but are skipped by default when querying or adapting to
131
+ providers.
132
+
54
133
  ### Tool Calls
55
134
 
56
- Tool use is supported with linked tool calls and results:
135
+ Tool calls are represented as paired `tool-use` and `tool-result` messages. Tool results
136
+ are validated to ensure the referenced call exists.
57
137
 
58
- ```typescript
138
+ ```ts
59
139
  conversation = appendMessages(
60
140
  conversation,
61
141
  {
62
142
  role: 'tool-use',
63
143
  content: '',
64
- toolCall: { id: 'call_123', name: 'get_weather', arguments: '{"city":"NYC"}' },
144
+ toolCall: { id: 'call_123', name: 'getWeather', arguments: { city: 'NYC' } },
65
145
  },
66
146
  {
67
147
  role: 'tool-result',
68
148
  content: '',
69
- toolResult: { callId: 'call_123', outcome: 'success', content: '72°F, sunny' },
149
+ toolResult: {
150
+ callId: 'call_123',
151
+ outcome: 'success',
152
+ content: { tempF: 72, condition: 'sunny' },
153
+ },
70
154
  },
71
155
  );
72
156
  ```
73
157
 
74
- ## API Reference
158
+ Use `pairToolCallsWithResults` to render tool calls alongside their results.
159
+
160
+ ### Streaming
161
+
162
+ Streaming helpers let you append a placeholder, update it as chunks arrive, and finalize
163
+ when done.
164
+
165
+ ```ts
166
+ import {
167
+ appendStreamingMessage,
168
+ finalizeStreamingMessage,
169
+ updateStreamingMessage,
170
+ } from 'conversationalist';
171
+
172
+ let { conversation, messageId } = appendStreamingMessage(conversation, 'assistant');
173
+ let content = '';
75
174
 
76
- ### Creating Conversations
175
+ for await (const chunk of stream) {
176
+ content += chunk;
177
+ conversation = updateStreamingMessage(conversation, messageId, content);
178
+ }
77
179
 
78
- ```typescript
79
- createConversation(options?: {
80
- id?: string;
81
- title?: string;
82
- status?: 'active' | 'archived' | 'deleted';
83
- metadata?: Record<string, unknown>;
84
- tags?: string[];
85
- }): Conversation
180
+ conversation = finalizeStreamingMessage(conversation, messageId, {
181
+ tokenUsage: { prompt: 100, completion: 50, total: 150 },
182
+ });
86
183
  ```
87
184
 
88
- ### Appending Messages
185
+ ### Context Window
186
+
187
+ Trim history to fit token budgets or to keep only recent messages.
89
188
 
90
- ```typescript
91
- appendMessages(conversation: Conversation, ...inputs: MessageInput[]): Conversation
92
- appendUserMessage(conversation: Conversation, content: string | MultiModalContent[]): Conversation
93
- appendAssistantMessage(conversation: Conversation, content: string | MultiModalContent[]): Conversation
94
- appendSystemMessage(conversation: Conversation, content: string): Conversation
95
- prependSystemMessage(conversation: Conversation, content: string): Conversation
96
- replaceSystemMessage(conversation: Conversation, content: string): Conversation
189
+ ```ts
190
+ import { simpleTokenEstimator, truncateToTokenLimit } from 'conversationalist';
191
+
192
+ conversation = truncateToTokenLimit(conversation, 4000, simpleTokenEstimator, {
193
+ preserveSystemMessages: true,
194
+ preserveLastN: 2,
195
+ });
97
196
  ```
98
197
 
99
- ### Querying Messages
198
+ ### Provider Adapters
199
+
200
+ Convert the same conversation into provider-specific formats.
100
201
 
101
- ```typescript
102
- getConversationMessages(conversation: Conversation, options?: { includeHidden?: boolean }): Message[]
103
- getMessageAtPosition(conversation: Conversation, position: number): Message | undefined
104
- getMessageByIdentifier(conversation: Conversation, id: string): Message | undefined
105
- searchConversationMessages(conversation: Conversation, predicate: (m: Message) => boolean): Message[]
202
+ Adapters skip hidden and snapshot messages and map system or developer roles as needed.
203
+
204
+ ```ts
205
+ import { toOpenAIMessages, toOpenAIMessagesGrouped } from 'conversationalist/openai';
206
+ import { toAnthropicMessages } from 'conversationalist/anthropic';
207
+ import { toGeminiMessages } from 'conversationalist/gemini';
106
208
  ```
107
209
 
108
- ### System Messages
210
+ - OpenAI: `toOpenAIMessages` and `toOpenAIMessagesGrouped` (groups consecutive tool calls)
211
+ - Anthropic: `toAnthropicMessages`
212
+ - Gemini: `toGeminiMessages`
213
+
214
+ ### Tool Call Wiring Examples
215
+
216
+ #### OpenAI
217
+
218
+ ```ts
219
+ import { appendAssistantMessage, appendMessages } from 'conversationalist';
220
+ import { toOpenAIMessages } from 'conversationalist/openai';
221
+
222
+ const tools = [
223
+ {
224
+ type: 'function',
225
+ function: {
226
+ name: 'getWeather',
227
+ description: 'Get current weather by city.',
228
+ parameters: {
229
+ type: 'object',
230
+ properties: { city: { type: 'string' } },
231
+ required: ['city'],
232
+ },
233
+ },
234
+ },
235
+ ];
236
+
237
+ const response = await openai.chat.completions.create({
238
+ model: 'gpt-4.1-mini',
239
+ messages: toOpenAIMessages(conversation),
240
+ tools,
241
+ });
109
242
 
110
- ```typescript
111
- hasSystemMessage(conversation: Conversation): boolean
112
- getFirstSystemMessage(conversation: Conversation): Message | undefined
113
- getSystemMessages(conversation: Conversation): Message[]
114
- collapseSystemMessages(conversation: Conversation): Conversation
243
+ const toolCalls = response.choices[0]?.message?.tool_calls ?? [];
244
+ for (const call of toolCalls) {
245
+ conversation = appendMessages(conversation, {
246
+ role: 'tool-use',
247
+ content: '',
248
+ toolCall: {
249
+ id: call.id,
250
+ name: call.function.name,
251
+ arguments: call.function.arguments,
252
+ },
253
+ });
254
+
255
+ const result = await getWeather(JSON.parse(call.function.arguments));
256
+ conversation = appendMessages(conversation, {
257
+ role: 'tool-result',
258
+ content: '',
259
+ toolResult: { callId: call.id, outcome: 'success', content: result },
260
+ });
261
+ }
262
+
263
+ const followUp = await openai.chat.completions.create({
264
+ model: 'gpt-4.1-mini',
265
+ messages: toOpenAIMessages(conversation),
266
+ tools,
267
+ });
268
+
269
+ conversation = appendAssistantMessage(
270
+ conversation,
271
+ followUp.choices[0]?.message?.content ?? '',
272
+ );
115
273
  ```
116
274
 
117
- ### Utilities
275
+ #### Anthropic
118
276
 
119
- ```typescript
120
- computeConversationStatistics(conversation: Conversation): {
121
- total: number;
122
- byRole: Record<string, number>;
123
- hidden: number;
124
- withImages: number;
277
+ ```ts
278
+ import { appendAssistantMessage, appendMessages } from 'conversationalist';
279
+ import { toAnthropicMessages } from 'conversationalist/anthropic';
280
+
281
+ const tools = [
282
+ {
283
+ name: 'getWeather',
284
+ description: 'Get current weather by city.',
285
+ input_schema: {
286
+ type: 'object',
287
+ properties: { city: { type: 'string' } },
288
+ required: ['city'],
289
+ },
290
+ },
291
+ ];
292
+
293
+ const { system, messages } = toAnthropicMessages(conversation);
294
+ const response = await anthropic.messages.create({
295
+ model: 'claude-3-5-sonnet-20240620',
296
+ system,
297
+ messages,
298
+ tools,
299
+ });
300
+
301
+ for (const block of response.content) {
302
+ if (block.type !== 'tool_use') continue;
303
+ conversation = appendMessages(conversation, {
304
+ role: 'tool-use',
305
+ content: '',
306
+ toolCall: { id: block.id, name: block.name, arguments: block.input },
307
+ });
308
+
309
+ const result = await getWeather(block.input as { city: string });
310
+ conversation = appendMessages(conversation, {
311
+ role: 'tool-result',
312
+ content: '',
313
+ toolResult: { callId: block.id, outcome: 'success', content: result },
314
+ });
125
315
  }
126
316
 
127
- redactMessageAtPosition(conversation: Conversation, position: number, placeholder?: string): Conversation
317
+ const followUp = await anthropic.messages.create({
318
+ model: 'claude-3-5-sonnet-20240620',
319
+ ...toAnthropicMessages(conversation),
320
+ tools,
321
+ });
322
+
323
+ const assistantText = followUp.content
324
+ .filter((block) => block.type === 'text')
325
+ .map((block) => block.text)
326
+ .join('\n');
327
+
328
+ conversation = appendAssistantMessage(conversation, assistantText);
128
329
  ```
129
330
 
130
- ### Serialization
331
+ #### Gemini
332
+
333
+ Gemini does not include tool call IDs, so generate one to pair the tool result.
334
+
335
+ ```ts
336
+ import { appendMessages } from 'conversationalist';
337
+ import { toGeminiMessages } from 'conversationalist/gemini';
338
+
339
+ const tools = [
340
+ {
341
+ functionDeclarations: [
342
+ {
343
+ name: 'getWeather',
344
+ description: 'Get current weather by city.',
345
+ parameters: {
346
+ type: 'object',
347
+ properties: { city: { type: 'string' } },
348
+ required: ['city'],
349
+ },
350
+ },
351
+ ],
352
+ },
353
+ ];
354
+
355
+ const { systemInstruction, contents } = toGeminiMessages(conversation);
356
+ const response = await model.generateContent({
357
+ systemInstruction,
358
+ contents,
359
+ tools,
360
+ });
361
+
362
+ const parts = response.response.candidates?.[0]?.content?.parts ?? [];
363
+ for (const part of parts) {
364
+ if (!('functionCall' in part)) continue;
365
+ const callId = crypto.randomUUID();
366
+ const args = part.functionCall.args as { city: string };
367
+
368
+ conversation = appendMessages(conversation, {
369
+ role: 'tool-use',
370
+ content: '',
371
+ toolCall: { id: callId, name: part.functionCall.name, arguments: args },
372
+ });
373
+
374
+ const result = await getWeather(args);
375
+ conversation = appendMessages(conversation, {
376
+ role: 'tool-result',
377
+ content: '',
378
+ toolResult: { callId, outcome: 'success', content: result },
379
+ });
380
+ }
131
381
 
132
- ```typescript
133
- serializeConversation(conversation: Conversation): ConversationJSON
134
- deserializeConversation(json: ConversationJSON): Conversation
135
- toChatMessages(conversation: Conversation): ExternalMessage[]
382
+ const followUp = await model.generateContent({
383
+ ...toGeminiMessages(conversation),
384
+ tools,
385
+ });
136
386
  ```
137
387
 
138
- ### Builder Pattern
388
+ ## Builder Pattern
139
389
 
140
- For fluent API style:
390
+ Use the draft pattern for fluent, mutable-style updates that still return immutable
391
+ conversations.
141
392
 
142
- ```typescript
143
- import { createConversation } from 'conversationalist';
144
- import { withConversation, pipeConversation } from 'conversationalist';
145
- import { simpleTokenEstimator } from 'conversationalist';
393
+ ```ts
394
+ import {
395
+ appendSystemMessage,
396
+ appendUserMessage,
397
+ createConversation,
398
+ pipeConversation,
399
+ withConversation,
400
+ } from 'conversationalist';
146
401
 
147
- // Draft pattern with callback
148
402
  const conversation = withConversation(createConversation(), (draft) => {
149
403
  draft
150
404
  .appendSystemMessage('You are a helpful assistant.')
@@ -152,208 +406,167 @@ const conversation = withConversation(createConversation(), (draft) => {
152
406
  .appendAssistantMessage('Hi there!');
153
407
  });
154
408
 
155
- // Streaming with the draft pattern
156
- const streamedConversation = withConversation(createConversation(), (draft) => {
157
- const { draft: d, messageId } = draft.appendStreamingMessage('assistant');
158
- d.updateStreamingMessage(messageId, 'Partial response...').finalizeStreamingMessage(
159
- messageId,
160
- { tokenUsage: { prompt: 10, completion: 5, total: 15 } },
161
- );
162
- });
163
-
164
- // Context window management with the draft pattern
165
- const truncatedConversation = withConversation(conversation, (draft) => {
166
- draft.truncateToTokenLimit(4000, simpleTokenEstimator, { preserveLastN: 2 });
167
- });
168
-
169
- // Pipe pattern
170
- const conversation = pipeConversation(
409
+ const piped = pipeConversation(
171
410
  createConversation(),
172
411
  (c) => appendSystemMessage(c, 'You are helpful.'),
173
412
  (c) => appendUserMessage(c, 'Hello!'),
174
413
  );
175
414
  ```
176
415
 
177
- The `ConversationDraft` provides all conversation manipulation methods:
178
-
179
- - **Message appending**: `appendMessages`, `appendUserMessage`, `appendAssistantMessage`, `appendSystemMessage`
180
- - **System messages**: `prependSystemMessage`, `replaceSystemMessage`, `collapseSystemMessages`
181
- - **Modification**: `redactMessageAtPosition`
182
- - **Streaming**: `appendStreamingMessage`, `updateStreamingMessage`, `finalizeStreamingMessage`, `cancelStreamingMessage`
183
- - **Context window**: `truncateFromPosition`, `truncateToTokenLimit`
416
+ `ConversationDraft` includes appending, system message helpers, streaming, redaction, and
417
+ context window utilities.
184
418
 
185
- ### Tool Call Pairing
419
+ ## API Overview
186
420
 
187
- ```typescript
188
- import { pairToolCallsWithResults } from 'conversationalist';
421
+ ### Conversation Creation and Serialization
189
422
 
190
- const pairs = pairToolCallsWithResults(conversation.messages);
191
- // Returns: [{ call: ToolCall, result?: ToolResult }, ...]
423
+ ```ts
424
+ createConversation(options?, environment?)
425
+ serializeConversation(conversation)
426
+ deserializeConversation(json)
427
+ toChatMessages(conversation)
192
428
  ```
193
429
 
194
- ## Provider Adapters
195
-
196
- Convert conversations to provider-specific formats using subpath exports:
430
+ ### Message Appending
197
431
 
198
- ### OpenAI
432
+ ```ts
433
+ appendMessages(conversation, ...inputs, environment?)
434
+ appendUserMessage(conversation, content, metadata?, environment?)
435
+ appendAssistantMessage(conversation, content, metadata?, environment?)
436
+ appendSystemMessage(conversation, content, metadata?, environment?)
437
+ ```
199
438
 
200
- ```typescript
201
- import { toOpenAIMessages } from 'conversationalist/openai';
439
+ ### System Message Utilities
202
440
 
203
- const messages = toOpenAIMessages(conversation);
204
- const response = await openai.chat.completions.create({
205
- model: 'gpt-4',
206
- messages,
207
- });
441
+ ```ts
442
+ prependSystemMessage(conversation, content, metadata?, environment?)
443
+ replaceSystemMessage(conversation, content, metadata?, environment?)
444
+ collapseSystemMessages(conversation, environment?)
445
+ hasSystemMessage(conversation)
446
+ getFirstSystemMessage(conversation)
447
+ getSystemMessages(conversation)
208
448
  ```
209
449
 
210
- ### Anthropic
450
+ ### Querying and Stats
211
451
 
212
- ```typescript
213
- import { toAnthropicMessages } from 'conversationalist/anthropic';
214
-
215
- const { system, messages } = toAnthropicMessages(conversation);
216
- const response = await anthropic.messages.create({
217
- model: 'claude-3-opus-20240229',
218
- system,
219
- messages,
220
- });
452
+ ```ts
453
+ getConversationMessages(conversation, { includeHidden? })
454
+ getMessageAtPosition(conversation, position)
455
+ getMessageByIdentifier(conversation, id)
456
+ searchConversationMessages(conversation, predicate)
457
+ computeConversationStatistics(conversation)
221
458
  ```
222
459
 
223
- ### Google Gemini
460
+ ### Modification
224
461
 
225
- ```typescript
226
- import { toGeminiMessages } from 'conversationalist/gemini';
227
-
228
- const { systemInstruction, contents } = toGeminiMessages(conversation);
229
- const response = await model.generateContent({
230
- systemInstruction,
231
- contents,
232
- });
462
+ ```ts
463
+ redactMessageAtPosition(conversation, position, placeholder?)
233
464
  ```
234
465
 
235
- ## Streaming Support
466
+ ### Streaming
236
467
 
237
- Handle streaming responses with pending message utilities:
468
+ ```ts
469
+ appendStreamingMessage(conversation, role, metadata?, environment?)
470
+ updateStreamingMessage(conversation, messageId, content, environment?)
471
+ finalizeStreamingMessage(conversation, messageId, { tokenUsage?, metadata? }, environment?)
472
+ cancelStreamingMessage(conversation, messageId, environment?)
473
+ isStreamingMessage(message)
474
+ getStreamingMessage(conversation)
475
+ ```
238
476
 
239
- ```typescript
240
- import {
241
- appendStreamingMessage,
242
- updateStreamingMessage,
243
- finalizeStreamingMessage,
244
- cancelStreamingMessage,
245
- isStreamingMessage,
246
- getStreamingMessage,
247
- } from 'conversationalist';
477
+ ### Context Window
248
478
 
249
- // Start a streaming message
250
- let { conversation, messageId } = appendStreamingMessage(conversation, 'assistant');
479
+ ```ts
480
+ getRecentMessages(conversation, count, { includeHidden?, includeSystem? })
481
+ truncateFromPosition(conversation, position, { preserveSystemMessages? }, environment?)
482
+ truncateToTokenLimit(
483
+ conversation,
484
+ maxTokens,
485
+ estimateTokens,
486
+ { preserveSystemMessages?, preserveLastN? },
487
+ environment?,
488
+ )
489
+ estimateConversationTokens(conversation, estimateTokens)
490
+ simpleTokenEstimator(message)
491
+ ```
251
492
 
252
- // Update content as chunks arrive
253
- for await (const chunk of stream) {
254
- accumulatedContent += chunk;
255
- conversation = updateStreamingMessage(conversation, messageId, accumulatedContent);
256
- }
493
+ ### Utilities
257
494
 
258
- // Finalize when complete
259
- conversation = finalizeStreamingMessage(conversation, messageId, {
260
- tokenUsage: { prompt: 100, completion: 50, total: 150 },
261
- });
495
+ ```ts
496
+ pairToolCallsWithResults(messages);
497
+ normalizeContent(content);
498
+ toMultiModalArray(content);
499
+ createMessage(messageJSON);
500
+ copyContent(content);
501
+ copyMultiModalContent(item);
502
+ ```
262
503
 
263
- // Or cancel if needed
264
- conversation = cancelStreamingMessage(conversation, messageId);
504
+ ### Schemas
505
+
506
+ ```ts
507
+ conversationSchema;
508
+ conversationShape;
509
+ messageInputSchema;
510
+ messageJSONSchema;
511
+ messageRoleSchema;
512
+ multiModalContentSchema;
513
+ tokenUsageSchema;
514
+ toolCallSchema;
515
+ toolResultSchema;
265
516
  ```
266
517
 
267
- ## Context Window Utilities
518
+ ### Errors
519
+
520
+ ```ts
521
+ ConversationalistError;
522
+ createDuplicateIdError;
523
+ createInvalidInputError;
524
+ createInvalidPositionError;
525
+ createInvalidToolReferenceError;
526
+ createLockedError;
527
+ createNotFoundError;
528
+ createSerializationError;
529
+ createValidationError;
530
+ ```
268
531
 
269
- Manage token limits and message truncation:
532
+ ## Deterministic Environments (Testing)
270
533
 
271
- ```typescript
272
- import {
273
- getRecentMessages,
274
- truncateFromPosition,
275
- truncateToTokenLimit,
276
- estimateConversationTokens,
277
- simpleTokenEstimator,
278
- } from 'conversationalist';
534
+ Pass a custom environment to control timestamps and IDs.
279
535
 
280
- // Get last N messages (excluding system by default)
281
- const recent = getRecentMessages(conversation, 10);
536
+ ```ts
537
+ import { appendUserMessage, createConversation } from 'conversationalist';
282
538
 
283
- // Truncate to messages from position onwards
284
- const truncated = truncateFromPosition(conversation, 5);
539
+ const env = {
540
+ now: () => '2024-01-01T00:00:00.000Z',
541
+ randomId: () => 'fixed-id',
542
+ };
285
543
 
286
- // Truncate to fit token limit
287
- const fitted = truncateToTokenLimit(conversation, 4000, simpleTokenEstimator, {
288
- preserveSystemMessages: true,
289
- preserveLastN: 2,
290
- });
291
-
292
- // Estimate total tokens
293
- const tokens = estimateConversationTokens(conversation, simpleTokenEstimator);
544
+ let conversation = createConversation({ title: 'Test' }, env);
545
+ conversation = appendUserMessage(conversation, 'Hello', undefined, env);
294
546
  ```
295
547
 
296
548
  ## Types
297
549
 
298
- ```typescript
550
+ ```ts
299
551
  import type {
300
552
  Conversation,
553
+ ConversationEnvironment,
301
554
  ConversationJSON,
302
555
  ConversationStatus,
556
+ ExternalMessage,
557
+ ImageContent,
303
558
  Message,
304
559
  MessageInput,
305
560
  MessageJSON,
306
561
  MessageRole,
562
+ MultiModalContent,
563
+ TextContent,
307
564
  TokenUsage,
308
565
  ToolCall,
309
566
  ToolResult,
310
- MultiModalContent,
311
- TextContent,
312
- ImageContent,
313
567
  } from 'conversationalist';
314
568
  ```
315
569
 
316
- ## Validation Schemas
317
-
318
- Zod schemas are exported for runtime validation:
319
-
320
- ```typescript
321
- import {
322
- conversationSchema,
323
- messageInputSchema,
324
- messageJSONSchema,
325
- messageRoleSchema,
326
- multiModalContentSchema,
327
- tokenUsageSchema,
328
- toolCallSchema,
329
- toolResultSchema,
330
- } from 'conversationalist';
331
-
332
- const result = messageInputSchema.safeParse(data);
333
- ```
334
-
335
- ## Error Handling
336
-
337
- Custom error types with codes:
338
-
339
- ```typescript
340
- import {
341
- ConversationalistError,
342
- createInvalidInputError,
343
- createInvalidPositionError,
344
- createNotFoundError,
345
- createValidationError,
346
- } from 'conversationalist';
347
-
348
- try {
349
- // ...
350
- } catch (error) {
351
- if (error instanceof ConversationalistError) {
352
- console.log(error.code); // e.g., 'INVALID_POSITION'
353
- }
354
- }
355
- ```
356
-
357
570
  ## Development
358
571
 
359
572
  ```bash