conversationalist 0.0.2 → 0.0.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,14 +1,52 @@
1
1
  # Conversationalist
2
2
 
3
- A TypeScript-first library for managing LLM conversation state with immutable updates, type-safe APIs, and provider adapters.
3
+ A TypeScript-first library for managing LLM conversation state with **immutable updates**, **type-safe APIs**, and **provider-agnostic adapters**.
4
4
 
5
- The high-level value proposition:
5
+ [![Tests](https://github.com/stevekinney/conversationalist/actions/workflows/test.yml/badge.svg)](https://github.com/stevekinney/conversationalist/actions/workflows/test.yml)
6
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
6
7
 
7
- - keep a single, model-agnostic conversation shape across UI, storage, and providers
8
- - append messages without mutation (safe for React, concurrent updates, replay)
9
- - handle tool calls, streaming responses, and hidden or internal messages
10
- - trim context windows and estimate token budgets
11
- - validate input with Zod schemas and typed errors
8
+ ## What is Conversationalist?
9
+
10
+ **Conversationalist** is a state engine for LLM-driven applications. While most libraries focus on making the API calls themselves, Conversationalist focuses on the **state** that lives between those calls. It provides a unified, model-agnostic representation of a conversation that can be easily stored, serialized, and adapted for any major LLM provider (OpenAI, Anthropic, Gemini).
11
+
12
+ In a modern AI application, a conversation is more than just a list of strings. It involves:
13
+
14
+ - **Tool Use**: Pairing function calls with their results and ensuring they stay in sync.
15
+ - **Hidden Logic**: Internal "thought" messages or snapshots that should be saved but never sent to the provider.
16
+ - **Metadata**: Tracking tags, custom IDs, and tokens across different steps.
17
+ - **Streaming**: Gracefully handling partial messages in a UI without messy state transitions.
18
+
19
+ Conversationalist handles these complexities through a robust, type-safe API that treats your conversation as the "Single Source of Truth."
20
+
21
+ ## Why Use It?
22
+
23
+ Managing LLM conversations manually often leads to "provider lock-in" or fragile glue code. Conversationalist solves this by:
24
+
25
+ - **Decoupling Logic from Providers**: Write your business logic once using Conversationalist's message model, and use adapters to talk to OpenAI, Anthropic, or Gemini.
26
+ - **Built-in Context Management**: Automatically handle context window limits by truncating history while preserving critical system instructions or recent messages.
27
+ - **Type Safety Out-of-the-Box**: Built with Zod and TypeScript, ensuring that your conversation data is valid at runtime and compile-time.
28
+ - **Unified Serialization**: One standard format (`ConversationJSON`) for your database, your frontend, and your backend.
29
+
30
+ ## The Immutable Advantage
31
+
32
+ At its core, Conversationalist is **strictly immutable**. Every change to a conversation—whether appending a message, updating a stream, or redacting sensitive data—returns a _new_ conversation object.
33
+
34
+ This approach offers several critical advantages for modern application development:
35
+
36
+ 1. **React/Redux Friendly**: Because updates return new references, they trigger re-renders naturally and work seamlessly with state management libraries.
37
+ 2. **Concurrency Safe**: You can safely pass a conversation to multiple functions or async tasks without worrying about one part of your app mutating it out from under another.
38
+ 3. **Easy Branching & Replay**: Want to let a user "undo" an AI response or branch a conversation into two different paths? Simply keep a reference to the previous immutable state. No complex cloning required.
39
+ 4. **Auditability**: Timestamps and message positions are automatically managed and preserved, making it easy to reconstruct the exact state of a chat at any point in time.
40
+
41
+ ## Real-World Use Cases
42
+
43
+ - **Multi-Model Chatbots**: Build a UI where users can switch between GPT-4o and Claude 3.5 Sonnet mid-conversation without losing history.
44
+ - **Chain-of-Thought Workflows**: Use `hidden` messages to store internal reasoning or intermediate steps that the AI uses to reach a final answer, without cluttering the user's view.
45
+ - **Agentic Workflows**: Track complex tool-use loops where multiple functions are called in sequence, ensuring every result is correctly paired with its corresponding call ID.
46
+ - **Token Budgeting**: Automatically trim old messages when a conversation gets too long, ensuring your API costs stay predictable and you never hit provider limits.
47
+ - **Deterministic Testing**: Use the custom `environment` parameter to mock IDs and timestamps, allowing you to write 100% deterministic tests for your chat logic.
48
+
49
+ ---
12
50
 
13
51
  ## Installation
14
52
 
@@ -18,8 +56,7 @@ npm add conversationalist zod
18
56
  pnpm add conversationalist zod
19
57
  ```
20
58
 
21
- This package is ESM-only. Use `import` syntax.
22
- Zod is a peer dependency and must be installed by your app.
59
+ This package is ESM-only. Use `import` syntax. Zod is a peer dependency and must be installed by your application.
23
60
 
24
61
  ## Quick Start
25
62
 
@@ -28,68 +65,29 @@ import {
28
65
  appendAssistantMessage,
29
66
  appendUserMessage,
30
67
  createConversation,
31
- deserializeConversation,
32
68
  serializeConversation,
33
69
  } from 'conversationalist';
34
70
  import { toOpenAIMessages } from 'conversationalist/openai';
35
71
 
72
+ // 1. Create a conversation
36
73
  let conversation = createConversation({
37
- title: 'Support chat',
38
- tags: ['support'],
74
+ title: 'Order Support',
75
+ metadata: { userId: 'user_123' },
39
76
  });
40
77
 
78
+ // 2. Add messages (returns a new conversation object)
41
79
  conversation = appendUserMessage(conversation, 'Where is my order?');
42
80
  conversation = appendAssistantMessage(conversation, 'Let me check that for you.');
43
81
 
44
- const messages = toOpenAIMessages(conversation);
45
- // send messages to your provider...
46
-
47
- const stored = serializeConversation(conversation);
48
- const restored = deserializeConversation(stored);
49
- ```
50
-
51
- ## End-to-End Example (Store + Resume)
52
-
53
- ```ts
54
- import {
55
- appendAssistantMessage,
56
- appendSystemMessage,
57
- appendUserMessage,
58
- createConversation,
59
- deserializeConversation,
60
- serializeConversation,
61
- } from 'conversationalist';
62
- import { toOpenAIMessages } from 'conversationalist/openai';
63
-
64
- let conversation = createConversation({ title: 'Order lookup' });
65
- conversation = appendSystemMessage(conversation, 'You are a support agent.');
66
- conversation = appendUserMessage(conversation, 'Where is order 123?');
67
-
68
- const response = await openai.chat.completions.create({
69
- model: 'gpt-4.1-mini',
70
- messages: toOpenAIMessages(conversation),
71
- });
72
-
73
- conversation = appendAssistantMessage(
74
- conversation,
75
- response.choices[0]?.message?.content ?? '',
76
- );
82
+ // 3. Adapt for a provider
83
+ const openAIMessages = toOpenAIMessages(conversation);
84
+ // [{ role: 'user', content: 'Where is my order?' }, ...]
77
85
 
78
- const stored = serializeConversation(conversation);
79
- // await db.save(stored);
80
-
81
- let restored = deserializeConversation(stored);
82
- restored = appendUserMessage(restored, 'Can you email me the tracking link?');
86
+ // 4. Save to your database
87
+ const data = serializeConversation(conversation);
88
+ // db.save(data.id, JSON.stringify(data));
83
89
  ```
84
90
 
85
- ## When to Use
86
-
87
- - Build multi-provider chat apps with a consistent message model.
88
- - Store and replay conversation history safely.
89
- - Drive streaming UIs without ad-hoc state machines.
90
- - Enforce tool call correctness and pair calls with results.
91
- - Manage context windows and token limits in one place.
92
-
93
91
  ## Core Concepts
94
92
 
95
93
  ### Conversations
@@ -112,7 +110,7 @@ const conversation = createConversation({
112
110
  Messages have roles and can contain text or multi-modal content. Optional fields include
113
111
  `metadata`, `hidden`, `tokenUsage`, `toolCall`, `toolResult`, and `goalCompleted`.
114
112
 
115
- Roles: `user`, `assistant`, `system`, `developer`, `tool-use`, `tool-result`, `snapshot`.
113
+ **Roles**: `user`, `assistant`, `system`, `developer`, `tool-use`, `tool-result`, `snapshot`.
116
114
  The `snapshot` role is for internal state and is skipped by adapters.
117
115
 
118
116
  ```ts
@@ -127,8 +125,8 @@ conversation = appendMessages(conversation, {
127
125
  });
128
126
  ```
129
127
 
130
- Hidden messages remain in history but are skipped by default when querying or adapting to
131
- providers.
128
+ **Hidden messages** remain in history but are skipped by default when querying or adapting to
129
+ providers. This is perfect for internal logging or "thinking" steps.
132
130
 
133
131
  ### Tool Calls
134
132
 
@@ -155,8 +153,6 @@ conversation = appendMessages(
155
153
  );
156
154
  ```
157
155
 
158
- Use `pairToolCallsWithResults` to render tool calls alongside their results.
159
-
160
156
  ### Streaming
161
157
 
162
158
  Streaming helpers let you append a placeholder, update it as chunks arrive, and finalize
@@ -182,62 +178,218 @@ conversation = finalizeStreamingMessage(conversation, messageId, {
182
178
  });
183
179
  ```
184
180
 
185
- ### Context Window
181
+ ### Context Window Management
186
182
 
187
- Trim history to fit token budgets or to keep only recent messages.
183
+ Automatically trim history to fit token budgets or to keep only recent messages.
188
184
 
189
185
  ```ts
190
186
  import { simpleTokenEstimator, truncateToTokenLimit } from 'conversationalist';
191
187
 
192
- conversation = truncateToTokenLimit(conversation, 4000, simpleTokenEstimator, {
188
+ conversation = truncateToTokenLimit(conversation, 4000, {
193
189
  preserveSystemMessages: true,
194
190
  preserveLastN: 2,
195
191
  });
196
192
  ```
197
193
 
198
- ### Provider Adapters
194
+ #### Custom Token Counters
199
195
 
200
- Convert the same conversation into provider-specific formats.
196
+ You can provide a custom token estimator (e.g. using `tiktoken` or `anthropic-tokenizer`) by passing it in the options or by binding it to your environment.
197
+
198
+ ```ts
199
+ import { truncateToTokenLimit } from 'conversationalist';
200
+ // import { get_encoding } from 'tiktoken';
201
201
 
202
- Adapters skip hidden and snapshot messages and map system or developer roles as needed.
202
+ const tiktokenEstimator = (message) => {
203
+ // Your logic here...
204
+ return 100;
205
+ };
206
+
207
+ // 1. Pass directly in options
208
+ conversation = truncateToTokenLimit(conversation, 4000, {
209
+ estimateTokens: tiktokenEstimator,
210
+ });
211
+
212
+ // 2. Or bind to a history instance/environment
213
+ const history = new ConversationHistory(conversation, {
214
+ estimateTokens: tiktokenEstimator,
215
+ });
216
+
217
+ const boundTruncate = history.bind(truncateToTokenLimit);
218
+ boundTruncate(4000); // Uses tiktokenEstimator automatically
219
+ ```
220
+
221
+ ### Markdown Conversion
222
+
223
+ Convert conversations to human-readable Markdown format, or parse Markdown back into a conversation object.
224
+
225
+ #### Basic Usage (Clean Markdown)
226
+
227
+ By default, `toMarkdown` produces clean, readable Markdown without metadata:
203
228
 
204
229
  ```ts
205
- import { toOpenAIMessages, toOpenAIMessagesGrouped } from 'conversationalist/openai';
230
+ import {
231
+ toMarkdown,
232
+ fromMarkdown,
233
+ createConversation,
234
+ appendMessages,
235
+ } from 'conversationalist';
236
+
237
+ let conversation = createConversation({ id: 'conv-1' });
238
+ conversation = appendMessages(
239
+ conversation,
240
+ { role: 'user', content: 'What is 2 + 2?' },
241
+ { role: 'assistant', content: 'The answer is 4.' },
242
+ );
243
+
244
+ const markdown = toMarkdown(conversation);
245
+ // Output:
246
+ // ### User
247
+ //
248
+ // What is 2 + 2?
249
+ //
250
+ // ### Assistant
251
+ //
252
+ // The answer is 4.
253
+ ```
254
+
255
+ When parsing simple Markdown without metadata, `fromMarkdown` generates new IDs and uses sensible defaults:
256
+
257
+ ```ts
258
+ const parsed = fromMarkdown(markdown);
259
+ // parsed.id is a new generated ID
260
+ // parsed.status is 'active'
261
+ // Message IDs are generated, positions are assigned sequentially
262
+ ```
263
+
264
+ #### Lossless Round-Trip (with Metadata)
265
+
266
+ For archiving or backup scenarios where you need to preserve all data, use `{ includeMetadata: true }`:
267
+
268
+ ```ts
269
+ const markdown = toMarkdown(conversation, { includeMetadata: true });
270
+ // Output includes YAML frontmatter with all metadata keyed by message ID:
271
+ // ---
272
+ // id: conv-1
273
+ // status: active
274
+ // metadata: {}
275
+ // tags: []
276
+ // createdAt: '2024-01-15T10:00:00.000Z'
277
+ // updatedAt: '2024-01-15T10:01:00.000Z'
278
+ // messages:
279
+ // msg-1:
280
+ // position: 0
281
+ // createdAt: '2024-01-15T10:00:00.000Z'
282
+ // metadata: {}
283
+ // hidden: false
284
+ // msg-2:
285
+ // position: 1
286
+ // createdAt: '2024-01-15T10:01:00.000Z'
287
+ // metadata: {}
288
+ // hidden: false
289
+ // ---
290
+ // ### User (msg-1)
291
+ //
292
+ // What is 2 + 2?
293
+ //
294
+ // ### Assistant (msg-2)
295
+ //
296
+ // The answer is 4.
297
+
298
+ // Parse back with all metadata preserved
299
+ const restored = fromMarkdown(markdown);
300
+ // restored.id === 'conv-1'
301
+ // restored.messages[0].id === 'msg-1'
302
+ ```
303
+
304
+ #### Multi-Modal Content
305
+
306
+ Both functions handle multi-modal content. Images render as Markdown images, and with metadata enabled, additional properties like `mimeType` are preserved in the YAML frontmatter:
307
+
308
+ ```ts
309
+ conversation = appendMessages(conversation, {
310
+ role: 'user',
311
+ content: [
312
+ { type: 'text', text: 'Describe this:' },
313
+ { type: 'image', url: 'https://example.com/photo.png', mimeType: 'image/png' },
314
+ ],
315
+ });
316
+
317
+ const md = toMarkdown(conversation);
318
+ // Describe this:
319
+ //
320
+ // ![image](https://example.com/photo.png)
321
+ ```
322
+
323
+ ## Plugins
324
+
325
+ **Conversationalist** supports a plugin system that allows you to transform messages as they are appended to a conversation. Plugins are functions that take a `MessageInput` and return a modified `MessageInput`.
326
+
327
+ ### PII Redaction Plugin
328
+
329
+ The library includes a built-in `piiRedactionPlugin` that can automatically redact emails, phone numbers, and common API key patterns.
330
+
331
+ ```ts
332
+ import {
333
+ appendUserMessage,
334
+ createConversation,
335
+ piiRedactionPlugin,
336
+ } from 'conversationalist';
337
+
338
+ // 1. Enable by adding to your environment
339
+ const env = {
340
+ plugins: [piiRedactionPlugin],
341
+ };
342
+
343
+ // 2. Use the environment when appending messages
344
+ let conversation = createConversation({}, env);
345
+ conversation = appendUserMessage(
346
+ conversation,
347
+ 'Contact me at test@example.com',
348
+ undefined,
349
+ env,
350
+ );
351
+
352
+ console.log(conversation.messages[0].content);
353
+ // "Contact me at [EMAIL_REDACTED]"
354
+ ```
355
+
356
+ When using `ConversationHistory`, you only need to provide the plugin once during initialization:
357
+
358
+ ```ts
359
+ const history = new ConversationHistory(createConversation(), {
360
+ plugins: [piiRedactionPlugin],
361
+ });
362
+
363
+ const appendUser = history.bind(appendUserMessage);
364
+ appendUser('My key is sk-12345...'); // Automatically redacted
365
+ ```
366
+
367
+ ## Provider Adapters
368
+
369
+ Convert the same conversation into provider-specific formats. Adapters automatically skip hidden/snapshot messages and map roles correctly.
370
+
371
+ ```ts
372
+ import { toOpenAIMessages } from 'conversationalist/openai';
206
373
  import { toAnthropicMessages } from 'conversationalist/anthropic';
207
374
  import { toGeminiMessages } from 'conversationalist/gemini';
208
375
  ```
209
376
 
210
- - OpenAI: `toOpenAIMessages` and `toOpenAIMessagesGrouped` (groups consecutive tool calls)
211
- - Anthropic: `toAnthropicMessages`
212
- - Gemini: `toGeminiMessages`
377
+ - **OpenAI**: Supports `toOpenAIMessages` and `toOpenAIMessagesGrouped` (which groups consecutive tool calls).
378
+ - **Anthropic**: Maps system messages and tool blocks to Anthropic's specific format.
379
+ - **Gemini**: Handles Gemini's unique content/part structure.
213
380
 
214
- ### Tool Call Wiring Examples
381
+ ### Provider-Specific Examples
215
382
 
216
- #### OpenAI
383
+ #### OpenAI (with Tool Calls)
217
384
 
218
385
  ```ts
219
386
  import { appendAssistantMessage, appendMessages } from 'conversationalist';
220
387
  import { toOpenAIMessages } from 'conversationalist/openai';
221
388
 
222
- const tools = [
223
- {
224
- type: 'function',
225
- function: {
226
- name: 'getWeather',
227
- description: 'Get current weather by city.',
228
- parameters: {
229
- type: 'object',
230
- properties: { city: { type: 'string' } },
231
- required: ['city'],
232
- },
233
- },
234
- },
235
- ];
236
-
237
389
  const response = await openai.chat.completions.create({
238
- model: 'gpt-4.1-mini',
390
+ model: 'gpt-4o',
239
391
  messages: toOpenAIMessages(conversation),
240
- tools,
392
+ tools: [{ type: 'function', function: { name: 'getWeather', ... } }],
241
393
  });
242
394
 
243
395
  const toolCalls = response.choices[0]?.message?.tool_calls ?? [];
@@ -245,11 +397,7 @@ for (const call of toolCalls) {
245
397
  conversation = appendMessages(conversation, {
246
398
  role: 'tool-use',
247
399
  content: '',
248
- toolCall: {
249
- id: call.id,
250
- name: call.function.name,
251
- arguments: call.function.arguments,
252
- },
400
+ toolCall: { id: call.id, name: call.function.name, arguments: call.function.arguments },
253
401
  });
254
402
 
255
403
  const result = await getWeather(JSON.parse(call.function.arguments));
@@ -259,43 +407,20 @@ for (const call of toolCalls) {
259
407
  toolResult: { callId: call.id, outcome: 'success', content: result },
260
408
  });
261
409
  }
262
-
263
- const followUp = await openai.chat.completions.create({
264
- model: 'gpt-4.1-mini',
265
- messages: toOpenAIMessages(conversation),
266
- tools,
267
- });
268
-
269
- conversation = appendAssistantMessage(
270
- conversation,
271
- followUp.choices[0]?.message?.content ?? '',
272
- );
273
410
  ```
274
411
 
275
- #### Anthropic
412
+ #### Anthropic (with Tool Calls)
276
413
 
277
414
  ```ts
278
415
  import { appendAssistantMessage, appendMessages } from 'conversationalist';
279
416
  import { toAnthropicMessages } from 'conversationalist/anthropic';
280
417
 
281
- const tools = [
282
- {
283
- name: 'getWeather',
284
- description: 'Get current weather by city.',
285
- input_schema: {
286
- type: 'object',
287
- properties: { city: { type: 'string' } },
288
- required: ['city'],
289
- },
290
- },
291
- ];
292
-
293
418
  const { system, messages } = toAnthropicMessages(conversation);
294
419
  const response = await anthropic.messages.create({
295
420
  model: 'claude-3-5-sonnet-20240620',
296
421
  system,
297
422
  messages,
298
- tools,
423
+ tools: [{ name: 'getWeather', ... }],
299
424
  });
300
425
 
301
426
  for (const block of response.content) {
@@ -306,64 +431,33 @@ for (const block of response.content) {
306
431
  toolCall: { id: block.id, name: block.name, arguments: block.input },
307
432
  });
308
433
 
309
- const result = await getWeather(block.input as { city: string });
434
+ const result = await getWeather(block.input);
310
435
  conversation = appendMessages(conversation, {
311
436
  role: 'tool-result',
312
437
  content: '',
313
438
  toolResult: { callId: block.id, outcome: 'success', content: result },
314
439
  });
315
440
  }
316
-
317
- const followUp = await anthropic.messages.create({
318
- model: 'claude-3-5-sonnet-20240620',
319
- ...toAnthropicMessages(conversation),
320
- tools,
321
- });
322
-
323
- const assistantText = followUp.content
324
- .filter((block) => block.type === 'text')
325
- .map((block) => block.text)
326
- .join('\n');
327
-
328
- conversation = appendAssistantMessage(conversation, assistantText);
329
441
  ```
330
442
 
331
- #### Gemini
332
-
333
- Gemini does not include tool call IDs, so generate one to pair the tool result.
443
+ #### Gemini (with Tool Calls)
334
444
 
335
445
  ```ts
336
446
  import { appendMessages } from 'conversationalist';
337
447
  import { toGeminiMessages } from 'conversationalist/gemini';
338
448
 
339
- const tools = [
340
- {
341
- functionDeclarations: [
342
- {
343
- name: 'getWeather',
344
- description: 'Get current weather by city.',
345
- parameters: {
346
- type: 'object',
347
- properties: { city: { type: 'string' } },
348
- required: ['city'],
349
- },
350
- },
351
- ],
352
- },
353
- ];
354
-
355
449
  const { systemInstruction, contents } = toGeminiMessages(conversation);
356
450
  const response = await model.generateContent({
357
451
  systemInstruction,
358
452
  contents,
359
- tools,
453
+ tools: [{ functionDeclarations: [{ name: 'getWeather', ... }] }],
360
454
  });
361
455
 
362
456
  const parts = response.response.candidates?.[0]?.content?.parts ?? [];
363
457
  for (const part of parts) {
364
458
  if (!('functionCall' in part)) continue;
365
- const callId = crypto.randomUUID();
366
- const args = part.functionCall.args as { city: string };
459
+ const callId = crypto.randomUUID(); // Gemini doesn't provide IDs, so we generate one
460
+ const args = part.functionCall.args;
367
461
 
368
462
  conversation = appendMessages(conversation, {
369
463
  role: 'tool-use',
@@ -378,26 +472,14 @@ for (const part of parts) {
378
472
  toolResult: { callId, outcome: 'success', content: result },
379
473
  });
380
474
  }
381
-
382
- const followUp = await model.generateContent({
383
- ...toGeminiMessages(conversation),
384
- tools,
385
- });
386
475
  ```
387
476
 
388
- ## Builder Pattern
477
+ ## Builder Pattern (Fluent API)
389
478
 
390
- Use the draft pattern for fluent, mutable-style updates that still return immutable
391
- conversations.
479
+ If you prefer a more fluent style, use `withConversation` or `pipeConversation`. These allow you to "mutate" a draft within a scope while still resulting in an immutable object.
392
480
 
393
481
  ```ts
394
- import {
395
- appendSystemMessage,
396
- appendUserMessage,
397
- createConversation,
398
- pipeConversation,
399
- withConversation,
400
- } from 'conversationalist';
482
+ import { withConversation, createConversation } from 'conversationalist';
401
483
 
402
484
  const conversation = withConversation(createConversation(), (draft) => {
403
485
  draft
@@ -405,166 +487,323 @@ const conversation = withConversation(createConversation(), (draft) => {
405
487
  .appendUserMessage('Hello!')
406
488
  .appendAssistantMessage('Hi there!');
407
489
  });
490
+ ```
408
491
 
409
- const piped = pipeConversation(
492
+ `pipeConversation` allows you to chain multiple transformation functions together:
493
+
494
+ ```ts
495
+ import {
496
+ createConversation,
497
+ pipeConversation,
498
+ appendSystemMessage,
499
+ appendUserMessage,
500
+ } from 'conversationalist';
501
+
502
+ const conversation = pipeConversation(
410
503
  createConversation(),
411
- (c) => appendSystemMessage(c, 'You are helpful.'),
504
+ (c) => appendSystemMessage(c, 'You are a helpful assistant.'),
412
505
  (c) => appendUserMessage(c, 'Hello!'),
506
+ (c) => appendAssistantMessage(c, 'Hi there!'),
413
507
  );
414
508
  ```
415
509
 
416
- `ConversationDraft` includes appending, system message helpers, streaming, redaction, and
417
- context window utilities.
510
+ ## Conversation History (Undo/Redo)
418
511
 
419
- ## API Overview
512
+ Use the `ConversationHistory` class to manage a stack of conversation states. Because every change returns a new immutable object, supporting undo/redo is built into the architecture.
513
+
514
+ ````ts
515
+ import {
516
+ ConversationHistory,
517
+ createConversation,
518
+ appendUserMessage,
519
+ } from 'conversationalist';
420
520
 
421
- ### Conversation Creation and Serialization
521
+ const history = new ConversationHistory(createConversation());
422
522
 
423
- ```ts
424
- createConversation(options?, environment?)
425
- serializeConversation(conversation)
426
- deserializeConversation(json)
427
- toChatMessages(conversation)
523
+ // You can use convenience methods that automatically track state
524
+ history.appendUserMessage('Hello!');
525
+ history.appendAssistantMessage('How are you?');
526
+
527
+ history.undo(); // State reverts to just "Hello!"
528
+ history.redo(); // State advances back to "How are you?"
529
+
530
+ // Convenience methods for all library utilities are built-in
531
+ history.appendUserMessage('Another message');
532
+ history.redactMessageAtPosition(0);
533
+ history.truncateToTokenLimit(4000);
534
+
535
+ // Query methods work on the current state
536
+ const messages = history.getMessages();
537
+ const stats = history.getStatistics();
538
+ const tokens = history.estimateTokens();
428
539
  ```
429
540
 
430
- ### Message Appending
541
+ ### Event Subscription
542
+
543
+ `ConversationHistory` implements `EventTarget` and follows the Svelte store contract. You can listen for changes using standard DOM events or the `subscribe` method.
544
+
545
+ #### Using DOM Events
431
546
 
432
547
  ```ts
433
- appendMessages(conversation, ...inputs, environment?)
434
- appendUserMessage(conversation, content, metadata?, environment?)
435
- appendAssistantMessage(conversation, content, metadata?, environment?)
436
- appendSystemMessage(conversation, content, metadata?, environment?)
548
+ const history = new ConversationHistory(createConversation());
549
+
550
+ // addEventListener returns a convenient unsubscribe function
551
+ const unsubscribe = history.addEventListener('change', (event) => {
552
+ const { type, conversation } = event.detail;
553
+ console.log(`History updated via ${type}`);
554
+ });
555
+
556
+ history.appendUserMessage('Hello!'); // Fires 'push' and 'change' events
557
+
558
+ unsubscribe(); // Clean up when done
437
559
  ```
438
560
 
439
- ### System Message Utilities
561
+ #### Using the Store Contract
440
562
 
441
563
  ```ts
442
- prependSystemMessage(conversation, content, metadata?, environment?)
443
- replaceSystemMessage(conversation, content, metadata?, environment?)
444
- collapseSystemMessages(conversation, environment?)
445
- hasSystemMessage(conversation)
446
- getFirstSystemMessage(conversation)
447
- getSystemMessages(conversation)
564
+ // Subscribe returns an unsubscribe function and calls the callback immediately
565
+ const unsubscribe = history.subscribe((conversation) => {
566
+ console.log('Current conversation state:', conversation);
567
+ });
448
568
  ```
449
569
 
450
- ### Querying and Stats
570
+ You can also use an `AbortSignal` for automatic cleanup:
451
571
 
452
572
  ```ts
453
- getConversationMessages(conversation, { includeHidden? })
454
- getMessageAtPosition(conversation, position)
455
- getMessageByIdentifier(conversation, id)
456
- searchConversationMessages(conversation, predicate)
457
- computeConversationStatistics(conversation)
573
+ const controller = new AbortController();
574
+ history.addEventListener('change', (e) => { ... }, { signal: controller.signal });
575
+
576
+ // Later...
577
+ controller.abort();
458
578
  ```
459
579
 
460
- ### Modification
580
+ ### Conversation Branching
581
+
582
+ The `ConversationHistory` class supports branching. When you undo to a previous state and push a new update, it creates an alternate path instead of deleting the old history.
461
583
 
462
584
  ```ts
463
- redactMessageAtPosition(conversation, position, placeholder?)
585
+ const history = new ConversationHistory(createConversation());
586
+
587
+ history.push(appendUserMessage(history.current, 'Path A'));
588
+ history.undo();
589
+
590
+ history.push(appendUserMessage(history.current, 'Path B'));
591
+
592
+ console.log(history.branchCount); // 2
593
+ console.log(history.current.messages[0].content); // "Path B"
594
+
595
+ history.switchToBranch(0);
596
+ console.log(history.current.messages[0].content); // "Path A"
464
597
  ```
465
598
 
466
- ### Streaming
599
+ ### Serialization
600
+
601
+ You can serialize the entire history tree (including all branches) to JSON and reconstruct it later.
467
602
 
468
603
  ```ts
469
- appendStreamingMessage(conversation, role, metadata?, environment?)
470
- updateStreamingMessage(conversation, messageId, content, environment?)
471
- finalizeStreamingMessage(conversation, messageId, { tokenUsage?, metadata? }, environment?)
472
- cancelStreamingMessage(conversation, messageId, environment?)
473
- isStreamingMessage(message)
474
- getStreamingMessage(conversation)
604
+ // 1. Save to JSON
605
+ const json = history.toJSON();
606
+ // localStorage.setItem('chat_history', JSON.stringify(json));
607
+
608
+ // 2. Restore from JSON
609
+ const restored = ConversationHistory.from(json);
610
+
611
+ // You can also provide a new environment (e.g. with fresh token counters)
612
+ const restoredWithEnv = ConversationHistory.from(json, {
613
+ estimateTokens: myNewEstimator,
614
+ });
475
615
  ```
476
616
 
477
- ### Context Window
617
+ ## Integration
478
618
 
479
- ```ts
480
- getRecentMessages(conversation, count, { includeHidden?, includeSystem? })
481
- truncateFromPosition(conversation, position, { preserveSystemMessages? }, environment?)
482
- truncateToTokenLimit(
483
- conversation,
484
- maxTokens,
485
- estimateTokens,
486
- { preserveSystemMessages?, preserveLastN? },
487
- environment?,
488
- )
489
- estimateConversationTokens(conversation, estimateTokens)
490
- simpleTokenEstimator(message)
619
+ ### Using with React
620
+
621
+ Because **Conversationalist** is immutable, it works perfectly with React's `useState` or `useReducer`. Every update returns a new reference, which automatically triggers a re-render.
622
+
623
+ ```tsx
624
+ import { useState } from 'react';
625
+ import { createConversation, appendUserMessage } from 'conversationalist';
626
+
627
+ export function ChatApp() {
628
+ const [conversation, setConversation] = useState(() => createConversation());
629
+
630
+ const handleSend = (text: string) => {
631
+ // The new conversation object is set into state
632
+ setConversation((prev) => appendUserMessage(prev, text));
633
+ };
634
+
635
+ return (
636
+ <div>
637
+ {conversation.messages.map((m) => (
638
+ <div key={m.id}>{String(m.content)}</div>
639
+ ))}
640
+ <button onClick={() => handleSend('Hello!')}>Send</button>
641
+ </div>
642
+ );
643
+ }
491
644
  ```
492
645
 
493
- ### Utilities
646
+ #### Custom React Hook Example
494
647
 
495
- ```ts
496
- pairToolCallsWithResults(messages);
497
- normalizeContent(content);
498
- toMultiModalArray(content);
499
- createMessage(messageJSON);
500
- copyContent(content);
501
- copyMultiModalContent(item);
648
+ For more complex applications, you can wrap the logic into a custom hook. This example uses `addEventListener` to sync the history with local React state and returns the unsubscribe function for easy cleanup in `useEffect`.
649
+
650
+ ```tsx
651
+ import { useState, useCallback, useEffect } from 'react';
652
+ import {
653
+ createConversation,
654
+ ConversationHistory,
655
+ toChatMessages,
656
+ } from 'conversationalist';
657
+
658
+ export function useChat(initialTitle?: string) {
659
+ // 1. Initialize history (this could also come from context or props)
660
+ const [history] = useState(
661
+ () => new ConversationHistory(createConversation({ title: initialTitle })),
662
+ );
663
+
664
+ // 2. Sync history with local state for reactivity
665
+ const [conversation, setConversation] = useState(history.current);
666
+ const [loading, setLoading] = useState(false);
667
+
668
+ useEffect(() => {
669
+ // addEventListener returns a cleanup function!
670
+ return history.addEventListener('change', (e) => {
671
+ setConversation(e.detail.conversation);
672
+ });
673
+ }, [history]);
674
+
675
+ const sendMessage = useCallback(
676
+ async (text: string) => {
677
+ history.appendUserMessage(text);
678
+ setLoading(true);
679
+
680
+ try {
681
+ const response = await fetch('/api/chat', {
682
+ method: 'POST',
683
+ body: JSON.stringify({
684
+ messages: history.toChatMessages(),
685
+ }),
686
+ });
687
+ const data = await response.json();
688
+ history.appendAssistantMessage(data.answer);
689
+ } finally {
690
+ setLoading(false);
691
+ }
692
+ },
693
+ [history],
694
+ );
695
+
696
+ return {
697
+ conversation,
698
+ messages: conversation.messages,
699
+ loading,
700
+ sendMessage,
701
+ undo: () => history.undo(),
702
+ redo: () => history.redo(),
703
+ };
704
+ }
502
705
  ```
503
706
 
504
- ### Schemas
707
+ > **Note**: `ConversationHistory.addEventListener()` returns an unsubscribe function, which is ideal for cleaning up effects in React (`useEffect`) or Svelte.
708
+
709
+ ### Using with Redux
710
+
711
+ Redux requires immutable state updates, making **Conversationalist** an ideal companion. You can store the conversation object directly in your store.
505
712
 
506
713
  ```ts
507
- conversationSchema;
508
- conversationShape;
509
- messageInputSchema;
510
- messageJSONSchema;
511
- messageRoleSchema;
512
- multiModalContentSchema;
513
- tokenUsageSchema;
514
- toolCallSchema;
515
- toolResultSchema;
714
+ import { createSlice, PayloadAction } from '@reduxjs/toolkit';
715
+ import { createConversation, appendUserMessage, Conversation } from 'conversationalist';
716
+
717
+ interface ChatState {
718
+ conversation: Conversation;
719
+ }
720
+
721
+ const chatSlice = createSlice({
722
+ name: 'chat',
723
+ initialState: {
724
+ conversation: createConversation(),
725
+ } as ChatState,
726
+ reducers: {
727
+ userMessageReceived: (state, action: PayloadAction<string>) => {
728
+ // Redux Toolkit's createSlice uses Immer, but since appendUserMessage
729
+ // returns a new object, we can just replace the property.
730
+ state.conversation = appendUserMessage(state.conversation, action.payload);
731
+ },
732
+ },
733
+ });
516
734
  ```
517
735
 
518
- ### Errors
736
+ ### Using with Svelte (Runes)
519
737
 
520
- ```ts
521
- ConversationalistError;
522
- createDuplicateIdError;
523
- createInvalidInputError;
524
- createInvalidPositionError;
525
- createInvalidToolReferenceError;
526
- createLockedError;
527
- createNotFoundError;
528
- createSerializationError;
529
- createValidationError;
738
+ In Svelte 5, you can manage conversation state using the `$state` rune. Since **Conversationalist** is immutable, you update the state by re-assigning the variable with a new conversation object.
739
+
740
+ ```svelte
741
+ <script lang="ts">
742
+ import { createConversation, appendUserMessage } from 'conversationalist';
743
+
744
+ let conversation = $state(createConversation());
745
+
746
+ function handleSend(text: string) {
747
+ conversation = appendUserMessage(conversation, text);
748
+ }
749
+ </script>
750
+
751
+ <div>
752
+ {#each conversation.messages as m (m.id)}
753
+ <div>{String(m.content)}</div>
754
+ {/each}
755
+ <button onclick={() => handleSend('Hello!')}>Send</button>
756
+ </div>
530
757
  ```
531
758
 
532
- ## Deterministic Environments (Testing)
759
+ #### Custom Svelte Rune Example
533
760
 
534
- Pass a custom environment to control timestamps and IDs.
761
+ Svelte 5's runes pair perfectly with **Conversationalist**. You can use the `ConversationHistory` class directly as a store, or wrap it in a class with runes.
535
762
 
536
- ```ts
537
- import { appendUserMessage, createConversation } from 'conversationalist';
763
+ ```svelte
764
+ <script lang="ts">
765
+ import { ConversationHistory, createConversation } from 'conversationalist';
538
766
 
539
- const env = {
540
- now: () => '2024-01-01T00:00:00.000Z',
541
- randomId: () => 'fixed-id',
542
- };
767
+ // history implements the Svelte store contract
768
+ const history = new ConversationHistory(createConversation());
769
+ </script>
543
770
 
544
- let conversation = createConversation({ title: 'Test' }, env);
545
- conversation = appendUserMessage(conversation, 'Hello', undefined, env);
771
+ <div>
772
+ {#each $history.messages as m (m.id)}
773
+ <div>{String(m.content)}</div>
774
+ {/each}
775
+ <button onclick={() => history.appendUserMessage('Hello!')}>
776
+ Send
777
+ </button>
778
+ </div>
546
779
  ```
547
780
 
548
- ## Types
781
+ > **Note**: `ConversationHistory.addEventListener()` returns an unsubscribe function, which is ideal for cleaning up reactive effects in Svelte 5 or React hooks.
782
+
783
+ ## API Overview
784
+
785
+ | Category | Key Functions |
786
+ | :--------------- | :------------------------------------------------------------------------------------------------------- |
787
+ | **Creation** | `createConversation`, `serializeConversation`, `deserializeConversation` |
788
+ | **Appending** | `appendUserMessage`, `appendAssistantMessage`, `appendSystemMessage`, `appendMessages` |
789
+ | **Streaming** | `appendStreamingMessage`, `updateStreamingMessage`, `finalizeStreamingMessage`, `cancelStreamingMessage` |
790
+ | **Modification** | `redactMessageAtPosition`, `replaceSystemMessage`, `collapseSystemMessages` |
791
+ | **Context** | `truncateToTokenLimit`, `getRecentMessages`, `estimateConversationTokens` |
792
+ | **Querying** | `getConversationMessages`, `getMessageByIdentifier`, `computeConversationStatistics` |
793
+ | **Conversion** | `toMarkdown`, `fromMarkdown`, `toChatMessages`, `pairToolCallsWithResults` |
794
+ | **History** | `ConversationHistory`, `bindToConversationHistory` |
795
+
796
+ ## Deterministic Environments (Testing)
797
+
798
+ Pass a custom environment to control timestamps and IDs, making your tests 100% predictable.
549
799
 
550
800
  ```ts
551
- import type {
552
- Conversation,
553
- ConversationEnvironment,
554
- ConversationJSON,
555
- ConversationStatus,
556
- ExternalMessage,
557
- ImageContent,
558
- Message,
559
- MessageInput,
560
- MessageJSON,
561
- MessageRole,
562
- MultiModalContent,
563
- TextContent,
564
- TokenUsage,
565
- ToolCall,
566
- ToolResult,
567
- } from 'conversationalist';
801
+ const testEnv = {
802
+ now: () => '2024-01-01T00:00:00.000Z',
803
+ randomId: () => 'fixed-id',
804
+ };
805
+
806
+ let conversation = createConversation({ title: 'Test' }, testEnv);
568
807
  ```
569
808
 
570
809
  ## Development
@@ -572,11 +811,6 @@ import type {
572
811
  ```bash
573
812
  bun install
574
813
  bun test
575
- bun run typecheck
576
- bun run lint
577
814
  bun run build
578
815
  ```
579
-
580
- ## License
581
-
582
- MIT
816
+ ````