conversationalist 0.0.1 → 0.0.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,47 +1,121 @@
1
1
  # Conversationalist
2
2
 
3
- A TypeScript library for managing LLM conversation state with immutable data structures and type-safe APIs.
3
+ A TypeScript-first library for managing LLM conversation state with **immutable updates**, **type-safe APIs**, and **provider-agnostic adapters**.
4
+
5
+ [![Tests](https://github.com/stevekinney/conversationalist/actions/workflows/test.yml/badge.svg)](https://github.com/stevekinney/conversationalist/actions/workflows/test.yml)
6
+ [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
7
+
8
+ ## What is Conversationalist?
9
+
10
+ **Conversationalist** is a state engine for LLM-driven applications. While most libraries focus on making the API calls themselves, Conversationalist focuses on the **state** that lives between those calls. It provides a unified, model-agnostic representation of a conversation that can be easily stored, serialized, and adapted for any major LLM provider (OpenAI, Anthropic, Gemini).
11
+
12
+ In a modern AI application, a conversation is more than just a list of strings. It involves:
13
+
14
+ - **Tool Use**: Pairing function calls with their results and ensuring they stay in sync.
15
+ - **Hidden Logic**: Internal "thought" messages or snapshots that should be saved but never sent to the provider.
16
+ - **Metadata**: Tracking tags, custom IDs, and tokens across different steps.
17
+ - **Streaming**: Gracefully handling partial messages in a UI without messy state transitions.
18
+
19
+ Conversationalist handles these complexities through a robust, type-safe API that treats your conversation as the "Single Source of Truth."
20
+
21
+ ## Why Use It?
22
+
23
+ Managing LLM conversations manually often leads to "provider lock-in" or fragile glue code. Conversationalist solves this by:
24
+
25
+ - **Decoupling Logic from Providers**: Write your business logic once using Conversationalist's message model, and use adapters to talk to OpenAI, Anthropic, or Gemini.
26
+ - **Built-in Context Management**: Automatically handle context window limits by truncating history while preserving critical system instructions or recent messages.
27
+ - **Type Safety Out-of-the-Box**: Built with Zod and TypeScript, ensuring that your conversation data is valid at runtime and compile-time.
28
+ - **Unified Serialization**: One standard format (`ConversationJSON`) for your database, your frontend, and your backend.
29
+
30
+ ## The Immutable Advantage
31
+
32
+ At its core, Conversationalist is **strictly immutable**. Every change to a conversation—whether appending a message, updating a stream, or redacting sensitive data—returns a _new_ conversation object.
33
+
34
+ This approach offers several critical advantages for modern application development:
35
+
36
+ 1. **React/Redux Friendly**: Because updates return new references, they trigger re-renders naturally and work seamlessly with state management libraries.
37
+ 2. **Concurrency Safe**: You can safely pass a conversation to multiple functions or async tasks without worrying about one part of your app mutating it out from under another.
38
+ 3. **Easy Branching & Replay**: Want to let a user "undo" an AI response or branch a conversation into two different paths? Simply keep a reference to the previous immutable state. No complex cloning required.
39
+ 4. **Auditability**: Timestamps and message positions are automatically managed and preserved, making it easy to reconstruct the exact state of a chat at any point in time.
40
+
41
+ ## Real-World Use Cases
42
+
43
+ - **Multi-Model Chatbots**: Build a UI where users can switch between GPT-4o and Claude 3.5 Sonnet mid-conversation without losing history.
44
+ - **Chain-of-Thought Workflows**: Use `hidden` messages to store internal reasoning or intermediate steps that the AI uses to reach a final answer, without cluttering the user's view.
45
+ - **Agentic Workflows**: Track complex tool-use loops where multiple functions are called in sequence, ensuring every result is correctly paired with its corresponding call ID.
46
+ - **Token Budgeting**: Automatically trim old messages when a conversation gets too long, ensuring your API costs stay predictable and you never hit provider limits.
47
+ - **Deterministic Testing**: Use the custom `environment` parameter to mock IDs and timestamps, allowing you to write 100% deterministic tests for your chat logic.
48
+
49
+ ---
4
50
 
5
51
  ## Installation
6
52
 
7
53
  ```bash
8
- bun add conversationalist
9
- npm add conversationalist
54
+ bun add conversationalist zod
55
+ npm add conversationalist zod
56
+ pnpm add conversationalist zod
10
57
  ```
11
58
 
12
- ## Core Concepts
13
-
14
- ### Conversation
59
+ This package is ESM-only. Use `import` syntax. Zod is a peer dependency and must be installed by your application.
15
60
 
16
- A conversation is an immutable data structure containing messages, metadata, and timestamps:
61
+ ## Quick Start
17
62
 
18
- ```typescript
63
+ ```ts
19
64
  import {
20
- createConversation,
21
- appendUserMessage,
22
65
  appendAssistantMessage,
66
+ appendUserMessage,
67
+ createConversation,
68
+ serializeConversation,
23
69
  } from 'conversationalist';
70
+ import { toOpenAIMessages } from 'conversationalist/openai';
71
+
72
+ // 1. Create a conversation
73
+ let conversation = createConversation({
74
+ title: 'Order Support',
75
+ metadata: { userId: 'user_123' },
76
+ });
24
77
 
25
- let conversation = createConversation({ title: 'My Chat' });
78
+ // 2. Add messages (returns a new conversation object)
79
+ conversation = appendUserMessage(conversation, 'Where is my order?');
80
+ conversation = appendAssistantMessage(conversation, 'Let me check that for you.');
26
81
 
27
- conversation = appendUserMessage(conversation, 'Hello!');
28
- conversation = appendAssistantMessage(conversation, 'Hi there!');
82
+ // 3. Adapt for a provider
83
+ const openAIMessages = toOpenAIMessages(conversation);
84
+ // [{ role: 'user', content: 'Where is my order?' }, ...]
85
+
86
+ // 4. Save to your database
87
+ const data = serializeConversation(conversation);
88
+ // db.save(data.id, JSON.stringify(data));
29
89
  ```
30
90
 
31
- ### Messages
91
+ ## Core Concepts
32
92
 
33
- Messages have roles (`user`, `assistant`, `system`, `tool-use`, `tool-result`, `developer`, `snapshot`) and can contain text or multi-modal content:
93
+ ### Conversations
34
94
 
35
- ```typescript
36
- import { appendMessages } from 'conversationalist';
95
+ A conversation is an immutable record with metadata, tags, timestamps, and ordered messages.
37
96
 
38
- // Text message
39
- conversation = appendMessages(conversation, {
40
- role: 'user',
41
- content: 'What is in this image?',
97
+ ```ts
98
+ import { createConversation } from 'conversationalist';
99
+
100
+ const conversation = createConversation({
101
+ title: 'My Chat',
102
+ status: 'active',
103
+ metadata: { customerId: 'cus_123' },
104
+ tags: ['support', 'vip'],
42
105
  });
106
+ ```
107
+
108
+ ### Messages
109
+
110
+ Messages have roles and can contain text or multi-modal content. Optional fields include
111
+ `metadata`, `hidden`, `tokenUsage`, `toolCall`, `toolResult`, and `goalCompleted`.
112
+
113
+ **Roles**: `user`, `assistant`, `system`, `developer`, `tool-use`, `tool-result`, `snapshot`.
114
+ The `snapshot` role is for internal state and is skipped by adapters.
115
+
116
+ ```ts
117
+ import { appendMessages } from 'conversationalist';
43
118
 
44
- // Multi-modal message with image
45
119
  conversation = appendMessages(conversation, {
46
120
  role: 'user',
47
121
  content: [
@@ -51,307 +125,582 @@ conversation = appendMessages(conversation, {
51
125
  });
52
126
  ```
53
127
 
128
+ **Hidden messages** remain in history but are skipped by default when querying or adapting to
129
+ providers. This is perfect for internal logging or "thinking" steps.
130
+
54
131
  ### Tool Calls
55
132
 
56
- Tool use is supported with linked tool calls and results:
133
+ Tool calls are represented as paired `tool-use` and `tool-result` messages. Tool results
134
+ are validated to ensure the referenced call exists.
57
135
 
58
- ```typescript
136
+ ```ts
59
137
  conversation = appendMessages(
60
138
  conversation,
61
139
  {
62
140
  role: 'tool-use',
63
141
  content: '',
64
- toolCall: { id: 'call_123', name: 'get_weather', arguments: '{"city":"NYC"}' },
142
+ toolCall: { id: 'call_123', name: 'getWeather', arguments: { city: 'NYC' } },
65
143
  },
66
144
  {
67
145
  role: 'tool-result',
68
146
  content: '',
69
- toolResult: { callId: 'call_123', outcome: 'success', content: '72°F, sunny' },
147
+ toolResult: {
148
+ callId: 'call_123',
149
+ outcome: 'success',
150
+ content: { tempF: 72, condition: 'sunny' },
151
+ },
70
152
  },
71
153
  );
72
154
  ```
73
155
 
74
- ## API Reference
156
+ ### Streaming
75
157
 
76
- ### Creating Conversations
158
+ Streaming helpers let you append a placeholder, update it as chunks arrive, and finalize
159
+ when done.
77
160
 
78
- ```typescript
79
- createConversation(options?: {
80
- id?: string;
81
- title?: string;
82
- status?: 'active' | 'archived' | 'deleted';
83
- metadata?: Record<string, unknown>;
84
- tags?: string[];
85
- }): Conversation
86
- ```
161
+ ```ts
162
+ import {
163
+ appendStreamingMessage,
164
+ finalizeStreamingMessage,
165
+ updateStreamingMessage,
166
+ } from 'conversationalist';
167
+
168
+ let { conversation, messageId } = appendStreamingMessage(conversation, 'assistant');
169
+ let content = '';
87
170
 
88
- ### Appending Messages
171
+ for await (const chunk of stream) {
172
+ content += chunk;
173
+ conversation = updateStreamingMessage(conversation, messageId, content);
174
+ }
89
175
 
90
- ```typescript
91
- appendMessages(conversation: Conversation, ...inputs: MessageInput[]): Conversation
92
- appendUserMessage(conversation: Conversation, content: string | MultiModalContent[]): Conversation
93
- appendAssistantMessage(conversation: Conversation, content: string | MultiModalContent[]): Conversation
94
- appendSystemMessage(conversation: Conversation, content: string): Conversation
95
- prependSystemMessage(conversation: Conversation, content: string): Conversation
96
- replaceSystemMessage(conversation: Conversation, content: string): Conversation
176
+ conversation = finalizeStreamingMessage(conversation, messageId, {
177
+ tokenUsage: { prompt: 100, completion: 50, total: 150 },
178
+ });
97
179
  ```
98
180
 
99
- ### Querying Messages
181
+ ### Context Window Management
100
182
 
101
- ```typescript
102
- getConversationMessages(conversation: Conversation, options?: { includeHidden?: boolean }): Message[]
103
- getMessageAtPosition(conversation: Conversation, position: number): Message | undefined
104
- getMessageByIdentifier(conversation: Conversation, id: string): Message | undefined
105
- searchConversationMessages(conversation: Conversation, predicate: (m: Message) => boolean): Message[]
106
- ```
183
+ Automatically trim history to fit token budgets or to keep only recent messages.
107
184
 
108
- ### System Messages
185
+ ```ts
186
+ import { simpleTokenEstimator, truncateToTokenLimit } from 'conversationalist';
109
187
 
110
- ```typescript
111
- hasSystemMessage(conversation: Conversation): boolean
112
- getFirstSystemMessage(conversation: Conversation): Message | undefined
113
- getSystemMessages(conversation: Conversation): Message[]
114
- collapseSystemMessages(conversation: Conversation): Conversation
188
+ conversation = truncateToTokenLimit(conversation, 4000, {
189
+ preserveSystemMessages: true,
190
+ preserveLastN: 2,
191
+ });
115
192
  ```
116
193
 
117
- ### Utilities
194
+ #### Custom Token Counters
118
195
 
119
- ```typescript
120
- computeConversationStatistics(conversation: Conversation): {
121
- total: number;
122
- byRole: Record<string, number>;
123
- hidden: number;
124
- withImages: number;
125
- }
196
+ You can provide a custom token estimator (e.g. using `tiktoken` or `anthropic-tokenizer`) by passing it in the options or by binding it to your environment.
126
197
 
127
- redactMessageAtPosition(conversation: Conversation, position: number, placeholder?: string): Conversation
128
- ```
198
+ ```ts
199
+ import { truncateToTokenLimit } from 'conversationalist';
200
+ // import { get_encoding } from 'tiktoken';
129
201
 
130
- ### Serialization
202
+ const tiktokenEstimator = (message) => {
203
+ // Your logic here...
204
+ return 100;
205
+ };
206
+
207
+ // 1. Pass directly in options
208
+ conversation = truncateToTokenLimit(conversation, 4000, {
209
+ estimateTokens: tiktokenEstimator,
210
+ });
131
211
 
132
- ```typescript
133
- serializeConversation(conversation: Conversation): ConversationJSON
134
- deserializeConversation(json: ConversationJSON): Conversation
135
- toChatMessages(conversation: Conversation): ExternalMessage[]
212
+ // 2. Or bind to a history instance/environment
213
+ const history = new ConversationHistory(conversation, {
214
+ estimateTokens: tiktokenEstimator,
215
+ });
216
+
217
+ const boundTruncate = history.bind(truncateToTokenLimit);
218
+ boundTruncate(4000); // Uses tiktokenEstimator automatically
136
219
  ```
137
220
 
138
- ### Builder Pattern
221
+ ## Plugins
139
222
 
140
- For fluent API style:
223
+ **Conversationalist** supports a plugin system that allows you to transform messages as they are appended to a conversation. Plugins are functions that take a `MessageInput` and return a modified `MessageInput`.
141
224
 
142
- ```typescript
143
- import { createConversation } from 'conversationalist';
144
- import { withConversation, pipeConversation } from 'conversationalist';
145
- import { simpleTokenEstimator } from 'conversationalist';
225
+ ### PII Redaction Plugin
146
226
 
147
- // Draft pattern with callback
148
- const conversation = withConversation(createConversation(), (draft) => {
149
- draft
150
- .appendSystemMessage('You are a helpful assistant.')
151
- .appendUserMessage('Hello!')
152
- .appendAssistantMessage('Hi there!');
153
- });
227
+ The library includes a built-in `piiRedactionPlugin` that can automatically redact emails, phone numbers, and common API key patterns.
154
228
 
155
- // Streaming with the draft pattern
156
- const streamedConversation = withConversation(createConversation(), (draft) => {
157
- const { draft: d, messageId } = draft.appendStreamingMessage('assistant');
158
- d.updateStreamingMessage(messageId, 'Partial response...').finalizeStreamingMessage(
159
- messageId,
160
- { tokenUsage: { prompt: 10, completion: 5, total: 15 } },
161
- );
162
- });
229
+ ```ts
230
+ import {
231
+ appendUserMessage,
232
+ createConversation,
233
+ piiRedactionPlugin,
234
+ } from 'conversationalist';
163
235
 
164
- // Context window management with the draft pattern
165
- const truncatedConversation = withConversation(conversation, (draft) => {
166
- draft.truncateToTokenLimit(4000, simpleTokenEstimator, { preserveLastN: 2 });
167
- });
236
+ // 1. Enable by adding to your environment
237
+ const env = {
238
+ plugins: [piiRedactionPlugin],
239
+ };
168
240
 
169
- // Pipe pattern
170
- const conversation = pipeConversation(
171
- createConversation(),
172
- (c) => appendSystemMessage(c, 'You are helpful.'),
173
- (c) => appendUserMessage(c, 'Hello!'),
241
+ // 2. Use the environment when appending messages
242
+ let conversation = createConversation({}, env);
243
+ conversation = appendUserMessage(
244
+ conversation,
245
+ 'Contact me at test@example.com',
246
+ undefined,
247
+ env,
174
248
  );
249
+
250
+ console.log(conversation.messages[0].content);
251
+ // "Contact me at [EMAIL_REDACTED]"
175
252
  ```
176
253
 
177
- The `ConversationDraft` provides all conversation manipulation methods:
254
+ When using `ConversationHistory`, you only need to provide the plugin once during initialization:
178
255
 
179
- - **Message appending**: `appendMessages`, `appendUserMessage`, `appendAssistantMessage`, `appendSystemMessage`
180
- - **System messages**: `prependSystemMessage`, `replaceSystemMessage`, `collapseSystemMessages`
181
- - **Modification**: `redactMessageAtPosition`
182
- - **Streaming**: `appendStreamingMessage`, `updateStreamingMessage`, `finalizeStreamingMessage`, `cancelStreamingMessage`
183
- - **Context window**: `truncateFromPosition`, `truncateToTokenLimit`
256
+ ```ts
257
+ const history = new ConversationHistory(createConversation(), {
258
+ plugins: [piiRedactionPlugin],
259
+ });
260
+
261
+ const appendUser = history.bind(appendUserMessage);
262
+ appendUser('My key is sk-12345...'); // Automatically redacted
263
+ ```
184
264
 
185
- ### Tool Call Pairing
265
+ ## Provider Adapters
186
266
 
187
- ```typescript
188
- import { pairToolCallsWithResults } from 'conversationalist';
267
+ Convert the same conversation into provider-specific formats. Adapters automatically skip hidden/snapshot messages and map roles correctly.
189
268
 
190
- const pairs = pairToolCallsWithResults(conversation.messages);
191
- // Returns: [{ call: ToolCall, result?: ToolResult }, ...]
269
+ ```ts
270
+ import { toOpenAIMessages } from 'conversationalist/openai';
271
+ import { toAnthropicMessages } from 'conversationalist/anthropic';
272
+ import { toGeminiMessages } from 'conversationalist/gemini';
192
273
  ```
193
274
 
194
- ## Provider Adapters
275
+ - **OpenAI**: Supports `toOpenAIMessages` and `toOpenAIMessagesGrouped` (which groups consecutive tool calls).
276
+ - **Anthropic**: Maps system messages and tool blocks to Anthropic's specific format.
277
+ - **Gemini**: Handles Gemini's unique content/part structure.
195
278
 
196
- Convert conversations to provider-specific formats using subpath exports:
279
+ ### Provider-Specific Examples
197
280
 
198
- ### OpenAI
281
+ #### OpenAI (with Tool Calls)
199
282
 
200
- ```typescript
283
+ ```ts
284
+ import { appendAssistantMessage, appendMessages } from 'conversationalist';
201
285
  import { toOpenAIMessages } from 'conversationalist/openai';
202
286
 
203
- const messages = toOpenAIMessages(conversation);
204
287
  const response = await openai.chat.completions.create({
205
- model: 'gpt-4',
206
- messages,
288
+ model: 'gpt-4o',
289
+ messages: toOpenAIMessages(conversation),
290
+ tools: [{ type: 'function', function: { name: 'getWeather', ... } }],
207
291
  });
292
+
293
+ const toolCalls = response.choices[0]?.message?.tool_calls ?? [];
294
+ for (const call of toolCalls) {
295
+ conversation = appendMessages(conversation, {
296
+ role: 'tool-use',
297
+ content: '',
298
+ toolCall: { id: call.id, name: call.function.name, arguments: call.function.arguments },
299
+ });
300
+
301
+ const result = await getWeather(JSON.parse(call.function.arguments));
302
+ conversation = appendMessages(conversation, {
303
+ role: 'tool-result',
304
+ content: '',
305
+ toolResult: { callId: call.id, outcome: 'success', content: result },
306
+ });
307
+ }
208
308
  ```
209
309
 
210
- ### Anthropic
310
+ #### Anthropic (with Tool Calls)
211
311
 
212
- ```typescript
312
+ ```ts
313
+ import { appendAssistantMessage, appendMessages } from 'conversationalist';
213
314
  import { toAnthropicMessages } from 'conversationalist/anthropic';
214
315
 
215
316
  const { system, messages } = toAnthropicMessages(conversation);
216
317
  const response = await anthropic.messages.create({
217
- model: 'claude-3-opus-20240229',
318
+ model: 'claude-3-5-sonnet-20240620',
218
319
  system,
219
320
  messages,
321
+ tools: [{ name: 'getWeather', ... }],
220
322
  });
323
+
324
+ for (const block of response.content) {
325
+ if (block.type !== 'tool_use') continue;
326
+ conversation = appendMessages(conversation, {
327
+ role: 'tool-use',
328
+ content: '',
329
+ toolCall: { id: block.id, name: block.name, arguments: block.input },
330
+ });
331
+
332
+ const result = await getWeather(block.input);
333
+ conversation = appendMessages(conversation, {
334
+ role: 'tool-result',
335
+ content: '',
336
+ toolResult: { callId: block.id, outcome: 'success', content: result },
337
+ });
338
+ }
221
339
  ```
222
340
 
223
- ### Google Gemini
341
+ #### Gemini (with Tool Calls)
224
342
 
225
- ```typescript
343
+ ```ts
344
+ import { appendMessages } from 'conversationalist';
226
345
  import { toGeminiMessages } from 'conversationalist/gemini';
227
346
 
228
347
  const { systemInstruction, contents } = toGeminiMessages(conversation);
229
348
  const response = await model.generateContent({
230
349
  systemInstruction,
231
350
  contents,
351
+ tools: [{ functionDeclarations: [{ name: 'getWeather', ... }] }],
232
352
  });
233
- ```
234
353
 
235
- ## Streaming Support
354
+ const parts = response.response.candidates?.[0]?.content?.parts ?? [];
355
+ for (const part of parts) {
356
+ if (!('functionCall' in part)) continue;
357
+ const callId = crypto.randomUUID(); // Gemini doesn't provide IDs, so we generate one
358
+ const args = part.functionCall.args;
236
359
 
237
- Handle streaming responses with pending message utilities:
360
+ conversation = appendMessages(conversation, {
361
+ role: 'tool-use',
362
+ content: '',
363
+ toolCall: { id: callId, name: part.functionCall.name, arguments: args },
364
+ });
238
365
 
239
- ```typescript
240
- import {
241
- appendStreamingMessage,
242
- updateStreamingMessage,
243
- finalizeStreamingMessage,
244
- cancelStreamingMessage,
245
- isStreamingMessage,
246
- getStreamingMessage,
247
- } from 'conversationalist';
366
+ const result = await getWeather(args);
367
+ conversation = appendMessages(conversation, {
368
+ role: 'tool-result',
369
+ content: '',
370
+ toolResult: { callId, outcome: 'success', content: result },
371
+ });
372
+ }
373
+ ```
248
374
 
249
- // Start a streaming message
250
- let { conversation, messageId } = appendStreamingMessage(conversation, 'assistant');
375
+ ## Builder Pattern (Fluent API)
251
376
 
252
- // Update content as chunks arrive
253
- for await (const chunk of stream) {
254
- accumulatedContent += chunk;
255
- conversation = updateStreamingMessage(conversation, messageId, accumulatedContent);
256
- }
377
+ If you prefer a more fluent style, use `withConversation` or `pipeConversation`. These allow you to "mutate" a draft within a scope while still resulting in an immutable object.
257
378
 
258
- // Finalize when complete
259
- conversation = finalizeStreamingMessage(conversation, messageId, {
260
- tokenUsage: { prompt: 100, completion: 50, total: 150 },
379
+ ```ts
380
+ import { withConversation, createConversation } from 'conversationalist';
381
+
382
+ const conversation = withConversation(createConversation(), (draft) => {
383
+ draft
384
+ .appendSystemMessage('You are a helpful assistant.')
385
+ .appendUserMessage('Hello!')
386
+ .appendAssistantMessage('Hi there!');
261
387
  });
388
+ ```
389
+
390
+ `pipeConversation` allows you to chain multiple transformation functions together:
262
391
 
263
- // Or cancel if needed
264
- conversation = cancelStreamingMessage(conversation, messageId);
392
+ ```ts
393
+ import {
394
+ createConversation,
395
+ pipeConversation,
396
+ appendSystemMessage,
397
+ appendUserMessage,
398
+ } from 'conversationalist';
399
+
400
+ const conversation = pipeConversation(
401
+ createConversation(),
402
+ (c) => appendSystemMessage(c, 'You are a helpful assistant.'),
403
+ (c) => appendUserMessage(c, 'Hello!'),
404
+ (c) => appendAssistantMessage(c, 'Hi there!'),
405
+ );
265
406
  ```
266
407
 
267
- ## Context Window Utilities
408
+ ## Conversation History (Undo/Redo)
268
409
 
269
- Manage token limits and message truncation:
410
+ Use the `ConversationHistory` class to manage a stack of conversation states. Because every change returns a new immutable object, supporting undo/redo is built into the architecture.
270
411
 
271
- ```typescript
412
+ ````ts
272
413
  import {
273
- getRecentMessages,
274
- truncateFromPosition,
275
- truncateToTokenLimit,
276
- estimateConversationTokens,
277
- simpleTokenEstimator,
414
+ ConversationHistory,
415
+ createConversation,
416
+ appendUserMessage,
278
417
  } from 'conversationalist';
279
418
 
280
- // Get last N messages (excluding system by default)
281
- const recent = getRecentMessages(conversation, 10);
419
+ const history = new ConversationHistory(createConversation());
282
420
 
283
- // Truncate to messages from position onwards
284
- const truncated = truncateFromPosition(conversation, 5);
421
+ // You can use convenience methods that automatically track state
422
+ history.appendUserMessage('Hello!');
423
+ history.appendAssistantMessage('How are you?');
285
424
 
286
- // Truncate to fit token limit
287
- const fitted = truncateToTokenLimit(conversation, 4000, simpleTokenEstimator, {
288
- preserveSystemMessages: true,
289
- preserveLastN: 2,
425
+ history.undo(); // State reverts to just "Hello!"
426
+ history.redo(); // State advances back to "How are you?"
427
+
428
+ // Convenience methods for all library utilities are built-in
429
+ history.appendUserMessage('Another message');
430
+ history.redactMessageAtPosition(0);
431
+ history.truncateToTokenLimit(4000);
432
+
433
+ // Query methods work on the current state
434
+ const messages = history.getMessages();
435
+ const stats = history.getStatistics();
436
+ const tokens = history.estimateTokens();
437
+ ```
438
+
439
+ ### Event Subscription
440
+
441
+ `ConversationHistory` implements `EventTarget` and follows the Svelte store contract. You can listen for changes using standard DOM events or the `subscribe` method.
442
+
443
+ #### Using DOM Events
444
+
445
+ ```ts
446
+ const history = new ConversationHistory(createConversation());
447
+
448
+ // addEventListener returns a convenient unsubscribe function
449
+ const unsubscribe = history.addEventListener('change', (event) => {
450
+ const { type, conversation } = event.detail;
451
+ console.log(`History updated via ${type}`);
290
452
  });
291
453
 
292
- // Estimate total tokens
293
- const tokens = estimateConversationTokens(conversation, simpleTokenEstimator);
454
+ history.appendUserMessage('Hello!'); // Fires 'push' and 'change' events
455
+
456
+ unsubscribe(); // Clean up when done
294
457
  ```
295
458
 
296
- ## Types
297
-
298
- ```typescript
299
- import type {
300
- Conversation,
301
- ConversationJSON,
302
- ConversationStatus,
303
- Message,
304
- MessageInput,
305
- MessageJSON,
306
- MessageRole,
307
- TokenUsage,
308
- ToolCall,
309
- ToolResult,
310
- MultiModalContent,
311
- TextContent,
312
- ImageContent,
313
- } from 'conversationalist';
459
+ #### Using the Store Contract
460
+
461
+ ```ts
462
+ // Subscribe returns an unsubscribe function and calls the callback immediately
463
+ const unsubscribe = history.subscribe((conversation) => {
464
+ console.log('Current conversation state:', conversation);
465
+ });
314
466
  ```
315
467
 
316
- ## Validation Schemas
468
+ You can also use an `AbortSignal` for automatic cleanup:
317
469
 
318
- Zod schemas are exported for runtime validation:
470
+ ```ts
471
+ const controller = new AbortController();
472
+ history.addEventListener('change', (e) => { ... }, { signal: controller.signal });
319
473
 
320
- ```typescript
321
- import {
322
- conversationSchema,
323
- messageInputSchema,
324
- messageJSONSchema,
325
- messageRoleSchema,
326
- multiModalContentSchema,
327
- tokenUsageSchema,
328
- toolCallSchema,
329
- toolResultSchema,
330
- } from 'conversationalist';
474
+ // Later...
475
+ controller.abort();
476
+ ```
477
+
478
+ ### Conversation Branching
479
+
480
+ The `ConversationHistory` class supports branching. When you undo to a previous state and push a new update, it creates an alternate path instead of deleting the old history.
481
+
482
+ ```ts
483
+ const history = new ConversationHistory(createConversation());
484
+
485
+ history.push(appendUserMessage(history.current, 'Path A'));
486
+ history.undo();
487
+
488
+ history.push(appendUserMessage(history.current, 'Path B'));
489
+
490
+ console.log(history.branchCount); // 2
491
+ console.log(history.current.messages[0].content); // "Path B"
492
+
493
+ history.switchToBranch(0);
494
+ console.log(history.current.messages[0].content); // "Path A"
495
+ ```
496
+
497
+ ### Serialization
498
+
499
+ You can serialize the entire history tree (including all branches) to JSON and reconstruct it later.
500
+
501
+ ```ts
502
+ // 1. Save to JSON
503
+ const json = history.toJSON();
504
+ // localStorage.setItem('chat_history', JSON.stringify(json));
505
+
506
+ // 2. Restore from JSON
507
+ const restored = ConversationHistory.from(json);
508
+
509
+ // You can also provide a new environment (e.g. with fresh token counters)
510
+ const restoredWithEnv = ConversationHistory.from(json, {
511
+ estimateTokens: myNewEstimator,
512
+ });
513
+ ```
514
+
515
+ ## Integration
516
+
517
+ ### Using with React
518
+
519
+ Because **Conversationalist** is immutable, it works perfectly with React's `useState` or `useReducer`. Every update returns a new reference, which automatically triggers a re-render.
520
+
521
+ ```tsx
522
+ import { useState } from 'react';
523
+ import { createConversation, appendUserMessage } from 'conversationalist';
524
+
525
+ export function ChatApp() {
526
+ const [conversation, setConversation] = useState(() => createConversation());
331
527
 
332
- const result = messageInputSchema.safeParse(data);
528
+ const handleSend = (text: string) => {
529
+ // The new conversation object is set into state
530
+ setConversation((prev) => appendUserMessage(prev, text));
531
+ };
532
+
533
+ return (
534
+ <div>
535
+ {conversation.messages.map((m) => (
536
+ <div key={m.id}>{String(m.content)}</div>
537
+ ))}
538
+ <button onClick={() => handleSend('Hello!')}>Send</button>
539
+ </div>
540
+ );
541
+ }
333
542
  ```
334
543
 
335
- ## Error Handling
544
+ #### Custom React Hook Example
336
545
 
337
- Custom error types with codes:
546
+ For more complex applications, you can wrap the logic into a custom hook. This example uses `addEventListener` to sync the history with local React state and returns the unsubscribe function for easy cleanup in `useEffect`.
338
547
 
339
- ```typescript
548
+ ```tsx
549
+ import { useState, useCallback, useEffect } from 'react';
340
550
  import {
341
- ConversationalistError,
342
- createInvalidInputError,
343
- createInvalidPositionError,
344
- createNotFoundError,
345
- createValidationError,
551
+ createConversation,
552
+ ConversationHistory,
553
+ toChatMessages,
346
554
  } from 'conversationalist';
347
555
 
348
- try {
349
- // ...
350
- } catch (error) {
351
- if (error instanceof ConversationalistError) {
352
- console.log(error.code); // e.g., 'INVALID_POSITION'
353
- }
556
+ export function useChat(initialTitle?: string) {
557
+ // 1. Initialize history (this could also come from context or props)
558
+ const [history] = useState(
559
+ () => new ConversationHistory(createConversation({ title: initialTitle })),
560
+ );
561
+
562
+ // 2. Sync history with local state for reactivity
563
+ const [conversation, setConversation] = useState(history.current);
564
+ const [loading, setLoading] = useState(false);
565
+
566
+ useEffect(() => {
567
+ // addEventListener returns a cleanup function!
568
+ return history.addEventListener('change', (e) => {
569
+ setConversation(e.detail.conversation);
570
+ });
571
+ }, [history]);
572
+
573
+ const sendMessage = useCallback(
574
+ async (text: string) => {
575
+ history.appendUserMessage(text);
576
+ setLoading(true);
577
+
578
+ try {
579
+ const response = await fetch('/api/chat', {
580
+ method: 'POST',
581
+ body: JSON.stringify({
582
+ messages: history.toChatMessages(),
583
+ }),
584
+ });
585
+ const data = await response.json();
586
+ history.appendAssistantMessage(data.answer);
587
+ } finally {
588
+ setLoading(false);
589
+ }
590
+ },
591
+ [history],
592
+ );
593
+
594
+ return {
595
+ conversation,
596
+ messages: conversation.messages,
597
+ loading,
598
+ sendMessage,
599
+ undo: () => history.undo(),
600
+ redo: () => history.redo(),
601
+ };
602
+ }
603
+ ```
604
+
605
+ > **Note**: `ConversationHistory.addEventListener()` returns an unsubscribe function, which is ideal for cleaning up effects in React (`useEffect`) or Svelte.
606
+
607
+ ### Using with Redux
608
+
609
+ Redux requires immutable state updates, making **Conversationalist** an ideal companion. You can store the conversation object directly in your store.
610
+
611
+ ```ts
612
+ import { createSlice, PayloadAction } from '@reduxjs/toolkit';
613
+ import { createConversation, appendUserMessage, Conversation } from 'conversationalist';
614
+
615
+ interface ChatState {
616
+ conversation: Conversation;
354
617
  }
618
+
619
+ const chatSlice = createSlice({
620
+ name: 'chat',
621
+ initialState: {
622
+ conversation: createConversation(),
623
+ } as ChatState,
624
+ reducers: {
625
+ userMessageReceived: (state, action: PayloadAction<string>) => {
626
+ // Redux Toolkit's createSlice uses Immer, but since appendUserMessage
627
+ // returns a new object, we can just replace the property.
628
+ state.conversation = appendUserMessage(state.conversation, action.payload);
629
+ },
630
+ },
631
+ });
632
+ ```
633
+
634
+ ### Using with Svelte (Runes)
635
+
636
+ In Svelte 5, you can manage conversation state using the `$state` rune. Since **Conversationalist** is immutable, you update the state by re-assigning the variable with a new conversation object.
637
+
638
+ ```svelte
639
+ <script lang="ts">
640
+ import { createConversation, appendUserMessage } from 'conversationalist';
641
+
642
+ let conversation = $state(createConversation());
643
+
644
+ function handleSend(text: string) {
645
+ conversation = appendUserMessage(conversation, text);
646
+ }
647
+ </script>
648
+
649
+ <div>
650
+ {#each conversation.messages as m (m.id)}
651
+ <div>{String(m.content)}</div>
652
+ {/each}
653
+ <button onclick={() => handleSend('Hello!')}>Send</button>
654
+ </div>
655
+ ```
656
+
657
+ #### Custom Svelte Rune Example
658
+
659
+ Svelte 5's runes pair perfectly with **Conversationalist**. You can use the `ConversationHistory` class directly as a store, or wrap it in a class with runes.
660
+
661
+ ```svelte
662
+ <script lang="ts">
663
+ import { ConversationHistory, createConversation } from 'conversationalist';
664
+
665
+ // history implements the Svelte store contract
666
+ const history = new ConversationHistory(createConversation());
667
+ </script>
668
+
669
+ <div>
670
+ {#each $history.messages as m (m.id)}
671
+ <div>{String(m.content)}</div>
672
+ {/each}
673
+ <button onclick={() => history.appendUserMessage('Hello!')}>
674
+ Send
675
+ </button>
676
+ </div>
677
+ ```
678
+
679
+ > **Note**: `ConversationHistory.addEventListener()` returns an unsubscribe function, which is ideal for cleaning up reactive effects in Svelte 5 or React hooks.
680
+
681
+ ## API Overview
682
+
683
+ | Category | Key Functions |
684
+ | :--------------- | :------------------------------------------------------------------------------------------------------- |
685
+ | **Creation** | `createConversation`, `serializeConversation`, `deserializeConversation` |
686
+ | **Appending** | `appendUserMessage`, `appendAssistantMessage`, `appendSystemMessage`, `appendMessages` |
687
+ | **Streaming** | `appendStreamingMessage`, `updateStreamingMessage`, `finalizeStreamingMessage`, `cancelStreamingMessage` |
688
+ | **Modification** | `redactMessageAtPosition`, `replaceSystemMessage`, `collapseSystemMessages` |
689
+ | **Context** | `truncateToTokenLimit`, `getRecentMessages`, `estimateConversationTokens` |
690
+ | **Querying** | `getConversationMessages`, `getMessageByIdentifier`, `computeConversationStatistics` |
691
+ | **History** | `ConversationHistory`, `bindToConversationHistory` |
692
+
693
+ ## Deterministic Environments (Testing)
694
+
695
+ Pass a custom environment to control timestamps and IDs, making your tests 100% predictable.
696
+
697
+ ```ts
698
+ const testEnv = {
699
+ now: () => '2024-01-01T00:00:00.000Z',
700
+ randomId: () => 'fixed-id',
701
+ };
702
+
703
+ let conversation = createConversation({ title: 'Test' }, testEnv);
355
704
  ```
356
705
 
357
706
  ## Development
@@ -359,11 +708,6 @@ try {
359
708
  ```bash
360
709
  bun install
361
710
  bun test
362
- bun run typecheck
363
- bun run lint
364
711
  bun run build
365
712
  ```
366
-
367
- ## License
368
-
369
- MIT
713
+ ````