conversationalist 0.0.2 → 0.0.3
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +418 -287
- package/dist/context.d.ts +22 -19
- package/dist/context.d.ts.map +1 -1
- package/dist/conversation.d.ts.map +1 -1
- package/dist/environment.d.ts +13 -1
- package/dist/environment.d.ts.map +1 -1
- package/dist/history.d.ts +165 -0
- package/dist/history.d.ts.map +1 -0
- package/dist/index.d.ts +3 -0
- package/dist/index.d.ts.map +1 -1
- package/dist/index.js +461 -141
- package/dist/index.js.map +9 -8
- package/dist/plugins/index.d.ts +2 -0
- package/dist/plugins/index.d.ts.map +1 -0
- package/dist/plugins/pii-redaction.d.ts +36 -0
- package/dist/plugins/pii-redaction.d.ts.map +1 -0
- package/dist/types.d.ts +22 -0
- package/dist/types.d.ts.map +1 -1
- package/dist/with-conversation.d.ts +3 -3
- package/dist/with-conversation.d.ts.map +1 -1
- package/package.json +8 -1
package/README.md
CHANGED
|
@@ -1,14 +1,52 @@
|
|
|
1
1
|
# Conversationalist
|
|
2
2
|
|
|
3
|
-
A TypeScript-first library for managing LLM conversation state with immutable updates
|
|
3
|
+
A TypeScript-first library for managing LLM conversation state with **immutable updates**, **type-safe APIs**, and **provider-agnostic adapters**.
|
|
4
4
|
|
|
5
|
-
|
|
5
|
+
[](https://github.com/stevekinney/conversationalist/actions/workflows/test.yml)
|
|
6
|
+
[](https://opensource.org/licenses/MIT)
|
|
6
7
|
|
|
7
|
-
|
|
8
|
-
|
|
9
|
-
-
|
|
10
|
-
|
|
11
|
-
|
|
8
|
+
## What is Conversationalist?
|
|
9
|
+
|
|
10
|
+
**Conversationalist** is a state engine for LLM-driven applications. While most libraries focus on making the API calls themselves, Conversationalist focuses on the **state** that lives between those calls. It provides a unified, model-agnostic representation of a conversation that can be easily stored, serialized, and adapted for any major LLM provider (OpenAI, Anthropic, Gemini).
|
|
11
|
+
|
|
12
|
+
In a modern AI application, a conversation is more than just a list of strings. It involves:
|
|
13
|
+
|
|
14
|
+
- **Tool Use**: Pairing function calls with their results and ensuring they stay in sync.
|
|
15
|
+
- **Hidden Logic**: Internal "thought" messages or snapshots that should be saved but never sent to the provider.
|
|
16
|
+
- **Metadata**: Tracking tags, custom IDs, and tokens across different steps.
|
|
17
|
+
- **Streaming**: Gracefully handling partial messages in a UI without messy state transitions.
|
|
18
|
+
|
|
19
|
+
Conversationalist handles these complexities through a robust, type-safe API that treats your conversation as the "Single Source of Truth."
|
|
20
|
+
|
|
21
|
+
## Why Use It?
|
|
22
|
+
|
|
23
|
+
Managing LLM conversations manually often leads to "provider lock-in" or fragile glue code. Conversationalist solves this by:
|
|
24
|
+
|
|
25
|
+
- **Decoupling Logic from Providers**: Write your business logic once using Conversationalist's message model, and use adapters to talk to OpenAI, Anthropic, or Gemini.
|
|
26
|
+
- **Built-in Context Management**: Automatically handle context window limits by truncating history while preserving critical system instructions or recent messages.
|
|
27
|
+
- **Type Safety Out-of-the-Box**: Built with Zod and TypeScript, ensuring that your conversation data is valid at runtime and compile-time.
|
|
28
|
+
- **Unified Serialization**: One standard format (`ConversationJSON`) for your database, your frontend, and your backend.
|
|
29
|
+
|
|
30
|
+
## The Immutable Advantage
|
|
31
|
+
|
|
32
|
+
At its core, Conversationalist is **strictly immutable**. Every change to a conversation—whether appending a message, updating a stream, or redacting sensitive data—returns a _new_ conversation object.
|
|
33
|
+
|
|
34
|
+
This approach offers several critical advantages for modern application development:
|
|
35
|
+
|
|
36
|
+
1. **React/Redux Friendly**: Because updates return new references, they trigger re-renders naturally and work seamlessly with state management libraries.
|
|
37
|
+
2. **Concurrency Safe**: You can safely pass a conversation to multiple functions or async tasks without worrying about one part of your app mutating it out from under another.
|
|
38
|
+
3. **Easy Branching & Replay**: Want to let a user "undo" an AI response or branch a conversation into two different paths? Simply keep a reference to the previous immutable state. No complex cloning required.
|
|
39
|
+
4. **Auditability**: Timestamps and message positions are automatically managed and preserved, making it easy to reconstruct the exact state of a chat at any point in time.
|
|
40
|
+
|
|
41
|
+
## Real-World Use Cases
|
|
42
|
+
|
|
43
|
+
- **Multi-Model Chatbots**: Build a UI where users can switch between GPT-4o and Claude 3.5 Sonnet mid-conversation without losing history.
|
|
44
|
+
- **Chain-of-Thought Workflows**: Use `hidden` messages to store internal reasoning or intermediate steps that the AI uses to reach a final answer, without cluttering the user's view.
|
|
45
|
+
- **Agentic Workflows**: Track complex tool-use loops where multiple functions are called in sequence, ensuring every result is correctly paired with its corresponding call ID.
|
|
46
|
+
- **Token Budgeting**: Automatically trim old messages when a conversation gets too long, ensuring your API costs stay predictable and you never hit provider limits.
|
|
47
|
+
- **Deterministic Testing**: Use the custom `environment` parameter to mock IDs and timestamps, allowing you to write 100% deterministic tests for your chat logic.
|
|
48
|
+
|
|
49
|
+
---
|
|
12
50
|
|
|
13
51
|
## Installation
|
|
14
52
|
|
|
@@ -18,8 +56,7 @@ npm add conversationalist zod
|
|
|
18
56
|
pnpm add conversationalist zod
|
|
19
57
|
```
|
|
20
58
|
|
|
21
|
-
This package is ESM-only. Use `import` syntax.
|
|
22
|
-
Zod is a peer dependency and must be installed by your app.
|
|
59
|
+
This package is ESM-only. Use `import` syntax. Zod is a peer dependency and must be installed by your application.
|
|
23
60
|
|
|
24
61
|
## Quick Start
|
|
25
62
|
|
|
@@ -28,68 +65,29 @@ import {
|
|
|
28
65
|
appendAssistantMessage,
|
|
29
66
|
appendUserMessage,
|
|
30
67
|
createConversation,
|
|
31
|
-
deserializeConversation,
|
|
32
68
|
serializeConversation,
|
|
33
69
|
} from 'conversationalist';
|
|
34
70
|
import { toOpenAIMessages } from 'conversationalist/openai';
|
|
35
71
|
|
|
72
|
+
// 1. Create a conversation
|
|
36
73
|
let conversation = createConversation({
|
|
37
|
-
title: 'Support
|
|
38
|
-
|
|
74
|
+
title: 'Order Support',
|
|
75
|
+
metadata: { userId: 'user_123' },
|
|
39
76
|
});
|
|
40
77
|
|
|
78
|
+
// 2. Add messages (returns a new conversation object)
|
|
41
79
|
conversation = appendUserMessage(conversation, 'Where is my order?');
|
|
42
80
|
conversation = appendAssistantMessage(conversation, 'Let me check that for you.');
|
|
43
81
|
|
|
44
|
-
|
|
45
|
-
|
|
82
|
+
// 3. Adapt for a provider
|
|
83
|
+
const openAIMessages = toOpenAIMessages(conversation);
|
|
84
|
+
// [{ role: 'user', content: 'Where is my order?' }, ...]
|
|
46
85
|
|
|
47
|
-
|
|
48
|
-
const
|
|
86
|
+
// 4. Save to your database
|
|
87
|
+
const data = serializeConversation(conversation);
|
|
88
|
+
// db.save(data.id, JSON.stringify(data));
|
|
49
89
|
```
|
|
50
90
|
|
|
51
|
-
## End-to-End Example (Store + Resume)
|
|
52
|
-
|
|
53
|
-
```ts
|
|
54
|
-
import {
|
|
55
|
-
appendAssistantMessage,
|
|
56
|
-
appendSystemMessage,
|
|
57
|
-
appendUserMessage,
|
|
58
|
-
createConversation,
|
|
59
|
-
deserializeConversation,
|
|
60
|
-
serializeConversation,
|
|
61
|
-
} from 'conversationalist';
|
|
62
|
-
import { toOpenAIMessages } from 'conversationalist/openai';
|
|
63
|
-
|
|
64
|
-
let conversation = createConversation({ title: 'Order lookup' });
|
|
65
|
-
conversation = appendSystemMessage(conversation, 'You are a support agent.');
|
|
66
|
-
conversation = appendUserMessage(conversation, 'Where is order 123?');
|
|
67
|
-
|
|
68
|
-
const response = await openai.chat.completions.create({
|
|
69
|
-
model: 'gpt-4.1-mini',
|
|
70
|
-
messages: toOpenAIMessages(conversation),
|
|
71
|
-
});
|
|
72
|
-
|
|
73
|
-
conversation = appendAssistantMessage(
|
|
74
|
-
conversation,
|
|
75
|
-
response.choices[0]?.message?.content ?? '',
|
|
76
|
-
);
|
|
77
|
-
|
|
78
|
-
const stored = serializeConversation(conversation);
|
|
79
|
-
// await db.save(stored);
|
|
80
|
-
|
|
81
|
-
let restored = deserializeConversation(stored);
|
|
82
|
-
restored = appendUserMessage(restored, 'Can you email me the tracking link?');
|
|
83
|
-
```
|
|
84
|
-
|
|
85
|
-
## When to Use
|
|
86
|
-
|
|
87
|
-
- Build multi-provider chat apps with a consistent message model.
|
|
88
|
-
- Store and replay conversation history safely.
|
|
89
|
-
- Drive streaming UIs without ad-hoc state machines.
|
|
90
|
-
- Enforce tool call correctness and pair calls with results.
|
|
91
|
-
- Manage context windows and token limits in one place.
|
|
92
|
-
|
|
93
91
|
## Core Concepts
|
|
94
92
|
|
|
95
93
|
### Conversations
|
|
@@ -112,7 +110,7 @@ const conversation = createConversation({
|
|
|
112
110
|
Messages have roles and can contain text or multi-modal content. Optional fields include
|
|
113
111
|
`metadata`, `hidden`, `tokenUsage`, `toolCall`, `toolResult`, and `goalCompleted`.
|
|
114
112
|
|
|
115
|
-
Roles
|
|
113
|
+
**Roles**: `user`, `assistant`, `system`, `developer`, `tool-use`, `tool-result`, `snapshot`.
|
|
116
114
|
The `snapshot` role is for internal state and is skipped by adapters.
|
|
117
115
|
|
|
118
116
|
```ts
|
|
@@ -127,8 +125,8 @@ conversation = appendMessages(conversation, {
|
|
|
127
125
|
});
|
|
128
126
|
```
|
|
129
127
|
|
|
130
|
-
Hidden messages remain in history but are skipped by default when querying or adapting to
|
|
131
|
-
providers.
|
|
128
|
+
**Hidden messages** remain in history but are skipped by default when querying or adapting to
|
|
129
|
+
providers. This is perfect for internal logging or "thinking" steps.
|
|
132
130
|
|
|
133
131
|
### Tool Calls
|
|
134
132
|
|
|
@@ -155,8 +153,6 @@ conversation = appendMessages(
|
|
|
155
153
|
);
|
|
156
154
|
```
|
|
157
155
|
|
|
158
|
-
Use `pairToolCallsWithResults` to render tool calls alongside their results.
|
|
159
|
-
|
|
160
156
|
### Streaming
|
|
161
157
|
|
|
162
158
|
Streaming helpers let you append a placeholder, update it as chunks arrive, and finalize
|
|
@@ -182,62 +178,116 @@ conversation = finalizeStreamingMessage(conversation, messageId, {
|
|
|
182
178
|
});
|
|
183
179
|
```
|
|
184
180
|
|
|
185
|
-
### Context Window
|
|
181
|
+
### Context Window Management
|
|
186
182
|
|
|
187
|
-
|
|
183
|
+
Automatically trim history to fit token budgets or to keep only recent messages.
|
|
188
184
|
|
|
189
185
|
```ts
|
|
190
186
|
import { simpleTokenEstimator, truncateToTokenLimit } from 'conversationalist';
|
|
191
187
|
|
|
192
|
-
conversation = truncateToTokenLimit(conversation, 4000,
|
|
188
|
+
conversation = truncateToTokenLimit(conversation, 4000, {
|
|
193
189
|
preserveSystemMessages: true,
|
|
194
190
|
preserveLastN: 2,
|
|
195
191
|
});
|
|
196
192
|
```
|
|
197
193
|
|
|
198
|
-
|
|
194
|
+
#### Custom Token Counters
|
|
195
|
+
|
|
196
|
+
You can provide a custom token estimator (e.g. using `tiktoken` or `anthropic-tokenizer`) by passing it in the options or by binding it to your environment.
|
|
197
|
+
|
|
198
|
+
```ts
|
|
199
|
+
import { truncateToTokenLimit } from 'conversationalist';
|
|
200
|
+
// import { get_encoding } from 'tiktoken';
|
|
201
|
+
|
|
202
|
+
const tiktokenEstimator = (message) => {
|
|
203
|
+
// Your logic here...
|
|
204
|
+
return 100;
|
|
205
|
+
};
|
|
206
|
+
|
|
207
|
+
// 1. Pass directly in options
|
|
208
|
+
conversation = truncateToTokenLimit(conversation, 4000, {
|
|
209
|
+
estimateTokens: tiktokenEstimator,
|
|
210
|
+
});
|
|
199
211
|
|
|
200
|
-
|
|
212
|
+
// 2. Or bind to a history instance/environment
|
|
213
|
+
const history = new ConversationHistory(conversation, {
|
|
214
|
+
estimateTokens: tiktokenEstimator,
|
|
215
|
+
});
|
|
216
|
+
|
|
217
|
+
const boundTruncate = history.bind(truncateToTokenLimit);
|
|
218
|
+
boundTruncate(4000); // Uses tiktokenEstimator automatically
|
|
219
|
+
```
|
|
201
220
|
|
|
202
|
-
|
|
221
|
+
## Plugins
|
|
222
|
+
|
|
223
|
+
**Conversationalist** supports a plugin system that allows you to transform messages as they are appended to a conversation. Plugins are functions that take a `MessageInput` and return a modified `MessageInput`.
|
|
224
|
+
|
|
225
|
+
### PII Redaction Plugin
|
|
226
|
+
|
|
227
|
+
The library includes a built-in `piiRedactionPlugin` that can automatically redact emails, phone numbers, and common API key patterns.
|
|
228
|
+
|
|
229
|
+
```ts
|
|
230
|
+
import {
|
|
231
|
+
appendUserMessage,
|
|
232
|
+
createConversation,
|
|
233
|
+
piiRedactionPlugin,
|
|
234
|
+
} from 'conversationalist';
|
|
235
|
+
|
|
236
|
+
// 1. Enable by adding to your environment
|
|
237
|
+
const env = {
|
|
238
|
+
plugins: [piiRedactionPlugin],
|
|
239
|
+
};
|
|
240
|
+
|
|
241
|
+
// 2. Use the environment when appending messages
|
|
242
|
+
let conversation = createConversation({}, env);
|
|
243
|
+
conversation = appendUserMessage(
|
|
244
|
+
conversation,
|
|
245
|
+
'Contact me at test@example.com',
|
|
246
|
+
undefined,
|
|
247
|
+
env,
|
|
248
|
+
);
|
|
249
|
+
|
|
250
|
+
console.log(conversation.messages[0].content);
|
|
251
|
+
// "Contact me at [EMAIL_REDACTED]"
|
|
252
|
+
```
|
|
253
|
+
|
|
254
|
+
When using `ConversationHistory`, you only need to provide the plugin once during initialization:
|
|
255
|
+
|
|
256
|
+
```ts
|
|
257
|
+
const history = new ConversationHistory(createConversation(), {
|
|
258
|
+
plugins: [piiRedactionPlugin],
|
|
259
|
+
});
|
|
260
|
+
|
|
261
|
+
const appendUser = history.bind(appendUserMessage);
|
|
262
|
+
appendUser('My key is sk-12345...'); // Automatically redacted
|
|
263
|
+
```
|
|
264
|
+
|
|
265
|
+
## Provider Adapters
|
|
266
|
+
|
|
267
|
+
Convert the same conversation into provider-specific formats. Adapters automatically skip hidden/snapshot messages and map roles correctly.
|
|
203
268
|
|
|
204
269
|
```ts
|
|
205
|
-
import { toOpenAIMessages
|
|
270
|
+
import { toOpenAIMessages } from 'conversationalist/openai';
|
|
206
271
|
import { toAnthropicMessages } from 'conversationalist/anthropic';
|
|
207
272
|
import { toGeminiMessages } from 'conversationalist/gemini';
|
|
208
273
|
```
|
|
209
274
|
|
|
210
|
-
- OpenAI
|
|
211
|
-
- Anthropic
|
|
212
|
-
- Gemini
|
|
275
|
+
- **OpenAI**: Supports `toOpenAIMessages` and `toOpenAIMessagesGrouped` (which groups consecutive tool calls).
|
|
276
|
+
- **Anthropic**: Maps system messages and tool blocks to Anthropic's specific format.
|
|
277
|
+
- **Gemini**: Handles Gemini's unique content/part structure.
|
|
213
278
|
|
|
214
|
-
###
|
|
279
|
+
### Provider-Specific Examples
|
|
215
280
|
|
|
216
|
-
#### OpenAI
|
|
281
|
+
#### OpenAI (with Tool Calls)
|
|
217
282
|
|
|
218
283
|
```ts
|
|
219
284
|
import { appendAssistantMessage, appendMessages } from 'conversationalist';
|
|
220
285
|
import { toOpenAIMessages } from 'conversationalist/openai';
|
|
221
286
|
|
|
222
|
-
const tools = [
|
|
223
|
-
{
|
|
224
|
-
type: 'function',
|
|
225
|
-
function: {
|
|
226
|
-
name: 'getWeather',
|
|
227
|
-
description: 'Get current weather by city.',
|
|
228
|
-
parameters: {
|
|
229
|
-
type: 'object',
|
|
230
|
-
properties: { city: { type: 'string' } },
|
|
231
|
-
required: ['city'],
|
|
232
|
-
},
|
|
233
|
-
},
|
|
234
|
-
},
|
|
235
|
-
];
|
|
236
|
-
|
|
237
287
|
const response = await openai.chat.completions.create({
|
|
238
|
-
model: 'gpt-
|
|
288
|
+
model: 'gpt-4o',
|
|
239
289
|
messages: toOpenAIMessages(conversation),
|
|
240
|
-
tools,
|
|
290
|
+
tools: [{ type: 'function', function: { name: 'getWeather', ... } }],
|
|
241
291
|
});
|
|
242
292
|
|
|
243
293
|
const toolCalls = response.choices[0]?.message?.tool_calls ?? [];
|
|
@@ -245,11 +295,7 @@ for (const call of toolCalls) {
|
|
|
245
295
|
conversation = appendMessages(conversation, {
|
|
246
296
|
role: 'tool-use',
|
|
247
297
|
content: '',
|
|
248
|
-
toolCall: {
|
|
249
|
-
id: call.id,
|
|
250
|
-
name: call.function.name,
|
|
251
|
-
arguments: call.function.arguments,
|
|
252
|
-
},
|
|
298
|
+
toolCall: { id: call.id, name: call.function.name, arguments: call.function.arguments },
|
|
253
299
|
});
|
|
254
300
|
|
|
255
301
|
const result = await getWeather(JSON.parse(call.function.arguments));
|
|
@@ -259,43 +305,20 @@ for (const call of toolCalls) {
|
|
|
259
305
|
toolResult: { callId: call.id, outcome: 'success', content: result },
|
|
260
306
|
});
|
|
261
307
|
}
|
|
262
|
-
|
|
263
|
-
const followUp = await openai.chat.completions.create({
|
|
264
|
-
model: 'gpt-4.1-mini',
|
|
265
|
-
messages: toOpenAIMessages(conversation),
|
|
266
|
-
tools,
|
|
267
|
-
});
|
|
268
|
-
|
|
269
|
-
conversation = appendAssistantMessage(
|
|
270
|
-
conversation,
|
|
271
|
-
followUp.choices[0]?.message?.content ?? '',
|
|
272
|
-
);
|
|
273
308
|
```
|
|
274
309
|
|
|
275
|
-
#### Anthropic
|
|
310
|
+
#### Anthropic (with Tool Calls)
|
|
276
311
|
|
|
277
312
|
```ts
|
|
278
313
|
import { appendAssistantMessage, appendMessages } from 'conversationalist';
|
|
279
314
|
import { toAnthropicMessages } from 'conversationalist/anthropic';
|
|
280
315
|
|
|
281
|
-
const tools = [
|
|
282
|
-
{
|
|
283
|
-
name: 'getWeather',
|
|
284
|
-
description: 'Get current weather by city.',
|
|
285
|
-
input_schema: {
|
|
286
|
-
type: 'object',
|
|
287
|
-
properties: { city: { type: 'string' } },
|
|
288
|
-
required: ['city'],
|
|
289
|
-
},
|
|
290
|
-
},
|
|
291
|
-
];
|
|
292
|
-
|
|
293
316
|
const { system, messages } = toAnthropicMessages(conversation);
|
|
294
317
|
const response = await anthropic.messages.create({
|
|
295
318
|
model: 'claude-3-5-sonnet-20240620',
|
|
296
319
|
system,
|
|
297
320
|
messages,
|
|
298
|
-
tools,
|
|
321
|
+
tools: [{ name: 'getWeather', ... }],
|
|
299
322
|
});
|
|
300
323
|
|
|
301
324
|
for (const block of response.content) {
|
|
@@ -306,64 +329,33 @@ for (const block of response.content) {
|
|
|
306
329
|
toolCall: { id: block.id, name: block.name, arguments: block.input },
|
|
307
330
|
});
|
|
308
331
|
|
|
309
|
-
const result = await getWeather(block.input
|
|
332
|
+
const result = await getWeather(block.input);
|
|
310
333
|
conversation = appendMessages(conversation, {
|
|
311
334
|
role: 'tool-result',
|
|
312
335
|
content: '',
|
|
313
336
|
toolResult: { callId: block.id, outcome: 'success', content: result },
|
|
314
337
|
});
|
|
315
338
|
}
|
|
316
|
-
|
|
317
|
-
const followUp = await anthropic.messages.create({
|
|
318
|
-
model: 'claude-3-5-sonnet-20240620',
|
|
319
|
-
...toAnthropicMessages(conversation),
|
|
320
|
-
tools,
|
|
321
|
-
});
|
|
322
|
-
|
|
323
|
-
const assistantText = followUp.content
|
|
324
|
-
.filter((block) => block.type === 'text')
|
|
325
|
-
.map((block) => block.text)
|
|
326
|
-
.join('\n');
|
|
327
|
-
|
|
328
|
-
conversation = appendAssistantMessage(conversation, assistantText);
|
|
329
339
|
```
|
|
330
340
|
|
|
331
|
-
#### Gemini
|
|
332
|
-
|
|
333
|
-
Gemini does not include tool call IDs, so generate one to pair the tool result.
|
|
341
|
+
#### Gemini (with Tool Calls)
|
|
334
342
|
|
|
335
343
|
```ts
|
|
336
344
|
import { appendMessages } from 'conversationalist';
|
|
337
345
|
import { toGeminiMessages } from 'conversationalist/gemini';
|
|
338
346
|
|
|
339
|
-
const tools = [
|
|
340
|
-
{
|
|
341
|
-
functionDeclarations: [
|
|
342
|
-
{
|
|
343
|
-
name: 'getWeather',
|
|
344
|
-
description: 'Get current weather by city.',
|
|
345
|
-
parameters: {
|
|
346
|
-
type: 'object',
|
|
347
|
-
properties: { city: { type: 'string' } },
|
|
348
|
-
required: ['city'],
|
|
349
|
-
},
|
|
350
|
-
},
|
|
351
|
-
],
|
|
352
|
-
},
|
|
353
|
-
];
|
|
354
|
-
|
|
355
347
|
const { systemInstruction, contents } = toGeminiMessages(conversation);
|
|
356
348
|
const response = await model.generateContent({
|
|
357
349
|
systemInstruction,
|
|
358
350
|
contents,
|
|
359
|
-
tools,
|
|
351
|
+
tools: [{ functionDeclarations: [{ name: 'getWeather', ... }] }],
|
|
360
352
|
});
|
|
361
353
|
|
|
362
354
|
const parts = response.response.candidates?.[0]?.content?.parts ?? [];
|
|
363
355
|
for (const part of parts) {
|
|
364
356
|
if (!('functionCall' in part)) continue;
|
|
365
|
-
const callId = crypto.randomUUID();
|
|
366
|
-
const args = part.functionCall.args
|
|
357
|
+
const callId = crypto.randomUUID(); // Gemini doesn't provide IDs, so we generate one
|
|
358
|
+
const args = part.functionCall.args;
|
|
367
359
|
|
|
368
360
|
conversation = appendMessages(conversation, {
|
|
369
361
|
role: 'tool-use',
|
|
@@ -378,26 +370,14 @@ for (const part of parts) {
|
|
|
378
370
|
toolResult: { callId, outcome: 'success', content: result },
|
|
379
371
|
});
|
|
380
372
|
}
|
|
381
|
-
|
|
382
|
-
const followUp = await model.generateContent({
|
|
383
|
-
...toGeminiMessages(conversation),
|
|
384
|
-
tools,
|
|
385
|
-
});
|
|
386
373
|
```
|
|
387
374
|
|
|
388
|
-
## Builder Pattern
|
|
375
|
+
## Builder Pattern (Fluent API)
|
|
389
376
|
|
|
390
|
-
|
|
391
|
-
conversations.
|
|
377
|
+
If you prefer a more fluent style, use `withConversation` or `pipeConversation`. These allow you to "mutate" a draft within a scope while still resulting in an immutable object.
|
|
392
378
|
|
|
393
379
|
```ts
|
|
394
|
-
import {
|
|
395
|
-
appendSystemMessage,
|
|
396
|
-
appendUserMessage,
|
|
397
|
-
createConversation,
|
|
398
|
-
pipeConversation,
|
|
399
|
-
withConversation,
|
|
400
|
-
} from 'conversationalist';
|
|
380
|
+
import { withConversation, createConversation } from 'conversationalist';
|
|
401
381
|
|
|
402
382
|
const conversation = withConversation(createConversation(), (draft) => {
|
|
403
383
|
draft
|
|
@@ -405,166 +385,322 @@ const conversation = withConversation(createConversation(), (draft) => {
|
|
|
405
385
|
.appendUserMessage('Hello!')
|
|
406
386
|
.appendAssistantMessage('Hi there!');
|
|
407
387
|
});
|
|
388
|
+
```
|
|
389
|
+
|
|
390
|
+
`pipeConversation` allows you to chain multiple transformation functions together:
|
|
408
391
|
|
|
409
|
-
|
|
392
|
+
```ts
|
|
393
|
+
import {
|
|
394
|
+
createConversation,
|
|
395
|
+
pipeConversation,
|
|
396
|
+
appendSystemMessage,
|
|
397
|
+
appendUserMessage,
|
|
398
|
+
} from 'conversationalist';
|
|
399
|
+
|
|
400
|
+
const conversation = pipeConversation(
|
|
410
401
|
createConversation(),
|
|
411
|
-
(c) => appendSystemMessage(c, 'You are helpful.'),
|
|
402
|
+
(c) => appendSystemMessage(c, 'You are a helpful assistant.'),
|
|
412
403
|
(c) => appendUserMessage(c, 'Hello!'),
|
|
404
|
+
(c) => appendAssistantMessage(c, 'Hi there!'),
|
|
413
405
|
);
|
|
414
406
|
```
|
|
415
407
|
|
|
416
|
-
|
|
417
|
-
context window utilities.
|
|
408
|
+
## Conversation History (Undo/Redo)
|
|
418
409
|
|
|
419
|
-
|
|
410
|
+
Use the `ConversationHistory` class to manage a stack of conversation states. Because every change returns a new immutable object, supporting undo/redo is built into the architecture.
|
|
411
|
+
|
|
412
|
+
````ts
|
|
413
|
+
import {
|
|
414
|
+
ConversationHistory,
|
|
415
|
+
createConversation,
|
|
416
|
+
appendUserMessage,
|
|
417
|
+
} from 'conversationalist';
|
|
420
418
|
|
|
421
|
-
|
|
419
|
+
const history = new ConversationHistory(createConversation());
|
|
422
420
|
|
|
423
|
-
|
|
424
|
-
|
|
425
|
-
|
|
426
|
-
|
|
427
|
-
|
|
421
|
+
// You can use convenience methods that automatically track state
|
|
422
|
+
history.appendUserMessage('Hello!');
|
|
423
|
+
history.appendAssistantMessage('How are you?');
|
|
424
|
+
|
|
425
|
+
history.undo(); // State reverts to just "Hello!"
|
|
426
|
+
history.redo(); // State advances back to "How are you?"
|
|
427
|
+
|
|
428
|
+
// Convenience methods for all library utilities are built-in
|
|
429
|
+
history.appendUserMessage('Another message');
|
|
430
|
+
history.redactMessageAtPosition(0);
|
|
431
|
+
history.truncateToTokenLimit(4000);
|
|
432
|
+
|
|
433
|
+
// Query methods work on the current state
|
|
434
|
+
const messages = history.getMessages();
|
|
435
|
+
const stats = history.getStatistics();
|
|
436
|
+
const tokens = history.estimateTokens();
|
|
428
437
|
```
|
|
429
438
|
|
|
430
|
-
###
|
|
439
|
+
### Event Subscription
|
|
440
|
+
|
|
441
|
+
`ConversationHistory` implements `EventTarget` and follows the Svelte store contract. You can listen for changes using standard DOM events or the `subscribe` method.
|
|
442
|
+
|
|
443
|
+
#### Using DOM Events
|
|
431
444
|
|
|
432
445
|
```ts
|
|
433
|
-
|
|
434
|
-
|
|
435
|
-
|
|
436
|
-
|
|
446
|
+
const history = new ConversationHistory(createConversation());
|
|
447
|
+
|
|
448
|
+
// addEventListener returns a convenient unsubscribe function
|
|
449
|
+
const unsubscribe = history.addEventListener('change', (event) => {
|
|
450
|
+
const { type, conversation } = event.detail;
|
|
451
|
+
console.log(`History updated via ${type}`);
|
|
452
|
+
});
|
|
453
|
+
|
|
454
|
+
history.appendUserMessage('Hello!'); // Fires 'push' and 'change' events
|
|
455
|
+
|
|
456
|
+
unsubscribe(); // Clean up when done
|
|
437
457
|
```
|
|
438
458
|
|
|
439
|
-
|
|
459
|
+
#### Using the Store Contract
|
|
440
460
|
|
|
441
461
|
```ts
|
|
442
|
-
|
|
443
|
-
|
|
444
|
-
|
|
445
|
-
|
|
446
|
-
getFirstSystemMessage(conversation)
|
|
447
|
-
getSystemMessages(conversation)
|
|
462
|
+
// Subscribe returns an unsubscribe function and calls the callback immediately
|
|
463
|
+
const unsubscribe = history.subscribe((conversation) => {
|
|
464
|
+
console.log('Current conversation state:', conversation);
|
|
465
|
+
});
|
|
448
466
|
```
|
|
449
467
|
|
|
450
|
-
|
|
468
|
+
You can also use an `AbortSignal` for automatic cleanup:
|
|
451
469
|
|
|
452
470
|
```ts
|
|
453
|
-
|
|
454
|
-
|
|
455
|
-
|
|
456
|
-
|
|
457
|
-
|
|
471
|
+
const controller = new AbortController();
|
|
472
|
+
history.addEventListener('change', (e) => { ... }, { signal: controller.signal });
|
|
473
|
+
|
|
474
|
+
// Later...
|
|
475
|
+
controller.abort();
|
|
458
476
|
```
|
|
459
477
|
|
|
460
|
-
###
|
|
478
|
+
### Conversation Branching
|
|
479
|
+
|
|
480
|
+
The `ConversationHistory` class supports branching. When you undo to a previous state and push a new update, it creates an alternate path instead of deleting the old history.
|
|
461
481
|
|
|
462
482
|
```ts
|
|
463
|
-
|
|
483
|
+
const history = new ConversationHistory(createConversation());
|
|
484
|
+
|
|
485
|
+
history.push(appendUserMessage(history.current, 'Path A'));
|
|
486
|
+
history.undo();
|
|
487
|
+
|
|
488
|
+
history.push(appendUserMessage(history.current, 'Path B'));
|
|
489
|
+
|
|
490
|
+
console.log(history.branchCount); // 2
|
|
491
|
+
console.log(history.current.messages[0].content); // "Path B"
|
|
492
|
+
|
|
493
|
+
history.switchToBranch(0);
|
|
494
|
+
console.log(history.current.messages[0].content); // "Path A"
|
|
464
495
|
```
|
|
465
496
|
|
|
466
|
-
###
|
|
497
|
+
### Serialization
|
|
498
|
+
|
|
499
|
+
You can serialize the entire history tree (including all branches) to JSON and reconstruct it later.
|
|
467
500
|
|
|
468
501
|
```ts
|
|
469
|
-
|
|
470
|
-
|
|
471
|
-
|
|
472
|
-
|
|
473
|
-
|
|
474
|
-
|
|
502
|
+
// 1. Save to JSON
|
|
503
|
+
const json = history.toJSON();
|
|
504
|
+
// localStorage.setItem('chat_history', JSON.stringify(json));
|
|
505
|
+
|
|
506
|
+
// 2. Restore from JSON
|
|
507
|
+
const restored = ConversationHistory.from(json);
|
|
508
|
+
|
|
509
|
+
// You can also provide a new environment (e.g. with fresh token counters)
|
|
510
|
+
const restoredWithEnv = ConversationHistory.from(json, {
|
|
511
|
+
estimateTokens: myNewEstimator,
|
|
512
|
+
});
|
|
475
513
|
```
|
|
476
514
|
|
|
477
|
-
|
|
515
|
+
## Integration
|
|
478
516
|
|
|
479
|
-
|
|
480
|
-
|
|
481
|
-
|
|
482
|
-
|
|
483
|
-
|
|
484
|
-
|
|
485
|
-
|
|
486
|
-
|
|
487
|
-
|
|
488
|
-
)
|
|
489
|
-
|
|
490
|
-
|
|
517
|
+
### Using with React
|
|
518
|
+
|
|
519
|
+
Because **Conversationalist** is immutable, it works perfectly with React's `useState` or `useReducer`. Every update returns a new reference, which automatically triggers a re-render.
|
|
520
|
+
|
|
521
|
+
```tsx
|
|
522
|
+
import { useState } from 'react';
|
|
523
|
+
import { createConversation, appendUserMessage } from 'conversationalist';
|
|
524
|
+
|
|
525
|
+
export function ChatApp() {
|
|
526
|
+
const [conversation, setConversation] = useState(() => createConversation());
|
|
527
|
+
|
|
528
|
+
const handleSend = (text: string) => {
|
|
529
|
+
// The new conversation object is set into state
|
|
530
|
+
setConversation((prev) => appendUserMessage(prev, text));
|
|
531
|
+
};
|
|
532
|
+
|
|
533
|
+
return (
|
|
534
|
+
<div>
|
|
535
|
+
{conversation.messages.map((m) => (
|
|
536
|
+
<div key={m.id}>{String(m.content)}</div>
|
|
537
|
+
))}
|
|
538
|
+
<button onClick={() => handleSend('Hello!')}>Send</button>
|
|
539
|
+
</div>
|
|
540
|
+
);
|
|
541
|
+
}
|
|
491
542
|
```
|
|
492
543
|
|
|
493
|
-
|
|
544
|
+
#### Custom React Hook Example
|
|
494
545
|
|
|
495
|
-
|
|
496
|
-
|
|
497
|
-
|
|
498
|
-
|
|
499
|
-
|
|
500
|
-
|
|
501
|
-
|
|
546
|
+
For more complex applications, you can wrap the logic into a custom hook. This example uses `addEventListener` to sync the history with local React state and returns the unsubscribe function for easy cleanup in `useEffect`.
|
|
547
|
+
|
|
548
|
+
```tsx
|
|
549
|
+
import { useState, useCallback, useEffect } from 'react';
|
|
550
|
+
import {
|
|
551
|
+
createConversation,
|
|
552
|
+
ConversationHistory,
|
|
553
|
+
toChatMessages,
|
|
554
|
+
} from 'conversationalist';
|
|
555
|
+
|
|
556
|
+
export function useChat(initialTitle?: string) {
|
|
557
|
+
// 1. Initialize history (this could also come from context or props)
|
|
558
|
+
const [history] = useState(
|
|
559
|
+
() => new ConversationHistory(createConversation({ title: initialTitle })),
|
|
560
|
+
);
|
|
561
|
+
|
|
562
|
+
// 2. Sync history with local state for reactivity
|
|
563
|
+
const [conversation, setConversation] = useState(history.current);
|
|
564
|
+
const [loading, setLoading] = useState(false);
|
|
565
|
+
|
|
566
|
+
useEffect(() => {
|
|
567
|
+
// addEventListener returns a cleanup function!
|
|
568
|
+
return history.addEventListener('change', (e) => {
|
|
569
|
+
setConversation(e.detail.conversation);
|
|
570
|
+
});
|
|
571
|
+
}, [history]);
|
|
572
|
+
|
|
573
|
+
const sendMessage = useCallback(
|
|
574
|
+
async (text: string) => {
|
|
575
|
+
history.appendUserMessage(text);
|
|
576
|
+
setLoading(true);
|
|
577
|
+
|
|
578
|
+
try {
|
|
579
|
+
const response = await fetch('/api/chat', {
|
|
580
|
+
method: 'POST',
|
|
581
|
+
body: JSON.stringify({
|
|
582
|
+
messages: history.toChatMessages(),
|
|
583
|
+
}),
|
|
584
|
+
});
|
|
585
|
+
const data = await response.json();
|
|
586
|
+
history.appendAssistantMessage(data.answer);
|
|
587
|
+
} finally {
|
|
588
|
+
setLoading(false);
|
|
589
|
+
}
|
|
590
|
+
},
|
|
591
|
+
[history],
|
|
592
|
+
);
|
|
593
|
+
|
|
594
|
+
return {
|
|
595
|
+
conversation,
|
|
596
|
+
messages: conversation.messages,
|
|
597
|
+
loading,
|
|
598
|
+
sendMessage,
|
|
599
|
+
undo: () => history.undo(),
|
|
600
|
+
redo: () => history.redo(),
|
|
601
|
+
};
|
|
602
|
+
}
|
|
502
603
|
```
|
|
503
604
|
|
|
504
|
-
|
|
605
|
+
> **Note**: `ConversationHistory.addEventListener()` returns an unsubscribe function, which is ideal for cleaning up effects in React (`useEffect`) or Svelte.
|
|
606
|
+
|
|
607
|
+
### Using with Redux
|
|
608
|
+
|
|
609
|
+
Redux requires immutable state updates, making **Conversationalist** an ideal companion. You can store the conversation object directly in your store.
|
|
505
610
|
|
|
506
611
|
```ts
|
|
507
|
-
|
|
508
|
-
|
|
509
|
-
|
|
510
|
-
|
|
511
|
-
|
|
512
|
-
|
|
513
|
-
|
|
514
|
-
|
|
515
|
-
|
|
612
|
+
import { createSlice, PayloadAction } from '@reduxjs/toolkit';
|
|
613
|
+
import { createConversation, appendUserMessage, Conversation } from 'conversationalist';
|
|
614
|
+
|
|
615
|
+
interface ChatState {
|
|
616
|
+
conversation: Conversation;
|
|
617
|
+
}
|
|
618
|
+
|
|
619
|
+
const chatSlice = createSlice({
|
|
620
|
+
name: 'chat',
|
|
621
|
+
initialState: {
|
|
622
|
+
conversation: createConversation(),
|
|
623
|
+
} as ChatState,
|
|
624
|
+
reducers: {
|
|
625
|
+
userMessageReceived: (state, action: PayloadAction<string>) => {
|
|
626
|
+
// Redux Toolkit's createSlice uses Immer, but since appendUserMessage
|
|
627
|
+
// returns a new object, we can just replace the property.
|
|
628
|
+
state.conversation = appendUserMessage(state.conversation, action.payload);
|
|
629
|
+
},
|
|
630
|
+
},
|
|
631
|
+
});
|
|
516
632
|
```
|
|
517
633
|
|
|
518
|
-
###
|
|
634
|
+
### Using with Svelte (Runes)
|
|
519
635
|
|
|
520
|
-
|
|
521
|
-
|
|
522
|
-
|
|
523
|
-
|
|
524
|
-
|
|
525
|
-
|
|
526
|
-
|
|
527
|
-
|
|
528
|
-
|
|
529
|
-
|
|
636
|
+
In Svelte 5, you can manage conversation state using the `$state` rune. Since **Conversationalist** is immutable, you update the state by re-assigning the variable with a new conversation object.
|
|
637
|
+
|
|
638
|
+
```svelte
|
|
639
|
+
<script lang="ts">
|
|
640
|
+
import { createConversation, appendUserMessage } from 'conversationalist';
|
|
641
|
+
|
|
642
|
+
let conversation = $state(createConversation());
|
|
643
|
+
|
|
644
|
+
function handleSend(text: string) {
|
|
645
|
+
conversation = appendUserMessage(conversation, text);
|
|
646
|
+
}
|
|
647
|
+
</script>
|
|
648
|
+
|
|
649
|
+
<div>
|
|
650
|
+
{#each conversation.messages as m (m.id)}
|
|
651
|
+
<div>{String(m.content)}</div>
|
|
652
|
+
{/each}
|
|
653
|
+
<button onclick={() => handleSend('Hello!')}>Send</button>
|
|
654
|
+
</div>
|
|
530
655
|
```
|
|
531
656
|
|
|
532
|
-
|
|
657
|
+
#### Custom Svelte Rune Example
|
|
533
658
|
|
|
534
|
-
|
|
659
|
+
Svelte 5's runes pair perfectly with **Conversationalist**. You can use the `ConversationHistory` class directly as a store, or wrap it in a class with runes.
|
|
535
660
|
|
|
536
|
-
```
|
|
537
|
-
|
|
661
|
+
```svelte
|
|
662
|
+
<script lang="ts">
|
|
663
|
+
import { ConversationHistory, createConversation } from 'conversationalist';
|
|
538
664
|
|
|
539
|
-
|
|
540
|
-
|
|
541
|
-
|
|
542
|
-
};
|
|
665
|
+
// history implements the Svelte store contract
|
|
666
|
+
const history = new ConversationHistory(createConversation());
|
|
667
|
+
</script>
|
|
543
668
|
|
|
544
|
-
|
|
545
|
-
|
|
669
|
+
<div>
|
|
670
|
+
{#each $history.messages as m (m.id)}
|
|
671
|
+
<div>{String(m.content)}</div>
|
|
672
|
+
{/each}
|
|
673
|
+
<button onclick={() => history.appendUserMessage('Hello!')}>
|
|
674
|
+
Send
|
|
675
|
+
</button>
|
|
676
|
+
</div>
|
|
546
677
|
```
|
|
547
678
|
|
|
548
|
-
|
|
679
|
+
> **Note**: `ConversationHistory.addEventListener()` returns an unsubscribe function, which is ideal for cleaning up reactive effects in Svelte 5 or React hooks.
|
|
680
|
+
|
|
681
|
+
## API Overview
|
|
682
|
+
|
|
683
|
+
| Category | Key Functions |
|
|
684
|
+
| :--------------- | :------------------------------------------------------------------------------------------------------- |
|
|
685
|
+
| **Creation** | `createConversation`, `serializeConversation`, `deserializeConversation` |
|
|
686
|
+
| **Appending** | `appendUserMessage`, `appendAssistantMessage`, `appendSystemMessage`, `appendMessages` |
|
|
687
|
+
| **Streaming** | `appendStreamingMessage`, `updateStreamingMessage`, `finalizeStreamingMessage`, `cancelStreamingMessage` |
|
|
688
|
+
| **Modification** | `redactMessageAtPosition`, `replaceSystemMessage`, `collapseSystemMessages` |
|
|
689
|
+
| **Context** | `truncateToTokenLimit`, `getRecentMessages`, `estimateConversationTokens` |
|
|
690
|
+
| **Querying** | `getConversationMessages`, `getMessageByIdentifier`, `computeConversationStatistics` |
|
|
691
|
+
| **History** | `ConversationHistory`, `bindToConversationHistory` |
|
|
692
|
+
|
|
693
|
+
## Deterministic Environments (Testing)
|
|
694
|
+
|
|
695
|
+
Pass a custom environment to control timestamps and IDs, making your tests 100% predictable.
|
|
549
696
|
|
|
550
697
|
```ts
|
|
551
|
-
|
|
552
|
-
|
|
553
|
-
|
|
554
|
-
|
|
555
|
-
|
|
556
|
-
|
|
557
|
-
ImageContent,
|
|
558
|
-
Message,
|
|
559
|
-
MessageInput,
|
|
560
|
-
MessageJSON,
|
|
561
|
-
MessageRole,
|
|
562
|
-
MultiModalContent,
|
|
563
|
-
TextContent,
|
|
564
|
-
TokenUsage,
|
|
565
|
-
ToolCall,
|
|
566
|
-
ToolResult,
|
|
567
|
-
} from 'conversationalist';
|
|
698
|
+
const testEnv = {
|
|
699
|
+
now: () => '2024-01-01T00:00:00.000Z',
|
|
700
|
+
randomId: () => 'fixed-id',
|
|
701
|
+
};
|
|
702
|
+
|
|
703
|
+
let conversation = createConversation({ title: 'Test' }, testEnv);
|
|
568
704
|
```
|
|
569
705
|
|
|
570
706
|
## Development
|
|
@@ -572,11 +708,6 @@ import type {
|
|
|
572
708
|
```bash
|
|
573
709
|
bun install
|
|
574
710
|
bun test
|
|
575
|
-
bun run typecheck
|
|
576
|
-
bun run lint
|
|
577
711
|
bun run build
|
|
578
712
|
```
|
|
579
|
-
|
|
580
|
-
## License
|
|
581
|
-
|
|
582
|
-
MIT
|
|
713
|
+
````
|