@letta-ai/letta-client 1.0.0 → 1.1.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +22 -0
- package/README.md +282 -268
- package/package.json +1 -1
- package/resources/agents/agents.d.mts +2 -2
- package/resources/agents/agents.d.ts +2 -2
- package/resources/agents/messages.d.mts +27 -1
- package/resources/agents/messages.d.mts.map +1 -1
- package/resources/agents/messages.d.ts +27 -1
- package/resources/agents/messages.d.ts.map +1 -1
- package/resources/identities/properties.d.mts +1 -1
- package/resources/identities/properties.d.ts +1 -1
- package/resources/identities/properties.js +1 -1
- package/resources/identities/properties.mjs +1 -1
- package/resources/models/models.d.mts +2 -1
- package/resources/models/models.d.mts.map +1 -1
- package/resources/models/models.d.ts +2 -1
- package/resources/models/models.d.ts.map +1 -1
- package/resources/models/models.js.map +1 -1
- package/resources/models/models.mjs.map +1 -1
- package/resources/runs/messages.d.mts +1 -1
- package/resources/runs/messages.d.ts +1 -1
- package/resources/runs/messages.js +1 -1
- package/resources/runs/messages.mjs +1 -1
- package/resources/templates/agents.d.mts +19 -2
- package/resources/templates/agents.d.mts.map +1 -1
- package/resources/templates/agents.d.ts +19 -2
- package/resources/templates/agents.d.ts.map +1 -1
- package/resources/templates/agents.js +1 -6
- package/resources/templates/agents.js.map +1 -1
- package/resources/templates/agents.mjs +1 -6
- package/resources/templates/agents.mjs.map +1 -1
- package/resources/templates/index.d.mts +1 -1
- package/resources/templates/index.d.mts.map +1 -1
- package/resources/templates/index.d.ts +1 -1
- package/resources/templates/index.d.ts.map +1 -1
- package/resources/templates/index.js.map +1 -1
- package/resources/templates/index.mjs.map +1 -1
- package/resources/templates/templates.d.mts +2 -2
- package/resources/templates/templates.d.mts.map +1 -1
- package/resources/templates/templates.d.ts +2 -2
- package/resources/templates/templates.d.ts.map +1 -1
- package/resources/templates/templates.js.map +1 -1
- package/resources/templates/templates.mjs.map +1 -1
- package/src/resources/agents/agents.ts +2 -2
- package/src/resources/agents/messages.ts +32 -0
- package/src/resources/identities/properties.ts +1 -1
- package/src/resources/models/models.ts +2 -1
- package/src/resources/runs/messages.ts +1 -1
- package/src/resources/templates/agents.ts +23 -8
- package/src/resources/templates/index.ts +1 -1
- package/src/resources/templates/templates.ts +6 -2
- package/src/version.ts +1 -1
- package/version.d.mts +1 -1
- package/version.d.ts +1 -1
- package/version.js +1 -1
- package/version.mjs +1 -1
package/README.md
CHANGED
|
@@ -1,390 +1,404 @@
|
|
|
1
|
-
# Letta TypeScript
|
|
1
|
+
# Letta TypeScript SDK
|
|
2
2
|
|
|
3
|
-
[](https://www.npmjs.com/package/@letta-ai/letta-client)
|
|
4
4
|
|
|
5
|
-
|
|
5
|
+
Letta is the platform for building stateful agents: open AI with advanced memory that can learn and self-improve over time.
|
|
6
6
|
|
|
7
|
-
|
|
7
|
+
### Quicklinks:
|
|
8
|
+
* [**Developer Documentation**](https://docs.letta.com): Learn how to create agents using Python or TypeScript
|
|
9
|
+
* [**TypeScript API Reference**](./reference.md): Complete TypeScript SDK documentation
|
|
10
|
+
* [**Agent Development Environment (ADE)**](https://docs.letta.com/guides/ade/overview): A no-code UI for building stateful agents
|
|
11
|
+
* [**Letta Cloud**](https://app.letta.com/): The fastest way to try Letta
|
|
8
12
|
|
|
9
|
-
|
|
13
|
+
## Get started
|
|
10
14
|
|
|
11
|
-
|
|
15
|
+
Install the Letta TypeScript SDK:
|
|
12
16
|
|
|
13
|
-
```
|
|
17
|
+
```bash
|
|
14
18
|
npm install @letta-ai/letta-client
|
|
15
19
|
```
|
|
16
20
|
|
|
17
|
-
##
|
|
21
|
+
## Simple Hello World example
|
|
18
22
|
|
|
19
|
-
|
|
23
|
+
In the example below, we'll create a stateful agent with two memory blocks. We'll initialize the `human` memory block with incorrect information, and correct the agent in our first message - which will trigger the agent to update its own memory with a tool call.
|
|
20
24
|
|
|
21
|
-
|
|
22
|
-
```js
|
|
23
|
-
import Letta from '@letta-ai/letta-client';
|
|
25
|
+
*To run the examples, you'll need to get a `LETTA_API_KEY` from [Letta Cloud](https://app.letta.com/api-keys), or run your own self-hosted server (see [our guide](https://docs.letta.com/guides/selfhosting))*
|
|
24
26
|
|
|
25
|
-
|
|
26
|
-
|
|
27
|
-
environment: 'local', // defaults to 'cloud'
|
|
28
|
-
});
|
|
29
|
-
|
|
30
|
-
const archive = await client.archives.create({ name: 'name' });
|
|
31
|
-
|
|
32
|
-
console.log(archive.id);
|
|
33
|
-
```
|
|
27
|
+
```typescript
|
|
28
|
+
import { LettaClient } from '@letta-ai/letta-client';
|
|
34
29
|
|
|
35
|
-
|
|
30
|
+
const client = new LettaClient({ apiKey: "LETTA_API_KEY" });
|
|
31
|
+
// const client = new LettaClient({ baseUrl: "http://localhost:8283" }); // if self-hosting
|
|
36
32
|
|
|
37
|
-
|
|
38
|
-
|
|
39
|
-
|
|
40
|
-
|
|
41
|
-
|
|
42
|
-
|
|
43
|
-
|
|
44
|
-
|
|
45
|
-
|
|
33
|
+
const agentState = await client.agents.create({
|
|
34
|
+
model: "openai/gpt-4o-mini",
|
|
35
|
+
embedding: "openai/text-embedding-3-small",
|
|
36
|
+
memoryBlocks: [
|
|
37
|
+
{
|
|
38
|
+
label: "human",
|
|
39
|
+
value: "The human's name is Chad. They like vibe coding."
|
|
40
|
+
},
|
|
41
|
+
{
|
|
42
|
+
label: "persona",
|
|
43
|
+
value: "My name is Sam, a helpful assistant."
|
|
44
|
+
}
|
|
45
|
+
],
|
|
46
|
+
tools: ["web_search", "run_code"]
|
|
46
47
|
});
|
|
47
48
|
|
|
48
|
-
|
|
49
|
-
|
|
50
|
-
```
|
|
51
|
-
|
|
52
|
-
Documentation for each method, request param, and response field are available in docstrings and will appear on hover in most modern editors.
|
|
49
|
+
console.log(agentState.id);
|
|
50
|
+
// agent-d9be...0846
|
|
53
51
|
|
|
54
|
-
|
|
55
|
-
|
|
56
|
-
|
|
52
|
+
const response = await client.agents.messages.create(agentState.id, {
|
|
53
|
+
messages: [
|
|
54
|
+
{
|
|
55
|
+
role: "user",
|
|
56
|
+
content: "Hey, nice to meet you, my name is Brad."
|
|
57
|
+
}
|
|
58
|
+
]
|
|
59
|
+
});
|
|
57
60
|
|
|
58
|
-
|
|
59
|
-
|
|
60
|
-
|
|
61
|
-
|
|
61
|
+
// the agent will think, then edit its memory using a tool
|
|
62
|
+
for (const message of response.messages) {
|
|
63
|
+
console.log(message);
|
|
64
|
+
}
|
|
62
65
|
|
|
63
|
-
|
|
64
|
-
|
|
65
|
-
|
|
66
|
+
// The content of this memory block will be something like
|
|
67
|
+
// "The human's name is Brad. They like vibe coding."
|
|
68
|
+
// Fetch this block's content with:
|
|
69
|
+
const human_block = await client.agents.blocks.retrieve(agentState.id, "human");
|
|
70
|
+
console.log(human_block.value);
|
|
71
|
+
```
|
|
66
72
|
|
|
67
|
-
|
|
73
|
+
## Core concepts in Letta:
|
|
68
74
|
|
|
69
|
-
|
|
70
|
-
await client.agents.importFile({ file: fs.createReadStream('/path/to/file') });
|
|
75
|
+
Letta is built on the [MemGPT](https://arxiv.org/abs/2310.08560) research paper, which introduced the concept of the "LLM Operating System" for memory management:
|
|
71
76
|
|
|
72
|
-
|
|
73
|
-
|
|
77
|
+
1. [**Memory Hierarchy**](https://docs.letta.com/guides/agents/memory): Agents have self-editing memory split between in-context and out-of-context memory
|
|
78
|
+
2. [**Memory Blocks**](https://docs.letta.com/guides/agents/memory-blocks): In-context memory is composed of persistent editable blocks
|
|
79
|
+
3. [**Agentic Context Engineering**](https://docs.letta.com/guides/agents/context-engineering): Agents control their context window using tools to edit, delete, or search memory
|
|
80
|
+
4. [**Perpetual Self-Improving Agents**](https://docs.letta.com/guides/agents/overview): Every agent has a perpetual (infinite) message history
|
|
74
81
|
|
|
75
|
-
|
|
76
|
-
await client.agents.importFile({ file: await fetch('https://somesite/file') });
|
|
82
|
+
## Local Development
|
|
77
83
|
|
|
78
|
-
|
|
79
|
-
await client.agents.importFile({ file: await toFile(Buffer.from('my bytes'), 'file') });
|
|
80
|
-
await client.agents.importFile({ file: await toFile(new Uint8Array([0, 1, 2]), 'file') });
|
|
81
|
-
```
|
|
84
|
+
Connect to a local Letta server instead of the cloud:
|
|
82
85
|
|
|
83
|
-
|
|
84
|
-
|
|
85
|
-
|
|
86
|
-
or if the API returns a non-success status code (i.e., 4xx or 5xx response),
|
|
87
|
-
a subclass of `APIError` will be thrown:
|
|
88
|
-
|
|
89
|
-
<!-- prettier-ignore -->
|
|
90
|
-
```ts
|
|
91
|
-
const archive = await client.archives.create({ name: 'name' }).catch(async (err) => {
|
|
92
|
-
if (err instanceof Letta.APIError) {
|
|
93
|
-
console.log(err.status); // 400
|
|
94
|
-
console.log(err.name); // BadRequestError
|
|
95
|
-
console.log(err.headers); // {server: 'nginx', ...}
|
|
96
|
-
} else {
|
|
97
|
-
throw err;
|
|
98
|
-
}
|
|
86
|
+
```typescript
|
|
87
|
+
const client = new LettaClient({
|
|
88
|
+
baseUrl: "http://localhost:8283"
|
|
99
89
|
});
|
|
100
90
|
```
|
|
101
91
|
|
|
102
|
-
|
|
92
|
+
Run Letta locally with Docker:
|
|
103
93
|
|
|
104
|
-
|
|
105
|
-
|
|
106
|
-
|
|
107
|
-
|
|
108
|
-
|
|
109
|
-
|
|
110
|
-
|
|
111
|
-
| 429 | `RateLimitError` |
|
|
112
|
-
| >=500 | `InternalServerError` |
|
|
113
|
-
| N/A | `APIConnectionError` |
|
|
94
|
+
```bash
|
|
95
|
+
docker run \
|
|
96
|
+
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
|
|
97
|
+
-p 8283:8283 \
|
|
98
|
+
-e OPENAI_API_KEY="your_key" \
|
|
99
|
+
letta/letta:latest
|
|
100
|
+
```
|
|
114
101
|
|
|
115
|
-
|
|
102
|
+
See the [self-hosting guide](https://docs.letta.com/guides/selfhosting) for more options.
|
|
116
103
|
|
|
117
|
-
|
|
118
|
-
Connection errors (for example, due to a network connectivity problem), 408 Request Timeout, 409 Conflict,
|
|
119
|
-
429 Rate Limit, and >=500 Internal errors will all be retried by default.
|
|
104
|
+
## Key Features
|
|
120
105
|
|
|
121
|
-
|
|
106
|
+
### Memory Management ([full guide](https://docs.letta.com/guides/agents/memory-blocks))
|
|
122
107
|
|
|
123
|
-
|
|
124
|
-
|
|
125
|
-
|
|
126
|
-
|
|
127
|
-
|
|
108
|
+
Memory blocks are persistent, editable sections of an agent's context window:
|
|
109
|
+
|
|
110
|
+
```typescript
|
|
111
|
+
// Create agent with memory blocks
|
|
112
|
+
const agent = await client.agents.create({
|
|
113
|
+
memoryBlocks: [
|
|
114
|
+
{ label: "persona", value: "I'm a helpful assistant." },
|
|
115
|
+
{ label: "human", value: "User preferences and info." }
|
|
116
|
+
]
|
|
128
117
|
});
|
|
129
118
|
|
|
130
|
-
//
|
|
131
|
-
await client.
|
|
132
|
-
|
|
119
|
+
// Update blocks manually
|
|
120
|
+
await client.agents.blocks.update(agent.id, "human", {
|
|
121
|
+
value: "Updated user information"
|
|
133
122
|
});
|
|
123
|
+
|
|
124
|
+
// Retrieve a block
|
|
125
|
+
const block = await client.agents.blocks.retrieve(agent.id, "human");
|
|
134
126
|
```
|
|
135
127
|
|
|
136
|
-
###
|
|
128
|
+
### Multi-agent Shared Memory ([full guide](https://docs.letta.com/guides/agents/multi-agent-shared-memory))
|
|
129
|
+
|
|
130
|
+
Memory blocks can be attached to multiple agents. All agents will have an up-to-date view on the contents of the memory block -- if one agent modifies it, the other will see it immediately.
|
|
137
131
|
|
|
138
|
-
|
|
132
|
+
Here is how to attach a single memory block to multiple agents:
|
|
139
133
|
|
|
140
|
-
|
|
141
|
-
|
|
142
|
-
|
|
143
|
-
|
|
144
|
-
|
|
134
|
+
```typescript
|
|
135
|
+
// Create shared block
|
|
136
|
+
const sharedBlock = await client.blocks.create({
|
|
137
|
+
label: "organization",
|
|
138
|
+
value: "Shared team context"
|
|
145
139
|
});
|
|
146
140
|
|
|
147
|
-
//
|
|
148
|
-
await client.
|
|
149
|
-
|
|
141
|
+
// Attach to multiple agents
|
|
142
|
+
const agent1 = await client.agents.create({
|
|
143
|
+
memoryBlocks: [{ label: "persona", value: "I am a supervisor" }],
|
|
144
|
+
blockIds: [sharedBlock.id]
|
|
150
145
|
});
|
|
151
|
-
```
|
|
152
146
|
|
|
153
|
-
|
|
147
|
+
const agent2 = await client.agents.create({
|
|
148
|
+
memoryBlocks: [{ label: "persona", value: "I am a worker" }],
|
|
149
|
+
blockIds: [sharedBlock.id]
|
|
150
|
+
});
|
|
151
|
+
```
|
|
154
152
|
|
|
155
|
-
|
|
153
|
+
### Sleep-time Agents ([full guide](https://docs.letta.com/guides/agents/architectures/sleeptime))
|
|
156
154
|
|
|
157
|
-
|
|
155
|
+
Background agents that share memory with your primary agent:
|
|
158
156
|
|
|
159
|
-
|
|
157
|
+
```typescript
|
|
158
|
+
const agent = await client.agents.create({
|
|
159
|
+
model: "openai/gpt-4o-mini",
|
|
160
|
+
enableSleeptime: true // creates a sleep-time agent
|
|
161
|
+
});
|
|
162
|
+
```
|
|
160
163
|
|
|
161
|
-
|
|
162
|
-
This method returns as soon as the headers for a successful response are received and does not consume the response body, so you are free to write custom parsing or streaming logic.
|
|
164
|
+
### Agent File Import/Export ([full guide](https://docs.letta.com/guides/agents/agent-file))
|
|
163
165
|
|
|
164
|
-
|
|
165
|
-
Unlike `.asResponse()` this method consumes the body, returning once it is parsed.
|
|
166
|
+
Save and share agents with the `.af` file format:
|
|
166
167
|
|
|
167
|
-
|
|
168
|
-
|
|
169
|
-
const client = new Letta();
|
|
168
|
+
```typescript
|
|
169
|
+
import { readFileSync } from 'fs';
|
|
170
170
|
|
|
171
|
-
|
|
172
|
-
|
|
173
|
-
|
|
171
|
+
// Import agent
|
|
172
|
+
const file = new Blob([readFileSync('/path/to/agent.af')]);
|
|
173
|
+
const agent = await client.agents.importFile(file);
|
|
174
174
|
|
|
175
|
-
|
|
176
|
-
|
|
177
|
-
console.log(archive.id);
|
|
175
|
+
// Export agent
|
|
176
|
+
const schema = await client.agents.exportFile(agent.id);
|
|
178
177
|
```
|
|
179
178
|
|
|
180
|
-
###
|
|
181
|
-
|
|
182
|
-
> [!IMPORTANT]
|
|
183
|
-
> All log messages are intended for debugging only. The format and content of log messages
|
|
184
|
-
> may change between releases.
|
|
179
|
+
### MCP Tools ([full guide](https://docs.letta.com/guides/mcp/overview))
|
|
185
180
|
|
|
186
|
-
|
|
181
|
+
Connect to Model Context Protocol servers:
|
|
187
182
|
|
|
188
|
-
|
|
189
|
-
|
|
190
|
-
|
|
191
|
-
|
|
183
|
+
```typescript
|
|
184
|
+
// First, create an MCP server (example: weather server)
|
|
185
|
+
const weatherServer = await client.mcpServers.create({
|
|
186
|
+
server_name: "weather-server",
|
|
187
|
+
config: {
|
|
188
|
+
mcp_server_type: "streamable_http",
|
|
189
|
+
server_url: "https://weather-mcp.example.com/mcp",
|
|
190
|
+
},
|
|
191
|
+
});
|
|
192
192
|
|
|
193
|
-
|
|
194
|
-
|
|
193
|
+
// List tools available from the MCP server
|
|
194
|
+
const tools = await client.mcpServers.tools.list(weatherServer.id);
|
|
195
195
|
|
|
196
|
-
|
|
197
|
-
|
|
196
|
+
// Create agent with MCP tool
|
|
197
|
+
const agent = await client.agents.create({
|
|
198
|
+
model: "openai/gpt-4o-mini",
|
|
199
|
+
toolIds: [tool.id]
|
|
198
200
|
});
|
|
199
201
|
```
|
|
200
202
|
|
|
201
|
-
|
|
202
|
-
|
|
203
|
-
- `'debug'` - Show debug messages, info, warnings, and errors
|
|
204
|
-
- `'info'` - Show info messages, warnings, and errors
|
|
205
|
-
- `'warn'` - Show warnings and errors (default)
|
|
206
|
-
- `'error'` - Show only errors
|
|
207
|
-
- `'off'` - Disable all logging
|
|
208
|
-
|
|
209
|
-
At the `'debug'` level, all HTTP requests and responses are logged, including headers and bodies.
|
|
210
|
-
Some authentication-related headers are redacted, but sensitive data in request and response bodies
|
|
211
|
-
may still be visible.
|
|
203
|
+
### Filesystem ([full guide](https://docs.letta.com/guides/agents/filesystem))
|
|
212
204
|
|
|
213
|
-
|
|
205
|
+
Give agents access to files:
|
|
214
206
|
|
|
215
|
-
|
|
216
|
-
|
|
207
|
+
```typescript
|
|
208
|
+
import { createReadStream } from 'fs';
|
|
217
209
|
|
|
218
|
-
|
|
219
|
-
|
|
220
|
-
|
|
221
|
-
|
|
222
|
-
import Letta from '@letta-ai/letta-client';
|
|
223
|
-
import pino from 'pino';
|
|
210
|
+
// Create folder and upload file
|
|
211
|
+
const folder = await client.folders.create({
|
|
212
|
+
name: "my_folder",
|
|
213
|
+
});
|
|
224
214
|
|
|
225
|
-
|
|
215
|
+
await client.folders.files.upload(createReadStream("file.txt"), folder.id);
|
|
226
216
|
|
|
227
|
-
|
|
228
|
-
|
|
229
|
-
logLevel: 'debug', // Send all messages to pino, allowing it to filter
|
|
230
|
-
});
|
|
217
|
+
// Attach to agent
|
|
218
|
+
await client.agents.folders.attach(agent.id, folder.id);
|
|
231
219
|
```
|
|
232
220
|
|
|
233
|
-
###
|
|
221
|
+
### Long-running Agents ([full guide](https://docs.letta.com/guides/agents/long-running))
|
|
234
222
|
|
|
235
|
-
|
|
236
|
-
endpoints, params, or response properties, the library can still be used.
|
|
223
|
+
Background execution with resumable streaming:
|
|
237
224
|
|
|
238
|
-
|
|
225
|
+
```typescript
|
|
226
|
+
const stream = await client.agents.messages.create(agent.id, {
|
|
227
|
+
messages: [{ role: "user", content: "Analyze this dataset" }],
|
|
228
|
+
background: true
|
|
229
|
+
});
|
|
239
230
|
|
|
240
|
-
|
|
241
|
-
|
|
231
|
+
let runId, lastSeqId;
|
|
232
|
+
for await (const chunk of stream) {
|
|
233
|
+
runId = chunk.runId;
|
|
234
|
+
lastSeqId = chunk.seqId;
|
|
235
|
+
}
|
|
242
236
|
|
|
243
|
-
|
|
244
|
-
await client.
|
|
245
|
-
|
|
246
|
-
|
|
247
|
-
});
|
|
237
|
+
// Resume if disconnected
|
|
238
|
+
for await (const chunk of client.runs.stream(runId, { startingAfter: lastSeqId })) {
|
|
239
|
+
console.log(chunk);
|
|
240
|
+
}
|
|
248
241
|
```
|
|
249
242
|
|
|
250
|
-
|
|
243
|
+
### Streaming ([full guide](https://docs.letta.com/guides/agents/streaming))
|
|
251
244
|
|
|
252
|
-
|
|
253
|
-
parameter. This library doesn't validate at runtime that the request matches the type, so any extra values you
|
|
254
|
-
send will be sent as-is.
|
|
245
|
+
Stream responses in real-time:
|
|
255
246
|
|
|
256
|
-
```
|
|
257
|
-
client.
|
|
258
|
-
|
|
259
|
-
// @ts-expect-error baz is not yet public
|
|
260
|
-
baz: 'undocumented option',
|
|
247
|
+
```typescript
|
|
248
|
+
const stream = await client.agents.messages.stream(agent.id, {
|
|
249
|
+
messages: [{ role: "user", content: "Hello!" }]
|
|
261
250
|
});
|
|
262
|
-
```
|
|
263
251
|
|
|
264
|
-
|
|
265
|
-
|
|
252
|
+
for await (const chunk of stream) {
|
|
253
|
+
console.log(chunk);
|
|
254
|
+
}
|
|
255
|
+
```
|
|
266
256
|
|
|
267
|
-
|
|
268
|
-
|
|
257
|
+
### Message Types ([full guide](https://docs.letta.com/guides/agents/message-types))
|
|
258
|
+
|
|
259
|
+
Agent responses contain different message types. Handle them with the `messageType` discriminator:
|
|
260
|
+
|
|
261
|
+
```typescript
|
|
262
|
+
const messages = await client.agents.messages.list(agent.id);
|
|
263
|
+
|
|
264
|
+
for (const message of messages) {
|
|
265
|
+
switch (message.messageType) {
|
|
266
|
+
case "user_message":
|
|
267
|
+
console.log("User:", message.content);
|
|
268
|
+
break;
|
|
269
|
+
case "assistant_message":
|
|
270
|
+
console.log("Agent:", message.content);
|
|
271
|
+
break;
|
|
272
|
+
case "reasoning_message":
|
|
273
|
+
console.log("Reasoning:", message.reasoning);
|
|
274
|
+
break;
|
|
275
|
+
case "tool_call_message":
|
|
276
|
+
console.log("Tool:", message.toolCall.name);
|
|
277
|
+
break;
|
|
278
|
+
case "tool_return_message":
|
|
279
|
+
console.log("Result:", message.toolReturn);
|
|
280
|
+
break;
|
|
281
|
+
}
|
|
282
|
+
}
|
|
283
|
+
```
|
|
269
284
|
|
|
270
|
-
|
|
285
|
+
## TypeScript Support
|
|
271
286
|
|
|
272
|
-
|
|
273
|
-
the response object, or cast the response object to the requisite type. Like the request params, we do not
|
|
274
|
-
validate or strip extra properties from the response from the API.
|
|
287
|
+
Full TypeScript support with exported types:
|
|
275
288
|
|
|
276
|
-
|
|
289
|
+
```typescript
|
|
290
|
+
import { Letta } from "@letta-ai/letta-client";
|
|
277
291
|
|
|
278
|
-
|
|
292
|
+
const request: Letta.CreateAgentRequest = {
|
|
293
|
+
model: "openai/gpt-4o-mini",
|
|
294
|
+
memoryBlocks: [...]
|
|
295
|
+
};
|
|
296
|
+
```
|
|
279
297
|
|
|
280
|
-
|
|
298
|
+
## Error Handling
|
|
281
299
|
|
|
282
|
-
```
|
|
283
|
-
import
|
|
300
|
+
```typescript
|
|
301
|
+
import { LettaError } from "@letta-ai/letta-client";
|
|
284
302
|
|
|
285
|
-
|
|
303
|
+
try {
|
|
304
|
+
await client.agents.messages.create(agentId, {...});
|
|
305
|
+
} catch (err) {
|
|
306
|
+
if (err instanceof LettaError) {
|
|
307
|
+
console.log(err.statusCode);
|
|
308
|
+
console.log(err.message);
|
|
309
|
+
console.log(err.body);
|
|
310
|
+
}
|
|
311
|
+
}
|
|
286
312
|
```
|
|
287
313
|
|
|
288
|
-
|
|
314
|
+
## Advanced Configuration
|
|
289
315
|
|
|
290
|
-
|
|
291
|
-
import Letta from '@letta-ai/letta-client';
|
|
292
|
-
import fetch from 'my-fetch';
|
|
316
|
+
### Retries
|
|
293
317
|
|
|
294
|
-
|
|
318
|
+
```typescript
|
|
319
|
+
const response = await client.agents.create({...}, {
|
|
320
|
+
maxRetries: 3 // Default: 2
|
|
321
|
+
});
|
|
295
322
|
```
|
|
296
323
|
|
|
297
|
-
###
|
|
298
|
-
|
|
299
|
-
If you want to set custom `fetch` options without overriding the `fetch` function, you can provide a `fetchOptions` object when instantiating the client or making a request. (Request-specific options override client options.)
|
|
300
|
-
|
|
301
|
-
```ts
|
|
302
|
-
import Letta from '@letta-ai/letta-client';
|
|
324
|
+
### Timeouts
|
|
303
325
|
|
|
304
|
-
|
|
305
|
-
|
|
306
|
-
|
|
307
|
-
},
|
|
326
|
+
```typescript
|
|
327
|
+
const response = await client.agents.create({...}, {
|
|
328
|
+
timeoutInSeconds: 30 // Default: 60
|
|
308
329
|
});
|
|
309
330
|
```
|
|
310
331
|
|
|
311
|
-
|
|
332
|
+
### Custom Headers
|
|
312
333
|
|
|
313
|
-
|
|
314
|
-
|
|
315
|
-
|
|
316
|
-
|
|
334
|
+
```typescript
|
|
335
|
+
const response = await client.agents.create({...}, {
|
|
336
|
+
headers: {
|
|
337
|
+
'X-Custom-Header': 'value'
|
|
338
|
+
}
|
|
339
|
+
});
|
|
340
|
+
```
|
|
317
341
|
|
|
318
|
-
|
|
319
|
-
import Letta from '@letta-ai/letta-client';
|
|
320
|
-
import * as undici from 'undici';
|
|
342
|
+
### Abort Requests
|
|
321
343
|
|
|
322
|
-
|
|
323
|
-
const
|
|
324
|
-
|
|
325
|
-
|
|
326
|
-
},
|
|
344
|
+
```typescript
|
|
345
|
+
const controller = new AbortController();
|
|
346
|
+
const response = await client.agents.create({...}, {
|
|
347
|
+
abortSignal: controller.signal
|
|
327
348
|
});
|
|
349
|
+
controller.abort();
|
|
328
350
|
```
|
|
329
351
|
|
|
330
|
-
|
|
352
|
+
### Raw Response Access
|
|
331
353
|
|
|
332
|
-
```
|
|
333
|
-
|
|
354
|
+
```typescript
|
|
355
|
+
const { data, rawResponse } = await client.agents
|
|
356
|
+
.create({...})
|
|
357
|
+
.withRawResponse();
|
|
334
358
|
|
|
335
|
-
|
|
336
|
-
fetchOptions: {
|
|
337
|
-
proxy: 'http://localhost:8888',
|
|
338
|
-
},
|
|
339
|
-
});
|
|
359
|
+
console.log(rawResponse.headers['X-My-Header']);
|
|
340
360
|
```
|
|
341
361
|
|
|
342
|
-
|
|
362
|
+
### Custom Fetch Client
|
|
343
363
|
|
|
344
|
-
```
|
|
345
|
-
|
|
346
|
-
|
|
347
|
-
const httpClient = Deno.createHttpClient({ proxy: { url: 'http://localhost:8888' } });
|
|
348
|
-
const client = new Letta({
|
|
349
|
-
fetchOptions: {
|
|
350
|
-
client: httpClient,
|
|
351
|
-
},
|
|
364
|
+
```typescript
|
|
365
|
+
const client = new LettaClient({
|
|
366
|
+
fetcher: yourCustomFetchImplementation
|
|
352
367
|
});
|
|
353
368
|
```
|
|
354
369
|
|
|
355
|
-
##
|
|
370
|
+
## Runtime Compatibility
|
|
356
371
|
|
|
357
|
-
|
|
372
|
+
Works in:
|
|
373
|
+
- Node.js 18+
|
|
374
|
+
- Vercel
|
|
375
|
+
- Cloudflare Workers
|
|
376
|
+
- Deno v1.25+
|
|
377
|
+
- Bun 1.0+
|
|
378
|
+
- React Native
|
|
358
379
|
|
|
359
|
-
|
|
360
|
-
|
|
361
|
-
1. Changes that only affect static types, without breaking runtime behavior.
|
|
362
|
-
2. Changes to library internals which are technically public but not intended or documented for external use. _(Please open a GitHub issue to let us know if you are relying on such internals.)_
|
|
363
|
-
3. Changes that we do not expect to impact the vast majority of users in practice.
|
|
380
|
+
## Contributing
|
|
364
381
|
|
|
365
|
-
|
|
382
|
+
Letta is an open source project built by over a hundred contributors. There are many ways to get involved in the Letta OSS project!
|
|
366
383
|
|
|
367
|
-
|
|
384
|
+
* [**Join the Discord**](https://discord.gg/letta): Chat with the Letta devs and other AI developers.
|
|
385
|
+
* [**Chat on our forum**](https://forum.letta.com/): If you're not into Discord, check out our developer forum.
|
|
386
|
+
* **Follow our socials**: [Twitter/X](https://twitter.com/Letta_AI), [LinkedIn](https://www.linkedin.com/company/letta-ai/), [YouTube](https://www.youtube.com/@letta-ai)
|
|
368
387
|
|
|
369
|
-
|
|
388
|
+
This SDK is generated programmatically. For SDK changes, please [open an issue](https://github.com/letta-ai/letta-node/issues).
|
|
370
389
|
|
|
371
|
-
|
|
390
|
+
README contributions are always welcome!
|
|
372
391
|
|
|
373
|
-
|
|
392
|
+
## Resources
|
|
374
393
|
|
|
375
|
-
-
|
|
376
|
-
-
|
|
377
|
-
-
|
|
378
|
-
- Bun 1.0 or later.
|
|
379
|
-
- Cloudflare Workers.
|
|
380
|
-
- Vercel Edge Runtime.
|
|
381
|
-
- Jest 28 or greater with the `"node"` environment (`"jsdom"` is not supported at this time).
|
|
382
|
-
- Nitro v2.6 or greater.
|
|
394
|
+
- [Documentation](https://docs.letta.com)
|
|
395
|
+
- [TypeScript API Reference](./reference.md)
|
|
396
|
+
- [Example Applications](https://github.com/letta-ai/letta-chatbot-example)
|
|
383
397
|
|
|
384
|
-
|
|
398
|
+
## License
|
|
385
399
|
|
|
386
|
-
|
|
400
|
+
MIT
|
|
387
401
|
|
|
388
|
-
|
|
402
|
+
---
|
|
389
403
|
|
|
390
|
-
|
|
404
|
+
***Legal notices**: By using Letta and related Letta services (such as the Letta endpoint or hosted service), you are agreeing to our [privacy policy](https://www.letta.com/privacy-policy) and [terms of service](https://www.letta.com/terms-of-service).*
|