letta-nightly 0.11.6.dev20250827050912__py3-none-any.whl → 0.11.6.dev20250828104133__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- letta_nightly-0.11.6.dev20250828104133.dist-info/METADATA +665 -0
- {letta_nightly-0.11.6.dev20250827050912.dist-info → letta_nightly-0.11.6.dev20250828104133.dist-info}/RECORD +5 -5
- letta_nightly-0.11.6.dev20250827050912.dist-info/METADATA +0 -422
- {letta_nightly-0.11.6.dev20250827050912.dist-info → letta_nightly-0.11.6.dev20250828104133.dist-info}/WHEEL +0 -0
- {letta_nightly-0.11.6.dev20250827050912.dist-info → letta_nightly-0.11.6.dev20250828104133.dist-info}/entry_points.txt +0 -0
- {letta_nightly-0.11.6.dev20250827050912.dist-info → letta_nightly-0.11.6.dev20250828104133.dist-info}/licenses/LICENSE +0 -0
@@ -0,0 +1,665 @@
|
|
1
|
+
Metadata-Version: 2.4
|
2
|
+
Name: letta-nightly
|
3
|
+
Version: 0.11.6.dev20250828104133
|
4
|
+
Summary: Create LLM agents with long-term memory and custom tools
|
5
|
+
Author-email: Letta Team <contact@letta.com>
|
6
|
+
License: Apache License
|
7
|
+
License-File: LICENSE
|
8
|
+
Requires-Python: <3.14,>=3.11
|
9
|
+
Requires-Dist: aiomultiprocess>=0.9.1
|
10
|
+
Requires-Dist: alembic>=1.13.3
|
11
|
+
Requires-Dist: anthropic>=0.49.0
|
12
|
+
Requires-Dist: apscheduler>=3.11.0
|
13
|
+
Requires-Dist: black[jupyter]>=24.2.0
|
14
|
+
Requires-Dist: brotli>=1.1.0
|
15
|
+
Requires-Dist: certifi>=2025.6.15
|
16
|
+
Requires-Dist: colorama>=0.4.6
|
17
|
+
Requires-Dist: composio-core>=0.7.7
|
18
|
+
Requires-Dist: datamodel-code-generator[http]>=0.25.0
|
19
|
+
Requires-Dist: demjson3>=3.0.6
|
20
|
+
Requires-Dist: docstring-parser<0.17,>=0.16
|
21
|
+
Requires-Dist: faker>=36.1.0
|
22
|
+
Requires-Dist: firecrawl-py<3.0.0,>=2.8.0
|
23
|
+
Requires-Dist: grpcio-tools>=1.68.1
|
24
|
+
Requires-Dist: grpcio>=1.68.1
|
25
|
+
Requires-Dist: html2text>=2020.1.16
|
26
|
+
Requires-Dist: httpx-sse>=0.4.0
|
27
|
+
Requires-Dist: httpx>=0.28.0
|
28
|
+
Requires-Dist: jinja2>=3.1.5
|
29
|
+
Requires-Dist: letta-client>=0.1.277
|
30
|
+
Requires-Dist: llama-index-embeddings-openai>=0.3.1
|
31
|
+
Requires-Dist: llama-index>=0.12.2
|
32
|
+
Requires-Dist: markitdown[docx,pdf,pptx]>=0.1.2
|
33
|
+
Requires-Dist: marshmallow-sqlalchemy>=1.4.1
|
34
|
+
Requires-Dist: matplotlib>=3.10.1
|
35
|
+
Requires-Dist: mcp[cli]>=1.9.4
|
36
|
+
Requires-Dist: mistralai>=1.8.1
|
37
|
+
Requires-Dist: nltk>=3.8.1
|
38
|
+
Requires-Dist: numpy>=2.1.0
|
39
|
+
Requires-Dist: openai>=1.99.9
|
40
|
+
Requires-Dist: opentelemetry-api==1.30.0
|
41
|
+
Requires-Dist: opentelemetry-exporter-otlp==1.30.0
|
42
|
+
Requires-Dist: opentelemetry-instrumentation-requests==0.51b0
|
43
|
+
Requires-Dist: opentelemetry-instrumentation-sqlalchemy==0.51b0
|
44
|
+
Requires-Dist: opentelemetry-sdk==1.30.0
|
45
|
+
Requires-Dist: orjson>=3.11.1
|
46
|
+
Requires-Dist: pathvalidate>=3.2.1
|
47
|
+
Requires-Dist: prettytable>=3.9.0
|
48
|
+
Requires-Dist: pydantic-settings>=2.2.1
|
49
|
+
Requires-Dist: pydantic>=2.10.6
|
50
|
+
Requires-Dist: pyhumps>=3.8.0
|
51
|
+
Requires-Dist: python-box>=7.1.1
|
52
|
+
Requires-Dist: python-multipart>=0.0.19
|
53
|
+
Requires-Dist: pytz>=2023.3.post1
|
54
|
+
Requires-Dist: pyyaml>=6.0.1
|
55
|
+
Requires-Dist: questionary>=2.0.1
|
56
|
+
Requires-Dist: rich>=13.9.4
|
57
|
+
Requires-Dist: sentry-sdk[fastapi]==2.19.1
|
58
|
+
Requires-Dist: setuptools>=70
|
59
|
+
Requires-Dist: sqlalchemy-json>=0.7.0
|
60
|
+
Requires-Dist: sqlalchemy-utils>=0.41.2
|
61
|
+
Requires-Dist: sqlalchemy[asyncio]>=2.0.41
|
62
|
+
Requires-Dist: sqlmodel>=0.0.16
|
63
|
+
Requires-Dist: structlog>=25.4.0
|
64
|
+
Requires-Dist: tavily-python>=0.7.2
|
65
|
+
Requires-Dist: tqdm>=4.66.1
|
66
|
+
Requires-Dist: typer>=0.15.2
|
67
|
+
Provides-Extra: bedrock
|
68
|
+
Requires-Dist: aioboto3>=14.3.0; extra == 'bedrock'
|
69
|
+
Requires-Dist: boto3>=1.36.24; extra == 'bedrock'
|
70
|
+
Provides-Extra: cloud-tool-sandbox
|
71
|
+
Requires-Dist: e2b-code-interpreter>=1.0.3; extra == 'cloud-tool-sandbox'
|
72
|
+
Provides-Extra: desktop
|
73
|
+
Requires-Dist: aiosqlite>=0.21.0; extra == 'desktop'
|
74
|
+
Requires-Dist: docker>=7.1.0; extra == 'desktop'
|
75
|
+
Requires-Dist: fastapi>=0.115.6; extra == 'desktop'
|
76
|
+
Requires-Dist: langchain-community>=0.3.7; extra == 'desktop'
|
77
|
+
Requires-Dist: langchain>=0.3.7; extra == 'desktop'
|
78
|
+
Requires-Dist: locust>=2.31.5; extra == 'desktop'
|
79
|
+
Requires-Dist: pgvector>=0.2.3; extra == 'desktop'
|
80
|
+
Requires-Dist: sqlite-vec>=0.1.7a2; extra == 'desktop'
|
81
|
+
Requires-Dist: uvicorn>=0.24.0.post1; extra == 'desktop'
|
82
|
+
Requires-Dist: websockets; extra == 'desktop'
|
83
|
+
Requires-Dist: wikipedia>=1.4.0; extra == 'desktop'
|
84
|
+
Provides-Extra: dev
|
85
|
+
Requires-Dist: autoflake>=2.3.0; extra == 'dev'
|
86
|
+
Requires-Dist: black[jupyter]>=24.4.2; extra == 'dev'
|
87
|
+
Requires-Dist: ipdb>=0.13.13; extra == 'dev'
|
88
|
+
Requires-Dist: ipykernel>=6.29.5; extra == 'dev'
|
89
|
+
Requires-Dist: isort>=5.13.2; extra == 'dev'
|
90
|
+
Requires-Dist: pexpect>=4.9.0; extra == 'dev'
|
91
|
+
Requires-Dist: pre-commit>=3.5.0; extra == 'dev'
|
92
|
+
Requires-Dist: pyright>=1.1.347; extra == 'dev'
|
93
|
+
Requires-Dist: pytest; extra == 'dev'
|
94
|
+
Requires-Dist: pytest-asyncio>=0.24.0; extra == 'dev'
|
95
|
+
Requires-Dist: pytest-json-report>=1.5.0; extra == 'dev'
|
96
|
+
Requires-Dist: pytest-mock>=3.14.0; extra == 'dev'
|
97
|
+
Requires-Dist: pytest-order>=1.2.0; extra == 'dev'
|
98
|
+
Provides-Extra: experimental
|
99
|
+
Requires-Dist: google-cloud-profiler>=4.1.0; extra == 'experimental'
|
100
|
+
Requires-Dist: granian[reload,uvloop]>=2.3.2; extra == 'experimental'
|
101
|
+
Requires-Dist: uvloop>=0.21.0; extra == 'experimental'
|
102
|
+
Provides-Extra: external-tools
|
103
|
+
Requires-Dist: docker>=7.1.0; extra == 'external-tools'
|
104
|
+
Requires-Dist: firecrawl-py<3.0.0,>=2.8.0; extra == 'external-tools'
|
105
|
+
Requires-Dist: langchain-community>=0.3.7; extra == 'external-tools'
|
106
|
+
Requires-Dist: langchain>=0.3.7; extra == 'external-tools'
|
107
|
+
Requires-Dist: turbopuffer>=0.5.17; extra == 'external-tools'
|
108
|
+
Requires-Dist: wikipedia>=1.4.0; extra == 'external-tools'
|
109
|
+
Provides-Extra: google
|
110
|
+
Requires-Dist: google-genai>=1.15.0; extra == 'google'
|
111
|
+
Provides-Extra: modal
|
112
|
+
Requires-Dist: modal>=1.1.0; extra == 'modal'
|
113
|
+
Provides-Extra: pinecone
|
114
|
+
Requires-Dist: pinecone[asyncio]>=7.3.0; extra == 'pinecone'
|
115
|
+
Provides-Extra: postgres
|
116
|
+
Requires-Dist: asyncpg>=0.30.0; extra == 'postgres'
|
117
|
+
Requires-Dist: pg8000>=1.30.3; extra == 'postgres'
|
118
|
+
Requires-Dist: pgvector>=0.2.3; extra == 'postgres'
|
119
|
+
Requires-Dist: psycopg2-binary>=2.9.10; extra == 'postgres'
|
120
|
+
Requires-Dist: psycopg2>=2.9.10; extra == 'postgres'
|
121
|
+
Provides-Extra: redis
|
122
|
+
Requires-Dist: redis>=6.2.0; extra == 'redis'
|
123
|
+
Provides-Extra: server
|
124
|
+
Requires-Dist: fastapi>=0.115.6; extra == 'server'
|
125
|
+
Requires-Dist: uvicorn>=0.24.0.post1; extra == 'server'
|
126
|
+
Requires-Dist: websockets; extra == 'server'
|
127
|
+
Provides-Extra: sqlite
|
128
|
+
Requires-Dist: aiosqlite>=0.21.0; extra == 'sqlite'
|
129
|
+
Requires-Dist: sqlite-vec>=0.1.7a2; extra == 'sqlite'
|
130
|
+
Description-Content-Type: text/markdown
|
131
|
+
|
132
|
+
<p align="center">
|
133
|
+
<picture>
|
134
|
+
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/Letta-logo-RGB_GreyonTransparent_cropped_small.png">
|
135
|
+
<source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/Letta-logo-RGB_OffBlackonTransparent_cropped_small.png">
|
136
|
+
<img alt="Letta logo" src="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/Letta-logo-RGB_GreyonOffBlack_cropped_small.png" width="500">
|
137
|
+
</picture>
|
138
|
+
</p>
|
139
|
+
|
140
|
+
# Letta (formerly MemGPT)
|
141
|
+
|
142
|
+
Letta is the platform for building stateful agents: open AI with advanced memory that can learn and self-improve over time.
|
143
|
+
|
144
|
+
### Quicklinks:
|
145
|
+
* [**Developer Documentation**](https://docs.letta.com): Learn how create agents that learn using Python / TypeScript
|
146
|
+
* [**Agent Development Environment (ADE)**](https://docs.letta.com/guides/ade/overview): A no-code UI for building stateful agents
|
147
|
+
* [**Letta Desktop**](https://docs.letta.com/guides/ade/desktop): A fully-local version of the ADE, available on MacOS and Windows
|
148
|
+
* [**Letta Cloud**](https://app.letta.com/): The fastest way to try Letta, with agents running in the cloud
|
149
|
+
|
150
|
+
## Get started
|
151
|
+
|
152
|
+
To get started, install the Letta SDK (available for both Python and TypeScript):
|
153
|
+
|
154
|
+
### [Python SDK](https://github.com/letta-ai/letta-python)
|
155
|
+
```sh
|
156
|
+
pip install letta-client
|
157
|
+
```
|
158
|
+
|
159
|
+
### [TypeScript / Node.js SDK](https://github.com/letta-ai/letta-node)
|
160
|
+
```sh
|
161
|
+
npm install @letta-ai/letta-client
|
162
|
+
```
|
163
|
+
|
164
|
+
## Simple Hello World example
|
165
|
+
|
166
|
+
In the example below, we'll create a stateful agent with two memory blocks, one for itself (the `persona` block), and one for the human. We'll initialize the `human` memory block with incorrect information, and correct agent in our first message - which will trigger the agent to update its own memory with a tool call.
|
167
|
+
|
168
|
+
*To run the examples, you'll need to get a `LETTA_API_KEY` from [Letta Cloud](https://app.letta.com/api-keys), or run your own self-hosted server (see [our guide](https://docs.letta.com/guides/selfhosting))*
|
169
|
+
|
170
|
+
|
171
|
+
### Python
|
172
|
+
```python
|
173
|
+
from letta_client import Letta
|
174
|
+
|
175
|
+
client = Letta(token="LETTA_API_KEY")
|
176
|
+
# client = Letta(base_url="http://localhost:8283") # if self-hosting, set your base_url
|
177
|
+
|
178
|
+
agent_state = client.agents.create(
|
179
|
+
model="openai/gpt-4.1",
|
180
|
+
embedding="openai/text-embedding-3-small",
|
181
|
+
memory_blocks=[
|
182
|
+
{
|
183
|
+
"label": "human",
|
184
|
+
"value": "The human's name is Chad. They like vibe coding."
|
185
|
+
},
|
186
|
+
{
|
187
|
+
"label": "persona",
|
188
|
+
"value": "My name is Sam, a helpful assistant."
|
189
|
+
}
|
190
|
+
],
|
191
|
+
tools=["web_search", "run_code"]
|
192
|
+
)
|
193
|
+
|
194
|
+
print(agent_state.id)
|
195
|
+
# agent-d9be...0846
|
196
|
+
|
197
|
+
response = client.agents.messages.create(
|
198
|
+
agent_id=agent_state.id,
|
199
|
+
messages=[
|
200
|
+
{
|
201
|
+
"role": "user",
|
202
|
+
"content": "Hey, nice to meet you, my name is Brad."
|
203
|
+
}
|
204
|
+
]
|
205
|
+
)
|
206
|
+
|
207
|
+
# the agent will think, then edit its memory using a tool
|
208
|
+
for message in response.messages:
|
209
|
+
print(message)
|
210
|
+
```
|
211
|
+
|
212
|
+
### TypeScript / Node.js
|
213
|
+
```typescript
|
214
|
+
import { LettaClient } from '@letta-ai/letta-client'
|
215
|
+
|
216
|
+
const client = new LettaClient({ token: "LETTA_API_KEY" });
|
217
|
+
// const client = new LettaClient({ baseUrl: "http://localhost:8283" }); // if self-hosting, set your baseUrl
|
218
|
+
|
219
|
+
const agentState = await client.agents.create({
|
220
|
+
model: "openai/gpt-4.1",
|
221
|
+
embedding: "openai/text-embedding-3-small",
|
222
|
+
memoryBlocks: [
|
223
|
+
{
|
224
|
+
label: "human",
|
225
|
+
value: "The human's name is Chad. They like vibe coding."
|
226
|
+
},
|
227
|
+
{
|
228
|
+
label: "persona",
|
229
|
+
value: "My name is Sam, a helpful assistant."
|
230
|
+
}
|
231
|
+
],
|
232
|
+
tools: ["web_search", "run_code"]
|
233
|
+
});
|
234
|
+
|
235
|
+
console.log(agentState.id);
|
236
|
+
// agent-d9be...0846
|
237
|
+
|
238
|
+
const response = await client.agents.messages.create(
|
239
|
+
agentState.id, {
|
240
|
+
messages: [
|
241
|
+
{
|
242
|
+
role: "user",
|
243
|
+
content: "Hey, nice to meet you, my name is Brad."
|
244
|
+
}
|
245
|
+
]
|
246
|
+
}
|
247
|
+
);
|
248
|
+
|
249
|
+
// the agent will think, then edit its memory using a tool
|
250
|
+
for (const message of response.messages) {
|
251
|
+
console.log(message);
|
252
|
+
}
|
253
|
+
```
|
254
|
+
|
255
|
+
## Core concepts in Letta:
|
256
|
+
|
257
|
+
Letta is made by the creators of [MemGPT](https://arxiv.org/abs/2310.08560), a research paper that introduced the concept of the "LLM Operating System" for memory management. The core concepts in Letta for designing stateful agents follow the MemGPT LLM OS principles:
|
258
|
+
|
259
|
+
1. [**Memory Hierarchy**](https://docs.letta.com/guides/agents/memory): Agents have self-editing memory that is split between in-context memory and out-of-context memory
|
260
|
+
2. [**Memory Blocks**](https://docs.letta.com/guides/agents/memory-blocks): The agent's in-context memory is composed of persistent editable **memory blocks**
|
261
|
+
3. [**Agentic Context Engineering**](https://docs.letta.com/guides/agents/context-engineering): Agents control the context window by using tools to edit, delete, or search for memory
|
262
|
+
4. [**Perpetual Self-Improving Agents**](https://docs.letta.com/guides/agents/overview): Every "agent" is a single entity that has a perpetual (infinite) message history
|
263
|
+
|
264
|
+
## Multi-agent shared memory ([full guide](https://docs.letta.com/guides/agents/multi-agent-shared-memory))
|
265
|
+
|
266
|
+
A single memory block can be attached to multiple agents, allowing to extremely powerful multi-agent shared memory setups.
|
267
|
+
For example, you can create two agents that have their own independent memory blocks in addition to a shared memory block.
|
268
|
+
|
269
|
+
### Python
|
270
|
+
```python
|
271
|
+
# create a shared memory block
|
272
|
+
shared_block = client.blocks.create(
|
273
|
+
label="organization",
|
274
|
+
description="Shared information between all agents within the organization.",
|
275
|
+
value="Nothing here yet, we should update this over time."
|
276
|
+
)
|
277
|
+
|
278
|
+
# create a supervisor agent
|
279
|
+
supervisor_agent = client.agents.create(
|
280
|
+
model="anthropic/claude-3-5-sonnet-20241022",
|
281
|
+
embedding="openai/text-embedding-3-small",
|
282
|
+
# blocks created for this agent
|
283
|
+
memory_blocks=[{"label": "persona", "value": "I am a supervisor"}],
|
284
|
+
# pre-existing shared block that is "attached" to this agent
|
285
|
+
block_ids=[shared_block.id],
|
286
|
+
)
|
287
|
+
|
288
|
+
# create a worker agent
|
289
|
+
worker_agent = client.agents.create(
|
290
|
+
model="openai/gpt-4.1-mini",
|
291
|
+
embedding="openai/text-embedding-3-small",
|
292
|
+
# blocks created for this agent
|
293
|
+
memory_blocks=[{"label": "persona", "value": "I am a worker"}],
|
294
|
+
# pre-existing shared block that is "attached" to this agent
|
295
|
+
block_ids=[shared_block.id],
|
296
|
+
)
|
297
|
+
```
|
298
|
+
|
299
|
+
### TypeScript / Node.js
|
300
|
+
```typescript
|
301
|
+
// create a shared memory block
|
302
|
+
const sharedBlock = await client.blocks.create({
|
303
|
+
label: "organization",
|
304
|
+
description: "Shared information between all agents within the organization.",
|
305
|
+
value: "Nothing here yet, we should update this over time."
|
306
|
+
});
|
307
|
+
|
308
|
+
// create a supervisor agent
|
309
|
+
const supervisorAgent = await client.agents.create({
|
310
|
+
model: "anthropic/claude-3-5-sonnet-20241022",
|
311
|
+
embedding: "openai/text-embedding-3-small",
|
312
|
+
// blocks created for this agent
|
313
|
+
memoryBlocks: [{ label: "persona", value: "I am a supervisor" }],
|
314
|
+
// pre-existing shared block that is "attached" to this agent
|
315
|
+
blockIds: [sharedBlock.id]
|
316
|
+
});
|
317
|
+
|
318
|
+
// create a worker agent
|
319
|
+
const workerAgent = await client.agents.create({
|
320
|
+
model: "openai/gpt-4.1-mini",
|
321
|
+
embedding: "openai/text-embedding-3-small",
|
322
|
+
// blocks created for this agent
|
323
|
+
memoryBlocks: [{ label: "persona", value: "I am a worker" }],
|
324
|
+
// pre-existing shared block that is "attached" to this agent
|
325
|
+
blockIds: [sharedBlock.id]
|
326
|
+
});
|
327
|
+
```
|
328
|
+
|
329
|
+
## Sleep-time agents ([full guide](https://docs.letta.com/guides/agents/architectures/sleeptime))
|
330
|
+
|
331
|
+
In Letta, you can create special **sleep-time agents** that share the memory of your primary agents, but run in the background (like an agent's "subconcious"). You can think of sleep-time agents as a special form of multi-agent architecture.
|
332
|
+
|
333
|
+
To enable sleep-time agents for your agent, set the `enable_sleeptime` flag to true when creating your agent. This will automatically create a sleep-time agent in addition to your main agent which will handle the memory editing, instead of your primary agent.
|
334
|
+
|
335
|
+
### Python
|
336
|
+
```python
|
337
|
+
agent_state = client.agents.create(
|
338
|
+
...
|
339
|
+
enable_sleeptime=True, # <- enable this flag to create a sleep-time agent
|
340
|
+
)
|
341
|
+
```
|
342
|
+
|
343
|
+
### TypeScript / Node.js
|
344
|
+
```typescript
|
345
|
+
const agentState = await client.agents.create({
|
346
|
+
...
|
347
|
+
enableSleeptime: true // <- enable this flag to create a sleep-time agent
|
348
|
+
});
|
349
|
+
```
|
350
|
+
|
351
|
+
## Saving and sharing agents with Agent File (`.af`) ([full guide](https://docs.letta.com/guides/agents/agent-file))
|
352
|
+
|
353
|
+
In Letta, all agent data is persisted to disk (Postgres or SQLite), and can be easily imported and exported using the open source [Agent File](https://github.com/letta-ai/agent-file) (`.af`) file format. You can use Agent File to checkpoint your agents, as well as move your agents (and their complete state/memories) between different Letta servers, e.g. between self-hosted Letta and Letta Cloud.
|
354
|
+
|
355
|
+
<details>
|
356
|
+
<summary>View code snippets</summary>
|
357
|
+
|
358
|
+
### Python
|
359
|
+
```python
|
360
|
+
# Import your .af file from any location
|
361
|
+
agent_state = client.agents.import_agent_serialized(file=open("/path/to/agent/file.af", "rb"))
|
362
|
+
|
363
|
+
print(f"Imported agent: {agent.id}")
|
364
|
+
|
365
|
+
# Export your agent into a serialized schema object (which you can write to a file)
|
366
|
+
schema = client.agents.export_agent_serialized(agent_id="<AGENT_ID>")
|
367
|
+
```
|
368
|
+
|
369
|
+
### TypeScript / Node.js
|
370
|
+
```typescript
|
371
|
+
import { readFileSync } from 'fs';
|
372
|
+
import { Blob } from 'buffer';
|
373
|
+
|
374
|
+
// Import your .af file from any location
|
375
|
+
const file = new Blob([readFileSync('/path/to/agent/file.af')])
|
376
|
+
const agentState = await client.agents.importAgentSerialized(file, {})
|
377
|
+
|
378
|
+
console.log(`Imported agent: ${agentState.id}`);
|
379
|
+
|
380
|
+
// Export your agent into a serialized schema object (which you can write to a file)
|
381
|
+
const schema = await client.agents.exportAgentSerialized("<AGENT_ID>");
|
382
|
+
```
|
383
|
+
</details>
|
384
|
+
|
385
|
+
## Model Context Protocol (MCP) and custom tools ([full guide](https://docs.letta.com/guides/mcp/overview))
|
386
|
+
|
387
|
+
Letta has rich support for MCP tools (Letta acts as an MCP client), as well as custom Python tools.
|
388
|
+
MCP servers can be easily added within the Agent Development Environment (ADE) tool manager UI, as well as via the SDK:
|
389
|
+
|
390
|
+
|
391
|
+
<details>
|
392
|
+
<summary>View code snippets</summary>
|
393
|
+
|
394
|
+
### Python
|
395
|
+
```python
|
396
|
+
# List tools from an MCP server
|
397
|
+
tools = client.tools.list_mcp_tools_by_server(mcp_server_name="weather-server")
|
398
|
+
|
399
|
+
# Add a specific tool from the MCP server
|
400
|
+
tool = client.tools.add_mcp_tool(
|
401
|
+
mcp_server_name="weather-server",
|
402
|
+
mcp_tool_name="get_weather"
|
403
|
+
)
|
404
|
+
|
405
|
+
# Create agent with MCP tool attached
|
406
|
+
agent_state = client.agents.create(
|
407
|
+
model="openai/gpt-4o-mini",
|
408
|
+
embedding="openai/text-embedding-3-small",
|
409
|
+
tool_ids=[tool.id]
|
410
|
+
)
|
411
|
+
|
412
|
+
# Or attach tools to an existing agent
|
413
|
+
client.agents.tool.attach(
|
414
|
+
agent_id=agent_state.id
|
415
|
+
tool_id=tool.id
|
416
|
+
)
|
417
|
+
|
418
|
+
# Use the agent with MCP tools
|
419
|
+
response = client.agents.messages.create(
|
420
|
+
agent_id=agent_state.id,
|
421
|
+
messages=[
|
422
|
+
{
|
423
|
+
"role": "user",
|
424
|
+
"content": "Use the weather tool to check the forecast"
|
425
|
+
}
|
426
|
+
]
|
427
|
+
)
|
428
|
+
```
|
429
|
+
|
430
|
+
### TypeScript / Node.js
|
431
|
+
```typescript
|
432
|
+
// List tools from an MCP server
|
433
|
+
const tools = await client.tools.listMcpToolsByServer("weather-server");
|
434
|
+
|
435
|
+
// Add a specific tool from the MCP server
|
436
|
+
const tool = await client.tools.addMcpTool("weather-server", "get_weather");
|
437
|
+
|
438
|
+
// Create agent with MCP tool
|
439
|
+
const agentState = await client.agents.create({
|
440
|
+
model: "openai/gpt-4o-mini",
|
441
|
+
embedding: "openai/text-embedding-3-small",
|
442
|
+
toolIds: [tool.id]
|
443
|
+
});
|
444
|
+
|
445
|
+
// Use the agent with MCP tools
|
446
|
+
const response = await client.agents.messages.create(agentState.id, {
|
447
|
+
messages: [
|
448
|
+
{
|
449
|
+
role: "user",
|
450
|
+
content: "Use the weather tool to check the forecast"
|
451
|
+
}
|
452
|
+
]
|
453
|
+
});
|
454
|
+
```
|
455
|
+
</details>
|
456
|
+
|
457
|
+
## Filesystem ([full guide](https://docs.letta.com/guides/agents/filesystem))
|
458
|
+
|
459
|
+
Letta’s filesystem allow you to easily connect your agents to external files, for example: research papers, reports, medical records, or any other data in common text formats (`.pdf`, `.txt`, `.md`, `.json`, etc).
|
460
|
+
Once you attach a folder to an agent, the agent will be able to use filesystem tools (`open_file`, `grep_file`, `search_file`) to browse the files to search for information.
|
461
|
+
|
462
|
+
<details>
|
463
|
+
<summary>View code snippets</summary>
|
464
|
+
|
465
|
+
### Python
|
466
|
+
```python
|
467
|
+
# get an available embedding_config
|
468
|
+
embedding_configs = client.embedding_models.list()
|
469
|
+
embedding_config = embedding_configs[0]
|
470
|
+
|
471
|
+
# create the folder
|
472
|
+
folder = client.folders.create(
|
473
|
+
name="my_folder",
|
474
|
+
embedding_config=embedding_config
|
475
|
+
)
|
476
|
+
|
477
|
+
# upload a file into the folder
|
478
|
+
job = client.folders.files.upload(
|
479
|
+
folder_id=folder.id,
|
480
|
+
file=open("my_file.txt", "rb")
|
481
|
+
)
|
482
|
+
|
483
|
+
# wait until the job is completed
|
484
|
+
while True:
|
485
|
+
job = client.jobs.retrieve(job.id)
|
486
|
+
if job.status == "completed":
|
487
|
+
break
|
488
|
+
elif job.status == "failed":
|
489
|
+
raise ValueError(f"Job failed: {job.metadata}")
|
490
|
+
print(f"Job status: {job.status}")
|
491
|
+
time.sleep(1)
|
492
|
+
|
493
|
+
# once you attach a folder to an agent, the agent can see all files in it
|
494
|
+
client.agents.folders.attach(agent_id=agent.id, folder_id=folder.id)
|
495
|
+
|
496
|
+
response = client.agents.messages.create(
|
497
|
+
agent_id=agent_state.id,
|
498
|
+
messages=[
|
499
|
+
{
|
500
|
+
"role": "user",
|
501
|
+
"content": "What data is inside of my_file.txt?"
|
502
|
+
}
|
503
|
+
]
|
504
|
+
)
|
505
|
+
|
506
|
+
for message in response.messages:
|
507
|
+
print(message)
|
508
|
+
```
|
509
|
+
|
510
|
+
### TypeScript / Node.js
|
511
|
+
```typescript
|
512
|
+
// get an available embedding_config
|
513
|
+
const embeddingConfigs = await client.embeddingModels.list()
|
514
|
+
const embeddingConfig = embeddingConfigs[0];
|
515
|
+
|
516
|
+
// create the folder
|
517
|
+
const folder = await client.folders.create({
|
518
|
+
name: "my_folder",
|
519
|
+
embeddingConfig: embeddingConfig
|
520
|
+
});
|
521
|
+
|
522
|
+
// upload a file into the folder
|
523
|
+
const uploadJob = await client.folders.files.upload(
|
524
|
+
createReadStream("my_file.txt"),
|
525
|
+
folder.id,
|
526
|
+
);
|
527
|
+
console.log("file uploaded")
|
528
|
+
|
529
|
+
// wait until the job is completed
|
530
|
+
while (true) {
|
531
|
+
const job = await client.jobs.retrieve(uploadJob.id);
|
532
|
+
if (job.status === "completed") {
|
533
|
+
break;
|
534
|
+
} else if (job.status === "failed") {
|
535
|
+
throw new Error(`Job failed: ${job.metadata}`);
|
536
|
+
}
|
537
|
+
console.log(`Job status: ${job.status}`);
|
538
|
+
await new Promise((resolve) => setTimeout(resolve, 1000));
|
539
|
+
}
|
540
|
+
|
541
|
+
// list files in the folder
|
542
|
+
const files = await client.folders.files.list(folder.id);
|
543
|
+
console.log(`Files in folder: ${files}`);
|
544
|
+
|
545
|
+
// list passages in the folder
|
546
|
+
const passages = await client.folders.passages.list(folder.id);
|
547
|
+
console.log(`Passages in folder: ${passages}`);
|
548
|
+
|
549
|
+
// once you attach a folder to an agent, the agent can see all files in it
|
550
|
+
await client.agents.folders.attach(agent.id, folder.id);
|
551
|
+
|
552
|
+
const response = await client.agents.messages.create(
|
553
|
+
agentState.id, {
|
554
|
+
messages: [
|
555
|
+
{
|
556
|
+
role: "user",
|
557
|
+
content: "What data is inside of my_file.txt?"
|
558
|
+
}
|
559
|
+
]
|
560
|
+
}
|
561
|
+
);
|
562
|
+
|
563
|
+
for (const message of response.messages) {
|
564
|
+
console.log(message);
|
565
|
+
}
|
566
|
+
```
|
567
|
+
</details>
|
568
|
+
|
569
|
+
## Long-running agents ([full guide](https://docs.letta.com/guides/agents/long-running))
|
570
|
+
|
571
|
+
When agents need to execute multiple tool calls or perform complex operations (like deep research, data analysis, or multi-step workflows), processing time can vary significantly. Letta supports both a background mode (with resumable streaming) as well as an async mode (with polling) to enable robust long-running agent executions.
|
572
|
+
|
573
|
+
|
574
|
+
<details>
|
575
|
+
<summary>View code snippets</summary>
|
576
|
+
|
577
|
+
### Python
|
578
|
+
```python
|
579
|
+
stream = client.agents.messages.create_stream(
|
580
|
+
agent_id=agent_state.id,
|
581
|
+
messages=[
|
582
|
+
{
|
583
|
+
"role": "user",
|
584
|
+
"content": "Run comprehensive analysis on this dataset"
|
585
|
+
}
|
586
|
+
],
|
587
|
+
stream_tokens=True,
|
588
|
+
background=True,
|
589
|
+
)
|
590
|
+
run_id = None
|
591
|
+
last_seq_id = None
|
592
|
+
for chunk in stream:
|
593
|
+
if hasattr(chunk, "run_id") and hasattr(chunk, "seq_id"):
|
594
|
+
run_id = chunk.run_id # Save this to reconnect if your connection drops
|
595
|
+
last_seq_id = chunk.seq_id # Save this as your resumption point for cursor-based pagination
|
596
|
+
print(chunk)
|
597
|
+
|
598
|
+
# If disconnected, resume from last received seq_id:
|
599
|
+
for chunk in client.runs.stream(run_id, starting_after=last_seq_id):
|
600
|
+
print(chunk)
|
601
|
+
```
|
602
|
+
|
603
|
+
### TypeScript / Node.js
|
604
|
+
```typescript
|
605
|
+
const stream = await client.agents.messages.createStream({
|
606
|
+
agentId: agentState.id,
|
607
|
+
requestBody: {
|
608
|
+
messages: [
|
609
|
+
{
|
610
|
+
role: "user",
|
611
|
+
content: "Run comprehensive analysis on this dataset"
|
612
|
+
}
|
613
|
+
],
|
614
|
+
streamTokens: true,
|
615
|
+
background: true,
|
616
|
+
}
|
617
|
+
});
|
618
|
+
|
619
|
+
let runId = null;
|
620
|
+
let lastSeqId = null;
|
621
|
+
for await (const chunk of stream) {
|
622
|
+
if (chunk.run_id && chunk.seq_id) {
|
623
|
+
runId = chunk.run_id; // Save this to reconnect if your connection drops
|
624
|
+
lastSeqId = chunk.seq_id; // Save this as your resumption point for cursor-based pagination
|
625
|
+
}
|
626
|
+
console.log(chunk);
|
627
|
+
}
|
628
|
+
|
629
|
+
// If disconnected, resume from last received seq_id
|
630
|
+
for await (const chunk of client.runs.stream(runId, {startingAfter: lastSeqId})) {
|
631
|
+
console.log(chunk);
|
632
|
+
}
|
633
|
+
```
|
634
|
+
</details>
|
635
|
+
|
636
|
+
## Using local models
|
637
|
+
|
638
|
+
Letta is model agnostic and supports using local model providers such as [Ollama](https://docs.letta.com/guides/server/providers/ollama) and [LM Studio](https://docs.letta.com/guides/server/providers/lmstudio). You can also easily swap models inside an agent after the agent has been created, by modifying the agent state with the new model provider via the SDK or in the ADE.
|
639
|
+
|
640
|
+
## Development (only needed if you need to modify the server code)
|
641
|
+
|
642
|
+
*Note: this repostory contains the source code for the core Letta service (API server), not the client SDKs. The client SDKs can be found here: [Python](https://github.com/letta-ai/letta-python), [TypeScript](https://github.com/letta-ai/letta-node).*
|
643
|
+
|
644
|
+
To install the Letta server from source, fork the repo, clone your fork, then use [uv](https://docs.astral.sh/uv/getting-started/installation/) to install from inside the main directory:
|
645
|
+
```sh
|
646
|
+
cd letta
|
647
|
+
uv sync --all-extras
|
648
|
+
```
|
649
|
+
|
650
|
+
To run the Letta server from source, use `uv run`:
|
651
|
+
```sh
|
652
|
+
uv run letta server
|
653
|
+
```
|
654
|
+
|
655
|
+
## Contributing
|
656
|
+
|
657
|
+
Letta is an open source project built by over a hundred contributors. There are many ways to get involved in the Letta OSS project!
|
658
|
+
|
659
|
+
* [**Join the Discord**](https://discord.gg/letta): Chat with the Letta devs and other AI developers.
|
660
|
+
* [**Chat on our forum**](https://forum.letta.com/): If you're not into Discord, check out our developer forum.
|
661
|
+
* **Follow our socials**: [Twitter/X](https://twitter.com/Letta_AI), [LinkedIn](https://www.linkedin.com/in/letta), [YouTube](https://www.youtube.com/@letta-ai)
|
662
|
+
|
663
|
+
---
|
664
|
+
|
665
|
+
***Legal notices**: By using Letta and related Letta services (such as the Letta endpoint or hosted service), you are agreeing to our [privacy policy](https://www.letta.com/privacy-policy) and [terms of service](https://www.letta.com/terms-of-service).*
|
@@ -460,8 +460,8 @@ letta/templates/sandbox_code_file_async.py.j2,sha256=lb7nh_P2W9VZHzU_9TxSCEMUod7
|
|
460
460
|
letta/templates/summary_request_text.j2,sha256=ZttQwXonW2lk4pJLYzLK0pmo4EO4EtUUIXjgXKiizuc,842
|
461
461
|
letta/templates/template_helper.py,sha256=HkG3zwRc5NVGmSTQu5PUTpz7LevK43bzXVaQuN8urf0,1634
|
462
462
|
letta/types/__init__.py,sha256=hokKjCVFGEfR7SLMrtZsRsBfsC7yTIbgKPLdGg4K1eY,147
|
463
|
-
letta_nightly-0.11.6.
|
464
|
-
letta_nightly-0.11.6.
|
465
|
-
letta_nightly-0.11.6.
|
466
|
-
letta_nightly-0.11.6.
|
467
|
-
letta_nightly-0.11.6.
|
463
|
+
letta_nightly-0.11.6.dev20250828104133.dist-info/METADATA,sha256=Z0CrxUuAAxQA_U7o5uOmT91nx0yluQ4CsZyup2vQLJI,24361
|
464
|
+
letta_nightly-0.11.6.dev20250828104133.dist-info/WHEEL,sha256=qtCwoSJWgHk21S1Kb4ihdzI2rlJ1ZKaIurTj_ngOhyQ,87
|
465
|
+
letta_nightly-0.11.6.dev20250828104133.dist-info/entry_points.txt,sha256=m-94Paj-kxiR6Ktu0us0_2qfhn29DzF2oVzqBE6cu8w,41
|
466
|
+
letta_nightly-0.11.6.dev20250828104133.dist-info/licenses/LICENSE,sha256=mExtuZ_GYJgDEI38GWdiEYZizZS4KkVt2SF1g_GPNhI,10759
|
467
|
+
letta_nightly-0.11.6.dev20250828104133.dist-info/RECORD,,
|
@@ -1,422 +0,0 @@
|
|
1
|
-
Metadata-Version: 2.4
|
2
|
-
Name: letta-nightly
|
3
|
-
Version: 0.11.6.dev20250827050912
|
4
|
-
Summary: Create LLM agents with long-term memory and custom tools
|
5
|
-
Author-email: Letta Team <contact@letta.com>
|
6
|
-
License: Apache License
|
7
|
-
License-File: LICENSE
|
8
|
-
Requires-Python: <3.14,>=3.11
|
9
|
-
Requires-Dist: aiomultiprocess>=0.9.1
|
10
|
-
Requires-Dist: alembic>=1.13.3
|
11
|
-
Requires-Dist: anthropic>=0.49.0
|
12
|
-
Requires-Dist: apscheduler>=3.11.0
|
13
|
-
Requires-Dist: black[jupyter]>=24.2.0
|
14
|
-
Requires-Dist: brotli>=1.1.0
|
15
|
-
Requires-Dist: certifi>=2025.6.15
|
16
|
-
Requires-Dist: colorama>=0.4.6
|
17
|
-
Requires-Dist: composio-core>=0.7.7
|
18
|
-
Requires-Dist: datamodel-code-generator[http]>=0.25.0
|
19
|
-
Requires-Dist: demjson3>=3.0.6
|
20
|
-
Requires-Dist: docstring-parser<0.17,>=0.16
|
21
|
-
Requires-Dist: faker>=36.1.0
|
22
|
-
Requires-Dist: firecrawl-py<3.0.0,>=2.8.0
|
23
|
-
Requires-Dist: grpcio-tools>=1.68.1
|
24
|
-
Requires-Dist: grpcio>=1.68.1
|
25
|
-
Requires-Dist: html2text>=2020.1.16
|
26
|
-
Requires-Dist: httpx-sse>=0.4.0
|
27
|
-
Requires-Dist: httpx>=0.28.0
|
28
|
-
Requires-Dist: jinja2>=3.1.5
|
29
|
-
Requires-Dist: letta-client>=0.1.277
|
30
|
-
Requires-Dist: llama-index-embeddings-openai>=0.3.1
|
31
|
-
Requires-Dist: llama-index>=0.12.2
|
32
|
-
Requires-Dist: markitdown[docx,pdf,pptx]>=0.1.2
|
33
|
-
Requires-Dist: marshmallow-sqlalchemy>=1.4.1
|
34
|
-
Requires-Dist: matplotlib>=3.10.1
|
35
|
-
Requires-Dist: mcp[cli]>=1.9.4
|
36
|
-
Requires-Dist: mistralai>=1.8.1
|
37
|
-
Requires-Dist: nltk>=3.8.1
|
38
|
-
Requires-Dist: numpy>=2.1.0
|
39
|
-
Requires-Dist: openai>=1.99.9
|
40
|
-
Requires-Dist: opentelemetry-api==1.30.0
|
41
|
-
Requires-Dist: opentelemetry-exporter-otlp==1.30.0
|
42
|
-
Requires-Dist: opentelemetry-instrumentation-requests==0.51b0
|
43
|
-
Requires-Dist: opentelemetry-instrumentation-sqlalchemy==0.51b0
|
44
|
-
Requires-Dist: opentelemetry-sdk==1.30.0
|
45
|
-
Requires-Dist: orjson>=3.11.1
|
46
|
-
Requires-Dist: pathvalidate>=3.2.1
|
47
|
-
Requires-Dist: prettytable>=3.9.0
|
48
|
-
Requires-Dist: pydantic-settings>=2.2.1
|
49
|
-
Requires-Dist: pydantic>=2.10.6
|
50
|
-
Requires-Dist: pyhumps>=3.8.0
|
51
|
-
Requires-Dist: python-box>=7.1.1
|
52
|
-
Requires-Dist: python-multipart>=0.0.19
|
53
|
-
Requires-Dist: pytz>=2023.3.post1
|
54
|
-
Requires-Dist: pyyaml>=6.0.1
|
55
|
-
Requires-Dist: questionary>=2.0.1
|
56
|
-
Requires-Dist: rich>=13.9.4
|
57
|
-
Requires-Dist: sentry-sdk[fastapi]==2.19.1
|
58
|
-
Requires-Dist: setuptools>=70
|
59
|
-
Requires-Dist: sqlalchemy-json>=0.7.0
|
60
|
-
Requires-Dist: sqlalchemy-utils>=0.41.2
|
61
|
-
Requires-Dist: sqlalchemy[asyncio]>=2.0.41
|
62
|
-
Requires-Dist: sqlmodel>=0.0.16
|
63
|
-
Requires-Dist: structlog>=25.4.0
|
64
|
-
Requires-Dist: tavily-python>=0.7.2
|
65
|
-
Requires-Dist: tqdm>=4.66.1
|
66
|
-
Requires-Dist: typer>=0.15.2
|
67
|
-
Provides-Extra: bedrock
|
68
|
-
Requires-Dist: aioboto3>=14.3.0; extra == 'bedrock'
|
69
|
-
Requires-Dist: boto3>=1.36.24; extra == 'bedrock'
|
70
|
-
Provides-Extra: cloud-tool-sandbox
|
71
|
-
Requires-Dist: e2b-code-interpreter>=1.0.3; extra == 'cloud-tool-sandbox'
|
72
|
-
Provides-Extra: desktop
|
73
|
-
Requires-Dist: aiosqlite>=0.21.0; extra == 'desktop'
|
74
|
-
Requires-Dist: docker>=7.1.0; extra == 'desktop'
|
75
|
-
Requires-Dist: fastapi>=0.115.6; extra == 'desktop'
|
76
|
-
Requires-Dist: langchain-community>=0.3.7; extra == 'desktop'
|
77
|
-
Requires-Dist: langchain>=0.3.7; extra == 'desktop'
|
78
|
-
Requires-Dist: locust>=2.31.5; extra == 'desktop'
|
79
|
-
Requires-Dist: pgvector>=0.2.3; extra == 'desktop'
|
80
|
-
Requires-Dist: sqlite-vec>=0.1.7a2; extra == 'desktop'
|
81
|
-
Requires-Dist: uvicorn>=0.24.0.post1; extra == 'desktop'
|
82
|
-
Requires-Dist: websockets; extra == 'desktop'
|
83
|
-
Requires-Dist: wikipedia>=1.4.0; extra == 'desktop'
|
84
|
-
Provides-Extra: dev
|
85
|
-
Requires-Dist: autoflake>=2.3.0; extra == 'dev'
|
86
|
-
Requires-Dist: black[jupyter]>=24.4.2; extra == 'dev'
|
87
|
-
Requires-Dist: ipdb>=0.13.13; extra == 'dev'
|
88
|
-
Requires-Dist: ipykernel>=6.29.5; extra == 'dev'
|
89
|
-
Requires-Dist: isort>=5.13.2; extra == 'dev'
|
90
|
-
Requires-Dist: pexpect>=4.9.0; extra == 'dev'
|
91
|
-
Requires-Dist: pre-commit>=3.5.0; extra == 'dev'
|
92
|
-
Requires-Dist: pyright>=1.1.347; extra == 'dev'
|
93
|
-
Requires-Dist: pytest; extra == 'dev'
|
94
|
-
Requires-Dist: pytest-asyncio>=0.24.0; extra == 'dev'
|
95
|
-
Requires-Dist: pytest-json-report>=1.5.0; extra == 'dev'
|
96
|
-
Requires-Dist: pytest-mock>=3.14.0; extra == 'dev'
|
97
|
-
Requires-Dist: pytest-order>=1.2.0; extra == 'dev'
|
98
|
-
Provides-Extra: experimental
|
99
|
-
Requires-Dist: google-cloud-profiler>=4.1.0; extra == 'experimental'
|
100
|
-
Requires-Dist: granian[reload,uvloop]>=2.3.2; extra == 'experimental'
|
101
|
-
Requires-Dist: uvloop>=0.21.0; extra == 'experimental'
|
102
|
-
Provides-Extra: external-tools
|
103
|
-
Requires-Dist: docker>=7.1.0; extra == 'external-tools'
|
104
|
-
Requires-Dist: firecrawl-py<3.0.0,>=2.8.0; extra == 'external-tools'
|
105
|
-
Requires-Dist: langchain-community>=0.3.7; extra == 'external-tools'
|
106
|
-
Requires-Dist: langchain>=0.3.7; extra == 'external-tools'
|
107
|
-
Requires-Dist: turbopuffer>=0.5.17; extra == 'external-tools'
|
108
|
-
Requires-Dist: wikipedia>=1.4.0; extra == 'external-tools'
|
109
|
-
Provides-Extra: google
|
110
|
-
Requires-Dist: google-genai>=1.15.0; extra == 'google'
|
111
|
-
Provides-Extra: modal
|
112
|
-
Requires-Dist: modal>=1.1.0; extra == 'modal'
|
113
|
-
Provides-Extra: pinecone
|
114
|
-
Requires-Dist: pinecone[asyncio]>=7.3.0; extra == 'pinecone'
|
115
|
-
Provides-Extra: postgres
|
116
|
-
Requires-Dist: asyncpg>=0.30.0; extra == 'postgres'
|
117
|
-
Requires-Dist: pg8000>=1.30.3; extra == 'postgres'
|
118
|
-
Requires-Dist: pgvector>=0.2.3; extra == 'postgres'
|
119
|
-
Requires-Dist: psycopg2-binary>=2.9.10; extra == 'postgres'
|
120
|
-
Requires-Dist: psycopg2>=2.9.10; extra == 'postgres'
|
121
|
-
Provides-Extra: redis
|
122
|
-
Requires-Dist: redis>=6.2.0; extra == 'redis'
|
123
|
-
Provides-Extra: server
|
124
|
-
Requires-Dist: fastapi>=0.115.6; extra == 'server'
|
125
|
-
Requires-Dist: uvicorn>=0.24.0.post1; extra == 'server'
|
126
|
-
Requires-Dist: websockets; extra == 'server'
|
127
|
-
Provides-Extra: sqlite
|
128
|
-
Requires-Dist: aiosqlite>=0.21.0; extra == 'sqlite'
|
129
|
-
Requires-Dist: sqlite-vec>=0.1.7a2; extra == 'sqlite'
|
130
|
-
Description-Content-Type: text/markdown
|
131
|
-
|
132
|
-
<p align="center">
|
133
|
-
<picture>
|
134
|
-
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/Letta-logo-RGB_GreyonTransparent_cropped_small.png">
|
135
|
-
<source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/Letta-logo-RGB_OffBlackonTransparent_cropped_small.png">
|
136
|
-
<img alt="Letta logo" src="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/Letta-logo-RGB_GreyonOffBlack_cropped_small.png" width="500">
|
137
|
-
</picture>
|
138
|
-
</p>
|
139
|
-
|
140
|
-
<div align="center">
|
141
|
-
<h1>Letta (previously MemGPT)</h1>
|
142
|
-
<h3>
|
143
|
-
|
144
|
-
[Homepage](https://letta.com) // [Documentation](https://docs.letta.com) // [ADE](https://docs.letta.com/agent-development-environment) // [Letta Cloud](https://forms.letta.com/early-access)
|
145
|
-
|
146
|
-
</h3>
|
147
|
-
|
148
|
-
**👾 Letta** is an open source framework for building **stateful agents** with advanced reasoning capabilities and transparent long-term memory. The Letta framework is white box and model-agnostic.
|
149
|
-
|
150
|
-
[](https://discord.gg/letta)
|
151
|
-
[](https://twitter.com/Letta_AI)
|
152
|
-
[](https://arxiv.org/abs/2310.08560)
|
153
|
-
|
154
|
-
[](LICENSE)
|
155
|
-
[](https://github.com/cpacker/MemGPT/releases)
|
156
|
-
[](https://hub.docker.com/r/letta/letta)
|
157
|
-
[](https://github.com/cpacker/MemGPT)
|
158
|
-
|
159
|
-
<a href="https://trendshift.io/repositories/3612" target="_blank"><img src="https://trendshift.io/api/badge/repositories/3612" alt="cpacker%2FMemGPT | Trendshift" style="width: 250px; height: 55px;" width="250" height="55"/></a>
|
160
|
-
|
161
|
-
</div>
|
162
|
-
|
163
|
-
> [!IMPORTANT]
|
164
|
-
> **Looking for MemGPT?** You're in the right place!
|
165
|
-
>
|
166
|
-
> The MemGPT package and Docker image have been renamed to `letta` to clarify the distinction between MemGPT *agents* and the Letta API *server* / *runtime* that runs LLM agents as *services*. Read more about the relationship between MemGPT and Letta [here](https://www.letta.com/blog/memgpt-and-letta).
|
167
|
-
|
168
|
-
---
|
169
|
-
|
170
|
-
## ⚡ Quickstart
|
171
|
-
|
172
|
-
_The recommended way to use Letta is to run use Docker. To install Docker, see [Docker's installation guide](https://docs.docker.com/get-docker/). For issues with installing Docker, see [Docker's troubleshooting guide](https://docs.docker.com/desktop/troubleshoot-and-support/troubleshoot/). You can also install Letta using `pip` (see instructions [below](#-quickstart-pip))._
|
173
|
-
|
174
|
-
### 🌖 Run the Letta server
|
175
|
-
|
176
|
-
> [!NOTE]
|
177
|
-
> Letta agents live inside the Letta server, which persists them to a database. You can interact with the Letta agents inside your Letta server via the [REST API](https://docs.letta.com/api-reference) + Python / Typescript SDKs, and the [Agent Development Environment](https://app.letta.com) (a graphical interface).
|
178
|
-
|
179
|
-
The Letta server can be connected to various LLM API backends ([OpenAI](https://docs.letta.com/models/openai), [Anthropic](https://docs.letta.com/models/anthropic), [vLLM](https://docs.letta.com/models/vllm), [Ollama](https://docs.letta.com/models/ollama), etc.). To enable access to these LLM API providers, set the appropriate environment variables when you use `docker run`:
|
180
|
-
```sh
|
181
|
-
# replace `~/.letta/.persist/pgdata` with wherever you want to store your agent data
|
182
|
-
docker run \
|
183
|
-
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
|
184
|
-
-p 8283:8283 \
|
185
|
-
-e OPENAI_API_KEY="your_openai_api_key" \
|
186
|
-
letta/letta:latest
|
187
|
-
```
|
188
|
-
|
189
|
-
If you have many different LLM API keys, you can also set up a `.env` file instead and pass that to `docker run`:
|
190
|
-
```sh
|
191
|
-
# using a .env file instead of passing environment variables
|
192
|
-
docker run \
|
193
|
-
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
|
194
|
-
-p 8283:8283 \
|
195
|
-
--env-file .env \
|
196
|
-
letta/letta:latest
|
197
|
-
```
|
198
|
-
|
199
|
-
Once the Letta server is running, you can access it via port `8283` (e.g. sending REST API requests to `http://localhost:8283/v1`). You can also connect your server to the Letta ADE to access and manage your agents in a web interface.
|
200
|
-
|
201
|
-
### 👾 Access the ADE (Agent Development Environment)
|
202
|
-
|
203
|
-
> [!NOTE]
|
204
|
-
> For a guided tour of the ADE, watch our [ADE walkthrough on YouTube](https://www.youtube.com/watch?v=OzSCFR0Lp5s), or read our [blog post](https://www.letta.com/blog/introducing-the-agent-development-environment) and [developer docs](https://docs.letta.com/agent-development-environment).
|
205
|
-
|
206
|
-
The Letta ADE is a graphical user interface for creating, deploying, interacting and observing with your Letta agents. For example, if you're running a Letta server to power an end-user application (such as a customer support chatbot), you can use the ADE to test, debug, and observe the agents in your server. You can also use the ADE as a general chat interface to interact with your Letta agents.
|
207
|
-
|
208
|
-
<p align="center">
|
209
|
-
<picture>
|
210
|
-
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/example_ade_screenshot.png">
|
211
|
-
<source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/example_ade_screenshot_light.png">
|
212
|
-
<img alt="ADE screenshot" src="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/example_ade_screenshot.png" width="800">
|
213
|
-
</picture>
|
214
|
-
</p>
|
215
|
-
|
216
|
-
The ADE can connect to self-hosted Letta servers (e.g. a Letta server running on your laptop), as well as the Letta Cloud service. When connected to a self-hosted / private server, the ADE uses the Letta REST API to communicate with your server.
|
217
|
-
|
218
|
-
#### 🖥️ Connecting the ADE to your local Letta server
|
219
|
-
To connect the ADE with your local Letta server, simply:
|
220
|
-
1. Start your Letta server (`docker run ...`)
|
221
|
-
2. Visit [https://app.letta.com](https://app.letta.com) and you will see "Local server" as an option in the left panel
|
222
|
-
|
223
|
-
<p align="center">
|
224
|
-
<picture>
|
225
|
-
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/example_ade_screenshot_agents.png">
|
226
|
-
<source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/example_ade_screenshot_agents_light.png">
|
227
|
-
<img alt="Letta logo" src="https://raw.githubusercontent.com/letta-ai/letta/refs/heads/main/assets/example_ade_screenshot_agents.png" width="800">
|
228
|
-
</picture>
|
229
|
-
</p>
|
230
|
-
|
231
|
-
🔐 To password protect your server, include `SECURE=true` and `LETTA_SERVER_PASSWORD=yourpassword` in your `docker run` command:
|
232
|
-
```sh
|
233
|
-
# If LETTA_SERVER_PASSWORD isn't set, the server will autogenerate a password
|
234
|
-
docker run \
|
235
|
-
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
|
236
|
-
-p 8283:8283 \
|
237
|
-
--env-file .env \
|
238
|
-
-e SECURE=true \
|
239
|
-
-e LETTA_SERVER_PASSWORD=yourpassword \
|
240
|
-
letta/letta:latest
|
241
|
-
```
|
242
|
-
|
243
|
-
#### 🌐 Connecting the ADE to an external (self-hosted) Letta server
|
244
|
-
If your Letta server isn't running on `localhost` (for example, you deployed it on an external service like EC2):
|
245
|
-
1. Click "Add remote server"
|
246
|
-
2. Enter your desired server name, the IP address of the server, and the server password (if set)
|
247
|
-
|
248
|
-
---
|
249
|
-
|
250
|
-
## 🧑🚀 Frequently asked questions (FAQ)
|
251
|
-
|
252
|
-
> _"Do I need to install Docker to use Letta?"_
|
253
|
-
|
254
|
-
No, you can install Letta using `pip` (via `pip install -U letta`), as well as from source (via `uv sync`). See instructions below.
|
255
|
-
|
256
|
-
> _"What's the difference between installing with `pip` vs `Docker`?"_
|
257
|
-
|
258
|
-
Letta gives your agents persistence (they live indefinitely) by storing all your agent data in a database. Letta is designed to be used with a [PostgreSQL](https://en.wikipedia.org/wiki/PostgreSQL) (the world's most popular database), however, it is not possible to install PostgreSQL via `pip`, so the `pip` install of Letta defaults to using [SQLite](https://www.sqlite.org/). If you have a PostgreSQL instance running on your own computer, you can still connect Letta (installed via `pip`) to PostgreSQL by setting the environment variable `LETTA_PG_URI`.
|
259
|
-
|
260
|
-
**Database migrations are not officially supported for Letta when using SQLite**, so if you would like to ensure that you're able to upgrade to the latest Letta version and migrate your Letta agents data, make sure that you're using PostgreSQL as your Letta database backend. Full compatability table below:
|
261
|
-
|
262
|
-
| Installation method | Start server command | Database backend | Data migrations supported? |
|
263
|
-
|---|---|---|---|
|
264
|
-
| `pip install letta` | `letta server` | SQLite | ❌ |
|
265
|
-
| `pip install letta` | `export LETTA_PG_URI=...` + `letta server` | PostgreSQL | ✅ |
|
266
|
-
| *[Install Docker](https://www.docker.com/get-started/)* |`docker run ...` ([full command](#-run-the-letta-server)) | PostgreSQL | ✅ |
|
267
|
-
|
268
|
-
> _"How do I use the ADE locally?"_
|
269
|
-
|
270
|
-
To connect the ADE to your local Letta server, simply run your Letta server (make sure you can access `localhost:8283`) and go to [https://app.letta.com](https://app.letta.com). If you would like to use the old version of the ADE (that runs on `localhost`), downgrade to Letta version `<=0.5.0`.
|
271
|
-
|
272
|
-
> _"If I connect the ADE to my local server, does my agent data get uploaded to letta.com?"_
|
273
|
-
|
274
|
-
No, the data in your Letta server database stays on your machine. The Letta ADE web application simply connects to your local Letta server (via the REST API) and provides a graphical interface on top of it to visualize your local Letta data in your browser's local state.
|
275
|
-
|
276
|
-
> _"Do I have to use your ADE? Can I build my own?"_
|
277
|
-
|
278
|
-
The ADE is built on top of the (fully open source) Letta server and Letta Agents API. You can build your own application like the ADE on top of the REST API (view the documentation [here](https://docs.letta.com/api-reference)).
|
279
|
-
|
280
|
-
> _"Can I interact with Letta agents via the CLI?"_
|
281
|
-
|
282
|
-
The recommended way to use Letta is via the REST API and ADE, however you can also access your agents via the CLI.
|
283
|
-
|
284
|
-
<details>
|
285
|
-
<summary>View instructions for running the Letta CLI</summary>
|
286
|
-
|
287
|
-
You can chat with your agents via the Letta CLI tool (`letta run`). If you have a Letta Docker container running, you can use `docker exec` to run the Letta CLI inside the container:
|
288
|
-
```sh
|
289
|
-
# replace `<letta_container_id>` with the ID of your Letta container, found via `docker ps`
|
290
|
-
docker exec -it <letta_container_id> letta run
|
291
|
-
```
|
292
|
-
|
293
|
-
You can also use `docker ps` within the command to automatically find the ID of your Letta container:
|
294
|
-
```
|
295
|
-
docker exec -it $(docker ps -q -f ancestor=letta/letta) letta run
|
296
|
-
```
|
297
|
-
|
298
|
-
In the CLI tool, you'll be able to create new agents, or load existing agents:
|
299
|
-
```
|
300
|
-
🧬 Creating new agent...
|
301
|
-
? Select LLM model: letta-free [type=openai] [ip=https://inference.letta.com]
|
302
|
-
? Select embedding model: letta-free [type=hugging-face] [ip=https://embeddings.letta.com]
|
303
|
-
-> 🤖 Using persona profile: 'sam_pov'
|
304
|
-
-> 🧑 Using human profile: 'basic'
|
305
|
-
-> 🛠️ 8 tools: send_message, pause_heartbeats, conversation_search, conversation_search_date, archival_memory_insert, archival_memory_search, core_memory_append, core_memory_replace
|
306
|
-
|
307
|
-
🎉 Created new agent 'InspiringSpinach'
|
308
|
-
|
309
|
-
Hit enter to begin (will request first Letta message)
|
310
|
-
|
311
|
-
💭 User engaged. Recognize as Chad. Acknowledge presence and establish connection. Explore interests and engage in meaningful discourse.
|
312
|
-
🤖 Hello, Chad. It's nice to meet you. I am keen on learning and exploring new ideas and concepts. Is there a specific topic you'd like to discuss today?
|
313
|
-
|
314
|
-
> Enter your message: my name is Brad, not Chad...
|
315
|
-
|
316
|
-
💭 A discrepancy in identity. Reflect this change in core memory. Apologize for the oversight, and rectify the error.
|
317
|
-
⚡🧠 [function] updating memory with core_memory_replace
|
318
|
-
First name: Chad
|
319
|
-
→ First name: Brad
|
320
|
-
💭 Apologize for the mistake in recognizing his name, and begin the conversation anew, addressing him by the correct name this time.
|
321
|
-
🤖 I apologize for the oversight, Brad. Let's restart. It's a pleasure to meet you, Brad. Is there a specific topic you'd like to discuss today?
|
322
|
-
|
323
|
-
> Enter your message:
|
324
|
-
```
|
325
|
-
</details>
|
326
|
-
|
327
|
-
---
|
328
|
-
|
329
|
-
## ⚡ Quickstart (pip)
|
330
|
-
|
331
|
-
> [!WARNING]
|
332
|
-
> **Database migrations are not officially supported with `SQLite`**
|
333
|
-
>
|
334
|
-
> When you install Letta with `pip`, the default database backend is `SQLite` (you can still use an external `postgres` service with your `pip` install of Letta by setting `LETTA_PG_URI`).
|
335
|
-
>
|
336
|
-
> We do not officially support migrations between Letta versions with `SQLite` backends, only `postgres`. If you would like to keep your agent data across multiple Letta versions we highly recommend using the Docker install method which is the easiest way to use `postgres` with Letta.
|
337
|
-
|
338
|
-
<details>
|
339
|
-
|
340
|
-
<summary>View instructions for installing with pip</summary>
|
341
|
-
|
342
|
-
You can also install Letta with `pip`, which will default to using `SQLite` for the database backends (whereas Docker will default to using `postgres`).
|
343
|
-
|
344
|
-
### Step 1 - Install Letta using `pip`
|
345
|
-
```sh
|
346
|
-
pip install -U letta
|
347
|
-
```
|
348
|
-
|
349
|
-
### Step 2 - Set your environment variables for your chosen LLM / embedding providers
|
350
|
-
```sh
|
351
|
-
export OPENAI_API_KEY=sk-...
|
352
|
-
```
|
353
|
-
|
354
|
-
For Ollama (see our full [documentation](https://docs.letta.com/install) for examples of how to set up various providers):
|
355
|
-
```sh
|
356
|
-
export OLLAMA_BASE_URL=http://localhost:11434
|
357
|
-
```
|
358
|
-
|
359
|
-
### Step 3 - Run the Letta CLI
|
360
|
-
|
361
|
-
You can create agents and chat with them via the Letta CLI tool (`letta run`):
|
362
|
-
```sh
|
363
|
-
letta run
|
364
|
-
```
|
365
|
-
```
|
366
|
-
🧬 Creating new agent...
|
367
|
-
? Select LLM model: letta-free [type=openai] [ip=https://inference.letta.com]
|
368
|
-
? Select embedding model: letta-free [type=hugging-face] [ip=https://embeddings.letta.com]
|
369
|
-
-> 🤖 Using persona profile: 'sam_pov'
|
370
|
-
-> 🧑 Using human profile: 'basic'
|
371
|
-
-> 🛠️ 8 tools: send_message, pause_heartbeats, conversation_search, conversation_search_date, archival_memory_insert, archival_memory_search, core_memory_append, core_memory_replace
|
372
|
-
|
373
|
-
🎉 Created new agent 'InspiringSpinach'
|
374
|
-
|
375
|
-
Hit enter to begin (will request first Letta message)
|
376
|
-
|
377
|
-
💭 User engaged. Recognize as Chad. Acknowledge presence and establish connection. Explore interests and engage in meaningful discourse.
|
378
|
-
🤖 Hello, Chad. It's nice to meet you. I am keen on learning and exploring new ideas and concepts. Is there a specific topic you'd like to discuss today?
|
379
|
-
|
380
|
-
> Enter your message: my name is Brad, not Chad...
|
381
|
-
|
382
|
-
💭 A discrepancy in identity. Reflect this change in core memory. Apologize for the oversight, and rectify the error.
|
383
|
-
⚡🧠 [function] updating memory with core_memory_replace
|
384
|
-
First name: Chad
|
385
|
-
→ First name: Brad
|
386
|
-
💭 Apologize for the mistake in recognizing his name, and begin the conversation anew, addressing him by the correct name this time.
|
387
|
-
🤖 I apologize for the oversight, Brad. Let's restart. It's a pleasure to meet you, Brad. Is there a specific topic you'd like to discuss today?
|
388
|
-
|
389
|
-
> Enter your message:
|
390
|
-
```
|
391
|
-
|
392
|
-
### Step 4 - Run the Letta server
|
393
|
-
|
394
|
-
You can start the Letta API server with `letta server` (see the full API reference [here](https://docs.letta.com/api-reference)):
|
395
|
-
```sh
|
396
|
-
letta server
|
397
|
-
```
|
398
|
-
```
|
399
|
-
Initializing database...
|
400
|
-
Running: uvicorn server:app --host localhost --port 8283
|
401
|
-
INFO: Started server process [47750]
|
402
|
-
INFO: Waiting for application startup.
|
403
|
-
INFO: Application startup complete.
|
404
|
-
INFO: Uvicorn running on http://localhost:8283 (Press CTRL+C to quit)
|
405
|
-
```
|
406
|
-
</details>
|
407
|
-
|
408
|
-
---
|
409
|
-
|
410
|
-
## 🤗 How to contribute
|
411
|
-
|
412
|
-
Letta is an open source project built by over a hundred contributors. There are many ways to get involved in the Letta OSS project!
|
413
|
-
|
414
|
-
* **Contribute to the project**: Interested in contributing? Start by reading our [Contribution Guidelines](https://github.com/cpacker/MemGPT/tree/main/CONTRIBUTING.md).
|
415
|
-
* **Ask a question**: Join our community on [Discord](https://discord.gg/letta) and direct your questions to the `#support` channel.
|
416
|
-
* **Report issues or suggest features**: Have an issue or a feature request? Please submit them through our [GitHub Issues page](https://github.com/cpacker/MemGPT/issues).
|
417
|
-
* **Explore the roadmap**: Curious about future developments? View and comment on our [project roadmap](https://github.com/cpacker/MemGPT/issues/1533).
|
418
|
-
* **Join community events**: Stay updated with the [event calendar](https://lu.ma/berkeley-llm-meetup) or follow our [Twitter account](https://twitter.com/Letta_AI).
|
419
|
-
|
420
|
-
---
|
421
|
-
|
422
|
-
***Legal notices**: By using Letta and related Letta services (such as the Letta endpoint or hosted service), you are agreeing to our [privacy policy](https://www.letta.com/privacy-policy) and [terms of service](https://www.letta.com/terms-of-service).*
|
File without changes
|
File without changes
|