memorylayer-langchain 0.0.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,51 @@
1
+ # Python
2
+ __pycache__/
3
+ *.py[cod]
4
+ *$py.class
5
+ *.so
6
+ .Python
7
+ build/
8
+ develop-eggs/
9
+ dist/
10
+ downloads/
11
+ eggs/
12
+ .eggs/
13
+ lib/
14
+ lib64/
15
+ parts/
16
+ sdist/
17
+ var/
18
+ wheels/
19
+ *.egg-info/
20
+ .installed.cfg
21
+ *.egg
22
+ MANIFEST
23
+
24
+ # Virtual environments
25
+ venv/
26
+ env/
27
+ ENV/
28
+ .venv
29
+
30
+ # IDE
31
+ .vscode/
32
+ .idea/
33
+ *.swp
34
+ *.swo
35
+ *~
36
+
37
+ # Testing
38
+ .pytest_cache/
39
+ .coverage
40
+ htmlcov/
41
+ .tox/
42
+ .hypothesis/
43
+
44
+ # Type checking
45
+ .mypy_cache/
46
+ .dmypy.json
47
+ dmypy.json
48
+
49
+ # Distribution
50
+ *.tar.gz
51
+ *.whl
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Scitrera
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,463 @@
1
+ Metadata-Version: 2.4
2
+ Name: memorylayer-langchain
3
+ Version: 0.0.1
4
+ Summary: LangChain memory integration for MemoryLayer.ai - persistent memory for AI agents
5
+ Project-URL: Homepage, https://memorylayer.ai
6
+ Project-URL: Documentation, https://docs.memorylayer.ai
7
+ Project-URL: Repository, https://github.com/scitrera/memorylayer
8
+ Project-URL: Issues, https://github.com/scitrera/memorylayer/issues
9
+ Author-email: Scitrera <open-source-team@scitrera.com>
10
+ License: Apache 2.0
11
+ License-File: LICENSE
12
+ Keywords: agents,ai,langchain,llm,memory,memorylayer
13
+ Classifier: Development Status :: 3 - Alpha
14
+ Classifier: Intended Audience :: Developers
15
+ Classifier: License :: OSI Approved :: Apache Software License
16
+ Classifier: Programming Language :: Python :: 3
17
+ Classifier: Programming Language :: Python :: 3.12
18
+ Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
19
+ Requires-Python: >=3.12
20
+ Requires-Dist: httpx>=0.24.0
21
+ Requires-Dist: langchain-classic>=1.0.0
22
+ Requires-Dist: langchain-core>=0.3.0
23
+ Requires-Dist: memorylayer-client>=0.0.1
24
+ Requires-Dist: pydantic>=2.0.0
25
+ Provides-Extra: dev
26
+ Requires-Dist: mypy>=1.8.0; extra == 'dev'
27
+ Requires-Dist: pytest-asyncio>=0.23.0; extra == 'dev'
28
+ Requires-Dist: pytest>=8.0.0; extra == 'dev'
29
+ Requires-Dist: respx>=0.20.0; extra == 'dev'
30
+ Requires-Dist: ruff>=0.1.0; extra == 'dev'
31
+ Description-Content-Type: text/markdown
32
+
33
+ # MemoryLayer LangChain Integration
34
+
35
+ LangChain memory integration for [MemoryLayer.ai](https://memorylayer.ai) - Persistent memory for AI agents.
36
+
37
+ ## Installation
38
+
39
+ ```bash
40
+ pip install memorylayer-langchain
41
+ ```
42
+
43
+ ## Overview
44
+
45
+ This package provides LangChain-compatible memory classes that use MemoryLayer as the backend, giving you:
46
+
47
+ - **Persistent Memory** - Memory survives across agent runs and application restarts
48
+ - **Stable API** - Consistent interface regardless of LangChain version changes
49
+ - **Rich Memory Types** - Semantic, episodic, procedural, and working memory support
50
+ - **Drop-in Replacement** - Works with LCEL and legacy chains
51
+ - **Session Isolation** - Multiple conversations tracked independently
52
+
53
+ ## Quick Start
54
+
55
+ ### LCEL Chains (Recommended)
56
+
57
+ ```python
58
+ from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
59
+ from langchain_core.runnables.history import RunnableWithMessageHistory
60
+ from langchain_openai import ChatOpenAI
61
+
62
+ from memorylayer_langchain import MemoryLayerChatMessageHistory
63
+
64
+ # Create a chat chain with message history
65
+ prompt = ChatPromptTemplate.from_messages([
66
+ ("system", "You are a helpful assistant."),
67
+ MessagesPlaceholder(variable_name="history"),
68
+ ("human", "{input}"),
69
+ ])
70
+
71
+ llm = ChatOpenAI(model="gpt-4")
72
+ chain = prompt | llm
73
+
74
+ # Wrap with persistent history
75
+ chain_with_history = RunnableWithMessageHistory(
76
+ runnable=chain,
77
+ get_session_history=lambda session_id: MemoryLayerChatMessageHistory(
78
+ session_id=session_id,
79
+ base_url="https://api.memorylayer.ai",
80
+ api_key="your-api-key",
81
+ workspace_id="ws_123",
82
+ ),
83
+ input_messages_key="input",
84
+ history_messages_key="history",
85
+ )
86
+
87
+ # Use with any session - history persists automatically
88
+ response = chain_with_history.invoke(
89
+ {"input": "Hello! My name is Alice."},
90
+ config={"configurable": {"session_id": "user_alice_session_1"}},
91
+ )
92
+
93
+ # Later, even after restart, Alice's context is remembered
94
+ response = chain_with_history.invoke(
95
+ {"input": "What's my name?"},
96
+ config={"configurable": {"session_id": "user_alice_session_1"}},
97
+ )
98
+ ```
99
+
100
+ ### Legacy Chains
101
+
102
+ ```python
103
+ from langchain.chains import ConversationChain
104
+ from langchain_openai import ChatOpenAI
105
+
106
+ from memorylayer_langchain import MemoryLayerMemory
107
+
108
+ # Create persistent memory
109
+ memory = MemoryLayerMemory(
110
+ session_id="customer_support_session_1",
111
+ base_url="https://api.memorylayer.ai",
112
+ api_key="your-api-key",
113
+ workspace_id="ws_123",
114
+ )
115
+
116
+ # Use with ConversationChain
117
+ llm = ChatOpenAI(model="gpt-4")
118
+ chain = ConversationChain(llm=llm, memory=memory)
119
+
120
+ # Conversation persists across chain invocations and restarts
121
+ chain.run("Hi, I need help with my order #12345")
122
+ chain.run("It hasn't arrived yet")
123
+ ```
124
+
125
+ ## Features
126
+
127
+ ### MemoryLayerChatMessageHistory
128
+
129
+ Drop-in replacement for LangChain chat history, designed for LCEL chains.
130
+
131
+ ```python
132
+ from memorylayer_langchain import MemoryLayerChatMessageHistory
133
+
134
+ history = MemoryLayerChatMessageHistory(
135
+ session_id="conversation_1",
136
+ base_url="https://api.memorylayer.ai",
137
+ api_key="your-api-key",
138
+ workspace_id="ws_123",
139
+ )
140
+
141
+ # Add messages
142
+ history.add_user_message("Hello!")
143
+ history.add_ai_message("Hi there! How can I help?")
144
+
145
+ # Retrieve all messages
146
+ messages = history.messages
147
+ for msg in messages:
148
+ print(f"{msg.type}: {msg.content}")
149
+
150
+ # Clear history
151
+ history.clear()
152
+ ```
153
+
154
+ ### MemoryLayerMemory
155
+
156
+ LangChain BaseMemory implementation for legacy chains. Drop-in replacement for `ConversationBufferMemory`.
157
+
158
+ ```python
159
+ from memorylayer_langchain import MemoryLayerMemory
160
+
161
+ memory = MemoryLayerMemory(
162
+ session_id="user_123_conversation",
163
+ base_url="https://api.memorylayer.ai",
164
+ api_key="your-api-key",
165
+ workspace_id="ws_123",
166
+ return_messages=False, # True for message objects
167
+ human_prefix="User",
168
+ ai_prefix="Assistant",
169
+ )
170
+
171
+ # Save conversation turn
172
+ memory.save_context(
173
+ inputs={"input": "What's Python?"},
174
+ outputs={"output": "Python is a programming language."},
175
+ )
176
+
177
+ # Load memory variables
178
+ history = memory.load_memory_variables({})
179
+ print(history["history"])
180
+ # Output: User: What's Python?
181
+ # Assistant: Python is a programming language.
182
+ ```
183
+
184
+ ### MemoryLayerConversationSummaryMemory
185
+
186
+ Returns AI-generated summaries instead of full conversation history. Useful for long conversations that would exceed context windows.
187
+
188
+ ```python
189
+ from memorylayer_langchain import MemoryLayerConversationSummaryMemory
190
+
191
+ memory = MemoryLayerConversationSummaryMemory(
192
+ session_id="long_conversation",
193
+ base_url="https://api.memorylayer.ai",
194
+ api_key="your-api-key",
195
+ workspace_id="ws_123",
196
+ max_tokens=500,
197
+ summary_prompt="Summarize the key points from this conversation.",
198
+ )
199
+
200
+ # After many conversation turns...
201
+ summary = memory.load_memory_variables({})
202
+ print(summary["history"]) # Concise AI-generated summary
203
+ ```
204
+
205
+ ## Configuration Options
206
+
207
+ ### Common Parameters
208
+
209
+ | Parameter | Description | Default |
210
+ |-----------|-------------|---------|
211
+ | `session_id` | Unique identifier for the conversation session | Required |
212
+ | `base_url` | MemoryLayer API base URL | `http://localhost:8080` |
213
+ | `api_key` | API key for authentication | `None` |
214
+ | `workspace_id` | Workspace ID for multi-tenant isolation | `None` |
215
+ | `timeout` | Request timeout in seconds | `30.0` |
216
+ | `memory_tags` | Additional tags for stored memories | `[]` |
217
+
218
+ ### MemoryLayerMemory Options
219
+
220
+ | Parameter | Description | Default |
221
+ |-----------|-------------|---------|
222
+ | `memory_key` | Key for memory variables | `"history"` |
223
+ | `return_messages` | Return message objects vs string | `False` |
224
+ | `human_prefix` | Prefix for human messages | `"Human"` |
225
+ | `ai_prefix` | Prefix for AI messages | `"AI"` |
226
+ | `input_key` | Custom input key | `None` |
227
+ | `output_key` | Custom output key | `None` |
228
+
229
+ ### MemoryLayerConversationSummaryMemory Options
230
+
231
+ | Parameter | Description | Default |
232
+ |-----------|-------------|---------|
233
+ | `max_tokens` | Maximum tokens in summary | `500` |
234
+ | `summary_prompt` | Custom summarization prompt | Built-in |
235
+ | `include_sources` | Include source memory IDs | `False` |
236
+
237
+ ## Advanced Usage
238
+
239
+ ### Custom Memory Tags
240
+
241
+ Tag messages for cross-session filtering and organization:
242
+
243
+ ```python
244
+ history = MemoryLayerChatMessageHistory(
245
+ session_id="support_ticket_456",
246
+ base_url="https://api.memorylayer.ai",
247
+ api_key="your-api-key",
248
+ workspace_id="ws_123",
249
+ memory_tags=["customer:enterprise", "topic:billing", "priority:high"],
250
+ )
251
+ ```
252
+
253
+ ### Multi-Session Management
254
+
255
+ Track multiple conversations independently within the same workspace:
256
+
257
+ ```python
258
+ # Session for user 1
259
+ user1_history = MemoryLayerChatMessageHistory(
260
+ session_id="user_1_main",
261
+ base_url="https://api.memorylayer.ai",
262
+ api_key="your-api-key",
263
+ workspace_id="ws_123",
264
+ )
265
+
266
+ # Session for user 2 - completely isolated
267
+ user2_history = MemoryLayerChatMessageHistory(
268
+ session_id="user_2_main",
269
+ base_url="https://api.memorylayer.ai",
270
+ api_key="your-api-key",
271
+ workspace_id="ws_123",
272
+ )
273
+ ```
274
+
275
+ ### Streaming Support
276
+
277
+ LCEL chains with RunnableWithMessageHistory support streaming natively:
278
+
279
+ ```python
280
+ chain_with_history = RunnableWithMessageHistory(
281
+ runnable=chain,
282
+ get_session_history=lambda session_id: MemoryLayerChatMessageHistory(
283
+ session_id=session_id,
284
+ base_url="https://api.memorylayer.ai",
285
+ api_key="your-api-key",
286
+ workspace_id="ws_123",
287
+ ),
288
+ input_messages_key="input",
289
+ history_messages_key="history",
290
+ )
291
+
292
+ # Stream the response
293
+ for chunk in chain_with_history.stream(
294
+ {"input": "Tell me a story"},
295
+ config={"configurable": {"session_id": "story_session"}},
296
+ ):
297
+ print(chunk.content, end="", flush=True)
298
+ ```
299
+
300
+ ### Custom Input/Output Keys
301
+
302
+ Match your chain's key names:
303
+
304
+ ```python
305
+ memory = MemoryLayerMemory(
306
+ session_id="qa_session",
307
+ base_url="https://api.memorylayer.ai",
308
+ api_key="your-api-key",
309
+ workspace_id="ws_123",
310
+ input_key="question",
311
+ output_key="answer",
312
+ memory_key="chat_history",
313
+ )
314
+
315
+ memory.save_context(
316
+ inputs={"question": "What is Python?"},
317
+ outputs={"answer": "Python is a programming language."},
318
+ )
319
+
320
+ result = memory.load_memory_variables({})
321
+ print(result["chat_history"])
322
+ ```
323
+
324
+ ### Synchronous Client
325
+
326
+ For direct API access without LangChain abstractions:
327
+
328
+ ```python
329
+ from memorylayer_langchain import SyncMemoryLayerClient, sync_client
330
+
331
+ # Using context manager
332
+ with sync_client(
333
+ base_url="https://api.memorylayer.ai",
334
+ api_key="your-api-key",
335
+ workspace_id="ws_123",
336
+ ) as client:
337
+ # Store a memory
338
+ memory = client.remember(
339
+ content="User prefers Python for backend development",
340
+ type="semantic",
341
+ importance=0.8,
342
+ tags=["preferences", "programming"],
343
+ )
344
+
345
+ # Search memories
346
+ results = client.recall(
347
+ query="what programming language does the user prefer?",
348
+ limit=5,
349
+ )
350
+
351
+ # Get a summary
352
+ reflection = client.reflect(
353
+ query="summarize user's technology preferences",
354
+ max_tokens=300,
355
+ )
356
+ ```
357
+
358
+ ## Migration from LangChain Memory
359
+
360
+ ### From ConversationBufferMemory
361
+
362
+ ```python
363
+ # Before (LangChain built-in - not persistent)
364
+ from langchain.memory import ConversationBufferMemory
365
+ memory = ConversationBufferMemory()
366
+
367
+ # After (MemoryLayer - persistent)
368
+ from memorylayer_langchain import MemoryLayerMemory
369
+ memory = MemoryLayerMemory(
370
+ session_id="my_session",
371
+ base_url="https://api.memorylayer.ai",
372
+ api_key="your-api-key",
373
+ workspace_id="ws_123",
374
+ )
375
+ ```
376
+
377
+ ### From ConversationSummaryMemory
378
+
379
+ ```python
380
+ # Before (LangChain built-in - not persistent)
381
+ from langchain.memory import ConversationSummaryMemory
382
+ memory = ConversationSummaryMemory(llm=llm)
383
+
384
+ # After (MemoryLayer - persistent)
385
+ from memorylayer_langchain import MemoryLayerConversationSummaryMemory
386
+ memory = MemoryLayerConversationSummaryMemory(
387
+ session_id="my_session",
388
+ base_url="https://api.memorylayer.ai",
389
+ api_key="your-api-key",
390
+ workspace_id="ws_123",
391
+ )
392
+ ```
393
+
394
+ ## Why MemoryLayer?
395
+
396
+ ### Problem: LangChain Memory Doesn't Persist
397
+
398
+ Standard LangChain memory is lost when your application restarts:
399
+
400
+ ```python
401
+ # LangChain's built-in memory
402
+ memory = ConversationBufferMemory()
403
+ chain = ConversationChain(llm=llm, memory=memory)
404
+ chain.run("My name is Alice") # Memory stored in RAM
405
+
406
+ # Application restarts... memory is gone!
407
+ ```
408
+
409
+ ### Solution: MemoryLayer Provides True Persistence
410
+
411
+ ```python
412
+ # MemoryLayer integration
413
+ memory = MemoryLayerMemory(session_id="alice_session", ...)
414
+ chain = ConversationChain(llm=llm, memory=memory)
415
+ chain.run("My name is Alice") # Memory stored in MemoryLayer
416
+
417
+ # Application restarts... memory is preserved!
418
+ memory2 = MemoryLayerMemory(session_id="alice_session", ...)
419
+ # Alice's conversation history is still available
420
+ ```
421
+
422
+ ### Additional Benefits
423
+
424
+ - **Stable API** - LangChain memory interfaces change frequently. MemoryLayer provides a stable abstraction.
425
+ - **Cross-Platform** - Access the same memories from Python, TypeScript, or any HTTP client.
426
+ - **Rich Memory Types** - Beyond simple chat history: semantic, episodic, procedural memories with relationships.
427
+ - **Built-in Search** - Semantic search across all stored memories.
428
+ - **Reflection** - AI-powered synthesis and summarization of memories.
429
+
430
+ ## Development
431
+
432
+ ```bash
433
+ # Install development dependencies
434
+ pip install -e ".[dev]"
435
+
436
+ # Run tests
437
+ pytest
438
+
439
+ # Type checking
440
+ mypy src/memorylayer_langchain
441
+
442
+ # Linting
443
+ ruff check src/memorylayer_langchain
444
+ ```
445
+
446
+ ## Examples
447
+
448
+ See the `examples/` directory for complete working examples:
449
+
450
+ - `lcel_example.py` - Modern LCEL chains with RunnableWithMessageHistory
451
+ - `legacy_chain_example.py` - Legacy ConversationChain integration
452
+ - `summary_memory_example.py` - Conversation summary memory usage
453
+
454
+ ## License
455
+
456
+ Apache 2.0 License - see LICENSE file for details.
457
+
458
+ ## Links
459
+
460
+ - [MemoryLayer Documentation](https://docs.memorylayer.ai)
461
+ - [LangChain Documentation](https://python.langchain.com)
462
+ - [GitHub](https://github.com/scitrera/memorylayer)
463
+ - [Homepage](https://memorylayer.ai)