memorylayer-py 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,241 @@
1
+ Metadata-Version: 2.4
2
+ Name: memorylayer-py
3
+ Version: 0.1.0
4
+ Summary: Privacy-first memory API for LLMs
5
+ Author-email: "rec0.ai" <hello@rec0.ai>
6
+ License: MIT
7
+ Project-URL: Homepage, https://rec0.ai
8
+ Project-URL: Documentation, https://docs.rec0.ai
9
+ Project-URL: Repository, https://github.com/rec0ai/rec0-python
10
+ Project-URL: Bug Tracker, https://github.com/rec0ai/rec0-python/issues
11
+ Keywords: llm,memory,ai,privacy,rag
12
+ Classifier: Development Status :: 4 - Beta
13
+ Classifier: Intended Audience :: Developers
14
+ Classifier: License :: OSI Approved :: MIT License
15
+ Classifier: Programming Language :: Python :: 3
16
+ Classifier: Programming Language :: Python :: 3.9
17
+ Classifier: Programming Language :: Python :: 3.10
18
+ Classifier: Programming Language :: Python :: 3.11
19
+ Classifier: Programming Language :: Python :: 3.12
20
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
21
+ Requires-Python: >=3.9
22
+ Description-Content-Type: text/markdown
23
+ Requires-Dist: requests>=2.28.0
24
+ Requires-Dist: httpx>=0.24.0
25
+ Provides-Extra: dev
26
+ Requires-Dist: pytest>=7.0; extra == "dev"
27
+ Requires-Dist: responses>=0.25.0; extra == "dev"
28
+ Requires-Dist: pytest-asyncio>=0.23.0; extra == "dev"
29
+
30
+ # rec0 — memory for any LLM
31
+
32
+ > Give your AI a permanent memory in 3 lines of code.
33
+
34
+ [![PyPI version](https://img.shields.io/pypi/v/rec0.svg)](https://pypi.org/project/rec0/)
35
+ [![Python 3.9+](https://img.shields.io/badge/python-3.9+-blue.svg)](https://python.org)
36
+ [![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](LICENSE)
37
+
38
+ ## Install
39
+
40
+ ```bash
41
+ pip install rec0
42
+ ```
43
+
44
+ ## Quickstart
45
+
46
+ ```python
47
+ from rec0 import Memory
48
+
49
+ mem = Memory(api_key="r0_xxx", user_id="user_123")
50
+ mem.store("User prefers Python and dark mode")
51
+ context = mem.context("user preferences")
52
+ # inject context into your LLM prompt — done
53
+ ```
54
+
55
+ That's it. `context` returns a bullet-list string ready to prepend to any system prompt.
56
+
57
+ ---
58
+
59
+ ## Why rec0
60
+
61
+ | | rec0 | Mem0 |
62
+ |---|---|---|
63
+ | **Privacy** | Data never leaves your servers | Processed externally |
64
+ | **Cost** | $0.002 / 1K ops | ~$0.10 / 1K ops |
65
+ | **Setup** | 3 lines | OAuth + config |
66
+ | **LLM support** | Any model | OpenAI-first |
67
+ | **GDPR** | 1 API call | Manual |
68
+
69
+ - **Privacy-first:** embeddings and summaries run on YOUR infrastructure — no user data touches third-party APIs
70
+ - **LLM-agnostic:** works with OpenAI, Anthropic, Gemini, Llama, Mistral — anything that takes a string
71
+ - **Memory lifecycle:** automatic importance scoring, recall-count boosting, and time-based decay
72
+ - **GDPR compliant:** right-to-erasure in one call (`mem.delete_user()`)
73
+
74
+ ---
75
+
76
+ ## Full API reference
77
+
78
+ ### `Memory(user_id, api_key, app_id, base_url)`
79
+
80
+ | Parameter | Type | Default | Description |
81
+ |---|---|---|---|
82
+ | `user_id` | `str` | required | Your end-user identifier |
83
+ | `api_key` | `str` | `$REC0_API_KEY` | Your rec0 API key |
84
+ | `app_id` | `str` | `"default"` | Namespace for multi-app isolation |
85
+ | `base_url` | `str` | prod URL | Override for self-hosting |
86
+
87
+ ### Methods
88
+
89
+ #### `mem.store(content)` → `MemoryObject`
90
+ Store a new memory. Auto-generates embedding and summary server-side.
91
+
92
+ ```python
93
+ m = mem.store("User is building a SaaS product in Python")
94
+ print(m.id) # UUID
95
+ print(m.importance) # starts at 1.0, increases with each recall
96
+ ```
97
+
98
+ #### `mem.context(query, limit=5)` → `str`
99
+ **The most-used method.** Returns a bullet-list string to inject into your LLM prompt.
100
+
101
+ ```python
102
+ context = mem.context("what does the user like", limit=5)
103
+ # "- User prefers Python and dark mode\n- User is building a SaaS product"
104
+
105
+ # Typical usage with OpenAI:
106
+ messages = [
107
+ {"role": "system", "content": f"User context:\n{context}"},
108
+ {"role": "user", "content": user_message},
109
+ ]
110
+ ```
111
+
112
+ #### `mem.recall(query, limit=5)` → `List[MemoryObject]`
113
+ Returns memories ranked by semantic similarity. Use when you need scores or metadata.
114
+
115
+ ```python
116
+ memories = mem.recall("programming preferences", limit=3)
117
+ for m in memories:
118
+ print(f"{m.content} (score: {m.relevance_score})")
119
+ ```
120
+
121
+ #### `mem.list()` → `List[MemoryObject]`
122
+ All active memories for this user, ordered by creation time.
123
+
124
+ #### `mem.delete(memory_id)` → `None`
125
+ Soft-delete a specific memory (retained for audit trail).
126
+
127
+ #### `mem.delete_user()` → `dict`
128
+ GDPR right-to-erasure. Removes all memories for this user.
129
+
130
+ #### `mem.export()` → `dict`
131
+ GDPR data export. Returns all memory data as a dictionary.
132
+
133
+ #### `mem.ping()` → `bool`
134
+ Connectivity check. Returns `True` if the API is reachable.
135
+
136
+ ```python
137
+ if not mem.ping():
138
+ print("rec0 API unreachable — check your key")
139
+ ```
140
+
141
+ ---
142
+
143
+ ## Error handling
144
+
145
+ ```python
146
+ from rec0 import Memory, Rec0Error, AuthError, RateLimitError, NotFoundError
147
+
148
+ mem = Memory(api_key="r0_xxx", user_id="user_123")
149
+
150
+ try:
151
+ mem.store("User loves rec0")
152
+ except AuthError:
153
+ print("Invalid API key — check REC0_API_KEY")
154
+ except RateLimitError as e:
155
+ print(f"Rate limited — retry in {e.retry_after}s")
156
+ except NotFoundError:
157
+ print("Memory not found")
158
+ except Rec0Error as e:
159
+ print(f"Unexpected error: {e}")
160
+ ```
161
+
162
+ Rate limits are handled automatically: rec0 will wait `retry_after` seconds and retry once before raising.
163
+
164
+ ---
165
+
166
+ ## Async usage
167
+
168
+ Every method has an async equivalent via `AsyncMemory`:
169
+
170
+ ```python
171
+ import asyncio
172
+ from rec0 import AsyncMemory
173
+
174
+ async def main():
175
+ mem = AsyncMemory(api_key="r0_xxx", user_id="user_123")
176
+ await mem.store("User is a night-owl developer")
177
+ context = await mem.context("when does the user work")
178
+ print(context)
179
+
180
+ asyncio.run(main())
181
+ ```
182
+
183
+ `AsyncMemory` uses `httpx` under the hood and is safe to use in FastAPI, Django async views, and any `asyncio` application.
184
+
185
+ ---
186
+
187
+ ## Environment variables
188
+
189
+ | Variable | Description |
190
+ |---|---|
191
+ | `REC0_API_KEY` | Your rec0 API key (used automatically if `api_key=` not passed) |
192
+ | `REC0_BASE_URL` | Override the API base URL (optional, for self-hosting) |
193
+
194
+ ```bash
195
+ export REC0_API_KEY=r0_your_key_here
196
+ ```
197
+
198
+ ```python
199
+ # api_key is now auto-loaded — no need to hardcode it
200
+ mem = Memory(user_id="user_123")
201
+ ```
202
+
203
+ ---
204
+
205
+ ## MemoryObject fields
206
+
207
+ | Field | Type | Description |
208
+ |---|---|---|
209
+ | `id` | `str` | UUID |
210
+ | `content` | `str` | The original memory text |
211
+ | `summary` | `str \| None` | Auto-generated summary |
212
+ | `importance` | `float` | 1.0–10.0; increases with recall |
213
+ | `recall_count` | `int` | Times this memory was recalled |
214
+ | `relevance_score` | `float \| None` | Similarity score (recall only) |
215
+ | `created_at` | `datetime` | When stored |
216
+ | `is_active` | `bool` | False if deleted |
217
+
218
+ ---
219
+
220
+ ## Self-hosting
221
+
222
+ rec0 is open-source. Deploy your own instance on Railway, Fly, or any server:
223
+
224
+ ```bash
225
+ git clone https://github.com/patelyash2511/memorylayer
226
+ # See README for Railway deployment instructions
227
+ ```
228
+
229
+ Then point the SDK at your instance:
230
+
231
+ ```python
232
+ mem = Memory(
233
+ api_key="your_key",
234
+ user_id="user_123",
235
+ base_url="https://your-instance.up.railway.app",
236
+ )
237
+ ```
238
+
239
+ ---
240
+
241
+ [rec0.ai](https://rec0.ai) · [docs](https://docs.rec0.ai) · [discord](https://discord.gg/rec0) · [twitter](https://twitter.com/rec0ai)
@@ -0,0 +1,212 @@
1
+ # rec0 — memory for any LLM
2
+
3
+ > Give your AI a permanent memory in 3 lines of code.
4
+
5
+ [![PyPI version](https://img.shields.io/pypi/v/rec0.svg)](https://pypi.org/project/rec0/)
6
+ [![Python 3.9+](https://img.shields.io/badge/python-3.9+-blue.svg)](https://python.org)
7
+ [![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](LICENSE)
8
+
9
+ ## Install
10
+
11
+ ```bash
12
+ pip install rec0
13
+ ```
14
+
15
+ ## Quickstart
16
+
17
+ ```python
18
+ from rec0 import Memory
19
+
20
+ mem = Memory(api_key="r0_xxx", user_id="user_123")
21
+ mem.store("User prefers Python and dark mode")
22
+ context = mem.context("user preferences")
23
+ # inject context into your LLM prompt — done
24
+ ```
25
+
26
+ That's it. `context` returns a bullet-list string ready to prepend to any system prompt.
27
+
28
+ ---
29
+
30
+ ## Why rec0
31
+
32
+ | | rec0 | Mem0 |
33
+ |---|---|---|
34
+ | **Privacy** | Data never leaves your servers | Processed externally |
35
+ | **Cost** | $0.002 / 1K ops | ~$0.10 / 1K ops |
36
+ | **Setup** | 3 lines | OAuth + config |
37
+ | **LLM support** | Any model | OpenAI-first |
38
+ | **GDPR** | 1 API call | Manual |
39
+
40
+ - **Privacy-first:** embeddings and summaries run on YOUR infrastructure — no user data touches third-party APIs
41
+ - **LLM-agnostic:** works with OpenAI, Anthropic, Gemini, Llama, Mistral — anything that takes a string
42
+ - **Memory lifecycle:** automatic importance scoring, recall-count boosting, and time-based decay
43
+ - **GDPR compliant:** right-to-erasure in one call (`mem.delete_user()`)
44
+
45
+ ---
46
+
47
+ ## Full API reference
48
+
49
+ ### `Memory(user_id, api_key, app_id, base_url)`
50
+
51
+ | Parameter | Type | Default | Description |
52
+ |---|---|---|---|
53
+ | `user_id` | `str` | required | Your end-user identifier |
54
+ | `api_key` | `str` | `$REC0_API_KEY` | Your rec0 API key |
55
+ | `app_id` | `str` | `"default"` | Namespace for multi-app isolation |
56
+ | `base_url` | `str` | prod URL | Override for self-hosting |
57
+
58
+ ### Methods
59
+
60
+ #### `mem.store(content)` → `MemoryObject`
61
+ Store a new memory. Auto-generates embedding and summary server-side.
62
+
63
+ ```python
64
+ m = mem.store("User is building a SaaS product in Python")
65
+ print(m.id) # UUID
66
+ print(m.importance) # starts at 1.0, increases with each recall
67
+ ```
68
+
69
+ #### `mem.context(query, limit=5)` → `str`
70
+ **The most-used method.** Returns a bullet-list string to inject into your LLM prompt.
71
+
72
+ ```python
73
+ context = mem.context("what does the user like", limit=5)
74
+ # "- User prefers Python and dark mode\n- User is building a SaaS product"
75
+
76
+ # Typical usage with OpenAI:
77
+ messages = [
78
+ {"role": "system", "content": f"User context:\n{context}"},
79
+ {"role": "user", "content": user_message},
80
+ ]
81
+ ```
82
+
83
+ #### `mem.recall(query, limit=5)` → `List[MemoryObject]`
84
+ Returns memories ranked by semantic similarity. Use when you need scores or metadata.
85
+
86
+ ```python
87
+ memories = mem.recall("programming preferences", limit=3)
88
+ for m in memories:
89
+ print(f"{m.content} (score: {m.relevance_score})")
90
+ ```
91
+
92
+ #### `mem.list()` → `List[MemoryObject]`
93
+ All active memories for this user, ordered by creation time.
94
+
95
+ #### `mem.delete(memory_id)` → `None`
96
+ Soft-delete a specific memory (retained for audit trail).
97
+
98
+ #### `mem.delete_user()` → `dict`
99
+ GDPR right-to-erasure. Removes all memories for this user.
100
+
101
+ #### `mem.export()` → `dict`
102
+ GDPR data export. Returns all memory data as a dictionary.
103
+
104
+ #### `mem.ping()` → `bool`
105
+ Connectivity check. Returns `True` if the API is reachable.
106
+
107
+ ```python
108
+ if not mem.ping():
109
+ print("rec0 API unreachable — check your key")
110
+ ```
111
+
112
+ ---
113
+
114
+ ## Error handling
115
+
116
+ ```python
117
+ from rec0 import Memory, Rec0Error, AuthError, RateLimitError, NotFoundError
118
+
119
+ mem = Memory(api_key="r0_xxx", user_id="user_123")
120
+
121
+ try:
122
+ mem.store("User loves rec0")
123
+ except AuthError:
124
+ print("Invalid API key — check REC0_API_KEY")
125
+ except RateLimitError as e:
126
+ print(f"Rate limited — retry in {e.retry_after}s")
127
+ except NotFoundError:
128
+ print("Memory not found")
129
+ except Rec0Error as e:
130
+ print(f"Unexpected error: {e}")
131
+ ```
132
+
133
+ Rate limits are handled automatically: rec0 will wait `retry_after` seconds and retry once before raising.
134
+
135
+ ---
136
+
137
+ ## Async usage
138
+
139
+ Every method has an async equivalent via `AsyncMemory`:
140
+
141
+ ```python
142
+ import asyncio
143
+ from rec0 import AsyncMemory
144
+
145
+ async def main():
146
+ mem = AsyncMemory(api_key="r0_xxx", user_id="user_123")
147
+ await mem.store("User is a night-owl developer")
148
+ context = await mem.context("when does the user work")
149
+ print(context)
150
+
151
+ asyncio.run(main())
152
+ ```
153
+
154
+ `AsyncMemory` uses `httpx` under the hood and is safe to use in FastAPI, Django async views, and any `asyncio` application.
155
+
156
+ ---
157
+
158
+ ## Environment variables
159
+
160
+ | Variable | Description |
161
+ |---|---|
162
+ | `REC0_API_KEY` | Your rec0 API key (used automatically if `api_key=` not passed) |
163
+ | `REC0_BASE_URL` | Override the API base URL (optional, for self-hosting) |
164
+
165
+ ```bash
166
+ export REC0_API_KEY=r0_your_key_here
167
+ ```
168
+
169
+ ```python
170
+ # api_key is now auto-loaded — no need to hardcode it
171
+ mem = Memory(user_id="user_123")
172
+ ```
173
+
174
+ ---
175
+
176
+ ## MemoryObject fields
177
+
178
+ | Field | Type | Description |
179
+ |---|---|---|
180
+ | `id` | `str` | UUID |
181
+ | `content` | `str` | The original memory text |
182
+ | `summary` | `str \| None` | Auto-generated summary |
183
+ | `importance` | `float` | 1.0–10.0; increases with recall |
184
+ | `recall_count` | `int` | Times this memory was recalled |
185
+ | `relevance_score` | `float \| None` | Similarity score (recall only) |
186
+ | `created_at` | `datetime` | When stored |
187
+ | `is_active` | `bool` | False if deleted |
188
+
189
+ ---
190
+
191
+ ## Self-hosting
192
+
193
+ rec0 is open-source. Deploy your own instance on Railway, Fly, or any server:
194
+
195
+ ```bash
196
+ git clone https://github.com/patelyash2511/memorylayer
197
+ # See README for Railway deployment instructions
198
+ ```
199
+
200
+ Then point the SDK at your instance:
201
+
202
+ ```python
203
+ mem = Memory(
204
+ api_key="your_key",
205
+ user_id="user_123",
206
+ base_url="https://your-instance.up.railway.app",
207
+ )
208
+ ```
209
+
210
+ ---
211
+
212
+ [rec0.ai](https://rec0.ai) · [docs](https://docs.rec0.ai) · [discord](https://discord.gg/rec0) · [twitter](https://twitter.com/rec0ai)
@@ -0,0 +1,241 @@
1
+ Metadata-Version: 2.4
2
+ Name: memorylayer-py
3
+ Version: 0.1.0
4
+ Summary: Privacy-first memory API for LLMs
5
+ Author-email: "rec0.ai" <hello@rec0.ai>
6
+ License: MIT
7
+ Project-URL: Homepage, https://rec0.ai
8
+ Project-URL: Documentation, https://docs.rec0.ai
9
+ Project-URL: Repository, https://github.com/rec0ai/rec0-python
10
+ Project-URL: Bug Tracker, https://github.com/rec0ai/rec0-python/issues
11
+ Keywords: llm,memory,ai,privacy,rag
12
+ Classifier: Development Status :: 4 - Beta
13
+ Classifier: Intended Audience :: Developers
14
+ Classifier: License :: OSI Approved :: MIT License
15
+ Classifier: Programming Language :: Python :: 3
16
+ Classifier: Programming Language :: Python :: 3.9
17
+ Classifier: Programming Language :: Python :: 3.10
18
+ Classifier: Programming Language :: Python :: 3.11
19
+ Classifier: Programming Language :: Python :: 3.12
20
+ Classifier: Topic :: Software Development :: Libraries :: Python Modules
21
+ Requires-Python: >=3.9
22
+ Description-Content-Type: text/markdown
23
+ Requires-Dist: requests>=2.28.0
24
+ Requires-Dist: httpx>=0.24.0
25
+ Provides-Extra: dev
26
+ Requires-Dist: pytest>=7.0; extra == "dev"
27
+ Requires-Dist: responses>=0.25.0; extra == "dev"
28
+ Requires-Dist: pytest-asyncio>=0.23.0; extra == "dev"
29
+
30
+ # rec0 — memory for any LLM
31
+
32
+ > Give your AI a permanent memory in 3 lines of code.
33
+
34
+ [![PyPI version](https://img.shields.io/pypi/v/rec0.svg)](https://pypi.org/project/rec0/)
35
+ [![Python 3.9+](https://img.shields.io/badge/python-3.9+-blue.svg)](https://python.org)
36
+ [![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](LICENSE)
37
+
38
+ ## Install
39
+
40
+ ```bash
41
+ pip install rec0
42
+ ```
43
+
44
+ ## Quickstart
45
+
46
+ ```python
47
+ from rec0 import Memory
48
+
49
+ mem = Memory(api_key="r0_xxx", user_id="user_123")
50
+ mem.store("User prefers Python and dark mode")
51
+ context = mem.context("user preferences")
52
+ # inject context into your LLM prompt — done
53
+ ```
54
+
55
+ That's it. `context` returns a bullet-list string ready to prepend to any system prompt.
56
+
57
+ ---
58
+
59
+ ## Why rec0
60
+
61
+ | | rec0 | Mem0 |
62
+ |---|---|---|
63
+ | **Privacy** | Data never leaves your servers | Processed externally |
64
+ | **Cost** | $0.002 / 1K ops | ~$0.10 / 1K ops |
65
+ | **Setup** | 3 lines | OAuth + config |
66
+ | **LLM support** | Any model | OpenAI-first |
67
+ | **GDPR** | 1 API call | Manual |
68
+
69
+ - **Privacy-first:** embeddings and summaries run on YOUR infrastructure — no user data touches third-party APIs
70
+ - **LLM-agnostic:** works with OpenAI, Anthropic, Gemini, Llama, Mistral — anything that takes a string
71
+ - **Memory lifecycle:** automatic importance scoring, recall-count boosting, and time-based decay
72
+ - **GDPR compliant:** right-to-erasure in one call (`mem.delete_user()`)
73
+
74
+ ---
75
+
76
+ ## Full API reference
77
+
78
+ ### `Memory(user_id, api_key, app_id, base_url)`
79
+
80
+ | Parameter | Type | Default | Description |
81
+ |---|---|---|---|
82
+ | `user_id` | `str` | required | Your end-user identifier |
83
+ | `api_key` | `str` | `$REC0_API_KEY` | Your rec0 API key |
84
+ | `app_id` | `str` | `"default"` | Namespace for multi-app isolation |
85
+ | `base_url` | `str` | prod URL | Override for self-hosting |
86
+
87
+ ### Methods
88
+
89
+ #### `mem.store(content)` → `MemoryObject`
90
+ Store a new memory. Auto-generates embedding and summary server-side.
91
+
92
+ ```python
93
+ m = mem.store("User is building a SaaS product in Python")
94
+ print(m.id) # UUID
95
+ print(m.importance) # starts at 1.0, increases with each recall
96
+ ```
97
+
98
+ #### `mem.context(query, limit=5)` → `str`
99
+ **The most-used method.** Returns a bullet-list string to inject into your LLM prompt.
100
+
101
+ ```python
102
+ context = mem.context("what does the user like", limit=5)
103
+ # "- User prefers Python and dark mode\n- User is building a SaaS product"
104
+
105
+ # Typical usage with OpenAI:
106
+ messages = [
107
+ {"role": "system", "content": f"User context:\n{context}"},
108
+ {"role": "user", "content": user_message},
109
+ ]
110
+ ```
111
+
112
+ #### `mem.recall(query, limit=5)` → `List[MemoryObject]`
113
+ Returns memories ranked by semantic similarity. Use when you need scores or metadata.
114
+
115
+ ```python
116
+ memories = mem.recall("programming preferences", limit=3)
117
+ for m in memories:
118
+ print(f"{m.content} (score: {m.relevance_score})")
119
+ ```
120
+
121
+ #### `mem.list()` → `List[MemoryObject]`
122
+ All active memories for this user, ordered by creation time.
123
+
124
+ #### `mem.delete(memory_id)` → `None`
125
+ Soft-delete a specific memory (retained for audit trail).
126
+
127
+ #### `mem.delete_user()` → `dict`
128
+ GDPR right-to-erasure. Removes all memories for this user.
129
+
130
+ #### `mem.export()` → `dict`
131
+ GDPR data export. Returns all memory data as a dictionary.
132
+
133
+ #### `mem.ping()` → `bool`
134
+ Connectivity check. Returns `True` if the API is reachable.
135
+
136
+ ```python
137
+ if not mem.ping():
138
+ print("rec0 API unreachable — check your key")
139
+ ```
140
+
141
+ ---
142
+
143
+ ## Error handling
144
+
145
+ ```python
146
+ from rec0 import Memory, Rec0Error, AuthError, RateLimitError, NotFoundError
147
+
148
+ mem = Memory(api_key="r0_xxx", user_id="user_123")
149
+
150
+ try:
151
+ mem.store("User loves rec0")
152
+ except AuthError:
153
+ print("Invalid API key — check REC0_API_KEY")
154
+ except RateLimitError as e:
155
+ print(f"Rate limited — retry in {e.retry_after}s")
156
+ except NotFoundError:
157
+ print("Memory not found")
158
+ except Rec0Error as e:
159
+ print(f"Unexpected error: {e}")
160
+ ```
161
+
162
+ Rate limits are handled automatically: rec0 will wait `retry_after` seconds and retry once before raising.
163
+
164
+ ---
165
+
166
+ ## Async usage
167
+
168
+ Every method has an async equivalent via `AsyncMemory`:
169
+
170
+ ```python
171
+ import asyncio
172
+ from rec0 import AsyncMemory
173
+
174
+ async def main():
175
+ mem = AsyncMemory(api_key="r0_xxx", user_id="user_123")
176
+ await mem.store("User is a night-owl developer")
177
+ context = await mem.context("when does the user work")
178
+ print(context)
179
+
180
+ asyncio.run(main())
181
+ ```
182
+
183
+ `AsyncMemory` uses `httpx` under the hood and is safe to use in FastAPI, Django async views, and any `asyncio` application.
184
+
185
+ ---
186
+
187
+ ## Environment variables
188
+
189
+ | Variable | Description |
190
+ |---|---|
191
+ | `REC0_API_KEY` | Your rec0 API key (used automatically if `api_key=` not passed) |
192
+ | `REC0_BASE_URL` | Override the API base URL (optional, for self-hosting) |
193
+
194
+ ```bash
195
+ export REC0_API_KEY=r0_your_key_here
196
+ ```
197
+
198
+ ```python
199
+ # api_key is now auto-loaded — no need to hardcode it
200
+ mem = Memory(user_id="user_123")
201
+ ```
202
+
203
+ ---
204
+
205
+ ## MemoryObject fields
206
+
207
+ | Field | Type | Description |
208
+ |---|---|---|
209
+ | `id` | `str` | UUID |
210
+ | `content` | `str` | The original memory text |
211
+ | `summary` | `str \| None` | Auto-generated summary |
212
+ | `importance` | `float` | 1.0–10.0; increases with recall |
213
+ | `recall_count` | `int` | Times this memory was recalled |
214
+ | `relevance_score` | `float \| None` | Similarity score (recall only) |
215
+ | `created_at` | `datetime` | When stored |
216
+ | `is_active` | `bool` | False if deleted |
217
+
218
+ ---
219
+
220
+ ## Self-hosting
221
+
222
+ rec0 is open-source. Deploy your own instance on Railway, Fly, or any server:
223
+
224
+ ```bash
225
+ git clone https://github.com/patelyash2511/memorylayer
226
+ # See README for Railway deployment instructions
227
+ ```
228
+
229
+ Then point the SDK at your instance:
230
+
231
+ ```python
232
+ mem = Memory(
233
+ api_key="your_key",
234
+ user_id="user_123",
235
+ base_url="https://your-instance.up.railway.app",
236
+ )
237
+ ```
238
+
239
+ ---
240
+
241
+ [rec0.ai](https://rec0.ai) · [docs](https://docs.rec0.ai) · [discord](https://discord.gg/rec0) · [twitter](https://twitter.com/rec0ai)
@@ -0,0 +1,15 @@
1
+ README.md
2
+ pyproject.toml
3
+ memorylayer_py.egg-info/PKG-INFO
4
+ memorylayer_py.egg-info/SOURCES.txt
5
+ memorylayer_py.egg-info/dependency_links.txt
6
+ memorylayer_py.egg-info/requires.txt
7
+ memorylayer_py.egg-info/top_level.txt
8
+ rec0/__init__.py
9
+ rec0/async_client.py
10
+ rec0/client.py
11
+ rec0/exceptions.py
12
+ rec0/models.py
13
+ rec0/version.py
14
+ tests/test_client.py
15
+ tests/test_integration.py