llmemory 0.1.13 → 0.1.15

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: a188251e04aac90f929fcc952f7902a8e9b86728742ab88e5f209953821efd30
4
- data.tar.gz: 01f3cd9d68c50a1a52ed33683b973a379073890db1ae5a3073676fd598581eca
3
+ metadata.gz: 723fae20d0310ccaeaf9ba600148061d17b2a0b29f933d455d1cf656dee85636
4
+ data.tar.gz: a135ea1661af46e96843bf52744e8004d0ebe7e8d94b0c46a097c36df53d5bc4
5
5
  SHA512:
6
- metadata.gz: e24d411ffaca985dc360bc6a6d271f8fd57c687cef26c8242e047fab6bcda400b3764b78aeea8d58ec253bc33ce48b89a46d6a8bdb71db8152c4d175880f3b87
7
- data.tar.gz: d7097e4c6cc8442088f5adb0614015b8d9d0541db8f4843e729c8e94cfefe0faf49969e9abb75381b070386c53623631b937fc72989a1f7cb0aa27b3b45a9171
6
+ metadata.gz: 256caaee94233d5e57b8d9e6007fe1ced57d35e21d40260ce34b2803ba0ef3593b66668aa06334e647edd103aa431113e38b639776163d71153c4b9bac68c1a1
7
+ data.tar.gz: 33cd1726e9f7bb3328610bafabca5ebfe51f080e7d34c523fc0b363eb290b353c9109937f2782ed7e60906965236e79229838e32c698d2f0e2f73aa2d421970b
data/README.md CHANGED
@@ -2,6 +2,8 @@
2
2
 
3
3
  Persistent memory system for LLM agents. Implements short-term checkpointing, long-term memory (file-based or **graph-based**), retrieval with time decay, and maintenance jobs. You can inspect memory from the **CLI** or, in Rails apps, from an optional **dashboard**.
4
4
 
5
+ Includes advanced memory management features inspired by [OpenClaw](https://github.com/openclaw/openclaw): pre-compaction memory flush, hybrid search (BM25 + vector), tool result pruning, context window tracking, session lifecycle management, daily memory logs, and auto-recall.
6
+
5
7
  ## Installation
6
8
 
7
9
  Add to your Gemfile:
@@ -40,11 +42,14 @@ memory.compact!(max_bytes: 8192) # or use config default
40
42
  memory.clear_session!
41
43
  ```
42
44
 
43
- - **`add_message(role:, content:)`** — Persists messages in short-term.
45
+ - **`add_message(role:, content:)`** — Persists messages in short-term. Supports `user`, `assistant`, `system`, `tool`, and `tool_result` roles.
44
46
  - **`messages`** — Returns the current conversation history.
45
47
  - **`retrieve(query, max_tokens: nil)`** — Returns combined context: recent conversation + relevant long-term memories.
48
+ - **`recall_for(query: nil)`** — Auto-recall: returns context for the given query (or last user message if `query` is nil). Only active when `auto_recall_enabled` is true.
46
49
  - **`consolidate!`** — Extracts facts from the current conversation and stores them in long-term.
47
- - **`compact!(max_bytes: nil)`** — Compacts short-term memory by summarizing old messages when byte size exceeds limit. Uses LLM to create a summary, keeping recent messages intact.
50
+ - **`compact!(max_bytes: nil)`** — Compacts short-term memory by summarizing old messages when byte size exceeds limit. Automatically flushes to long-term before compacting when over `memory_flush_threshold_tokens`.
51
+ - **`prune!(mode: nil)`** — Prunes oversized tool results (soft-trim or hard-clear). Only when `prune_tool_results_enabled` is true.
52
+ - **`check_context_window!`** — Triggers consolidate and compact when context exceeds configured thresholds.
48
53
  - **`clear_session!`** — Clears short-term only.
49
54
 
50
55
  ## Configuration
@@ -64,6 +69,37 @@ Llmemory.configure do |config|
64
69
  config.max_retrieval_tokens = 2000
65
70
  config.prune_after_days = 90
66
71
  config.compact_max_bytes = 8192 # max bytes before compact! triggers
72
+
73
+ # Pre-compaction memory flush (prevents knowledge loss when compacting)
74
+ config.memory_flush_enabled = true
75
+ config.memory_flush_threshold_tokens = 4000
76
+
77
+ # Hybrid search (BM25 + vector) and MMR re-ranking
78
+ config.hybrid_search_enabled = true
79
+ config.bm25_weight = 0.3
80
+ config.mmr_enabled = false
81
+ config.mmr_lambda = 0.7
82
+
83
+ # Tool result pruning (soft-trim or hard-clear for tool/tool_result messages)
84
+ config.prune_tool_results_enabled = false
85
+ config.prune_tool_results_mode = :soft_trim
86
+ config.prune_tool_results_max_bytes = 2048
87
+
88
+ # Context window tracking and auto-consolidation
89
+ config.context_window_tokens = 128_000
90
+ config.reserve_tokens = 16_384
91
+ config.keep_recent_tokens = 20_000
92
+
93
+ # Session lifecycle management
94
+ config.session_idle_minutes = 60
95
+ config.session_prune_after_days = 30
96
+ config.session_max_entries_per_user = 500
97
+
98
+ # Daily memory logs (file-based, FileStorage only)
99
+ config.daily_logs_enabled = false
100
+
101
+ # Auto-recall (inject relevant memories before each LLM turn)
102
+ config.auto_recall_enabled = false
67
103
  end
68
104
  ```
69
105
 
@@ -159,6 +195,71 @@ candidates = memory.search_candidates("job", top_k: 20)
159
195
 
160
196
  **Graph storage:** `:memory` (in-memory) or `:active_record` (Rails). For ActiveRecord, run `rails g llmemory:install` and migrate; the migration creates `llmemory_nodes`, `llmemory_edges`, and `llmemory_embeddings` (pgvector). Enable the `vector` extension in PostgreSQL for embeddings.
161
197
 
198
+ ## Advanced Memory Management
199
+
200
+ These features improve robustness and efficiency, inspired by OpenClaw's memory system.
201
+
202
+ ### Pre-Compaction Memory Flush
203
+
204
+ Before compacting short-term memory, llmemory can automatically consolidate the conversation into long-term storage. This prevents knowledge loss when the context is summarized.
205
+
206
+ - **`memory_flush_enabled`** — When true, `compact!` calls `consolidate!` first when messages exceed `memory_flush_threshold_tokens`.
207
+ - **`maybe_flush_memory!`** — Call explicitly to flush when approaching context limits.
208
+
209
+ ### Hybrid Search (BM25 + Vector)
210
+
211
+ Retrieval combines keyword matching (BM25) with vector similarity for more robust search. Optional MMR (Maximal Marginal Relevance) re-ranking improves result diversity.
212
+
213
+ - **`hybrid_search_enabled`** — Combines BM25 and vector scores.
214
+ - **`bm25_weight`** — Weight for BM25 (0–1); remainder is vector score.
215
+ - **`mmr_enabled`** — Re-ranks results for diversity.
216
+ - **`mmr_lambda`** — Balance between relevance and diversity (0–1).
217
+
218
+ ### Tool Result Pruning
219
+
220
+ Large tool outputs can consume most of the context window. Pruning selectively trims `tool` and `tool_result` messages while keeping user/assistant intact.
221
+
222
+ - **`prune_tool_results_enabled`** — When true, `retrieve` uses pruned messages and `prune!` is available.
223
+ - **`prune_tool_results_mode`** — `:soft_trim` (keep head+tail) or `:hard_clear` (replace with placeholder).
224
+ - **`prune_tool_results_max_bytes`** — Max bytes before soft-trim applies.
225
+
226
+ ### Context Window Tracking
227
+
228
+ Track estimated tokens and trigger consolidation/compaction automatically.
229
+
230
+ - **`context_tokens`** — Returns estimated token count for current messages.
231
+ - **`should_auto_consolidate?`** — True when over `context_window_tokens - reserve_tokens`.
232
+ - **`check_context_window!`** — Runs consolidate and compact when thresholds are exceeded.
233
+
234
+ ### Session Lifecycle Management
235
+
236
+ Clean up stale or idle sessions to control storage usage.
237
+
238
+ ```ruby
239
+ lifecycle = Llmemory::ShortTerm::SessionLifecycle.new
240
+ lifecycle.cleanup_idle_sessions!(user_id: "user_123", idle_minutes: 60)
241
+ lifecycle.cleanup_stale_sessions!(user_id: "user_123", prune_after_days: 30)
242
+ lifecycle.enforce_max_entries!(user_id: "user_123", max_entries: 500)
243
+ ```
244
+
245
+ Sessions store `last_activity_at` automatically on each save.
246
+
247
+ ### Daily Memory Logs
248
+
249
+ With `daily_logs_enabled` and FileStorage, file-based memory writes to `memory/YYYY-MM-DD.md` per user. Today's and yesterday's logs are included in retrieval. Useful for temporal organization and human-readable logs.
250
+
251
+ ### Auto-Recall
252
+
253
+ When `auto_recall_enabled` is true, call `recall_for(query: nil)` before each LLM turn. If `query` is nil, the last user message is used as the search query. Returns combined context without explicit `retrieve` calls.
254
+
255
+ ```ruby
256
+ Llmemory.configure { |c| c.auto_recall_enabled = true }
257
+ # Before each LLM call:
258
+ context = memory.recall_for(query: user_message)
259
+ # Or use last user message automatically:
260
+ context = memory.recall_for
261
+ ```
262
+
162
263
  ## Lower-Level APIs
163
264
 
164
265
  ### Short-Term Memory (Checkpointing)
@@ -333,7 +434,7 @@ MCP_TOKEN=your-secret-token llmemory mcp serve --http --port 443 \
333
434
  | `memory_retrieve` | Get context optimized for LLM inference (supports timeline context) |
334
435
  | `memory_timeline` | Get chronological timeline of recent memories |
335
436
  | `memory_timeline_context` | Get N items before/after a specific memory |
336
- | `memory_add_message` | Add message to short-term conversation |
437
+ | `memory_add_message` | Add message to short-term conversation (roles: user, assistant, system, tool, tool_result) |
337
438
  | `memory_consolidate` | Extract facts from conversation to long-term |
338
439
  | `memory_stats` | Get memory statistics for a user |
339
440
  | `memory_info` | Documentation on how to use the tools |
@@ -16,6 +16,7 @@ class CreateLlmemoryTables < ActiveRecord::Migration[7.0]
16
16
  t.string :category, null: false
17
17
  t.text :content, null: false
18
18
  t.string :source_resource_id
19
+ t.float :importance, default: 0.7
19
20
  t.timestamps
20
21
  end
21
22
  add_index :llmemory_items, :user_id
@@ -16,7 +16,32 @@ module Llmemory
16
16
  :time_decay_half_life_days,
17
17
  :max_retrieval_tokens,
18
18
  :prune_after_days,
19
- :compact_max_bytes
19
+ :compact_max_bytes,
20
+ :memory_flush_enabled,
21
+ :memory_flush_threshold_tokens,
22
+ :hybrid_search_enabled,
23
+ :bm25_weight,
24
+ :mmr_enabled,
25
+ :mmr_lambda,
26
+ :prune_tool_results_enabled,
27
+ :prune_tool_results_mode,
28
+ :prune_tool_results_max_bytes,
29
+ :context_window_tokens,
30
+ :reserve_tokens,
31
+ :keep_recent_tokens,
32
+ :session_idle_minutes,
33
+ :session_prune_after_days,
34
+ :session_max_entries_per_user,
35
+ :daily_logs_enabled,
36
+ :auto_recall_enabled,
37
+ :noise_filter_enabled,
38
+ :noise_filter_min_chars,
39
+ :flush_once_per_cycle_seconds,
40
+ :overflow_recovery_enabled,
41
+ :embedding_cache_enabled,
42
+ :embedding_cache_max_entries,
43
+ :max_message_chars,
44
+ :message_sanitizer_enabled
20
45
 
21
46
  def initialize
22
47
  @llm_provider = :openai
@@ -34,6 +59,31 @@ module Llmemory
34
59
  @max_retrieval_tokens = 2000
35
60
  @prune_after_days = 90
36
61
  @compact_max_bytes = 8192
62
+ @memory_flush_enabled = true
63
+ @memory_flush_threshold_tokens = 4000
64
+ @hybrid_search_enabled = true
65
+ @bm25_weight = 0.3
66
+ @mmr_enabled = false
67
+ @mmr_lambda = 0.7
68
+ @prune_tool_results_enabled = false
69
+ @prune_tool_results_mode = :soft_trim
70
+ @prune_tool_results_max_bytes = 2048
71
+ @context_window_tokens = 128_000
72
+ @reserve_tokens = 16_384
73
+ @keep_recent_tokens = 20_000
74
+ @session_idle_minutes = 60
75
+ @session_prune_after_days = 30
76
+ @session_max_entries_per_user = 500
77
+ @daily_logs_enabled = false
78
+ @auto_recall_enabled = false
79
+ @noise_filter_enabled = false
80
+ @noise_filter_min_chars = 10
81
+ @flush_once_per_cycle_seconds = 60
82
+ @overflow_recovery_enabled = false
83
+ @embedding_cache_enabled = true
84
+ @embedding_cache_max_entries = 10_000
85
+ @max_message_chars = 32_000
86
+ @message_sanitizer_enabled = false
37
87
  end
38
88
  end
39
89
 
@@ -14,7 +14,9 @@ module Llmemory
14
14
  Extract discrete facts from this conversation.
15
15
  Focus on preferences, behaviors, and important details.
16
16
  Conversation: #{conversation_text}
17
- Return as JSON array of objects with "content" key. Example: [{"content": "User prefers Ruby"}, {"content": "User is vegan"}]
17
+ Return as JSON array of objects with "content" and "importance" (0-1) keys.
18
+ Importance: 0.8-0.95 for preferences/corrections/decisions, 0.5-0.8 for factual context, 0.3-0.5 for ephemeral.
19
+ Example: [{"content": "User prefers Ruby", "importance": 0.9}, {"content": "User mentioned the weather", "importance": 0.4}]
18
20
  PROMPT
19
21
  response = @llm.invoke(prompt.strip)
20
22
  parse_items_response(response)
@@ -56,7 +58,12 @@ module Llmemory
56
58
  def parse_items_response(response)
57
59
  json = extract_json_array(response)
58
60
  return [] unless json
59
- json.map { |item| item.is_a?(Hash) ? item : { "content" => item.to_s } }
61
+ json.map do |item|
62
+ h = item.is_a?(Hash) ? item : { "content" => item.to_s }
63
+ imp = h["importance"] || h[:importance]
64
+ h["importance"] = imp.nil? ? 0.7 : (imp.to_f.between?(0, 1) ? imp.to_f : 0.7)
65
+ h
66
+ end
60
67
  end
61
68
 
62
69
  def extract_json_array(response)
@@ -4,6 +4,7 @@ require_relative "resource"
4
4
  require_relative "item"
5
5
  require_relative "category"
6
6
  require_relative "storage"
7
+ require_relative "../../noise_filter"
7
8
 
8
9
  module Llmemory
9
10
  module LongTerm
@@ -17,16 +18,21 @@ module Llmemory
17
18
  end
18
19
 
19
20
  def memorize(conversation_text)
20
- resource_id = save_resource(conversation_text)
21
- items = @extractor.extract_items(conversation_text)
21
+ text = Llmemory.configuration.noise_filter_enabled ? NoiseFilter.filter?(conversation_text) : conversation_text.to_s
22
+ return true if text.strip.empty?
23
+
24
+ resource_id = save_resource(text)
25
+ append_to_daily_log(text) if Llmemory.configuration.daily_logs_enabled && @storage.respond_to?(:save_daily_log_entry)
26
+ items = @extractor.extract_items(text)
22
27
  updates_by_category = {}
23
28
 
24
29
  items.each do |item|
25
30
  content = item.is_a?(Hash) ? (item["content"] || item[:content]) : item.to_s
31
+ importance = (item["importance"] || item[:importance] || 0.7).to_f
26
32
  cat = @extractor.classify_item(content)
27
33
  updates_by_category[cat] ||= []
28
34
  updates_by_category[cat] << content.to_s
29
- save_item(category: cat, item: item, source_resource_id: resource_id)
35
+ save_item(category: cat, item: item, source_resource_id: resource_id, importance: importance)
30
36
  end
31
37
 
32
38
  updates_by_category.each do |category, new_memories|
@@ -47,12 +53,20 @@ module Llmemory
47
53
  uid = user_id || @user_id
48
54
  items = @storage.search_items(uid, query)
49
55
  resources = @storage.search_resources(uid, query)
56
+ daily_logs = load_daily_logs_for_retrieval(uid) if Llmemory.configuration.daily_logs_enabled && @storage.respond_to?(:load_daily_logs)
57
+ category_summaries = load_category_summaries_as_candidates(uid, query)
50
58
  out = []
59
+
60
+ category_summaries.each do |c|
61
+ out << c.merge(evergreen: true)
62
+ end
63
+
51
64
  items.first(top_k).each do |i|
52
65
  out << {
53
66
  text: i[:content] || i["content"],
54
67
  timestamp: i[:created_at] || i["created_at"],
55
- score: 1.0
68
+ score: (i[:importance] || i["importance"] || 1.0).to_f,
69
+ evergreen: i[:evergreen] || i["evergreen"]
56
70
  }
57
71
  end
58
72
  resources.first([top_k - out.size, 0].max).each do |r|
@@ -62,6 +76,11 @@ module Llmemory
62
76
  score: 0.9
63
77
  }
64
78
  end
79
+ if daily_logs
80
+ daily_logs.each do |log|
81
+ out << { text: log[:content], timestamp: log[:date].to_time, score: 0.85 }
82
+ end
83
+ end
65
84
  out
66
85
  end
67
86
 
@@ -73,9 +92,37 @@ module Llmemory
73
92
  @storage.save_resource(@user_id, text)
74
93
  end
75
94
 
76
- def save_item(category:, item:, source_resource_id:)
95
+ def save_item(category:, item:, source_resource_id:, importance: 0.7)
77
96
  content = item.is_a?(Hash) ? item["content"] || item[:content] : item.to_s
78
- @storage.save_item(@user_id, category: category, content: content, source_resource_id: source_resource_id)
97
+ @storage.save_item(@user_id, category: category, content: content, source_resource_id: source_resource_id, importance: importance)
98
+ end
99
+
100
+ def append_to_daily_log(conversation_text)
101
+ summary = conversation_text.length > 500 ? "#{conversation_text[0..500]}..." : conversation_text
102
+ @storage.save_daily_log_entry(@user_id, Date.today, summary)
103
+ end
104
+
105
+ def load_daily_logs_for_retrieval(user_id)
106
+ today = Date.today
107
+ yesterday = today - 1
108
+ logs = @storage.load_daily_logs(user_id, from_date: yesterday, to_date: today)
109
+ logs.map { |l| { date: l[:date], content: "[#{l[:date]}] #{l[:content]}" } }
110
+ end
111
+
112
+ def load_category_summaries_as_candidates(user_id, query)
113
+ return [] unless @storage.respond_to?(:list_categories)
114
+
115
+ categories = @storage.list_categories(user_id)
116
+ return [] if categories.empty?
117
+
118
+ query_lower = query.to_s.downcase
119
+ categories.filter_map do |cat|
120
+ summary = @storage.load_category(user_id, cat)
121
+ next if summary.to_s.strip.empty?
122
+ next unless summary.to_s.downcase.include?(query_lower)
123
+
124
+ { text: "[#{cat}] #{summary}", timestamp: Time.now, score: 0.95 }
125
+ end
79
126
  end
80
127
  end
81
128
  end
@@ -30,16 +30,18 @@ module Llmemory
30
30
  id
31
31
  end
32
32
 
33
- def save_item(user_id, category:, content:, source_resource_id:)
33
+ def save_item(user_id, category:, content:, source_resource_id:, importance: 0.7)
34
34
  id = "item_#{SecureRandom.hex(8)}"
35
- LlmemoryItem.create!(
35
+ attrs = {
36
36
  id: id,
37
37
  user_id: user_id,
38
38
  category: category,
39
39
  content: content,
40
40
  source_resource_id: source_resource_id,
41
41
  created_at: Time.current
42
- )
42
+ }
43
+ attrs[:importance] = importance if LlmemoryItem.column_names.include?("importance")
44
+ LlmemoryItem.create!(attrs)
43
45
  id
44
46
  end
45
47
 
@@ -96,14 +98,16 @@ module Llmemory
96
98
  def replace_items(user_id, ids_to_remove, merged_item)
97
99
  LlmemoryItem.where(user_id: user_id, id: ids_to_remove).destroy_all
98
100
  created_at = merged_item[:created_at] || Time.current
99
- LlmemoryItem.create!(
101
+ attrs = {
100
102
  id: "item_#{SecureRandom.hex(8)}",
101
103
  user_id: user_id,
102
104
  category: merged_item[:category],
103
105
  content: merged_item[:content],
104
106
  source_resource_id: merged_item[:source_resource_id],
105
107
  created_at: created_at
106
- )
108
+ }
109
+ attrs[:importance] = merged_item[:importance] if LlmemoryItem.column_names.include?("importance") && merged_item[:importance]
110
+ LlmemoryItem.create!(attrs)
107
111
  end
108
112
 
109
113
  def archive_items(user_id, item_ids)
@@ -177,13 +181,15 @@ module Llmemory
177
181
  end
178
182
 
179
183
  def row_to_item(r)
180
- {
184
+ h = {
181
185
  id: r.id,
182
186
  category: r.category,
183
187
  content: r.content,
184
188
  source_resource_id: r.source_resource_id,
185
189
  created_at: r.created_at
186
190
  }
191
+ h[:importance] = r.respond_to?(:importance) ? (r.importance || 0.7).to_f : 0.7
192
+ h
187
193
  end
188
194
 
189
195
  def row_to_resource(r)
@@ -9,7 +9,7 @@ module Llmemory
9
9
  raise NotImplementedError, "#{self.class}#save_resource must be implemented"
10
10
  end
11
11
 
12
- def save_item(user_id, category:, content:, source_resource_id:)
12
+ def save_item(user_id, category:, content:, source_resource_id:, importance: 0.7)
13
13
  raise NotImplementedError, "#{self.class}#save_item must be implemented"
14
14
  end
15
15
 
@@ -24,12 +24,12 @@ module Llmemory
24
24
  id
25
25
  end
26
26
 
27
- def save_item(user_id, category:, content:, source_resource_id:)
27
+ def save_item(user_id, category:, content:, source_resource_id:, importance: 0.7)
28
28
  ensure_tables!
29
29
  id = "item_#{SecureRandom.hex(8)}"
30
30
  conn.exec_params(
31
- "INSERT INTO llmemory_items (id, user_id, category, content, source_resource_id, created_at) VALUES ($1, $2, $3, $4, $5, $6)",
32
- [id, user_id, category, content, source_resource_id, Time.now.utc.iso8601]
31
+ "INSERT INTO llmemory_items (id, user_id, category, content, source_resource_id, importance, created_at) VALUES ($1, $2, $3, $4, $5, $6, $7)",
32
+ [id, user_id, category, content, source_resource_id, importance.to_f, Time.now.utc.iso8601]
33
33
  )
34
34
  id
35
35
  end
@@ -67,7 +67,7 @@ module Llmemory
67
67
  ensure_tables!
68
68
  pattern = "%#{conn.escape_string(query.to_s.downcase)}%"
69
69
  rows = conn.exec_params(
70
- "SELECT id, category, content, source_resource_id, created_at FROM llmemory_items WHERE user_id = $1 AND LOWER(content) LIKE $2",
70
+ "SELECT id, category, content, source_resource_id, importance, created_at FROM llmemory_items WHERE user_id = $1 AND LOWER(content) LIKE $2",
71
71
  [user_id, pattern]
72
72
  )
73
73
  rows_to_items(rows)
@@ -97,7 +97,7 @@ module Llmemory
97
97
  ensure_tables!
98
98
  cutoff = (Time.now - (days * 86400)).utc.iso8601
99
99
  rows = conn.exec_params(
100
- "SELECT id, category, content, source_resource_id, created_at FROM llmemory_items WHERE user_id = $1 AND created_at < $2 ORDER BY created_at",
100
+ "SELECT id, category, content, source_resource_id, importance, created_at FROM llmemory_items WHERE user_id = $1 AND created_at < $2 ORDER BY created_at",
101
101
  [user_id, cutoff]
102
102
  )
103
103
  rows_to_items(rows)
@@ -106,7 +106,7 @@ module Llmemory
106
106
  def get_all_items(user_id)
107
107
  ensure_tables!
108
108
  rows = conn.exec_params(
109
- "SELECT id, category, content, source_resource_id, created_at FROM llmemory_items WHERE user_id = $1 ORDER BY created_at",
109
+ "SELECT id, category, content, source_resource_id, importance, created_at FROM llmemory_items WHERE user_id = $1 ORDER BY created_at",
110
110
  [user_id]
111
111
  )
112
112
  rows_to_items(rows)
@@ -125,7 +125,7 @@ module Llmemory
125
125
  ensure_tables!
126
126
  cutoff = (Time.now - (hours * 3600)).utc.iso8601
127
127
  rows = conn.exec_params(
128
- "SELECT id, category, content, source_resource_id, created_at FROM llmemory_items WHERE user_id = $1 AND created_at >= $2 ORDER BY created_at",
128
+ "SELECT id, category, content, source_resource_id, importance, created_at FROM llmemory_items WHERE user_id = $1 AND created_at >= $2 ORDER BY created_at",
129
129
  [user_id, cutoff]
130
130
  )
131
131
  rows_to_items(rows)
@@ -179,7 +179,7 @@ module Llmemory
179
179
 
180
180
  def list_items(user_id:, category: nil, limit: nil)
181
181
  ensure_tables!
182
- sql = "SELECT id, category, content, source_resource_id, created_at FROM llmemory_items WHERE user_id = $1"
182
+ sql = "SELECT id, category, content, source_resource_id, importance, created_at FROM llmemory_items WHERE user_id = $1"
183
183
  params = [user_id]
184
184
  if category
185
185
  sql += " AND category = $2"
@@ -257,10 +257,12 @@ module Llmemory
257
257
  category TEXT NOT NULL,
258
258
  content TEXT NOT NULL,
259
259
  source_resource_id TEXT,
260
+ importance REAL DEFAULT 0.7,
260
261
  created_at TIMESTAMPTZ NOT NULL
261
262
  );
262
263
  CREATE INDEX IF NOT EXISTS idx_llmemory_items_user_id ON llmemory_items(user_id);
263
264
  SQL
265
+ conn.exec("ALTER TABLE llmemory_items ADD COLUMN IF NOT EXISTS importance REAL DEFAULT 0.7") rescue nil
264
266
  conn.exec(<<~SQL)
265
267
  CREATE TABLE IF NOT EXISTS llmemory_categories (
266
268
  user_id TEXT NOT NULL,
@@ -279,6 +281,7 @@ module Llmemory
279
281
  category: r["category"],
280
282
  content: r["content"],
281
283
  source_resource_id: r["source_resource_id"],
284
+ importance: (r["importance"] || 0.7).to_f,
282
285
  created_at: Time.parse(r["created_at"])
283
286
  }
284
287
  end
@@ -24,7 +24,7 @@ module Llmemory
24
24
  id
25
25
  end
26
26
 
27
- def save_item(user_id, category:, content:, source_resource_id:)
27
+ def save_item(user_id, category:, content:, source_resource_id:, importance: 0.7)
28
28
  ensure_user_dir(user_id)
29
29
  seq = next_seq(user_id, "item_id_seq")
30
30
  id = "item_#{seq}"
@@ -34,6 +34,7 @@ module Llmemory
34
34
  category: category,
35
35
  content: content,
36
36
  source_resource_id: source_resource_id,
37
+ importance: importance,
37
38
  created_at: Time.now.iso8601
38
39
  }
39
40
  File.write(path, JSON.generate(data))
@@ -124,6 +125,29 @@ module Llmemory
124
125
  resource_ids.each { |id| File.delete(resource_path(user_id, id)) if File.file?(resource_path(user_id, id)) }
125
126
  end
126
127
 
128
+ def save_daily_log_entry(user_id, date, content)
129
+ ensure_user_dir(user_id, "memory")
130
+ path = daily_log_path(user_id, date)
131
+ existing = File.file?(path) ? File.read(path) : ""
132
+ entry = "#{Time.now.strftime('%H:%M')} #{content}\n"
133
+ File.write(path, existing + entry)
134
+ true
135
+ end
136
+
137
+ def load_daily_logs(user_id, from_date:, to_date:)
138
+ from_date = Date.parse(from_date.to_s) if from_date.is_a?(String)
139
+ to_date = Date.parse(to_date.to_s) if to_date.is_a?(String)
140
+ dir = user_path(user_id, "memory")
141
+ return [] unless Dir.exist?(dir)
142
+
143
+ (from_date..to_date).filter_map do |d|
144
+ path = daily_log_path(user_id, d)
145
+ next unless File.file?(path)
146
+
147
+ { date: d, content: File.read(path) }
148
+ end
149
+ end
150
+
127
151
  def list_users
128
152
  return [] unless Dir.exist?(@base_path)
129
153
  Dir.children(@base_path).select { |e| File.directory?(File.join(@base_path, e)) && !e.start_with?(".") }
@@ -162,6 +186,11 @@ module Llmemory
162
186
  File.join(user_path(user_id, "items"), "#{id}.json")
163
187
  end
164
188
 
189
+ def daily_log_path(user_id, date)
190
+ date_str = date.respond_to?(:strftime) ? date.strftime("%Y-%m-%d") : date.to_s
191
+ File.join(user_path(user_id, "memory"), "#{date_str}.md")
192
+ end
193
+
165
194
  def category_path(user_id, category_name)
166
195
  safe = category_name.to_s.gsub(%r{[^\w\-.]}, "_")
167
196
  File.join(user_path(user_id, "categories"), "#{safe}.md")
@@ -22,7 +22,7 @@ module Llmemory
22
22
  id
23
23
  end
24
24
 
25
- def save_item(user_id, category:, content:, source_resource_id:)
25
+ def save_item(user_id, category:, content:, source_resource_id:, importance: 0.7)
26
26
  @item_id_seq += 1
27
27
  id = "item_#{@item_id_seq}"
28
28
  @items[user_id] << {
@@ -30,6 +30,7 @@ module Llmemory
30
30
  category: category,
31
31
  content: content,
32
32
  source_resource_id: source_resource_id,
33
+ importance: importance,
33
34
  created_at: Time.now
34
35
  }
35
36
  id
@@ -5,6 +5,7 @@ require_relative "edge"
5
5
  require_relative "knowledge_graph"
6
6
  require_relative "conflict_resolver"
7
7
  require_relative "storage"
8
+ require_relative "../../noise_filter"
8
9
 
9
10
  module Llmemory
10
11
  module LongTerm
@@ -21,7 +22,10 @@ module Llmemory
21
22
  end
22
23
 
23
24
  def memorize(conversation_text)
24
- data = @extractor.extract(conversation_text) rescue { entities: [], relations: [] }
25
+ text = Llmemory.configuration.noise_filter_enabled ? NoiseFilter.filter?(conversation_text) : conversation_text.to_s
26
+ return true if text.strip.empty?
27
+
28
+ data = @extractor.extract(text) rescue { entities: [], relations: [] }
25
29
  data = { entities: [], relations: [] } unless data.is_a?(Hash)
26
30
  entities = Array(data[:entities] || data["entities"])
27
31
  relations = Array(data[:relations] || data["relations"])
@@ -10,7 +10,7 @@ module Llmemory
10
10
  properties: {
11
11
  user_id: { type: "string", description: "User identifier" },
12
12
  session_id: { type: "string", description: "Session identifier (default: 'default')" },
13
- role: { type: "string", enum: ["user", "assistant", "system"], description: "Message role" },
13
+ role: { type: "string", enum: ["user", "assistant", "system", "tool", "tool_result"], description: "Message role" },
14
14
  content: { type: "string", description: "Message content" }
15
15
  },
16
16
  required: ["user_id", "role", "content"]