get-claudia 1.55.13 → 1.55.15

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -2,6 +2,17 @@
2
2
 
3
3
  All notable changes to Claudia will be documented in this file.
4
4
 
5
+ ## 1.55.15 (2026-03-18)
6
+
7
+ - **Fix mixed-timezone datetime crash** -- The memory daemon could crash with `can't subtract offset-naive and offset-aware datetimes` when recall or consolidation queries hit records with timezone suffixes (e.g., `+00:00` from email or transcript timestamps). Added a shared `parse_naive()` utility that strips timezone info on parse, applied across 14 locations in 5 files (recall.py, consolidate.py, server.py, vault_sync.py, canvas_generator.py). Replaces the older `[:19]` string truncation workaround. 615 tests pass.
8
+ - **License updated to PolyForm Noncommercial 1.0.0** -- README, package.json, and ARCHITECTURE.md now reflect the license change from Apache 2.0 to PolyForm NC. Free for personal, research, educational, and nonprofit use. Commercial licensing available via mail@kbanc.com.
9
+
10
+ ## 1.55.14 (2026-03-16)
11
+
12
+ - **LaunchAgent no longer bakes in --project-dir** -- The standalone background daemon now starts without a `--project-dir` argument. This forces a plist content change for all existing installs, which triggers an automatic LaunchAgent reload on next `claudia setup`, picking up the current Python daemon code. Previously, the plist could be identical across updates, leaving old daemon code running indefinitely even after `pip install --upgrade`.
13
+ - **Cleanup of orphaned empty hash databases** -- On each startup, if the database is already unified and empty hash-named DB files exist (created by stale old-code daemon processes), they are silently removed. Prevents phantom databases from accumulating in `~/.claudia/memory/`.
14
+ - **Root cause:** After the v1.55 consolidation, users whose LaunchAgent was running old pre-unified-DB code would see scheduled jobs (consolidation, backups, decay) operating against an empty `6af67351bcfa.db` while all real memories lived in `claudia.db`. Health check showed `schema_version: 0` and `-1` counts.
15
+
5
16
  ## 1.55.13 (2026-03-16)
6
17
 
7
18
  - **Gmail & Calendar MCPs as standard options** -- The standalone `gmail` (`@gongrzhe/server-gmail-autoauth-mcp`) and `google-calendar` (`@gongrzhe/server-calendar-autoauth-mcp`) servers are now first-class options alongside workspace-mcp. Two paths for Google integration: Option A (lightweight, focused, fewer tools) and Option B (all-in-one workspace-mcp with Drive, Docs, Sheets, etc.). Both can coexist.
package/README.md CHANGED
@@ -11,7 +11,7 @@ Remembers your people. Catches your commitments. Learns how you work.
11
11
  <p align="center">
12
12
  <a href="https://github.com/kbanc85/claudia/stargazers"><img src="https://img.shields.io/github/stars/kbanc85/claudia?style=flat-square" alt="GitHub stars"></a>
13
13
  <a href="https://www.npmjs.com/package/get-claudia"><img src="https://img.shields.io/npm/v/get-claudia?style=flat-square" alt="npm version"></a>
14
- <a href="https://github.com/kbanc85/claudia/blob/main/LICENSE"><img src="https://img.shields.io/badge/license-Apache%202.0-blue?style=flat-square" alt="License"></a>
14
+ <a href="https://github.com/kbanc85/claudia/blob/main/LICENSE"><img src="https://img.shields.io/badge/license-PolyForm%20NC%201.0.0-purple?style=flat-square" alt="License"></a>
15
15
  </p>
16
16
 
17
17
  <p align="center">
@@ -479,7 +479,7 @@ This updates daemon code, skills, and rules while preserving your databases and
479
479
 
480
480
  ## Contributing
481
481
 
482
- Claudia is open source under the Apache 2.0 License.
482
+ Claudia is source-available under the PolyForm Noncommercial License 1.0.0.
483
483
 
484
484
  - **Template (skills, rules, identity):** `template-v2/`
485
485
  - **Memory daemon (Python):** `memory-daemon/` (tests: `cd memory-daemon && pytest tests/`)
@@ -491,9 +491,9 @@ Claudia is open source under the Apache 2.0 License.
491
491
 
492
492
  ## License
493
493
 
494
- [Apache License 2.0](LICENSE)
494
+ [PolyForm Noncommercial 1.0.0](LICENSE)
495
495
 
496
- Open source. Free for personal and commercial use. Attribution required.
496
+ Free for personal, research, educational, and nonprofit use. Commercial licensing: mail@kbanc.com
497
497
 
498
498
  ---
499
499
 
package/bin/index.js CHANGED
@@ -1004,7 +1004,7 @@ async function main() {
1004
1004
 
1005
1005
  // Register LaunchAgent for standalone daemon (macOS only)
1006
1006
  if (daemonOk && process.platform === 'darwin') {
1007
- await ensureLaunchAgent(venvPython, targetPath);
1007
+ await ensureLaunchAgent(venvPython);
1008
1008
  }
1009
1009
 
1010
1010
  // MCP Config step: verify .mcp.json is correct and check stdio server count
@@ -1575,7 +1575,7 @@ function checkMcpConfig(targetPath) {
1575
1575
  * The standalone daemon runs 24/7 for scheduled jobs (consolidation, decay, vault sync).
1576
1576
  * This is separate from the MCP daemon that Claude Code spawns per-session.
1577
1577
  */
1578
- async function ensureLaunchAgent(venvPythonPath, projectDir) {
1578
+ async function ensureLaunchAgent(venvPythonPath) {
1579
1579
  const plistDir = join(homedir(), 'Library', 'LaunchAgents');
1580
1580
  const plistPath = join(plistDir, 'com.claudia.memory.plist');
1581
1581
 
@@ -1591,8 +1591,6 @@ async function ensureLaunchAgent(venvPythonPath, projectDir) {
1591
1591
  <string>-m</string>
1592
1592
  <string>claudia_memory</string>
1593
1593
  <string>--standalone</string>
1594
- <string>--project-dir</string>
1595
- <string>${projectDir}</string>
1596
1594
  </array>
1597
1595
  <key>WorkingDirectory</key>
1598
1596
  <string>${join(homedir(), '.claudia', 'daemon')}</string>
@@ -195,7 +195,16 @@ def _auto_consolidate() -> None:
195
195
  fetch=True,
196
196
  )
197
197
  if rows and rows[0]["value"] == "true":
198
- logger.debug("Database already unified, skipping consolidation")
198
+ # Unified. Clean up any empty hash DBs that stale daemon instances may have
199
+ # created (old standalone daemons running pre-unified-DB code create a fresh
200
+ # empty hash DB on startup if the original was deleted by consolidation).
201
+ all_hash_dbs = scan_hash_databases(memory_dir)
202
+ empty_dbs = [d for d in all_hash_dbs if not d["has_data"]]
203
+ if empty_dbs:
204
+ logger.info(
205
+ f"Removing {len(empty_dbs)} empty hash DB(s) left by stale standalone daemon"
206
+ )
207
+ cleanup_old_databases(memory_dir, empty_dbs)
199
208
  return
200
209
  except Exception:
201
210
  pass # _meta table might not exist yet
@@ -22,6 +22,7 @@ from mcp.types import (
22
22
  )
23
23
 
24
24
  from ..database import get_db
25
+ from ..utils import parse_naive
25
26
  from ..services.consolidate import (
26
27
  get_consolidate_service,
27
28
  get_predictions,
@@ -3190,9 +3191,9 @@ def _build_briefing() -> str:
3190
3191
  "SELECT updated_at FROM _meta WHERE key = 'unified_db'", fetch=True
3191
3192
  )
3192
3193
  if ts_row and ts_row[0]["updated_at"]:
3193
- from datetime import datetime as _dt, timedelta as _td
3194
- consolidated_at = _dt.fromisoformat(ts_row[0]["updated_at"][:19])
3195
- if (_dt.utcnow() - consolidated_at) < _td(minutes=5):
3194
+ from datetime import timedelta as _td
3195
+ consolidated_at = parse_naive(ts_row[0]["updated_at"])
3196
+ if (datetime.utcnow() - consolidated_at) < _td(minutes=5):
3196
3197
  # Just consolidated, include stats
3197
3198
  mem_row = db.execute("SELECT COUNT(*) as c FROM memories", fetch=True)
3198
3199
  ent_row = db.execute("SELECT COUNT(*) as c FROM entities WHERE deleted_at IS NULL", fetch=True)
@@ -3660,7 +3661,7 @@ def _build_morning_context() -> str:
3660
3661
  if stale:
3661
3662
  sections.append(f"## Stale Commitments ({len(stale)})\n")
3662
3663
  for c in stale:
3663
- days_old = (datetime.utcnow() - datetime.fromisoformat(c["created_at"])).days
3664
+ days_old = (datetime.utcnow() - parse_naive(c["created_at"])).days
3664
3665
  entities = c["entity_names"] or ""
3665
3666
  prefix = f"[{entities}] " if entities else ""
3666
3667
  sections.append(f"- {prefix}{c['content'][:100]} ({days_old}d old, importance: {c['importance']:.1f})")
@@ -28,6 +28,7 @@ from pathlib import Path
28
28
  from typing import Any, Dict, List, Optional, Tuple
29
29
 
30
30
  from ..database import get_db
31
+ from ..utils import parse_naive
31
32
 
32
33
  logger = logging.getLogger(__name__)
33
34
 
@@ -418,7 +419,7 @@ class CanvasGenerator:
418
419
  last = r["last_contact_at"]
419
420
  if last:
420
421
  try:
421
- days_ago = (datetime.utcnow() - datetime.fromisoformat(last[:19])).days
422
+ days_ago = (datetime.utcnow() - parse_naive(last)).days
422
423
  reconnect_lines.append(f"- [[{r['name']}]] ({trend}, {days_ago}d ago)")
423
424
  except (ValueError, TypeError):
424
425
  reconnect_lines.append(f"- [[{r['name']}]] ({trend})")
@@ -14,6 +14,7 @@ from typing import Any, Dict, List, Optional, Tuple
14
14
 
15
15
  from ..config import get_config
16
16
  from ..database import get_db
17
+ from ..utils import parse_naive
17
18
 
18
19
  logger = logging.getLogger(__name__)
19
20
 
@@ -366,7 +367,7 @@ class ConsolidateService:
366
367
  timestamps = []
367
368
  for r in rows:
368
369
  try:
369
- timestamps.append(datetime.fromisoformat(r["created_at"]))
370
+ timestamps.append(parse_naive(r["created_at"]))
370
371
  except (ValueError, TypeError):
371
372
  continue
372
373
 
@@ -538,7 +539,7 @@ class ConsolidateService:
538
539
  days_since = 0
539
540
  if entity["last_contact_at"]:
540
541
  try:
541
- last_dt = datetime.fromisoformat(entity["last_contact_at"])
542
+ last_dt = parse_naive(entity["last_contact_at"])
542
543
  days_since = int((now - last_dt).total_seconds() / 86400)
543
544
  except (ValueError, TypeError):
544
545
  pass
@@ -671,7 +672,7 @@ class ConsolidateService:
671
672
  for row in rows:
672
673
  days_since = None
673
674
  if row["last_mention"]:
674
- last_dt = datetime.fromisoformat(row["last_mention"])
675
+ last_dt = parse_naive(row["last_mention"])
675
676
  days_since = (datetime.utcnow() - last_dt).days
676
677
 
677
678
  severity = "warning" if days_since and days_since > 60 else "observation"
@@ -1377,7 +1378,7 @@ class ConsolidateService:
1377
1378
  )
1378
1379
 
1379
1380
  for commitment in old_commitments:
1380
- created = datetime.fromisoformat(commitment["created_at"])
1381
+ created = parse_naive(commitment["created_at"])
1381
1382
  days_old = (datetime.utcnow() - created).days
1382
1383
 
1383
1384
  if days_old > 3:
@@ -2302,7 +2303,7 @@ class ConsolidateService:
2302
2303
  velocity_parts.append(f"tier: {entity['attention_tier']}")
2303
2304
  if entity["last_contact_at"]:
2304
2305
  try:
2305
- last_dt = datetime.fromisoformat(entity["last_contact_at"])
2306
+ last_dt = parse_naive(entity["last_contact_at"])
2306
2307
  days_since = (datetime.utcnow() - last_dt).days
2307
2308
  velocity_parts.append(f"last contact: {days_since} days ago")
2308
2309
  except (ValueError, TypeError):
@@ -18,6 +18,7 @@ from typing import Any, Dict, List, Optional, Tuple
18
18
  from ..config import get_config
19
19
  from ..database import get_db
20
20
  from ..embeddings import embed_sync, get_embedding_service
21
+ from ..utils import parse_naive
21
22
  from ..extraction.entity_extractor import get_extractor
22
23
 
23
24
  logger = logging.getLogger(__name__)
@@ -240,7 +241,7 @@ class RecallService:
240
241
  row = vector_rows.get(mid)
241
242
  if row:
242
243
  try:
243
- created = datetime.fromisoformat(row["created_at"])
244
+ created = parse_naive(row["created_at"])
244
245
  recency_data[mid] = (now - created).total_seconds()
245
246
  except (ValueError, TypeError):
246
247
  recency_data[mid] = float("inf")
@@ -333,7 +334,7 @@ class RecallService:
333
334
  importance_score = row["importance"]
334
335
 
335
336
  # Recency score (configurable half-life decay)
336
- created = datetime.fromisoformat(row["created_at"])
337
+ created = parse_naive(row["created_at"])
337
338
  days_old = (now - created).days
338
339
  recency_score = math.exp(-days_old / self.config.recency_half_life_days)
339
340
 
@@ -2122,8 +2123,8 @@ class RecallService:
2122
2123
  results = []
2123
2124
  now = datetime.utcnow()
2124
2125
  for row in rows:
2125
- source_last = datetime.fromisoformat(row["source_last_memory"])
2126
- target_last = datetime.fromisoformat(row["target_last_memory"])
2126
+ source_last = parse_naive(row["source_last_memory"])
2127
+ target_last = parse_naive(row["target_last_memory"])
2127
2128
  most_recent = max(source_last, target_last)
2128
2129
  days_dormant = (now - most_recent).days
2129
2130
 
@@ -2513,7 +2514,7 @@ class RecallService:
2513
2514
  urgency = "later"
2514
2515
  if deadline_str:
2515
2516
  try:
2516
- deadline_dt = datetime.fromisoformat(deadline_str)
2517
+ deadline_dt = parse_naive(deadline_str)
2517
2518
  if deadline_dt < now:
2518
2519
  urgency = "overdue"
2519
2520
  elif deadline_dt < now + timedelta(days=1):
@@ -2705,12 +2706,9 @@ class RecallService:
2705
2706
  }
2706
2707
 
2707
2708
  try:
2708
- last_dt = datetime.strptime(last_contact, "%Y-%m-%d %H:%M:%S")
2709
- except (ValueError, TypeError):
2710
- try:
2711
- last_dt = datetime.fromisoformat(last_contact.replace("Z", "+00:00")).replace(tzinfo=None)
2712
- except Exception:
2713
- return {"entity": entity["name"], "status": "parse_error"}
2709
+ last_dt = parse_naive(last_contact.replace("Z", "+00:00"))
2710
+ except Exception:
2711
+ return {"entity": entity["name"], "status": "parse_error"}
2714
2712
 
2715
2713
  now = datetime.utcnow()
2716
2714
  days_since = (now - last_dt).days
@@ -42,6 +42,7 @@ from typing import Any, Dict, List, Optional, Tuple
42
42
 
43
43
  from ..config import get_config
44
44
  from ..database import get_db
45
+ from ..utils import parse_naive
45
46
 
46
47
  logger = logging.getLogger(__name__)
47
48
 
@@ -1234,7 +1235,7 @@ class VaultSyncService:
1234
1235
  last = w["last_contact_at"]
1235
1236
  if last:
1236
1237
  try:
1237
- days_ago = (datetime.utcnow() - datetime.fromisoformat(last[:19])).days
1238
+ days_ago = (datetime.utcnow() - parse_naive(last)).days
1238
1239
  lines.append(f"- [[{w['name']}]] - {trend} ({days_ago}d)")
1239
1240
  except (ValueError, TypeError):
1240
1241
  lines.append(f"- [[{w['name']}]] - {trend}")
@@ -1599,7 +1600,7 @@ class VaultSyncService:
1599
1600
  last_contact = p["last_contact_at"]
1600
1601
  if last_contact:
1601
1602
  try:
1602
- dt = datetime.fromisoformat(last_contact[:19])
1603
+ dt = parse_naive(last_contact)
1603
1604
  days_ago = (now - dt).days
1604
1605
  last_str = f"{days_ago}d ago"
1605
1606
  except (ValueError, TypeError):
@@ -0,0 +1,22 @@
1
+ """
2
+ Shared utilities for Claudia Memory System.
3
+ """
4
+
5
+ from datetime import datetime
6
+
7
+
8
+ def parse_naive(dt_string: str) -> datetime:
9
+ """Parse an ISO datetime string and strip timezone info.
10
+
11
+ The database stores a mix of naive and offset-aware datetimes.
12
+ External sources (emails, transcripts, calendar events) often include
13
+ timezone suffixes like +00:00 or Z. Since all timestamps are treated
14
+ as UTC internally, we strip tzinfo to avoid:
15
+
16
+ TypeError: can't subtract offset-naive and offset-aware datetimes
17
+
18
+ This is used everywhere a parsed timestamp participates in arithmetic
19
+ with datetime.utcnow() (which returns a naive datetime).
20
+ """
21
+ dt = datetime.fromisoformat(dt_string)
22
+ return dt.replace(tzinfo=None) if dt.tzinfo else dt
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "get-claudia",
3
- "version": "1.55.13",
3
+ "version": "1.55.15",
4
4
  "description": "An AI assistant who learns how you work.",
5
5
  "keywords": [
6
6
  "claudia",
@@ -16,7 +16,7 @@
16
16
  "adaptive"
17
17
  ],
18
18
  "author": "Kamil Banc",
19
- "license": "Apache-2.0",
19
+ "license": "SEE LICENSE IN LICENSE",
20
20
  "repository": {
21
21
  "type": "git",
22
22
  "url": "git+https://github.com/kbanc85/claudia.git"