memory-journal-mcp 7.0.1 → 7.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -10,7 +10,7 @@
10
10
  [![MCP Registry](https://img.shields.io/badge/MCP_Registry-Published-green)](https://registry.modelcontextprotocol.io/v0/servers?search=io.github.neverinfamous/memory-journal-mcp)
11
11
  [![Security](https://img.shields.io/badge/Security-Enhanced-green.svg)](SECURITY.md)
12
12
  [![TypeScript](https://img.shields.io/badge/TypeScript-Strict-blue.svg)](https://github.com/neverinfamous/memory-journal-mcp)
13
- ![Coverage](https://img.shields.io/badge/Coverage-97.52%25-brightgreen.svg)
13
+ ![Coverage](https://img.shields.io/badge/Coverage-97.12%25-brightgreen.svg)
14
14
  ![Tests](https://img.shields.io/badge/Tests-1782_passed-brightgreen.svg)
15
15
  ![E2E Tests](https://img.shields.io/badge/E2E_Tests-391_passed-brightgreen.svg)
16
16
  [![CI](https://github.com/neverinfamous/memory-journal-mcp/actions/workflows/gatekeeper.yml/badge.svg)](https://github.com/neverinfamous/memory-journal-mcp/actions/workflows/gatekeeper.yml)
@@ -28,7 +28,7 @@
28
28
 
29
29
  ### What Sets Us Apart
30
30
 
31
- **61 MCP Tools** · **17 Workflow Prompts** · **38 Resources** · **10 Tool Groups** · **Code Mode** · **GitHub Commander** (Issue Triage, PR Review, Milestone Sprints, Security/Quality/Perf Audits) · **GitHub Integration** (Issues, PRs, Actions, Kanban, Milestones, Insights) · **Team Collaboration** (Shared DB, Vector Search, Cross-Project Insights)
31
+ **65 MCP Tools** · **17 Workflow Prompts** · **38 Resources** · **10 Tool Groups** · **Code Mode** · **GitHub Commander** (Issue Triage, PR Review, Milestone Sprints, Security/Quality/Perf Audits) · **GitHub Integration** (Issues, PRs, Actions, Kanban, Milestones, Insights) · **Team Collaboration** (Shared DB, Vector Search, Cross-Project Insights)
32
32
 
33
33
  | Feature | Description |
34
34
  | ----------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------- |
@@ -40,7 +40,8 @@
40
40
  | **Code Mode** | Execute multi-step operations in a secure sandbox — up to 90% token savings via `mj.*` API |
41
41
  | **Configurable Briefing** | 12 env vars / CLI flags control `memory://briefing` content — entries, team, GitHub detail, skills awareness |
42
42
  | **Reports & Analytics** | Standups, retrospectives, PR summaries, digests, period analyses, and milestone tracking |
43
- | **Team Collaboration** | 20 tools with full parity — CRUD, vector search, relationship graphs, cross-project insights, author attribution |
43
+ | **Team Collaboration** | 22 tools with full parity — CRUD, vector search, relationship graphs, cross-project insights, author attribution |
44
+ | **Data Interoperability** | Bidirectional Markdown roundtripping, unified IO namespace, and schema-safe JSON exports with hard bounds-checked path traversal defenses |
44
45
  | **Backup & Restore** | One-command backup/restore with automated scheduling, retention policies, and safety-net auto-backups |
45
46
  | **Security & Transport** | OAuth 2.1 (RFC 9728/8414, JWT/JWKS, scopes), Streamable HTTP + SSE, rate limiting, CORS, SQL injection prevention, non-root Docker |
46
47
  | **Structured Error Handling** | Every tool returns `{success, error, code, category, suggestion, recoverable}` — agents get classification, remediation hints, and recoverability signals |
@@ -149,18 +150,18 @@ Control which tools are exposed via `MEMORY_JOURNAL_MCP_TOOL_FILTER` (or CLI: `-
149
150
 
150
151
  | Filter | Tools | Use Case |
151
152
  | -------------------- | ----- | ------------------------ |
152
- | `full` | 61 | All tools (default) |
153
+ | `full` | 65 | All tools (default) |
153
154
  | `starter` | ~11 | Core + search + codemode |
154
155
  | `essential` | ~7 | Minimal footprint |
155
156
  | `readonly` | ~15 | Disable all mutations |
156
- | `-github` | 45 | Exclude a group |
157
- | `-github,-analytics` | 43 | Exclude multiple groups |
157
+ | `-github` | 49 | Exclude a group |
158
+ | `-github,-analytics` | 47 | Exclude multiple groups |
158
159
 
159
160
  **Filter Syntax:** `shortcut` or `group` or `tool_name` (whitelist mode) · `-group` (disable group) · `-tool` (disable tool) · `+tool` (re-enable after group disable)
160
161
 
161
162
  **Custom Selection:** List individual tool names to create your own whitelist: `--tool-filter "create_entry,search_entries,semantic_search"`
162
163
 
163
- **Groups:** `core`, `search`, `analytics`, `relationships`, `export`, `admin`, `github`, `backup`, `team`, `codemode`
164
+ **Groups:** `core`, `search`, `analytics`, `relationships`, `io`, `admin`, `github`, `backup`, `team`, `codemode`
164
165
 
165
166
  **[Complete tool filtering guide →](https://github.com/neverinfamous/memory-journal-mcp/wiki/Tool-Filtering)**
166
167
 
@@ -168,20 +169,20 @@ Control which tools are exposed via `MEMORY_JOURNAL_MCP_TOOL_FILTER` (or CLI: `-
168
169
 
169
170
  ## 📋 Core Capabilities
170
171
 
171
- ### 🛠️ **61 MCP Tools** (10 Groups)
172
-
173
- | Group | Tools | Description |
174
- | --------------- | ----- | ------------------------------------------------------------------------------------------------------------------- |
175
- | `codemode` | 1 | Code Mode (sandboxed code execution) 🌟 **Recommended** |
176
- | `core` | 6 | Entry CRUD, tags, test |
177
- | `search` | 4 | Text search, date range, semantic, vector stats |
178
- | `analytics` | 2 | Statistics, cross-project insights |
179
- | `relationships` | 2 | Link entries, visualize graphs |
180
- | `export` | 1 | JSON/Markdown export |
181
- | `admin` | 5 | Update, delete, rebuild/add to vector index, merge tags |
182
- | `github` | 16 | Issues, PRs, context, Kanban, **Milestones**, **Insights**, **issue lifecycle**, **Copilot Reviews** |
183
- | `backup` | 4 | Backup, list, restore, cleanup |
184
- | `team` | 20 | CRUD, search, stats, relationships, export, backup, vector search, cross-project insights (requires `TEAM_DB_PATH`) |
172
+ ### 🛠️ **65 MCP Tools** (10 Groups)
173
+
174
+ | Group | Tools | Description |
175
+ | --------------- | ----- | ---------------------------------------------------------------------------------------------------------------------------------------- |
176
+ | `codemode` | 1 | Code Mode (sandboxed code execution) 🌟 **Recommended** |
177
+ | `core` | 6 | Entry CRUD, tags, test |
178
+ | `search` | 4 | Text search, date range, semantic, vector stats |
179
+ | `analytics` | 2 | Statistics, cross-project insights |
180
+ | `relationships` | 2 | Link entries, visualize graphs |
181
+ | `io` | 3 | JSON/Markdown export and File-level Markdown Data Integration Interoperability (Import/Export) |
182
+ | `admin` | 5 | Update, delete, rebuild/add to vector index, merge tags |
183
+ | `github` | 16 | Issues, PRs, context, Kanban, **Milestones**, **Insights**, **issue lifecycle**, **Copilot Reviews** |
184
+ | `backup` | 4 | Backup, list, restore, cleanup |
185
+ | `team` | 22 | CRUD, search, stats, relationships, IO (Markdown import/export), backup, vector search, cross-project insights (requires `TEAM_DB_PATH`) |
185
186
 
186
187
  **[Complete tools reference →](https://github.com/neverinfamous/memory-journal-mcp/wiki/Tools)**
187
188
 
@@ -268,7 +269,7 @@ Code executes in a **sandboxed VM context** with multiple layers of security. Al
268
269
 
269
270
  ### ⚡ Code Mode Only (Maximum Token Savings)
270
271
 
271
- Run with **only Code Mode enabled** — a single tool that provides access to all 61 tools' worth of capability through the `mj.*` API:
272
+ Run with **only Code Mode enabled** — a single tool that provides access to all 65 tools' worth of capability through the `mj.*` API:
272
273
 
273
274
  ```json
274
275
  {
@@ -554,7 +555,7 @@ For production deployments, enable OAuth 2.1 authentication on the HTTP transpor
554
555
 
555
556
  | Scope | Tool Groups |
556
557
  | ------- | ------------------------------------------------- |
557
- | `read` | core, search, analytics, relationships, export |
558
+ | `read` | core, search, analytics, relationships, io |
558
559
  | `write` | github, team (+ all read groups) |
559
560
  | `admin` | admin, backup, codemode (+ all write/read groups) |
560
561
 
@@ -628,7 +629,7 @@ flowchart TB
628
629
  AI["🤖 AI Agent<br/>(Cursor, Windsurf, Claude)"]
629
630
 
630
631
  subgraph MCP["Memory Journal MCP Server"]
631
- Tools["🛠️ 61 Tools"]
632
+ Tools["🛠️ 65 Tools"]
632
633
  Resources["📡 38 Resources"]
633
634
  Prompts["💬 17 Prompts"]
634
635
  end
@@ -657,7 +658,7 @@ flowchart TB
657
658
  ┌─────────────────────────────────────────────────────────────┐
658
659
  │ MCP Server Layer (TypeScript) │
659
660
  │ ┌─────────────────┐ ┌─────────────────┐ ┌─────────────┐ │
660
- │ │ Tools (61) │ │ Resources (38) │ │ Prompts (17)│ │
661
+ │ │ Tools (65) │ │ Resources (38) │ │ Prompts (17)│ │
661
662
  │ │ with Annotations│ │ with Annotations│ │ │ │
662
663
  │ └─────────────────┘ └─────────────────┘ └─────────────┘ │
663
664
  ├─────────────────────────────────────────────────────────────┤
@@ -222,6 +222,13 @@ function assertNoPathTraversal(filename) {
222
222
  throw new PathTraversalError(filename);
223
223
  }
224
224
  }
225
+ function assertSafeDirectoryPath(dirPath) {
226
+ const normalized = dirPath.replace(/\\/g, "/");
227
+ const segments = normalized.split("/");
228
+ if (segments.some((s) => s === "..")) {
229
+ throw new PathTraversalError(dirPath);
230
+ }
231
+ }
225
232
  var TOKEN_PATTERNS = [
226
233
  // GitHub personal access tokens (classic and fine-grained)
227
234
  /ghp_[A-Za-z0-9_]{36,}/g,
@@ -1656,4 +1663,4 @@ var GitHubIntegration = class {
1656
1663
  }
1657
1664
  };
1658
1665
 
1659
- export { ConfigurationError, ConnectionError, GitHubIntegration, MemoryJournalMcpError, QueryError, ResourceNotFoundError, ValidationError, assertNoPathTraversal, logger, matchSuggestion, resolveAuthor, validateDateFormatPattern };
1666
+ export { ConfigurationError, ConnectionError, GitHubIntegration, MemoryJournalMcpError, QueryError, ResourceNotFoundError, ValidationError, assertNoPathTraversal, assertSafeDirectoryPath, logger, matchSuggestion, resolveAuthor, validateDateFormatPattern };
@@ -1,5 +1,5 @@
1
- import { withSessionInit, withPriority, ASSISTANT_FOCUSED, TOOL_GROUPS, HIGH_PRIORITY, LOW_PRIORITY, MEDIUM_PRIORITY, setDefaultSandboxMode, initializeAuditLogger, parseToolFilter, getFilterSummary, getToolFilterFromEnv, getTools, getEnabledGroups, callTool, getGlobalAuditLogger, sendProgress, SUPPORTED_SCOPES, getRequiredScope, hasScope, getAuditResourceDef, execQuery, transformEntryRow, resolveGitHubRepo, isResourceError, milestoneCompletionPct, parseScopes, BASE_SCOPES, getAllToolNames, globalMetrics, DEFAULT_BRIEFING_CONFIG } from './chunk-2BJHLTYP.js';
2
- import { logger, GitHubIntegration, ConfigurationError, ResourceNotFoundError, ConnectionError, QueryError, assertNoPathTraversal, ValidationError, MemoryJournalMcpError, validateDateFormatPattern } from './chunk-ARLH46WS.js';
1
+ import { withSessionInit, withPriority, ASSISTANT_FOCUSED, TOOL_GROUPS, HIGH_PRIORITY, LOW_PRIORITY, MEDIUM_PRIORITY, setDefaultSandboxMode, initializeAuditLogger, parseToolFilter, getFilterSummary, getToolFilterFromEnv, getTools, getEnabledGroups, callTool, getGlobalAuditLogger, sendProgress, SUPPORTED_SCOPES, getRequiredScope, hasScope, getAuditResourceDef, execQuery, transformEntryRow, resolveGitHubRepo, isResourceError, milestoneCompletionPct, parseScopes, BASE_SCOPES, getAllToolNames, globalMetrics, DEFAULT_BRIEFING_CONFIG } from './chunk-JEGRDY6W.js';
2
+ import { logger, GitHubIntegration, ConfigurationError, ResourceNotFoundError, ConnectionError, QueryError, assertNoPathTraversal, ValidationError, MemoryJournalMcpError, validateDateFormatPattern } from './chunk-37BQOJDZ.js';
3
3
  import { createRequire } from 'module';
4
4
  import { McpServer, ResourceTemplate } from '@modelcontextprotocol/sdk/server/mcp.js';
5
5
  import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
@@ -2445,10 +2445,10 @@ var CODE_MODE_NAMESPACE_ROWS = [
2445
2445
  example: '`mj.relationships.linkEntries(1, 2, "implements")`'
2446
2446
  },
2447
2447
  {
2448
- group: "export",
2449
- label: "Export",
2450
- namespace: "`mj.export.*`",
2451
- example: '`mj.export.exportEntries("json")`'
2448
+ group: "io",
2449
+ label: "IO",
2450
+ namespace: "`mj.io.*`",
2451
+ example: '`mj.io.exportEntries("json")`'
2452
2452
  },
2453
2453
  {
2454
2454
  group: "admin",
@@ -2502,7 +2502,7 @@ This executes JavaScript in a sandboxed environment with all tools available as
2502
2502
  | Search | \`mj.search.*\` | \`mj.search.searchEntries("performance")\` |
2503
2503
  | Analytics | \`mj.analytics.*\` | \`mj.analytics.getStatistics()\` |
2504
2504
  | Relationships | \`mj.relationships.*\` | \`mj.relationships.linkEntries(1, 2, "implements")\` |
2505
- | Export | \`mj.export.*\` | \`mj.export.exportEntries("json")\` |
2505
+ | IO | \`mj.io.*\` | \`mj.io.exportEntries("json")\` |
2506
2506
  | Admin | \`mj.admin.*\` | \`mj.admin.rebuildVectorIndex()\` |
2507
2507
  | GitHub | \`mj.github.*\` | \`mj.github.getGithubIssues({ state: "open" })\` |
2508
2508
  | Backup | \`mj.backup.*\` | \`mj.backup.backupJournal()\` |
@@ -2626,7 +2626,7 @@ var GOTCHAS_CONTENT = `# memory-journal-mcp \u2014 Field Notes & Gotchas
2626
2626
 
2627
2627
  - **Team cross-database search**: \`search_entries\` and \`search_by_date_range\` automatically merge team DB results when \`TEAM_DB_PATH\` is configured. Results include a \`source\` field ("personal" or "team").
2628
2628
  - **Team vector search**: Team has its own isolated vector index. Use \`team_rebuild_vector_index\` if the team index drifts. \`team_semantic_search\` works identically to personal \`semantic_search\`.
2629
- - **Team tools without \`TEAM_DB_PATH\`**: All 20 team tools return \`{ success: false, error: "Team collaboration is not configured..." }\` \u2014 no crash, no partial results.
2629
+ - **Team tools without \`TEAM_DB_PATH\`**: All 22 team tools return \`{ success: false, error: "Team collaboration is not configured..." }\` \u2014 no crash, no partial results.
2630
2630
  `;
2631
2631
  function generateInstructions(enabledTools, prompts, latestEntry, level = "standard", enabledGroups) {
2632
2632
  const groups = enabledGroups ?? getEnabledGroups(enabledTools);
@@ -4692,7 +4692,7 @@ var GROUP_DESCRIPTIONS = {
4692
4692
  search: "Full-text search, semantic search, date range queries, and vector index stats",
4693
4693
  analytics: "Entry analytics, importance scoring, and productivity trends",
4694
4694
  relationships: "Link and visualize relationships between journal entries",
4695
- export: "Export journal data in various formats (JSON, Markdown, CSV)",
4695
+ io: "Import and export journal data \u2014 JSON export, Markdown round-trip (export/import)",
4696
4696
  admin: "Update entries, rebuild indexes, and manage the vector search index",
4697
4697
  github: "GitHub integration \u2014 issues, PRs, milestones, workflow runs, Kanban boards",
4698
4698
  backup: "Create and restore database backups",
@@ -4894,7 +4894,7 @@ function getHelpResourceDefinitions() {
4894
4894
  var toolIndexModule = null;
4895
4895
  async function getAllToolDefinitionsAsync(context) {
4896
4896
  try {
4897
- toolIndexModule ??= await import('./tools-FFFGXIKN.js');
4897
+ toolIndexModule ??= await import('./tools-O44Q52RD.js');
4898
4898
  if (toolIndexModule === null) return [];
4899
4899
  const tools = toolIndexModule.getTools(context.db, null);
4900
4900
  return tools.map((t) => ({
@@ -4935,8 +4935,10 @@ function inferGroupFromName(name) {
4935
4935
  // relationships (2)
4936
4936
  link_entries: "relationships",
4937
4937
  visualize_relationships: "relationships",
4938
- // export (1)
4939
- export_entries: "export",
4938
+ // io (3)
4939
+ export_entries: "io",
4940
+ export_markdown: "io",
4941
+ import_markdown: "io",
4940
4942
  // admin (5)
4941
4943
  update_entry: "admin",
4942
4944
  delete_entry: "admin",
@@ -1,14 +1,15 @@
1
1
  import { transformAutoReturn } from './chunk-OKOVZ5QE.js';
2
- import { GitHubIntegration, resolveAuthor, logger, MemoryJournalMcpError, matchSuggestion, ConfigurationError } from './chunk-ARLH46WS.js';
2
+ import { GitHubIntegration, resolveAuthor, logger, assertSafeDirectoryPath, MemoryJournalMcpError, matchSuggestion, ConfigurationError } from './chunk-37BQOJDZ.js';
3
3
  import { z, ZodError } from 'zod';
4
+ import { open, stat, mkdir, rename, appendFile, readdir, readFile } from 'fs/promises';
5
+ import { tmpdir } from 'os';
6
+ import * as path from 'path';
7
+ import { dirname, resolve, sep, join } from 'path';
4
8
  import * as vm from 'vm';
5
9
  import { MessageChannel, Worker } from 'worker_threads';
6
10
  import * as crypto2 from 'crypto';
7
11
  import { fileURLToPath } from 'url';
8
- import * as path from 'path';
9
- import { dirname } from 'path';
10
12
  import { performance as performance$1 } from 'perf_hooks';
11
- import { open, stat, mkdir, rename, appendFile } from 'fs/promises';
12
13
 
13
14
  function formatZodError(error) {
14
15
  return error.issues.map((issue) => {
@@ -58,7 +59,7 @@ async function resolveIssueUrl(context, projectNumber, issueNumber, existingUrl)
58
59
  ([_, v]) => v.project_number === projectNumber
59
60
  );
60
61
  if (entry) {
61
- const { GitHubIntegration: GitHubIntegration2 } = await import('./github-integration-PDRLXKGM.js');
62
+ const { GitHubIntegration: GitHubIntegration2 } = await import('./github-integration-FOJ4U6I5.js');
62
63
  const targetGithub = new GitHubIntegration2(entry[1].path);
63
64
  const repoInfo = await targetGithub.getRepoInfo();
64
65
  if (repoInfo.owner && repoInfo.repo) {
@@ -1261,12 +1262,23 @@ var LinkEntriesSchemaMcp = z.object({
1261
1262
  var VisualizeInputSchema = z.object({
1262
1263
  entry_id: z.number().optional().describe("Specific entry ID to visualize (shows connected entries)"),
1263
1264
  tags: z.array(z.string()).optional().describe("Filter entries by tags"),
1265
+ relationship_type: z.enum([
1266
+ "evolves_from",
1267
+ "references",
1268
+ "implements",
1269
+ "clarifies",
1270
+ "response_to",
1271
+ "blocked_by",
1272
+ "resolved",
1273
+ "caused"
1274
+ ]).optional().describe("Filter to show only this relationship type"),
1264
1275
  depth: z.number().min(1).max(3).optional().default(2).describe("Relationship traversal depth"),
1265
1276
  limit: z.number().max(500).optional().default(20).describe("Maximum entries to include")
1266
1277
  });
1267
1278
  var VisualizeInputSchemaMcp = z.object({
1268
1279
  entry_id: relaxedNumber().optional().describe("Specific entry ID to visualize (shows connected entries)"),
1269
1280
  tags: z.array(z.string()).optional().describe("Filter entries by tags"),
1281
+ relationship_type: z.string().optional().describe("Filter to show only this relationship type (e.g., blocked_by, implements)"),
1270
1282
  depth: relaxedNumber().optional().default(2).describe("Relationship traversal depth"),
1271
1283
  limit: relaxedNumber().optional().default(20).describe("Maximum entries to include")
1272
1284
  });
@@ -1463,15 +1475,18 @@ function getRelationshipTools(context) {
1463
1475
  }
1464
1476
  const entryIds = Object.keys(entries).map(Number);
1465
1477
  const placeholders = entryIds.map(() => "?").join(",");
1466
- const relsResult = db.executeRawQuery(
1467
- `
1478
+ let relsQuery = `
1468
1479
  SELECT from_entry_id, to_entry_id, relationship_type
1469
1480
  FROM relationships
1470
1481
  WHERE from_entry_id IN (${placeholders})
1471
1482
  AND to_entry_id IN (${placeholders})
1472
- `,
1473
- [...entryIds, ...entryIds]
1474
- );
1483
+ `;
1484
+ const relsParams = [...entryIds, ...entryIds];
1485
+ if (input.relationship_type) {
1486
+ relsQuery += " AND relationship_type = ?";
1487
+ relsParams.push(input.relationship_type);
1488
+ }
1489
+ const relsResult = db.executeRawQuery(relsQuery, relsParams);
1475
1490
  const relationships = relsResult[0]?.values ?? [];
1476
1491
  const MERMAID_CONTENT_PREVIEW_LENGTH = 40;
1477
1492
  let mermaid = "```mermaid\\ngraph TD\\n";
@@ -1554,8 +1569,345 @@ async function sendProgress(ctx, progress, total, message) {
1554
1569
  } catch {
1555
1570
  }
1556
1571
  }
1572
+ var FrontmatterSchema = z.object({
1573
+ mj_id: z.number().int().positive().optional(),
1574
+ entry_type: z.string().optional(),
1575
+ author: z.string().optional(),
1576
+ tags: z.array(z.string()).optional(),
1577
+ timestamp: z.string().optional(),
1578
+ significance: z.string().optional(),
1579
+ relationships: z.array(
1580
+ z.object({
1581
+ type: z.string(),
1582
+ target_id: z.number().int().positive()
1583
+ })
1584
+ ).optional(),
1585
+ source: z.string().optional()
1586
+ });
1587
+ function serializeFrontmatter(data) {
1588
+ if (data === null || data === void 0) return "";
1589
+ const lines = ["---"];
1590
+ if (data.mj_id !== void 0) lines.push(`mj_id: ${String(data.mj_id)}`);
1591
+ if (data.entry_type !== void 0) lines.push(`entry_type: ${data.entry_type}`);
1592
+ if (data.author !== void 0) lines.push(`author: ${data.author}`);
1593
+ if (data.timestamp !== void 0) lines.push(`timestamp: ${data.timestamp}`);
1594
+ if (data.significance !== void 0) lines.push(`significance: ${data.significance}`);
1595
+ if (data.source !== void 0) lines.push(`source: ${data.source}`);
1596
+ if (data.tags && data.tags.length > 0) {
1597
+ lines.push("tags:");
1598
+ for (const tag of data.tags) {
1599
+ lines.push(` - ${tag}`);
1600
+ }
1601
+ }
1602
+ if (data.relationships !== void 0 && data.relationships.length > 0) {
1603
+ lines.push("relationships:");
1604
+ for (const rel of data.relationships) {
1605
+ lines.push(` - type: ${rel.type}`);
1606
+ lines.push(` target_id: ${String(rel.target_id)}`);
1607
+ }
1608
+ }
1609
+ if (lines.length === 1) {
1610
+ return "";
1611
+ }
1612
+ lines.push("---");
1613
+ return lines.join("\n") + "\n";
1614
+ }
1615
+ function parseFrontmatter(content) {
1616
+ const lines = content.split("\n");
1617
+ if (lines[0]?.trim() !== "---") {
1618
+ return { metadata: {}, body: content };
1619
+ }
1620
+ let closingIndex = -1;
1621
+ for (let i = 1; i < lines.length; i++) {
1622
+ if (lines[i]?.trim() === "---") {
1623
+ closingIndex = i;
1624
+ break;
1625
+ }
1626
+ }
1627
+ if (closingIndex === -1) {
1628
+ return { metadata: {}, body: content };
1629
+ }
1630
+ const fmLines = lines.slice(1, closingIndex);
1631
+ const raw = {};
1632
+ let currentKey = null;
1633
+ let currentArray = null;
1634
+ let isRelationshipArray = false;
1635
+ let currentRelObj = null;
1636
+ for (const line of fmLines) {
1637
+ if (line.trim() === "") continue;
1638
+ if (/^\s{2,}- /.test(line)) {
1639
+ const itemContent = line.replace(/^\s{2,}- /, "");
1640
+ if (isRelationshipArray && currentKey === "relationships") {
1641
+ const colonIdx2 = itemContent.indexOf(":");
1642
+ if (colonIdx2 !== -1) {
1643
+ if (currentRelObj !== null) {
1644
+ currentArray?.push(currentRelObj);
1645
+ }
1646
+ currentRelObj = {};
1647
+ const key = itemContent.slice(0, colonIdx2).trim();
1648
+ const value = itemContent.slice(colonIdx2 + 1).trim();
1649
+ currentRelObj[key] = parseScalar(value);
1650
+ }
1651
+ } else if (currentArray !== null) {
1652
+ currentArray.push(itemContent.trim());
1653
+ }
1654
+ continue;
1655
+ }
1656
+ if (/^\s{4,}\w/.test(line) && currentRelObj !== null) {
1657
+ const colonIdx2 = line.indexOf(":");
1658
+ if (colonIdx2 !== -1) {
1659
+ const key = line.slice(0, colonIdx2).trim();
1660
+ const value = line.slice(colonIdx2 + 1).trim();
1661
+ currentRelObj[key] = parseScalar(value);
1662
+ }
1663
+ continue;
1664
+ }
1665
+ if (currentKey !== null && currentArray !== null) {
1666
+ if (isRelationshipArray && currentRelObj !== null) {
1667
+ currentArray.push(currentRelObj);
1668
+ currentRelObj = null;
1669
+ }
1670
+ raw[currentKey] = currentArray;
1671
+ currentArray = null;
1672
+ currentKey = null;
1673
+ isRelationshipArray = false;
1674
+ }
1675
+ const colonIdx = line.indexOf(":");
1676
+ if (colonIdx !== -1) {
1677
+ const key = line.slice(0, colonIdx).trim();
1678
+ const value = line.slice(colonIdx + 1).trim();
1679
+ if (value === "") {
1680
+ currentKey = key;
1681
+ currentArray = [];
1682
+ isRelationshipArray = key === "relationships";
1683
+ } else {
1684
+ raw[key] = parseScalar(value);
1685
+ }
1686
+ }
1687
+ }
1688
+ if (currentKey !== null && currentArray !== null) {
1689
+ if (isRelationshipArray && currentRelObj !== null) {
1690
+ currentArray.push(currentRelObj);
1691
+ }
1692
+ raw[currentKey] = currentArray;
1693
+ }
1694
+ const parseResult = FrontmatterSchema.safeParse(raw);
1695
+ if (!parseResult.success) {
1696
+ const issues = parseResult.error.issues.map((i) => `${i.path.join(".")}: ${i.message}`);
1697
+ throw new Error(`Invalid frontmatter: ${issues.join("; ")}`);
1698
+ }
1699
+ const body = lines.slice(closingIndex + 1).join("\n").replace(/^\n/, "");
1700
+ return { metadata: parseResult.data, body };
1701
+ }
1702
+ function parseScalar(value) {
1703
+ if (/^-?\d+$/.test(value)) return parseInt(value, 10);
1704
+ if (/^-?\d+\.\d+$/.test(value)) return parseFloat(value);
1705
+ if (value === "true") return true;
1706
+ if (value === "false") return false;
1707
+ if (value.startsWith('"') && value.endsWith('"') || value.startsWith("'") && value.endsWith("'")) {
1708
+ return value.slice(1, -1);
1709
+ }
1710
+ return value;
1711
+ }
1712
+ function generateSlug(content) {
1713
+ const slug = content.slice(0, 50).toLowerCase().replace(/[^a-z0-9]+/g, "-").replace(/^-+|-+$/g, "").replace(/-{2,}/g, "-");
1714
+ return slug || "untitled";
1715
+ }
1716
+ function generateFilename(id, content) {
1717
+ return `${String(id)}-${generateSlug(content)}.md`;
1718
+ }
1719
+ async function exportEntriesToMarkdown(entries, outputDir, db) {
1720
+ const resolvedOutputDir = resolve(outputDir);
1721
+ const resolvedTmpDir = resolve(tmpdir());
1722
+ const tmpPrefix = resolvedTmpDir.endsWith(sep) ? resolvedTmpDir : resolvedTmpDir + sep;
1723
+ if (resolvedOutputDir === resolvedTmpDir || resolvedOutputDir.startsWith(tmpPrefix)) {
1724
+ throw new Error("Refusing to export markdown files into the OS temporary directory");
1725
+ }
1726
+ await mkdir(resolvedOutputDir, { recursive: true });
1727
+ const files = [];
1728
+ let skipped = 0;
1729
+ for (const entry of entries) {
1730
+ if (!entry.content.trim()) {
1731
+ skipped++;
1732
+ continue;
1733
+ }
1734
+ const fmData = {
1735
+ mj_id: entry.id,
1736
+ entry_type: entry.entryType,
1737
+ timestamp: entry.timestamp,
1738
+ source: "memory-journal-mcp"
1739
+ };
1740
+ if (entry.tags !== void 0 && entry.tags.length > 0) {
1741
+ fmData.tags = entry.tags;
1742
+ }
1743
+ if (entry.significance !== void 0) {
1744
+ fmData.significance = entry.significance;
1745
+ }
1746
+ if (entry.author !== void 0) {
1747
+ fmData.author = entry.author;
1748
+ }
1749
+ try {
1750
+ const relationships = db.getRelationships(entry.id);
1751
+ if (relationships.length > 0) {
1752
+ fmData.relationships = relationships.map((r) => ({
1753
+ type: r.relationshipType,
1754
+ target_id: r.fromEntryId === entry.id ? r.toEntryId : r.fromEntryId
1755
+ }));
1756
+ }
1757
+ } catch {
1758
+ }
1759
+ const frontmatter = serializeFrontmatter(fmData);
1760
+ const fileContent = frontmatter + "\n" + entry.content + "\n";
1761
+ const filename = generateFilename(entry.id, entry.content);
1762
+ const filepath = join(resolvedOutputDir, filename);
1763
+ const handle = await open(filepath, "w", 384);
1764
+ try {
1765
+ await handle.writeFile(fileContent, "utf-8");
1766
+ } finally {
1767
+ await handle.close();
1768
+ }
1769
+ files.push(filename);
1770
+ }
1771
+ return {
1772
+ success: true,
1773
+ exported_count: files.length,
1774
+ output_dir: resolvedOutputDir,
1775
+ files,
1776
+ skipped
1777
+ };
1778
+ }
1779
+ var VALID_RELATIONSHIP_TYPES = /* @__PURE__ */ new Set([
1780
+ "evolves_from",
1781
+ "references",
1782
+ "implements",
1783
+ "clarifies",
1784
+ "response_to",
1785
+ "blocked_by",
1786
+ "resolved",
1787
+ "caused"
1788
+ ]);
1789
+ var VALID_ENTRY_TYPES = /* @__PURE__ */ new Set([
1790
+ "personal_reflection",
1791
+ "project_decision",
1792
+ "technical_achievement",
1793
+ "bug_fix",
1794
+ "feature_implementation",
1795
+ "code_review",
1796
+ "meeting_notes",
1797
+ "learning",
1798
+ "research",
1799
+ "planning",
1800
+ "retrospective",
1801
+ "standup",
1802
+ "technical_note",
1803
+ "development_note",
1804
+ "enhancement",
1805
+ "milestone",
1806
+ "system_integration_test",
1807
+ "test_entry",
1808
+ "other"
1809
+ ]);
1810
+ function toEntryType(value) {
1811
+ if (value === void 0) return void 0;
1812
+ return VALID_ENTRY_TYPES.has(value) ? value : void 0;
1813
+ }
1814
+ async function importMarkdownEntries(sourceDir, db, options = {}, vectorManager) {
1815
+ const { dry_run = false, limit = 100 } = options;
1816
+ const allFiles = await readdir(sourceDir);
1817
+ const mdFiles = allFiles.filter((f) => f.endsWith(".md")).slice(0, limit);
1818
+ const result = {
1819
+ success: true,
1820
+ created: 0,
1821
+ updated: 0,
1822
+ skipped: 0,
1823
+ errors: [],
1824
+ dry_run
1825
+ };
1826
+ for (const filename of mdFiles) {
1827
+ try {
1828
+ const filepath = join(sourceDir, filename);
1829
+ const content = await readFile(filepath, "utf-8");
1830
+ const { metadata, body } = parseFrontmatter(content);
1831
+ if (!body.trim()) {
1832
+ result.skipped++;
1833
+ continue;
1834
+ }
1835
+ if (dry_run) {
1836
+ if (metadata.mj_id !== void 0) {
1837
+ const existing = db.getEntryById(metadata.mj_id);
1838
+ if (existing) {
1839
+ result.updated++;
1840
+ } else {
1841
+ result.created++;
1842
+ }
1843
+ } else {
1844
+ result.created++;
1845
+ }
1846
+ continue;
1847
+ }
1848
+ let entryId;
1849
+ const entryType = toEntryType(metadata.entry_type) ?? "personal_reflection";
1850
+ if (metadata.mj_id !== void 0) {
1851
+ const existing = db.getEntryById(metadata.mj_id);
1852
+ if (existing) {
1853
+ db.updateEntry(metadata.mj_id, {
1854
+ content: body,
1855
+ entryType: toEntryType(metadata.entry_type),
1856
+ tags: metadata.tags
1857
+ });
1858
+ entryId = metadata.mj_id;
1859
+ result.updated++;
1860
+ } else {
1861
+ const newEntry = db.createEntry({
1862
+ content: body,
1863
+ entryType,
1864
+ tags: metadata.tags,
1865
+ ...metadata.significance !== void 0 && {
1866
+ significanceType: metadata.significance
1867
+ }
1868
+ });
1869
+ entryId = newEntry.id;
1870
+ result.created++;
1871
+ }
1872
+ } else {
1873
+ const newEntry = db.createEntry({
1874
+ content: body,
1875
+ entryType,
1876
+ tags: metadata.tags,
1877
+ ...metadata.significance !== void 0 && {
1878
+ significanceType: metadata.significance
1879
+ }
1880
+ });
1881
+ entryId = newEntry.id;
1882
+ result.created++;
1883
+ }
1884
+ if (metadata.relationships !== void 0) {
1885
+ for (const rel of metadata.relationships) {
1886
+ if (!VALID_RELATIONSHIP_TYPES.has(rel.type)) continue;
1887
+ try {
1888
+ const targetExists = db.getEntryById(rel.target_id);
1889
+ if (targetExists) {
1890
+ db.linkEntries(entryId, rel.target_id, rel.type);
1891
+ }
1892
+ } catch {
1893
+ }
1894
+ }
1895
+ }
1896
+ if (vectorManager) {
1897
+ void vectorManager.addEntry(entryId, body).catch(() => {
1898
+ });
1899
+ }
1900
+ } catch (err) {
1901
+ result.errors.push({
1902
+ file: filename,
1903
+ error: err instanceof Error ? err.message : String(err)
1904
+ });
1905
+ }
1906
+ }
1907
+ return result;
1908
+ }
1557
1909
 
1558
- // src/handlers/tools/export.ts
1910
+ // src/handlers/tools/io.ts
1559
1911
  var ExportEntriesSchema = z.object({
1560
1912
  format: z.enum(["json", "markdown"]).optional().default("json"),
1561
1913
  start_date: z.string().regex(DATE_FORMAT_REGEX, DATE_FORMAT_MESSAGE).optional(),
@@ -1580,14 +1932,60 @@ var ExportEntriesOutputSchema = z.object({
1580
1932
  success: z.boolean().optional(),
1581
1933
  error: z.string().optional()
1582
1934
  }).extend(ErrorFieldsMixin.shape);
1583
- function getExportTools(context) {
1935
+ var ExportMarkdownSchema = z.object({
1936
+ output_dir: z.string().min(1),
1937
+ start_date: z.string().regex(DATE_FORMAT_REGEX, DATE_FORMAT_MESSAGE).optional(),
1938
+ end_date: z.string().regex(DATE_FORMAT_REGEX, DATE_FORMAT_MESSAGE).optional(),
1939
+ entry_types: z.array(z.enum(ENTRY_TYPES)).optional(),
1940
+ tags: z.array(z.string()).optional(),
1941
+ limit: z.number().max(500).optional().default(100)
1942
+ });
1943
+ var ExportMarkdownSchemaMcp = z.object({
1944
+ output_dir: z.string().describe("Target directory for .md files"),
1945
+ start_date: z.string().optional().describe("Start date filter (YYYY-MM-DD)"),
1946
+ end_date: z.string().optional().describe("End date filter (YYYY-MM-DD)"),
1947
+ entry_types: z.array(z.string()).optional().describe("Filter by entry types"),
1948
+ tags: z.array(z.string()).optional().describe("Filter by tags"),
1949
+ limit: relaxedNumber().optional().default(100).describe("Max entries to export (default: 100, max: 500)")
1950
+ });
1951
+ var ExportMarkdownOutputSchema = z.object({
1952
+ success: z.boolean().optional(),
1953
+ exported_count: z.number().optional(),
1954
+ output_dir: z.string().optional(),
1955
+ files: z.array(z.string()).optional(),
1956
+ skipped: z.number().optional()
1957
+ }).extend(ErrorFieldsMixin.shape);
1958
+ var ImportMarkdownSchema = z.object({
1959
+ source_dir: z.string().min(1),
1960
+ dry_run: z.boolean().optional().default(false),
1961
+ limit: z.number().max(500).optional().default(100)
1962
+ });
1963
+ var ImportMarkdownSchemaMcp = z.object({
1964
+ source_dir: z.string().describe("Directory containing .md files to import"),
1965
+ dry_run: z.boolean().optional().default(false).describe("Parse and validate without writing to database"),
1966
+ limit: relaxedNumber().optional().default(100).describe("Max files to process (default: 100, max: 500)")
1967
+ });
1968
+ var ImportMarkdownOutputSchema = z.object({
1969
+ success: z.boolean().optional(),
1970
+ created: z.number().optional(),
1971
+ updated: z.number().optional(),
1972
+ skipped: z.number().optional(),
1973
+ errors: z.array(
1974
+ z.object({
1975
+ file: z.string(),
1976
+ error: z.string()
1977
+ })
1978
+ ).optional(),
1979
+ dry_run: z.boolean().optional()
1980
+ }).extend(ErrorFieldsMixin.shape);
1981
+ function getIoTools(context) {
1584
1982
  const { db, progress } = context;
1585
1983
  return [
1586
1984
  {
1587
1985
  name: "export_entries",
1588
1986
  title: "Export Entries",
1589
1987
  description: "Export journal entries to JSON or Markdown format",
1590
- group: "export",
1988
+ group: "io",
1591
1989
  inputSchema: ExportEntriesSchemaMcp,
1592
1990
  outputSchema: ExportEntriesOutputSchema,
1593
1991
  annotations: { readOnlyHint: true, idempotentHint: true, openWorldHint: false },
@@ -1643,6 +2041,111 @@ ${e.content}
1643
2041
  return formatHandlerError(err);
1644
2042
  }
1645
2043
  }
2044
+ },
2045
+ // ==================================================================
2046
+ // export_markdown
2047
+ // ==================================================================
2048
+ {
2049
+ name: "export_markdown",
2050
+ title: "Export to Markdown Files",
2051
+ description: "Export journal entries as individual frontmattered Markdown files (.md). Each file contains YAML frontmatter (mj_id, entry_type, tags, relationships) and the entry content. Files can be edited externally and re-imported via import_markdown.",
2052
+ group: "io",
2053
+ inputSchema: ExportMarkdownSchemaMcp,
2054
+ outputSchema: ExportMarkdownOutputSchema,
2055
+ annotations: {
2056
+ readOnlyHint: false,
2057
+ destructiveHint: false,
2058
+ idempotentHint: true,
2059
+ openWorldHint: true
2060
+ },
2061
+ handler: async (params) => {
2062
+ try {
2063
+ const input = ExportMarkdownSchema.parse(params);
2064
+ assertSafeDirectoryPath(input.output_dir);
2065
+ await sendProgress(progress, 0, 3, "Fetching entries...");
2066
+ const limit = input.limit ?? 100;
2067
+ let entries;
2068
+ if (input.start_date || input.end_date) {
2069
+ const startDate = input.start_date ?? DATE_MIN_SENTINEL;
2070
+ const endDate = input.end_date ?? DATE_MAX_SENTINEL;
2071
+ entries = db.searchByDateRange(startDate, endDate, {
2072
+ tags: input.tags,
2073
+ limit
2074
+ });
2075
+ } else if (input.tags && input.tags.length > 0) {
2076
+ entries = db.searchByDateRange(DATE_MIN_SENTINEL, DATE_MAX_SENTINEL, {
2077
+ tags: input.tags,
2078
+ limit
2079
+ });
2080
+ } else {
2081
+ entries = db.getRecentEntries(limit);
2082
+ }
2083
+ if (input.entry_types && input.entry_types.length > 0) {
2084
+ const allowedTypes = new Set(input.entry_types);
2085
+ entries = entries.filter((e) => allowedTypes.has(e.entryType)).slice(0, limit);
2086
+ }
2087
+ await sendProgress(
2088
+ progress,
2089
+ 1,
2090
+ 3,
2091
+ `Exporting ${String(entries.length)} entries...`
2092
+ );
2093
+ const exportable = entries.map((e) => ({
2094
+ id: e.id,
2095
+ content: e.content,
2096
+ entryType: e.entryType,
2097
+ timestamp: e.timestamp,
2098
+ tags: e.tags,
2099
+ significance: e.significanceType ?? void 0
2100
+ }));
2101
+ const result = await exportEntriesToMarkdown(
2102
+ exportable,
2103
+ input.output_dir,
2104
+ db
2105
+ );
2106
+ await sendProgress(progress, 3, 3, "Export complete");
2107
+ return result;
2108
+ } catch (err) {
2109
+ return formatHandlerError(err);
2110
+ }
2111
+ }
2112
+ },
2113
+ // ==================================================================
2114
+ // import_markdown
2115
+ // ==================================================================
2116
+ {
2117
+ name: "import_markdown",
2118
+ title: "Import from Markdown Files",
2119
+ description: "Import frontmattered Markdown files (.md) into the journal. If a file has an mj_id in its frontmatter and the entry exists, it is updated. Otherwise, a new entry is created. Tags and relationships are reconciled automatically. Use dry_run: true to preview what would happen without writing.",
2120
+ group: "io",
2121
+ inputSchema: ImportMarkdownSchemaMcp,
2122
+ outputSchema: ImportMarkdownOutputSchema,
2123
+ annotations: {
2124
+ readOnlyHint: false,
2125
+ destructiveHint: false,
2126
+ idempotentHint: false,
2127
+ openWorldHint: true
2128
+ },
2129
+ handler: async (params) => {
2130
+ try {
2131
+ const input = ImportMarkdownSchema.parse(params);
2132
+ assertSafeDirectoryPath(input.source_dir);
2133
+ await sendProgress(progress, 0, 2, "Reading markdown files...");
2134
+ const result = await importMarkdownEntries(
2135
+ input.source_dir,
2136
+ db,
2137
+ {
2138
+ dry_run: input.dry_run,
2139
+ limit: input.limit
2140
+ },
2141
+ context.vectorManager
2142
+ );
2143
+ await sendProgress(progress, 2, 2, "Import complete");
2144
+ return result;
2145
+ } catch (err) {
2146
+ return formatHandlerError(err);
2147
+ }
2148
+ }
1646
2149
  }
1647
2150
  ];
1648
2151
  }
@@ -5015,6 +5518,179 @@ function getTeamExportTools(context) {
5015
5518
  }
5016
5519
  ];
5017
5520
  }
5521
+ var TeamExportMarkdownSchema = z.object({
5522
+ output_dir: z.string().min(1),
5523
+ start_date: z.string().regex(DATE_FORMAT_REGEX, DATE_FORMAT_MESSAGE).optional(),
5524
+ end_date: z.string().regex(DATE_FORMAT_REGEX, DATE_FORMAT_MESSAGE).optional(),
5525
+ tags: z.array(z.string()).optional(),
5526
+ limit: z.number().max(500).optional().default(100)
5527
+ });
5528
+ var TeamExportMarkdownSchemaMcp = z.object({
5529
+ output_dir: z.string().describe("Target directory for .md files"),
5530
+ start_date: z.string().optional().describe("Start date filter (YYYY-MM-DD)"),
5531
+ end_date: z.string().optional().describe("End date filter (YYYY-MM-DD)"),
5532
+ tags: z.array(z.string()).optional().describe("Filter by tags"),
5533
+ limit: relaxedNumber().optional().default(100).describe("Max entries to export (default: 100, max: 500)")
5534
+ });
5535
+ var TeamExportMarkdownOutputSchema = z.object({
5536
+ success: z.boolean().optional(),
5537
+ exported_count: z.number().optional(),
5538
+ output_dir: z.string().optional(),
5539
+ files: z.array(z.string()).optional(),
5540
+ skipped: z.number().optional()
5541
+ }).extend(ErrorFieldsMixin.shape);
5542
+ var TeamImportMarkdownSchema = z.object({
5543
+ source_dir: z.string().min(1),
5544
+ dry_run: z.boolean().optional().default(false),
5545
+ limit: z.number().max(500).optional().default(100)
5546
+ });
5547
+ var TeamImportMarkdownSchemaMcp = z.object({
5548
+ source_dir: z.string().describe("Directory containing .md files to import"),
5549
+ dry_run: z.boolean().optional().default(false).describe("Parse and validate without writing to database"),
5550
+ limit: relaxedNumber().optional().default(100).describe("Max files to process (default: 100, max: 500)")
5551
+ });
5552
+ var TeamImportMarkdownOutputSchema = z.object({
5553
+ success: z.boolean().optional(),
5554
+ created: z.number().optional(),
5555
+ updated: z.number().optional(),
5556
+ skipped: z.number().optional(),
5557
+ errors: z.array(z.object({ file: z.string(), error: z.string() })).optional(),
5558
+ dry_run: z.boolean().optional()
5559
+ }).extend(ErrorFieldsMixin.shape);
5560
+ function getTeamIoTools(context) {
5561
+ const { teamDb, progress } = context;
5562
+ if (!teamDb) {
5563
+ return [
5564
+ {
5565
+ name: "team_export_markdown",
5566
+ title: "Team Export to Markdown",
5567
+ description: "Export team entries as frontmattered Markdown files",
5568
+ group: "team",
5569
+ inputSchema: TeamExportMarkdownSchemaMcp,
5570
+ outputSchema: TeamExportMarkdownOutputSchema,
5571
+ annotations: { readOnlyHint: true, idempotentHint: true, openWorldHint: true },
5572
+ handler: () => ({
5573
+ success: false,
5574
+ error: "Team collaboration is not configured. Set TEAM_DB_PATH to enable."
5575
+ })
5576
+ },
5577
+ {
5578
+ name: "team_import_markdown",
5579
+ title: "Team Import from Markdown",
5580
+ description: "Import frontmattered Markdown files into the team journal",
5581
+ group: "team",
5582
+ inputSchema: TeamImportMarkdownSchemaMcp,
5583
+ outputSchema: TeamImportMarkdownOutputSchema,
5584
+ annotations: {
5585
+ readOnlyHint: false,
5586
+ idempotentHint: false,
5587
+ openWorldHint: true
5588
+ },
5589
+ handler: () => ({
5590
+ success: false,
5591
+ error: "Team collaboration is not configured. Set TEAM_DB_PATH to enable."
5592
+ })
5593
+ }
5594
+ ];
5595
+ }
5596
+ return [
5597
+ {
5598
+ name: "team_export_markdown",
5599
+ title: "Team Export to Markdown",
5600
+ description: "Export team journal entries as individual frontmattered Markdown files (.md). Includes author field in frontmatter. Files can be re-imported via team_import_markdown.",
5601
+ group: "team",
5602
+ inputSchema: TeamExportMarkdownSchemaMcp,
5603
+ outputSchema: TeamExportMarkdownOutputSchema,
5604
+ annotations: {
5605
+ readOnlyHint: false,
5606
+ destructiveHint: false,
5607
+ idempotentHint: true,
5608
+ openWorldHint: true
5609
+ },
5610
+ handler: async (params) => {
5611
+ try {
5612
+ const input = TeamExportMarkdownSchema.parse(params);
5613
+ assertSafeDirectoryPath(input.output_dir);
5614
+ await sendProgress(progress, 0, 3, "Fetching team entries...");
5615
+ const limit = input.limit ?? 100;
5616
+ let entries;
5617
+ if (input.start_date || input.end_date) {
5618
+ const startDate = input.start_date ?? DATE_MIN_SENTINEL;
5619
+ const endDate = input.end_date ?? DATE_MAX_SENTINEL;
5620
+ entries = teamDb.searchByDateRange(startDate, endDate, {
5621
+ tags: input.tags,
5622
+ limit
5623
+ });
5624
+ } else {
5625
+ entries = teamDb.getRecentEntries(limit);
5626
+ }
5627
+ await sendProgress(
5628
+ progress,
5629
+ 1,
5630
+ 3,
5631
+ `Exporting ${String(entries.length)} team entries...`
5632
+ );
5633
+ const entryIds = entries.map((e) => e.id);
5634
+ const authorMap = batchFetchAuthors(teamDb, entryIds);
5635
+ const exportable = entries.map((e) => ({
5636
+ id: e.id,
5637
+ content: e.content,
5638
+ entryType: e.entryType,
5639
+ timestamp: e.timestamp,
5640
+ tags: e.tags,
5641
+ significance: e.significanceType ?? void 0,
5642
+ author: authorMap.get(e.id) ?? void 0
5643
+ }));
5644
+ const result = await exportEntriesToMarkdown(
5645
+ exportable,
5646
+ input.output_dir,
5647
+ teamDb
5648
+ );
5649
+ await sendProgress(progress, 3, 3, "Team export complete");
5650
+ return result;
5651
+ } catch (err) {
5652
+ return formatHandlerError(err);
5653
+ }
5654
+ }
5655
+ },
5656
+ {
5657
+ name: "team_import_markdown",
5658
+ title: "Team Import from Markdown",
5659
+ description: "Import frontmattered Markdown files (.md) into the team journal. Author is set from TEAM_AUTHOR env or git config. Use dry_run: true to preview without writing.",
5660
+ group: "team",
5661
+ inputSchema: TeamImportMarkdownSchemaMcp,
5662
+ outputSchema: TeamImportMarkdownOutputSchema,
5663
+ annotations: {
5664
+ readOnlyHint: false,
5665
+ destructiveHint: false,
5666
+ idempotentHint: false,
5667
+ openWorldHint: true
5668
+ },
5669
+ handler: async (params) => {
5670
+ try {
5671
+ const input = TeamImportMarkdownSchema.parse(params);
5672
+ assertSafeDirectoryPath(input.source_dir);
5673
+ await sendProgress(progress, 0, 2, "Reading markdown files...");
5674
+ const author = resolveAuthor();
5675
+ const result = await importMarkdownEntries(
5676
+ input.source_dir,
5677
+ teamDb,
5678
+ {
5679
+ dry_run: input.dry_run,
5680
+ limit: input.limit,
5681
+ author
5682
+ },
5683
+ context.vectorManager
5684
+ );
5685
+ await sendProgress(progress, 2, 2, "Team import complete");
5686
+ return result;
5687
+ } catch (err) {
5688
+ return formatHandlerError(err);
5689
+ }
5690
+ }
5691
+ }
5692
+ ];
5693
+ }
5018
5694
  function getTeamBackupTools(context) {
5019
5695
  const { teamDb } = context;
5020
5696
  return [
@@ -5295,6 +5971,7 @@ function getTeamTools(context) {
5295
5971
  ...getTeamAnalyticsTools(context),
5296
5972
  ...getTeamRelationshipTools(context),
5297
5973
  ...getTeamExportTools(context),
5974
+ ...getTeamIoTools(context),
5298
5975
  ...getTeamBackupTools(context),
5299
5976
  ...getTeamVectorTools(context)
5300
5977
  ];
@@ -5324,8 +6001,10 @@ var METHOD_ALIASES = {
5324
6001
  link: "linkEntries",
5325
6002
  graph: "visualizeRelationships"
5326
6003
  },
5327
- export: {
5328
- dump: "exportEntries"
6004
+ io: {
6005
+ dump: "exportEntries",
6006
+ md: "exportMarkdown",
6007
+ importMd: "importMarkdown"
5329
6008
  },
5330
6009
  admin: {
5331
6010
  edit: "updateEntry",
@@ -5400,9 +6079,10 @@ var GROUP_EXAMPLES = {
5400
6079
  'mj.relationships.linkEntries({ from_entry_id: 1, to_entry_id: 2, relationship_type: "implements" })',
5401
6080
  "mj.relationships.visualizeRelationships({ entry_id: 1 })"
5402
6081
  ],
5403
- export: [
5404
- 'mj.export.exportEntries({ format: "json" })',
5405
- 'mj.export.exportEntries({ format: "markdown", limit: 50 })'
6082
+ io: [
6083
+ 'mj.io.exportEntries({ format: "json" })',
6084
+ 'mj.io.exportMarkdown({ output_dir: "./export" })',
6085
+ 'mj.io.importMarkdown({ source_dir: "./import" })'
5406
6086
  ],
5407
6087
  admin: [
5408
6088
  'mj.admin.updateEntry({ entry_id: 1, content: "Updated content" })',
@@ -5493,7 +6173,7 @@ var GROUP_PREFIX_MAP = {
5493
6173
  search: "",
5494
6174
  analytics: "",
5495
6175
  relationships: "",
5496
- export: "",
6176
+ io: "",
5497
6177
  admin: "",
5498
6178
  github: "",
5499
6179
  backup: "",
@@ -5596,7 +6276,7 @@ var JournalApi = class {
5596
6276
  search;
5597
6277
  analytics;
5598
6278
  relationships;
5599
- export;
6279
+ io;
5600
6280
  admin;
5601
6281
  github;
5602
6282
  backup;
@@ -5617,7 +6297,7 @@ var JournalApi = class {
5617
6297
  "relationships",
5618
6298
  this.toolsByGroup.get("relationships") ?? []
5619
6299
  );
5620
- this.export = createGroupApi("export", this.toolsByGroup.get("export") ?? []);
6300
+ this.io = createGroupApi("io", this.toolsByGroup.get("io") ?? []);
5621
6301
  this.admin = createGroupApi("admin", this.toolsByGroup.get("admin") ?? []);
5622
6302
  this.github = createGroupApi("github", this.toolsByGroup.get("github") ?? []);
5623
6303
  this.backup = createGroupApi("backup", this.toolsByGroup.get("backup") ?? []);
@@ -5649,7 +6329,9 @@ var JournalApi = class {
5649
6329
  search: this.search,
5650
6330
  analytics: this.analytics,
5651
6331
  relationships: this.relationships,
5652
- export: this.export,
6332
+ io: this.io,
6333
+ // Backward-compat alias — agents using mj.export.* still work
6334
+ export: this.io,
5653
6335
  admin: this.admin,
5654
6336
  github: this.github,
5655
6337
  backup: this.backup,
@@ -5999,7 +6681,7 @@ var WorkerSandbox = class {
5999
6681
  const effectiveTimeout = timeoutMs ?? this.options.timeoutMs;
6000
6682
  const startTime = performance.now();
6001
6683
  const startRss = process.memoryUsage.rss();
6002
- return new Promise((resolve) => {
6684
+ return new Promise((resolve2) => {
6003
6685
  const methodList = {};
6004
6686
  const topLevel = [];
6005
6687
  for (const [key, value] of Object.entries(apiBindings)) {
@@ -6055,7 +6737,7 @@ var WorkerSandbox = class {
6055
6737
  cpuTimeMs: result.metrics.cpuTimeMs,
6056
6738
  memoryUsedMb: Math.round((endRss - startRss) / 1024 / 1024)
6057
6739
  };
6058
- resolve(result);
6740
+ resolve2(result);
6059
6741
  });
6060
6742
  worker.on("error", (err) => {
6061
6743
  clearTimeout(timeoutHandle);
@@ -6064,7 +6746,7 @@ var WorkerSandbox = class {
6064
6746
  const endRss = process.memoryUsage.rss();
6065
6747
  const errorMessage = err.message;
6066
6748
  const errorStack = err.stack;
6067
- resolve({
6749
+ resolve2({
6068
6750
  success: false,
6069
6751
  error: errorMessage,
6070
6752
  stack: errorStack,
@@ -6081,7 +6763,7 @@ var WorkerSandbox = class {
6081
6763
  if (exitCode !== 0) {
6082
6764
  const endTime = performance.now();
6083
6765
  const endRss = process.memoryUsage.rss();
6084
- resolve({
6766
+ resolve2({
6085
6767
  success: false,
6086
6768
  error: `Worker exited with code ${String(exitCode)} (likely timeout or OOM)`,
6087
6769
  metrics: {
@@ -6207,7 +6889,7 @@ function collectNonCodeModeTools(context) {
6207
6889
  ...getSearchTools(context),
6208
6890
  ...getAnalyticsTools(context),
6209
6891
  ...getRelationshipTools(context),
6210
- ...getExportTools(context),
6892
+ ...getIoTools(context),
6211
6893
  ...getAdminTools(context),
6212
6894
  ...getGitHubTools(context),
6213
6895
  ...getBackupTools(context),
@@ -6221,7 +6903,7 @@ function getCodeModeTools(context) {
6221
6903
  {
6222
6904
  name: "mj_execute_code",
6223
6905
  title: "Execute Code (Code Mode)",
6224
- description: "Execute JavaScript in a sandboxed environment with access to all journal tools via the `mj.*` API. Enables multi-step workflows in a single call, reducing token usage by 70-90%. API groups: mj.core.*, mj.search.*, mj.analytics.*, mj.relationships.*, mj.export.*, mj.admin.*, mj.github.*, mj.backup.*, mj.team.*. Use mj.help() for method listing. Returns the last expression value.",
6906
+ description: "Execute JavaScript in a sandboxed environment with access to all journal tools via the `mj.*` API. Enables multi-step workflows in a single call, reducing token usage by 70-90%. API groups: mj.core.*, mj.search.*, mj.analytics.*, mj.relationships.*, mj.io.*, mj.admin.*, mj.github.*, mj.backup.*, mj.team.*. Use mj.help() for method listing. Returns the last expression value.",
6225
6907
  group: "codemode",
6226
6908
  inputSchema: ExecuteCodeSchemaMcp,
6227
6909
  annotations: {
@@ -6645,7 +7327,7 @@ var TOOL_GROUPS = {
6645
7327
  search: ["search_entries", "search_by_date_range", "semantic_search", "get_vector_index_stats"],
6646
7328
  analytics: ["get_statistics", "get_cross_project_insights"],
6647
7329
  relationships: ["link_entries", "visualize_relationships"],
6648
- export: ["export_entries"],
7330
+ io: ["export_entries", "export_markdown", "import_markdown"],
6649
7331
  admin: [
6650
7332
  "update_entry",
6651
7333
  "delete_entry",
@@ -6686,6 +7368,8 @@ var TOOL_GROUPS = {
6686
7368
  "team_link_entries",
6687
7369
  "team_visualize_relationships",
6688
7370
  "team_export_entries",
7371
+ "team_export_markdown",
7372
+ "team_import_markdown",
6689
7373
  "team_backup",
6690
7374
  "team_list_backups",
6691
7375
  "team_semantic_search",
@@ -6704,14 +7388,14 @@ var META_GROUPS = {
6704
7388
  "search",
6705
7389
  "analytics",
6706
7390
  "relationships",
6707
- "export",
7391
+ "io",
6708
7392
  "admin",
6709
7393
  "github",
6710
7394
  "backup",
6711
7395
  "team",
6712
7396
  "codemode"
6713
7397
  ],
6714
- readonly: ["core", "search", "analytics", "relationships", "export"]
7398
+ readonly: ["core", "search", "analytics", "relationships", "io"]
6715
7399
  };
6716
7400
  function getAllToolNames() {
6717
7401
  const allTools = [];
@@ -6737,8 +7421,12 @@ function getEnabledGroups(enabledTools) {
6737
7421
  }
6738
7422
  return groups;
6739
7423
  }
7424
+ var GROUP_ALIASES = { export: "io" };
7425
+ function resolveGroupAlias(name) {
7426
+ return GROUP_ALIASES[name] ?? name;
7427
+ }
6740
7428
  function isGroup(name) {
6741
- return name in TOOL_GROUPS;
7429
+ return resolveGroupAlias(name) in TOOL_GROUPS;
6742
7430
  }
6743
7431
  function isMetaGroup(name) {
6744
7432
  return name in META_GROUPS;
@@ -6761,7 +7449,8 @@ function parseToolFilter(filterString) {
6761
7449
  enabledTools = /* @__PURE__ */ new Set([...enabledTools, ...TOOL_GROUPS[group]]);
6762
7450
  }
6763
7451
  } else if (isGroup(name)) {
6764
- enabledTools = /* @__PURE__ */ new Set([...enabledTools, ...TOOL_GROUPS[name]]);
7452
+ const resolved = resolveGroupAlias(name);
7453
+ enabledTools = /* @__PURE__ */ new Set([...enabledTools, ...TOOL_GROUPS[resolved]]);
6765
7454
  } else {
6766
7455
  enabledTools.add(name);
6767
7456
  }
@@ -6772,7 +7461,8 @@ function parseToolFilter(filterString) {
6772
7461
  });
6773
7462
  } else if (isRemove) {
6774
7463
  if (isGroup(name)) {
6775
- for (const tool of TOOL_GROUPS[name]) {
7464
+ const resolved = resolveGroupAlias(name);
7465
+ for (const tool of TOOL_GROUPS[resolved]) {
6776
7466
  enabledTools.delete(tool);
6777
7467
  }
6778
7468
  } else {
@@ -6789,7 +7479,8 @@ function parseToolFilter(filterString) {
6789
7479
  enabledTools = /* @__PURE__ */ new Set([...enabledTools, ...TOOL_GROUPS[group]]);
6790
7480
  }
6791
7481
  } else if (isGroup(name)) {
6792
- enabledTools = /* @__PURE__ */ new Set([...enabledTools, ...TOOL_GROUPS[name]]);
7482
+ const resolved = resolveGroupAlias(name);
7483
+ enabledTools = /* @__PURE__ */ new Set([...enabledTools, ...TOOL_GROUPS[resolved]]);
6793
7484
  } else {
6794
7485
  enabledTools.add(name);
6795
7486
  }
@@ -6805,7 +7496,8 @@ function parseToolFilter(filterString) {
6805
7496
  for (const rule of rules) {
6806
7497
  if (rule.type === "exclude") {
6807
7498
  if (isGroup(rule.target)) {
6808
- for (const tool of TOOL_GROUPS[rule.target]) {
7499
+ const resolved = resolveGroupAlias(rule.target);
7500
+ for (const tool of TOOL_GROUPS[resolved]) {
6809
7501
  enabledTools.delete(tool);
6810
7502
  }
6811
7503
  } else {
@@ -6861,7 +7553,7 @@ var TOOL_GROUP_SCOPES = {
6861
7553
  search: SCOPES.READ,
6862
7554
  analytics: SCOPES.READ,
6863
7555
  relationships: SCOPES.READ,
6864
- export: SCOPES.READ,
7556
+ io: SCOPES.READ,
6865
7557
  admin: SCOPES.ADMIN,
6866
7558
  github: SCOPES.WRITE,
6867
7559
  backup: SCOPES.ADMIN,
@@ -6914,6 +7606,8 @@ for (const [group, tools] of Object.entries(TOOL_GROUPS)) {
6914
7606
  }
6915
7607
  }
6916
7608
  }
7609
+ toolScopeMap.set("import_markdown", SCOPES.WRITE);
7610
+ toolScopeMap.set("team_import_markdown", SCOPES.WRITE);
6917
7611
  function getRequiredScope(toolName) {
6918
7612
  return toolScopeMap.get(toolName) ?? SCOPES.READ;
6919
7613
  }
@@ -7147,10 +7841,10 @@ function getToolIcon(group) {
7147
7841
  title: "Relationships",
7148
7842
  description: "Entry relationship management"
7149
7843
  },
7150
- export: {
7151
- iconUrl: "https://cdn.jsdelivr.net/npm/@mdi/svg@7.4.47/svg/export.svg",
7152
- title: "Export",
7153
- description: "Data export operations"
7844
+ io: {
7845
+ iconUrl: "https://cdn.jsdelivr.net/npm/@mdi/svg@7.4.47/svg/swap-horizontal.svg",
7846
+ title: "IO",
7847
+ description: "Import/export operations"
7154
7848
  },
7155
7849
  admin: {
7156
7850
  iconUrl: "https://cdn.jsdelivr.net/npm/@mdi/svg@7.4.47/svg/cog.svg",
@@ -7264,7 +7958,7 @@ function getAllToolDefinitions(context) {
7264
7958
  ...getSearchTools(context),
7265
7959
  ...getAnalyticsTools(context),
7266
7960
  ...getRelationshipTools(context),
7267
- ...getExportTools(context),
7961
+ ...getIoTools(context),
7268
7962
  ...getAdminTools(context),
7269
7963
  ...getGitHubTools(context),
7270
7964
  ...getBackupTools(context),
package/dist/cli.js CHANGED
@@ -1,7 +1,7 @@
1
- import { VERSION, createServer } from './chunk-6J4RPJ4I.js';
2
- import { DEFAULT_AUDIT_LOG_MAX_SIZE_BYTES } from './chunk-2BJHLTYP.js';
1
+ import { VERSION, createServer } from './chunk-GW5DYUQJ.js';
2
+ import { DEFAULT_AUDIT_LOG_MAX_SIZE_BYTES } from './chunk-JEGRDY6W.js';
3
3
  import './chunk-OKOVZ5QE.js';
4
- import { logger } from './chunk-ARLH46WS.js';
4
+ import { logger } from './chunk-37BQOJDZ.js';
5
5
  import { Command } from 'commander';
6
6
  import * as fs from 'fs';
7
7
 
@@ -0,0 +1 @@
1
+ export { GitHubIntegration } from './chunk-37BQOJDZ.js';
package/dist/index.d.ts CHANGED
@@ -4,7 +4,7 @@
4
4
  /**
5
5
  * Tool group identifiers for Memory Journal
6
6
  */
7
- type ToolGroup = 'core' | 'search' | 'analytics' | 'relationships' | 'export' | 'admin' | 'github' | 'backup' | 'team' | 'codemode';
7
+ type ToolGroup = 'core' | 'search' | 'analytics' | 'relationships' | 'io' | 'admin' | 'github' | 'backup' | 'team' | 'codemode';
8
8
  /**
9
9
  * Meta-group identifiers for common multi-group selections
10
10
  */
@@ -510,7 +510,7 @@ type SandboxMode = 'vm' | 'worker';
510
510
  /**
511
511
  * Tool group definitions mapping group names to tool names
512
512
  *
513
- * All 61 tools are categorized here for filtering support.
513
+ * All 65 tools are categorized here for filtering support.
514
514
  */
515
515
  declare const TOOL_GROUPS: Record<ToolGroup, string[]>;
516
516
  /**
package/dist/index.js CHANGED
@@ -1,7 +1,7 @@
1
- export { VERSION, createServer } from './chunk-6J4RPJ4I.js';
2
- export { META_GROUPS, TOOL_GROUPS, calculateTokenSavings, filterTools, getAllToolNames, getFilterSummary, getToolFilterFromEnv, getToolGroup, isToolEnabled, parseToolFilter } from './chunk-2BJHLTYP.js';
1
+ export { VERSION, createServer } from './chunk-GW5DYUQJ.js';
2
+ export { META_GROUPS, TOOL_GROUPS, calculateTokenSavings, filterTools, getAllToolNames, getFilterSummary, getToolFilterFromEnv, getToolGroup, isToolEnabled, parseToolFilter } from './chunk-JEGRDY6W.js';
3
3
  import './chunk-OKOVZ5QE.js';
4
- export { logger } from './chunk-ARLH46WS.js';
4
+ export { logger } from './chunk-37BQOJDZ.js';
5
5
 
6
6
  // src/types/index.ts
7
7
  var DEFAULT_CONFIG = {
@@ -1,3 +1,3 @@
1
- export { callTool, getGlobalAuditLogger, getTools, initializeAuditLogger } from './chunk-2BJHLTYP.js';
1
+ export { callTool, getGlobalAuditLogger, getTools, initializeAuditLogger } from './chunk-JEGRDY6W.js';
2
2
  import './chunk-OKOVZ5QE.js';
3
- import './chunk-ARLH46WS.js';
3
+ import './chunk-37BQOJDZ.js';
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "memory-journal-mcp",
3
- "version": "7.0.1",
3
+ "version": "7.1.0",
4
4
  "description": "Project context management for AI-assisted development - Persistent knowledge graphs and intelligent context recall across fragmented AI threads",
5
5
  "type": "module",
6
6
  "main": "dist/index.js",
@@ -82,7 +82,7 @@
82
82
  "globals": "^17.4.0",
83
83
  "tsup": "^8.5.1",
84
84
  "typescript": "^6.0.2",
85
- "typescript-eslint": "^8.57.0",
85
+ "typescript-eslint": "^8.58.1",
86
86
  "vitest": "^4.1.3"
87
87
  },
88
88
  "overrides": {
@@ -1 +0,0 @@
1
- export { GitHubIntegration } from './chunk-ARLH46WS.js';