mindlore 0.1.0 → 0.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -44,7 +44,7 @@ To add your first source:
44
44
  | Skill | Version | Description |
45
45
  |-------|---------|-------------|
46
46
  | `/mindlore-ingest` | v0.1 | Add knowledge sources (URL, text, file, PDF) |
47
- | `/mindlore-health` | v0.1 | 16-point structural health check |
47
+ | `/mindlore-health` | v0.1 | 18-point structural health check |
48
48
  | `/mindlore-query` | v0.2 | Search and retrieve knowledge (4 modes) |
49
49
  | `/mindlore-log` | v0.2 | Session logging with reflect and status |
50
50
  | `/mindlore-decide` | v0.2 | Decision records with supersedes chain |
@@ -147,17 +147,19 @@ SESSION START DURING SESSION SESSIO
147
147
 
148
148
  ## Hooks
149
149
 
150
- 7 Claude Code lifecycle hooks (v0.1):
150
+ 9 Claude Code lifecycle hooks (v0.2):
151
151
 
152
152
  | Event | Hook | What it does |
153
153
  |-------|------|-------------|
154
154
  | SessionStart | session-focus | Injects last delta + INDEX |
155
- | UserPromptSubmit | search | FTS5 search, top 3 results |
155
+ | UserPromptSubmit | search | FTS5 search, top 3 results + tags |
156
+ | UserPromptSubmit | decision-detector | TR+EN decision signal detection |
156
157
  | FileChanged | index | Sync changed files to FTS5 |
157
158
  | FileChanged | fts5-sync | Incremental batch re-index |
158
- | SessionEnd | session-end | Write delta to diary/ |
159
+ | SessionEnd | session-end | Structured delta (commits, files, reads) |
159
160
  | PreCompact | pre-compact | FTS5 flush before compaction |
160
161
  | PostCompact | post-compact | Re-inject context |
162
+ | PreToolUse (Read) | read-guard | Repeated-read warning |
161
163
 
162
164
  ## Uninstall
163
165
 
package/SCHEMA.md CHANGED
@@ -143,14 +143,70 @@ Discover unexpected connections between sources. Cross-reference analysis.
143
143
  - Max results: 3 per query (BM25 ranking)
144
144
  - Hook injects: file path + first 2 headings
145
145
 
146
+ ### FTS5 Columns (9-col schema, v0.2)
147
+
148
+ | Column | Indexed | Source |
149
+ |--------|---------|--------|
150
+ | `path` | UNINDEXED | File system path |
151
+ | `slug` | Yes | Frontmatter slug |
152
+ | `description` | Yes | Frontmatter description |
153
+ | `type` | UNINDEXED | Frontmatter type |
154
+ | `category` | Yes | Parent directory name |
155
+ | `title` | Yes | Frontmatter title or first heading |
156
+ | `content` | Yes | Markdown body (sans frontmatter) |
157
+ | `tags` | Yes | Frontmatter tags (comma-separated) |
158
+ | `quality` | UNINDEXED | Frontmatter quality (NULL until 50+ sources) |
159
+
146
160
  ### Search Flow (UserPromptSubmit hook)
147
161
 
148
162
  1. Extract keywords from user prompt
149
163
  2. Query FTS5 with BM25 ranking
150
- 3. Return max 3 results as stderr additionalContext
164
+ 3. Return max 3 results as stdout additionalContext
151
165
  4. Agent reads full file only if needed (progressive disclosure)
152
166
 
153
- ## 6. Compounding
167
+ ## 6. Wiki vs Diary (Writeback Target Rules)
168
+
169
+ Knowledge goes to one of two layers. The agent MUST pick the correct one.
170
+
171
+ ### Wiki Layer (permanent knowledge)
172
+
173
+ Directories: `sources/`, `domains/`, `analyses/`, `insights/`, `connections/`, `learnings/`
174
+
175
+ - Persists across sessions — reference value
176
+ - Indexed by FTS5, discoverable via search hook
177
+ - Updated by ingest, query writeback, reflect, evolve
178
+ - Content should be factual, sourced, and reusable
179
+
180
+ ### Diary Layer (session-scoped logs)
181
+
182
+ Directories: `diary/`, `decisions/`
183
+
184
+ - Session-specific: deltas, logs, decision snapshots
185
+ - diary/ entries get `archived: true` after reflect processes them
186
+ - decisions/ are permanent but session-originated (context + rationale)
187
+ - Patterns extracted from diary → moved to `learnings/` (wiki layer)
188
+
189
+ ### Selection Rule
190
+
191
+ | Content Type | Target | Example |
192
+ |-------------|--------|---------|
193
+ | Ingested source summary | `sources/` | URL or text summary |
194
+ | Topic wiki page | `domains/` | Consolidated knowledge on a subject |
195
+ | Multi-source synthesis | `analyses/` | Comparison table, architecture decision |
196
+ | Short Q&A answer | `insights/` | Query writeback (<200 lines) |
197
+ | Cross-reference finding | `connections/` | Link between 2+ unrelated sources |
198
+ | Persistent rule/lesson | `learnings/` | YAPMA/BEST PRACTICE from reflect |
199
+ | Session log/delta | `diary/` | What happened this session |
200
+ | Decision record | `decisions/` | Why X was chosen over Y |
201
+ | Raw capture | `raw/` | Immutable original (URL dump, paste) |
202
+
203
+ ### Anti-patterns
204
+
205
+ - Do NOT write session-specific notes to `insights/` — use `diary/`
206
+ - Do NOT write permanent rules to `diary/` — use `learnings/`
207
+ - Do NOT write decision rationale to `analyses/` — use `decisions/`
208
+
209
+ ## 7. Compounding
154
210
 
155
211
  Knowledge compounds when outputs become inputs:
156
212
 
@@ -173,7 +229,7 @@ Offer to save when:
173
229
  - Large synthesis (200+ lines, 3+ sources) → analyses/
174
230
  - Cross-cutting link → connections/
175
231
 
176
- ## 7. Learnings
232
+ ## 8. Learnings
177
233
 
178
234
  Persistent rules extracted from reflect operations.
179
235
  Organized by topic: `git.md`, `testing.md`, `security.md`, etc.
@@ -201,7 +257,7 @@ tags: [testing, jest, mock]
201
257
  - Use `YAPMA:` / `BEST PRACTICE:` / `KRITIK:` prefixes
202
258
  - Reflect skill proposes, user approves before writing
203
259
 
204
- ## 8. Naming Conventions
260
+ ## 9. Naming Conventions
205
261
 
206
262
  ### Files
207
263
 
@@ -86,14 +86,22 @@ function extractFtsMetadata(meta, body, filePath, baseDir) {
86
86
  const headingMatch = body.match(/^#\s+(.+)/m);
87
87
  title = headingMatch ? headingMatch[1].trim() : path.basename(filePath, '.md');
88
88
  }
89
- return { slug, description, type, category, title };
89
+ let tags = '';
90
+ if (meta.tags) {
91
+ tags = Array.isArray(meta.tags) ? meta.tags.join(', ') : String(meta.tags);
92
+ }
93
+ const quality = meta.quality !== undefined && meta.quality !== null ? meta.quality : null;
94
+ return { slug, description, type, category, title, tags, quality };
90
95
  }
91
96
 
92
97
  /**
93
98
  * Shared SQL constants to prevent drift across indexing paths.
94
99
  */
100
+ const SQL_FTS_CREATE =
101
+ "CREATE VIRTUAL TABLE IF NOT EXISTS mindlore_fts USING fts5(path UNINDEXED, slug, description, type UNINDEXED, category, title, content, tags, quality UNINDEXED, tokenize='porter unicode61')";
102
+
95
103
  const SQL_FTS_INSERT =
96
- 'INSERT INTO mindlore_fts (path, slug, description, type, category, title, content) VALUES (?, ?, ?, ?, ?, ?, ?)';
104
+ 'INSERT INTO mindlore_fts (path, slug, description, type, category, title, content, tags, quality) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)';
97
105
 
98
106
  /**
99
107
  * Extract headings (h1-h3) from markdown content.
@@ -145,6 +153,30 @@ function getAllMdFiles(dir, skip) {
145
153
  return results;
146
154
  }
147
155
 
156
+ /**
157
+ * Read CC hook stdin and parse JSON envelope.
158
+ * Returns the value of the first matching field, or raw text as fallback.
159
+ * @param {string[]} fields - Priority-ordered field names to extract
160
+ */
161
+ function readHookStdin(fields) {
162
+ let input = '';
163
+ try {
164
+ input = fs.readFileSync(0, 'utf8').trim();
165
+ } catch (_err) {
166
+ return '';
167
+ }
168
+ if (!input) return '';
169
+ try {
170
+ const parsed = JSON.parse(input);
171
+ for (const f of fields) {
172
+ if (parsed[f]) return parsed[f];
173
+ }
174
+ } catch (_err) {
175
+ // plain text
176
+ }
177
+ return input;
178
+ }
179
+
148
180
  module.exports = {
149
181
  MINDLORE_DIR,
150
182
  DB_NAME,
@@ -154,6 +186,8 @@ module.exports = {
154
186
  sha256,
155
187
  parseFrontmatter,
156
188
  extractFtsMetadata,
189
+ readHookStdin,
190
+ SQL_FTS_CREATE,
157
191
  SQL_FTS_INSERT,
158
192
  extractHeadings,
159
193
  requireDatabase,
@@ -0,0 +1,51 @@
1
+ #!/usr/bin/env node
2
+ 'use strict';
3
+
4
+ /**
5
+ * mindlore-decision-detector — UserPromptSubmit hook
6
+ *
7
+ * Detects decision signals in user messages (TR + EN).
8
+ * Outputs a suggestion to record the decision via /mindlore-decide.
9
+ * Does NOT block (exit 0) — advisory only.
10
+ */
11
+
12
+ const { findMindloreDir, readHookStdin } = require('./lib/mindlore-common.cjs');
13
+
14
+ const SIGNALS_TR = [
15
+ 'karar verdik', 'karar verildi', 'kararlastirdik', 'kararlaştırdık',
16
+ 'şunu seçtik', 'sunu sectik', 'bunu yapmayalım', 'bunu yapmayalim',
17
+ 'yerine', 'tercih ettik', 'onaylandi', 'onaylandı', 'kesinleşti', 'kesinlesti',
18
+ 'vazgeçtik', 'vazgectik', 'iptal ettik',
19
+ ];
20
+
21
+ const SIGNALS_EN = [
22
+ 'decided', 'decision made', "let's go with", 'lets go with',
23
+ "we'll use", 'well use', 'approved', 'settled on',
24
+ 'going with', 'chosen', 'finalized', 'rejected',
25
+ ];
26
+
27
+ function detectDecision(text) {
28
+ const lower = text.toLowerCase();
29
+ for (const signal of SIGNALS_TR) {
30
+ if (lower.includes(signal)) return signal;
31
+ }
32
+ for (const signal of SIGNALS_EN) {
33
+ if (lower.includes(signal)) return signal;
34
+ }
35
+ return null;
36
+ }
37
+
38
+ function main() {
39
+ const baseDir = findMindloreDir();
40
+ if (!baseDir) return;
41
+
42
+ const userText = readHookStdin(['prompt', 'content', 'message']);
43
+ if (!userText || userText.length < 10) return;
44
+
45
+ const signal = detectDecision(userText);
46
+ if (signal) {
47
+ process.stdout.write(`[Mindlore: Karar sinyali tespit edildi ("${signal}") — /mindlore-decide record ile kaydetmek ister misin?]\n`);
48
+ }
49
+ }
50
+
51
+ main();
@@ -13,24 +13,10 @@
13
13
 
14
14
  const fs = require('fs');
15
15
  const path = require('path');
16
- const { MINDLORE_DIR, DB_NAME, sha256, openDatabase, getAllMdFiles, parseFrontmatter, extractFtsMetadata, SQL_FTS_INSERT } = require('./lib/mindlore-common.cjs');
16
+ const { MINDLORE_DIR, DB_NAME, sha256, openDatabase, getAllMdFiles, parseFrontmatter, extractFtsMetadata, SQL_FTS_INSERT, readHookStdin } = require('./lib/mindlore-common.cjs');
17
17
 
18
18
  function main() {
19
- // Read stdin to check if this is a .mindlore/ file change
20
- let input = '';
21
- try {
22
- input = fs.readFileSync(0, 'utf8').trim();
23
- } catch (_err) {
24
- // No stdin — skip
25
- }
26
-
27
- let filePath = '';
28
- try {
29
- const parsed = JSON.parse(input);
30
- filePath = parsed.path || parsed.file_path || '';
31
- } catch (_err) {
32
- filePath = input;
33
- }
19
+ const filePath = readHookStdin(['path', 'file_path']);
34
20
 
35
21
  // Only trigger on .mindlore/ changes (empty filePath = skip)
36
22
  if (!filePath || !filePath.includes(MINDLORE_DIR)) return;
@@ -74,9 +60,9 @@ function main() {
74
60
  if (existing && existing.content_hash === hash) continue;
75
61
 
76
62
  const { meta, body } = parseFrontmatter(content);
77
- const { slug, description, type, category, title } = extractFtsMetadata(meta, body, file, baseDir);
63
+ const { slug, description, type, category, title, tags, quality } = extractFtsMetadata(meta, body, file, baseDir);
78
64
  deleteFts.run(file);
79
- insertFts.run(file, slug, description, type, category, title, body);
65
+ insertFts.run(file, slug, description, type, category, title, body, tags, quality);
80
66
  upsertHash.run(file, hash, now);
81
67
  synced++;
82
68
  }
@@ -10,24 +10,10 @@
10
10
 
11
11
  const fs = require('fs');
12
12
  const path = require('path');
13
- const { MINDLORE_DIR, DB_NAME, SKIP_FILES, sha256, openDatabase, parseFrontmatter, extractFtsMetadata, SQL_FTS_INSERT } = require('./lib/mindlore-common.cjs');
13
+ const { MINDLORE_DIR, DB_NAME, SKIP_FILES, sha256, openDatabase, parseFrontmatter, extractFtsMetadata, SQL_FTS_INSERT, readHookStdin } = require('./lib/mindlore-common.cjs');
14
14
 
15
15
  function main() {
16
- let input = '';
17
- try {
18
- input = fs.readFileSync(0, 'utf8').trim();
19
- } catch (_err) {
20
- return;
21
- }
22
-
23
- let filePath = '';
24
- try {
25
- const parsed = JSON.parse(input);
26
- filePath = parsed.path || parsed.file_path || '';
27
- } catch (_err) {
28
- filePath = input;
29
- }
30
-
16
+ const filePath = readHookStdin(['path', 'file_path']);
31
17
  if (!filePath) return;
32
18
 
33
19
  // Only process .md files inside .mindlore/
@@ -72,11 +58,11 @@ function main() {
72
58
 
73
59
  // Parse frontmatter for rich FTS5 columns
74
60
  const { meta, body } = parseFrontmatter(content);
75
- const { slug, description, type, category, title } = extractFtsMetadata(meta, body, filePath, baseDir);
61
+ const { slug, description, type, category, title, tags, quality } = extractFtsMetadata(meta, body, filePath, baseDir);
76
62
 
77
63
  // Update FTS5
78
64
  db.prepare('DELETE FROM mindlore_fts WHERE path = ?').run(filePath);
79
- db.prepare(SQL_FTS_INSERT).run(filePath, slug, description, type, category, title, body);
65
+ db.prepare(SQL_FTS_INSERT).run(filePath, slug, description, type, category, title, body, tags, quality);
80
66
 
81
67
  // Update hash
82
68
  db.prepare(
@@ -0,0 +1,62 @@
1
+ #!/usr/bin/env node
2
+ 'use strict';
3
+
4
+ /**
5
+ * mindlore-read-guard — PreToolUse hook (if: "Read")
6
+ *
7
+ * OpenWolf repeated-read pattern: detects files read multiple times
8
+ * in the same session and emits a soft warning.
9
+ * Does NOT block (exit 0) — advisory only.
10
+ *
11
+ * Storage: .mindlore/diary/_session-reads.json
12
+ * Cleanup: session-end hook writes stats to delta then deletes the file.
13
+ */
14
+
15
+ const fs = require('fs');
16
+ const path = require('path');
17
+ const { findMindloreDir, readHookStdin } = require('./lib/mindlore-common.cjs');
18
+
19
+ function main() {
20
+ const baseDir = findMindloreDir();
21
+ if (!baseDir) return;
22
+
23
+ const filePath = readHookStdin(['file_path', 'path']);
24
+ if (!filePath) return;
25
+
26
+ // Only track CWD-relative files, skip .mindlore/ internals
27
+ const cwd = process.cwd();
28
+ const resolved = path.resolve(filePath);
29
+ if (!resolved.startsWith(cwd)) return;
30
+ if (resolved.startsWith(path.resolve(baseDir))) return;
31
+
32
+ // Load or create session reads tracker
33
+ const diaryDir = path.join(baseDir, 'diary');
34
+ if (!fs.existsSync(diaryDir)) {
35
+ fs.mkdirSync(diaryDir, { recursive: true });
36
+ }
37
+
38
+ const readsPath = path.join(diaryDir, '_session-reads.json');
39
+ let reads = {};
40
+ if (fs.existsSync(readsPath)) {
41
+ try {
42
+ reads = JSON.parse(fs.readFileSync(readsPath, 'utf8'));
43
+ } catch (_err) {
44
+ reads = {};
45
+ }
46
+ }
47
+
48
+ const normalizedPath = path.resolve(filePath);
49
+ const count = (reads[normalizedPath] || 0) + 1;
50
+ reads[normalizedPath] = count;
51
+
52
+ // Write updated reads
53
+ fs.writeFileSync(readsPath, JSON.stringify(reads, null, 2), 'utf8');
54
+
55
+ // Warn on repeated reads (2nd+ time)
56
+ if (count > 1) {
57
+ const basename = path.basename(filePath);
58
+ process.stderr.write(`[Mindlore: ${basename} bu session'da ${count}. kez okunuyor. Değişiklik yoksa tekrar okumayı atlayabilirsin.]\n`);
59
+ }
60
+ }
61
+
62
+ main();
@@ -10,7 +10,7 @@
10
10
 
11
11
  const fs = require('fs');
12
12
  const path = require('path');
13
- const { findMindloreDir, DB_NAME, requireDatabase, extractHeadings } = require('./lib/mindlore-common.cjs');
13
+ const { findMindloreDir, DB_NAME, requireDatabase, extractHeadings, readHookStdin } = require('./lib/mindlore-common.cjs');
14
14
 
15
15
  const MAX_RESULTS = 3;
16
16
  const MIN_QUERY_WORDS = 3;
@@ -60,21 +60,7 @@ function extractKeywords(text) {
60
60
  }
61
61
 
62
62
  function main() {
63
- let input = '';
64
- try {
65
- input = fs.readFileSync(0, 'utf8');
66
- } catch (_err) {
67
- return;
68
- }
69
-
70
- let userMessage = '';
71
- try {
72
- const parsed = JSON.parse(input);
73
- userMessage = parsed.prompt || parsed.content || parsed.message || parsed.query || input;
74
- } catch (_err) {
75
- userMessage = input;
76
- }
77
-
63
+ const userMessage = readHookStdin(['prompt', 'content', 'message', 'query']);
78
64
  if (!userMessage || userMessage.length < MIN_QUERY_WORDS) return;
79
65
 
80
66
  const baseDir = findMindloreDir();
@@ -127,7 +113,7 @@ function main() {
127
113
 
128
114
  // Build rich inject output
129
115
  const metaStmt = db.prepare(
130
- 'SELECT slug, description, category, title FROM mindlore_fts WHERE path = ?'
116
+ 'SELECT slug, description, category, title, tags FROM mindlore_fts WHERE path = ?'
131
117
  );
132
118
 
133
119
  const output = [];
@@ -146,8 +132,9 @@ function main() {
146
132
  const description = meta.description || '';
147
133
 
148
134
  const headingStr = headings.length > 0 ? `\nBasliklar: ${headings.join(', ')}` : '';
135
+ const tagsStr = meta.tags ? `\nTags: ${meta.tags}` : '';
149
136
  output.push(
150
- `[Mindlore: ${category}/${title}] ${description}\nDosya: ${relativePath}${headingStr}`
137
+ `[Mindlore: ${category}/${title}] ${description}\nDosya: ${relativePath}${tagsStr}${headingStr}`
151
138
  );
152
139
  }
153
140
 
@@ -11,6 +11,7 @@
11
11
 
12
12
  const fs = require('fs');
13
13
  const path = require('path');
14
+ const { execSync } = require('child_process');
14
15
  const { findMindloreDir } = require('./lib/mindlore-common.cjs');
15
16
 
16
17
  function formatDate(date) {
@@ -22,6 +23,47 @@ function formatDate(date) {
22
23
  return `${y}-${m}-${d}-${h}${min}`;
23
24
  }
24
25
 
26
+ function getRecentGitChanges() {
27
+ try {
28
+ const raw = execSync('git diff --name-only HEAD~5..HEAD 2>/dev/null', {
29
+ encoding: 'utf8',
30
+ timeout: 5000,
31
+ }).trim();
32
+ if (!raw) return [];
33
+ return raw.split('\n').filter(Boolean).slice(0, 20);
34
+ } catch (_err) {
35
+ return [];
36
+ }
37
+ }
38
+
39
+ function getRecentCommits() {
40
+ try {
41
+ const raw = execSync('git log --oneline -5 2>/dev/null', {
42
+ encoding: 'utf8',
43
+ timeout: 5000,
44
+ }).trim();
45
+ if (!raw) return [];
46
+ return raw.split('\n').filter(Boolean);
47
+ } catch (_err) {
48
+ return [];
49
+ }
50
+ }
51
+
52
+ function getSessionReads(baseDir) {
53
+ const readsPath = path.join(baseDir, 'diary', '_session-reads.json');
54
+ if (!fs.existsSync(readsPath)) return null;
55
+ try {
56
+ const data = JSON.parse(fs.readFileSync(readsPath, 'utf8'));
57
+ const count = Object.keys(data).length;
58
+ const repeats = Object.values(data).filter((v) => v > 1).length;
59
+ // Clean up session file
60
+ fs.unlinkSync(readsPath);
61
+ return { count, repeats };
62
+ } catch (_err) {
63
+ return null;
64
+ }
65
+ }
66
+
25
67
  function main() {
26
68
  const baseDir = findMindloreDir();
27
69
  if (!baseDir) return;
@@ -38,7 +80,12 @@ function main() {
38
80
  // Don't overwrite existing delta (idempotent)
39
81
  if (fs.existsSync(deltaPath)) return;
40
82
 
41
- const content = [
83
+ // Gather structured data
84
+ const commits = getRecentCommits();
85
+ const changedFiles = getRecentGitChanges();
86
+ const reads = getSessionReads(baseDir);
87
+
88
+ const sections = [
42
89
  '---',
43
90
  `slug: delta-${dateStr}`,
44
91
  'type: diary',
@@ -48,14 +95,33 @@ function main() {
48
95
  `# Session Delta — ${dateStr}`,
49
96
  '',
50
97
  `Session ended: ${now.toISOString()}`,
51
- '',
52
- '## Changes',
53
- '',
54
- '_No structured changes tracked in v0.1. Upgrade to v0.2 for detailed deltas._',
55
- '',
56
- ].join('\n');
98
+ ];
99
+
100
+ // Commits section
101
+ sections.push('', '## Commits');
102
+ if (commits.length > 0) {
103
+ for (const c of commits) sections.push(`- ${c}`);
104
+ } else {
105
+ sections.push('- _(no commits)_');
106
+ }
107
+
108
+ // Changed files section
109
+ sections.push('', '## Changed Files');
110
+ if (changedFiles.length > 0) {
111
+ for (const f of changedFiles) sections.push(`- ${f}`);
112
+ } else {
113
+ sections.push('- _(no file changes)_');
114
+ }
115
+
116
+ // Read stats (from read-guard, if active)
117
+ if (reads) {
118
+ sections.push('', '## Read Stats');
119
+ sections.push(`- ${reads.count} files read, ${reads.repeats} repeated reads`);
120
+ }
121
+
122
+ sections.push('');
57
123
 
58
- fs.writeFileSync(deltaPath, content, 'utf8');
124
+ fs.writeFileSync(deltaPath, sections.join('\n'), 'utf8');
59
125
 
60
126
  // Append to log.md
61
127
  const logPath = path.join(baseDir, 'log.md');
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "mindlore",
3
- "version": "0.1.0",
3
+ "version": "0.2.0",
4
4
  "description": "AI-native knowledge system for Claude Code",
5
5
  "type": "commonjs",
6
6
  "bin": {
package/plugin.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "mindlore",
3
3
  "description": "AI-native knowledge system for Claude Code. Persistent, searchable, evolving knowledge base with FTS5.",
4
- "version": "0.0.1",
4
+ "version": "0.2.0",
5
5
  "skills": [
6
6
  {
7
7
  "name": "mindlore-ingest",
@@ -11,7 +11,22 @@
11
11
  {
12
12
  "name": "mindlore-health",
13
13
  "path": "skills/mindlore-health/SKILL.md",
14
- "description": "Run 16-point structural health check on .mindlore/ knowledge base"
14
+ "description": "Run 18-point structural health check on .mindlore/ knowledge base"
15
+ },
16
+ {
17
+ "name": "mindlore-query",
18
+ "path": "skills/mindlore-query/SKILL.md",
19
+ "description": "Search, ask, stats, brief — compounding knowledge pipeline"
20
+ },
21
+ {
22
+ "name": "mindlore-log",
23
+ "path": "skills/mindlore-log/SKILL.md",
24
+ "description": "Session logging, pattern extraction, wiki updates"
25
+ },
26
+ {
27
+ "name": "mindlore-decide",
28
+ "path": "skills/mindlore-decide/SKILL.md",
29
+ "description": "Record and list decisions with context, alternatives, rationale"
15
30
  }
16
31
  ],
17
32
  "hooks": [
@@ -23,6 +38,10 @@
23
38
  "event": "UserPromptSubmit",
24
39
  "script": "hooks/mindlore-search.cjs"
25
40
  },
41
+ {
42
+ "event": "UserPromptSubmit",
43
+ "script": "hooks/mindlore-decision-detector.cjs"
44
+ },
26
45
  {
27
46
  "event": "FileChanged",
28
47
  "script": "hooks/mindlore-index.cjs"
@@ -42,6 +61,11 @@
42
61
  {
43
62
  "event": "PostCompact",
44
63
  "script": "hooks/mindlore-post-compact.cjs"
64
+ },
65
+ {
66
+ "event": "PreToolUse",
67
+ "script": "hooks/mindlore-read-guard.cjs",
68
+ "if": "Read"
45
69
  }
46
70
  ]
47
71
  }
package/scripts/init.cjs CHANGED
@@ -15,6 +15,7 @@ const path = require('path');
15
15
  // ── Constants ──────────────────────────────────────────────────────────
16
16
 
17
17
  const { MINDLORE_DIR, DB_NAME, DIRECTORIES, homedir } = require('./lib/constants.cjs');
18
+ const { SQL_FTS_CREATE } = require('../hooks/lib/mindlore-common.cjs');
18
19
 
19
20
  const TEMPLATE_FILES = ['INDEX.md', 'log.md'];
20
21
 
@@ -91,13 +92,16 @@ function migrateDatabase(dbPath, Database) {
91
92
  const info = db.pragma('table_info(mindlore_fts)');
92
93
  const columns = info.map((r) => r.name);
93
94
  if (!columns.includes('slug') || !columns.includes('description')) {
94
- log('Upgrading FTS5 schema (2 → 7 columns, porter stemmer)...');
95
+ log('Upgrading FTS5 schema (2 → 9 columns, porter stemmer)...');
95
96
  db.exec('DROP TABLE IF EXISTS mindlore_fts');
96
- db.exec(`
97
- CREATE VIRTUAL TABLE mindlore_fts
98
- USING fts5(path UNINDEXED, slug, description, type UNINDEXED, category, title, content, tokenize='porter unicode61');
99
- `);
100
- // Clear hashes so full re-index happens
97
+ db.exec(SQL_FTS_CREATE);
98
+ db.exec('DELETE FROM file_hashes');
99
+ db.close();
100
+ return true;
101
+ } else if (!columns.includes('tags')) {
102
+ log('Upgrading FTS5 schema (7 → 9 columns, +tags +quality)...');
103
+ db.exec('DROP TABLE IF EXISTS mindlore_fts');
104
+ db.exec(SQL_FTS_CREATE);
101
105
  db.exec('DELETE FROM file_hashes');
102
106
  db.close();
103
107
  return true;
@@ -105,10 +109,7 @@ function migrateDatabase(dbPath, Database) {
105
109
  } catch (_err) {
106
110
  // table_info fails on FTS5 virtual tables in some versions — recreate
107
111
  db.exec('DROP TABLE IF EXISTS mindlore_fts');
108
- db.exec(`
109
- CREATE VIRTUAL TABLE mindlore_fts
110
- USING fts5(path UNINDEXED, slug, description, type UNINDEXED, category, title, content, tokenize='porter unicode61');
111
- `);
112
+ db.exec(SQL_FTS_CREATE);
112
113
  db.exec('DELETE FROM file_hashes');
113
114
  db.close();
114
115
  return true;
@@ -143,10 +144,7 @@ function createDatabase(baseDir) {
143
144
  const db = new Database(dbPath);
144
145
  db.pragma('journal_mode = WAL');
145
146
 
146
- db.exec(`
147
- CREATE VIRTUAL TABLE IF NOT EXISTS mindlore_fts
148
- USING fts5(path UNINDEXED, slug, description, type UNINDEXED, category, title, content, tokenize='porter unicode61');
149
- `);
147
+ db.exec(SQL_FTS_CREATE);
150
148
 
151
149
  db.exec(`
152
150
  CREATE TABLE IF NOT EXISTS file_hashes (
@@ -67,9 +67,9 @@ function main() {
67
67
 
68
68
  // Update FTS5
69
69
  const { meta, body } = parseFrontmatter(content);
70
- const { slug, description, type, category, title } = extractFtsMetadata(meta, body, filePath, baseDir);
70
+ const { slug, description, type, category, title, tags, quality } = extractFtsMetadata(meta, body, filePath, baseDir);
71
71
  deleteFts.run(filePath);
72
- insertFts.run(filePath, slug, description, type, category, title, body);
72
+ insertFts.run(filePath, slug, description, type, category, title, body, tags, quality);
73
73
 
74
74
  // Update hash
75
75
  upsertHash.run(filePath, hash, now);
@@ -15,35 +15,19 @@ const path = require('path');
15
15
  // ── Constants ──────────────────────────────────────────────────────────
16
16
 
17
17
  const { DIRECTORIES, TYPE_TO_DIR } = require('./lib/constants.cjs');
18
+ const { parseFrontmatter: _parseFm, getAllMdFiles: _getAllMd } = require('../hooks/lib/mindlore-common.cjs');
18
19
 
19
20
  // ── Helpers ────────────────────────────────────────────────────────────
20
21
 
22
+ // Wrapper: shared parseFrontmatter returns { meta, body }, health-check expects flat object or null
21
23
  function parseFrontmatter(content) {
22
- const match = content.match(/^---\r?\n([\s\S]*?)\r?\n---/);
23
- if (!match) return null;
24
-
25
- const fm = {};
26
- const lines = match[1].split('\n');
27
- for (const line of lines) {
28
- const colonIdx = line.indexOf(':');
29
- if (colonIdx === -1) continue;
30
- const key = line.slice(0, colonIdx).trim();
31
- let value = line.slice(colonIdx + 1).trim();
32
- // Handle arrays
33
- if (value.startsWith('[') && value.endsWith(']')) {
34
- value = value
35
- .slice(1, -1)
36
- .split(',')
37
- .map((s) => s.trim());
38
- }
39
- fm[key] = value;
40
- }
41
- return fm;
24
+ const { meta } = _parseFm(content);
25
+ return Object.keys(meta).length > 0 ? meta : null;
42
26
  }
43
27
 
44
28
  // Health check needs ALL .md files (no skip), so pass empty set
45
29
  function getAllMdFiles(dir) {
46
- return require('../hooks/lib/mindlore-common.cjs').getAllMdFiles(dir, new Set());
30
+ return _getAllMd(dir, new Set());
47
31
  }
48
32
 
49
33
  // ── Checks ─────────────────────────────────────────────────────────────
@@ -148,24 +132,30 @@ class HealthChecker {
148
132
  .prepare('SELECT count(*) as cnt FROM file_hashes')
149
133
  .get();
150
134
 
151
- // Verify 7-column schema (slug, description, type, category, title, content + path)
152
- let schemaOk = true;
135
+ // Verify 9-column schema (slug, description, type, category, title, content, tags, quality + path)
136
+ let schemaVersion = 0;
153
137
  try {
154
- db.prepare('SELECT slug, description, category, title FROM mindlore_fts LIMIT 0').run();
138
+ db.prepare('SELECT tags, quality FROM mindlore_fts LIMIT 0').run();
139
+ schemaVersion = 9;
155
140
  } catch (_err) {
156
- schemaOk = false;
141
+ try {
142
+ db.prepare('SELECT slug, description, category, title FROM mindlore_fts LIMIT 0').run();
143
+ schemaVersion = 7;
144
+ } catch (_err2) {
145
+ schemaVersion = 2;
146
+ }
157
147
  }
158
148
 
159
- if (!schemaOk) {
149
+ if (schemaVersion < 9) {
160
150
  return {
161
151
  warn: true,
162
- detail: `${result.cnt} indexed, ${hashResult.cnt} hashes — OLD SCHEMA (run: npx mindlore init to upgrade)`,
152
+ detail: `${result.cnt} indexed, ${hashResult.cnt} hashes — ${schemaVersion}-col schema (run: npx mindlore init to upgrade to 9-col)`,
163
153
  };
164
154
  }
165
155
 
166
156
  return {
167
157
  ok: true,
168
- detail: `${result.cnt} indexed, ${hashResult.cnt} hashes, 7-col schema`,
158
+ detail: `${result.cnt} indexed, ${hashResult.cnt} hashes, 9-col schema`,
169
159
  };
170
160
  } catch (err) {
171
161
  return { ok: false, detail: `FTS5 error: ${err.message}` };
@@ -279,11 +269,72 @@ class HealthChecker {
279
269
  detail: `${mdFiles.length} files validated`,
280
270
  };
281
271
  }
282
- return {
283
- ok: wrongDir > 0 ? false : undefined,
284
- warn: wrongDir === 0,
285
- detail: issues.join(', '),
286
- };
272
+ if (wrongDir > 0) {
273
+ return { ok: false, detail: issues.join(', ') };
274
+ }
275
+ return { warn: true, detail: issues.join(', ') };
276
+ });
277
+ }
278
+
279
+ // Check 17: Stale deltas (30+ days without archived: true)
280
+ checkStaleDeltas() {
281
+ this.check('Stale deltas', () => {
282
+ const diaryDir = path.join(this.baseDir, 'diary');
283
+ if (!fs.existsSync(diaryDir)) return { ok: true, detail: 'no diary dir' };
284
+
285
+ const now = Date.now();
286
+ const thirtyDays = 30 * 24 * 60 * 60 * 1000;
287
+ let stale = 0;
288
+
289
+ const files = fs.readdirSync(diaryDir).filter((f) => f.startsWith('delta-') && f.endsWith('.md'));
290
+ for (const file of files) {
291
+ const fullPath = path.join(diaryDir, file);
292
+ const content = fs.readFileSync(fullPath, 'utf8').replace(/\r\n/g, '\n');
293
+ const fm = parseFrontmatter(content);
294
+ if (fm && fm.archived === 'true') continue;
295
+
296
+ const stat = fs.statSync(fullPath);
297
+ if (now - stat.mtimeMs > thirtyDays) stale++;
298
+ }
299
+
300
+ if (stale === 0) return { ok: true, detail: `${files.length} deltas, none stale` };
301
+ return { warn: true, detail: `${stale} deltas older than 30 days without archived flag — run /mindlore-log reflect` };
302
+ });
303
+ }
304
+
305
+ // Check 18: Conflicting analyses (same tags, different confidence)
306
+ checkConflictingAnalyses() {
307
+ this.check('Conflicting analyses', () => {
308
+ const analysesDir = path.join(this.baseDir, 'analyses');
309
+ if (!fs.existsSync(analysesDir)) return { ok: true, detail: 'no analyses dir' };
310
+
311
+ const files = fs.readdirSync(analysesDir).filter((f) => f.endsWith('.md'));
312
+ if (files.length < 2) return { ok: true, detail: `${files.length} analyses, no conflict possible` };
313
+
314
+ const tagMap = {};
315
+ for (const file of files) {
316
+ const content = fs.readFileSync(path.join(analysesDir, file), 'utf8').replace(/\r\n/g, '\n');
317
+ const fm = parseFrontmatter(content);
318
+ if (!fm || !fm.tags || !fm.confidence) continue;
319
+
320
+ const tags = Array.isArray(fm.tags) ? fm.tags : String(fm.tags).split(',').map((t) => t.trim());
321
+ for (const tag of tags) {
322
+ if (!tagMap[tag]) tagMap[tag] = [];
323
+ tagMap[tag].push({ file, confidence: fm.confidence });
324
+ }
325
+ }
326
+
327
+ const conflicts = [];
328
+ for (const [tag, entries] of Object.entries(tagMap)) {
329
+ if (entries.length < 2) continue;
330
+ const confidences = new Set(entries.map((e) => e.confidence));
331
+ if (confidences.size > 1) {
332
+ conflicts.push(`${tag}: ${entries.map((e) => `${e.file}(${e.confidence})`).join(' vs ')}`);
333
+ }
334
+ }
335
+
336
+ if (conflicts.length === 0) return { ok: true, detail: `${files.length} analyses, no conflicts` };
337
+ return { warn: true, detail: `${conflicts.length} tag conflicts: ${conflicts.slice(0, 2).join('; ')}` };
287
338
  });
288
339
  }
289
340
 
@@ -296,6 +347,8 @@ class HealthChecker {
296
347
  this.checkDatabase();
297
348
  this.checkOrphans();
298
349
  this.checkFrontmatter();
350
+ this.checkStaleDeltas();
351
+ this.checkConflictingAnalyses();
299
352
  return this;
300
353
  }
301
354
 
@@ -0,0 +1,71 @@
1
+ # Skill: Mindlore Decide
2
+
3
+ Record and list decisions in the `.mindlore/decisions/` directory.
4
+
5
+ ## Trigger
6
+
7
+ User says `/mindlore-decide record` or `/mindlore-decide list`.
8
+
9
+ ## Modes
10
+
11
+ ### record
12
+
13
+ Record a new decision.
14
+
15
+ **Flow:**
16
+ 1. Ask user (or extract from context): What was decided? What alternatives were considered? Why this choice?
17
+ 2. Generate slug from decision title (kebab-case, max 5 words)
18
+ 3. Check if a previous decision on same topic exists → set `supersedes` field
19
+ 4. Write to `.mindlore/decisions/{slug}.md` with frontmatter:
20
+
21
+ ```yaml
22
+ ---
23
+ slug: use-fts5-over-vector
24
+ type: decision
25
+ title: Use FTS5 over vector search for v0.1
26
+ tags: [search, fts5, architecture]
27
+ date: 2026-04-11
28
+ supersedes: null
29
+ status: active
30
+ description: Chose FTS5 keyword search as primary engine, vector deferred to v0.4
31
+ ---
32
+ ```
33
+
34
+ 5. Body structure:
35
+ ```markdown
36
+ # {title}
37
+
38
+ ## Context
39
+ Why this decision was needed.
40
+
41
+ ## Alternatives Considered
42
+ 1. **Option A** — pros/cons
43
+ 2. **Option B** — pros/cons
44
+
45
+ ## Decision
46
+ What was chosen and why.
47
+
48
+ ## Consequences
49
+ What this means going forward.
50
+ ```
51
+
52
+ 6. Append to `log.md`: `| {date} | decide | {slug}.md |`
53
+ 7. FTS5 auto-indexes via FileChanged hook
54
+
55
+ ### list
56
+
57
+ List active decisions.
58
+
59
+ **Flow:**
60
+ 1. Read all `.md` files in `.mindlore/decisions/`
61
+ 2. Parse frontmatter, filter `status: active`
62
+ 3. Display as table: slug, title, date, tags
63
+ 4. Show supersedes chain if any (A → B → C)
64
+
65
+ ## Rules
66
+
67
+ - Slug must be unique in decisions/
68
+ - `supersedes` field points to the slug of the replaced decision
69
+ - When a decision is superseded, update old one: `status: superseded`
70
+ - Tags should match domain topics for FTS5 discoverability
71
+ - Keep decision body concise — context + alternatives + choice + consequences
@@ -94,9 +94,20 @@ Only update the stats line: increment source count and total count.
94
94
  N source, N analysis, N total
95
95
  ```
96
96
 
97
- ## Post-Ingest Verification
97
+ ## Post-Ingest Quality Gate
98
98
 
99
- After ingest, run health check:
99
+ After every ingest, verify all 6 checkpoints before reporting success:
100
+
101
+ 1. **raw/ file exists** — immutable capture written with frontmatter (slug, type, source_url)
102
+ 2. **sources/ summary exists** — processed summary with full frontmatter (slug, type, title, tags, quality, description)
103
+ 3. **INDEX.md updated** — stats line incremented, Recent section has new entry
104
+ 4. **Domain updated** — if relevant domain exists, new finding added (max 1 domain per ingest)
105
+ 5. **log.md entry** — append `| {date} | ingest | {slug}.md |`
106
+ 6. **FTS5 indexed** — FileChanged hook auto-triggers, but verify: `node scripts/mindlore-fts5-search.cjs "{keyword}"` returns the new file
107
+
108
+ If any checkpoint fails, fix it before reporting "ingest complete". Do NOT skip steps.
109
+
110
+ Optional: run full health check for structural integrity:
100
111
  ```bash
101
112
  node scripts/mindlore-health-check.cjs
102
113
  ```
@@ -0,0 +1,79 @@
1
+ # Skill: Mindlore Log
2
+
3
+ Session logging, pattern extraction, and wiki updates.
4
+
5
+ ## Trigger
6
+
7
+ `/mindlore-log <mode>` where mode is `log`, `reflect`, `status`, or `save`.
8
+
9
+ ## Modes
10
+
11
+ ### log
12
+
13
+ Write a manual diary entry.
14
+
15
+ **Flow:**
16
+ 1. User provides note/observation (or extract from conversation context)
17
+ 2. Generate slug: `note-YYYY-MM-DD-HHmm`
18
+ 3. Write to `.mindlore/diary/{slug}.md`:
19
+
20
+ ```yaml
21
+ ---
22
+ slug: note-2026-04-11-1530
23
+ type: diary
24
+ date: 2026-04-11
25
+ ---
26
+ ```
27
+
28
+ 4. Body: user's note as-is
29
+ 5. Append to `log.md`: `| {date} | log | {slug}.md |`
30
+
31
+ ### reflect
32
+
33
+ Scan old deltas, extract patterns, write to learnings/.
34
+
35
+ **Flow (v0.2 — basic):**
36
+ 1. Read all non-archived delta files in `diary/` (no `archived: true` frontmatter)
37
+ 2. Present summary to user: "Found N unprocessed deltas spanning DATE1 to DATE2"
38
+ 3. For each delta, extract: repeated topics, recurring decisions, common file changes
39
+ 4. Propose learnings to user: "I found these patterns: ..."
40
+ 5. User approves → write to `learnings/{topic}.md` (append if exists, create if not)
41
+ 6. Format: `YAPMA:` / `BEST PRACTICE:` / `KRITIK:` prefixed rules
42
+ 7. Mark processed deltas: add `archived: true` to their frontmatter
43
+ 8. Append to `log.md`: `| {date} | reflect | {N} deltas processed, {M} learnings written |`
44
+
45
+ **Important:** Do NOT auto-extract patterns. Present findings, user approves. This is v0.2 basic mode — automated pattern extraction deferred to v0.2.1.
46
+
47
+ ### status
48
+
49
+ Show recent session summary.
50
+
51
+ **Flow:**
52
+ 1. Read last 5 delta files from `diary/` (sorted by date, newest first)
53
+ 2. For each delta, extract: date, commits count, changed files count
54
+ 3. Display as compact table
55
+ 4. Show any open items (from delta "Yarım Kalan" sections if present)
56
+ 5. Show total: "N sessions, M commits, K unique files changed"
57
+
58
+ ### save
59
+
60
+ Structured delta + log.md append + domain wiki update.
61
+
62
+ **Flow:**
63
+ 1. Gather current session context:
64
+ - Recent git commits (last 5)
65
+ - Changed files
66
+ - Decisions made (if decision-detector captured any)
67
+ 2. Write structured delta to `diary/delta-YYYY-MM-DD-HHmm.md` (same format as session-end hook)
68
+ 3. Append to `log.md`
69
+ 4. Ask user: "Which domain pages should be updated with this session's findings?"
70
+ 5. If user specifies domains → update those `.mindlore/domains/{slug}.md` pages with new findings
71
+ 6. Max 2 domain updates per save (prevent sprawl)
72
+
73
+ ## Rules
74
+
75
+ - Diary files are session-scoped (temporary), learnings are permanent
76
+ - reflect marks deltas as `archived: true` — they stay in diary/ but are not processed again
77
+ - Health check warns on deltas older than 30 days without `archived: true`
78
+ - learnings/ files are topic-based (one per topic), append-only
79
+ - save mode is the manual equivalent of what session-end hook does automatically
@@ -0,0 +1,125 @@
1
+ # Skill: Mindlore Query
2
+
3
+ Search, ask, analyze, and retrieve knowledge from `.mindlore/`.
4
+
5
+ ## Trigger
6
+
7
+ `/mindlore-query <mode> [query]` where mode is `search`, `ask`, `stats`, or `brief`.
8
+
9
+ ## Modes
10
+
11
+ ### search
12
+
13
+ FTS5 keyword search with direct results.
14
+
15
+ **Flow:**
16
+ 1. Parse user query into keywords (strip stop words)
17
+ 2. Run FTS5 MATCH on `mindlore_fts` table
18
+ 3. Return top 5 results (configurable) with: path, title, description, category, rank score
19
+ 4. Display as table with snippet preview
20
+ 5. If `--tags <tag>` flag provided: `WHERE tags MATCH '<tag>'` filter
21
+
22
+ **Output format:**
23
+ ```
24
+ | # | Category | Title | Description | Score |
25
+ |---|----------|-------|-------------|-------|
26
+ | 1 | sources | React Hooks | useEffect cleanup patterns | -2.34 |
27
+ ```
28
+
29
+ ### ask
30
+
31
+ Compounding query pipeline — knowledge grows with each answer.
32
+
33
+ **Flow:**
34
+ 1. Parse user question
35
+ 2. FTS5 search → find relevant files (sources + domains + analyses + insights — previous answers INCLUDED)
36
+ 3. Read top 3-5 relevant files using ctx_execute_file if context-mode available, else Read
37
+ 4. Synthesize answer from found knowledge
38
+ 5. Cite sources: `[kaynak: sources/x.md]` format
39
+ 6. Ask user: "Bu cevabı kaydetmemi ister misin?"
40
+ 7. If yes → determine target:
41
+ - Short answer (<200 lines, 1-2 sources) → `insights/{slug}.md`
42
+ - Large synthesis (200+ lines, 3+ sources) → `analyses/{slug}.md`
43
+ 8. Write with frontmatter:
44
+
45
+ ```yaml
46
+ ---
47
+ slug: react-hooks-cleanup-comparison
48
+ type: insight
49
+ title: React Hooks Cleanup Comparison
50
+ tags: [react, hooks, useEffect]
51
+ confidence: high
52
+ sources_used: [react-hooks, typescript-generics]
53
+ description: Comparison of cleanup patterns in useEffect vs useLayoutEffect
54
+ ---
55
+ ```
56
+
57
+ 9. FTS5 auto-indexes via FileChanged hook → next query finds this answer
58
+ 10. Update relevant domain page (max 1) with backlink if applicable
59
+ 11. Append to `log.md`: `| {date} | query-ask | {slug}.md |`
60
+
61
+ **Compounding effect:** Step 2 searches ALL of `.mindlore/` including previous `insights/` and `analyses/`. Each saved answer enriches the next query.
62
+
63
+ **Error compounding prevention:**
64
+ - `confidence` field is REQUIRED on writebacks (high/medium/low)
65
+ - `sources_used` lists exact slugs — traceability
66
+ - Health check flags conflicting analyses on same topic (different confidence)
67
+ - User approval is the quality gate — low-quality answers are not saved
68
+
69
+ ### stats
70
+
71
+ Knowledge base statistics.
72
+
73
+ **Flow:**
74
+ 1. Count files per directory (9 directories)
75
+ 2. Count FTS5 entries and file_hashes entries
76
+ 3. Find most recent ingest (latest file by modified date per directory)
77
+ 4. Count tags frequency (parse all frontmatter, aggregate tags)
78
+ 5. Display summary:
79
+
80
+ ```
81
+ Mindlore Stats
82
+ ─────────────
83
+ Total files: 47 (FTS5: 47 indexed)
84
+ - sources: 18
85
+ - domains: 5
86
+ - analyses: 6
87
+ - insights: 3
88
+ - connections: 2
89
+ - learnings: 4
90
+ - diary: 8
91
+ - decisions: 1
92
+ - raw: 0
93
+
94
+ Top tags: security (12), hooks (8), fts5 (6), testing (5)
95
+ Last ingest: 2026-04-11 (sources/react-hooks.md)
96
+ DB size: 1.2 MB
97
+ ```
98
+
99
+ ### brief
100
+
101
+ Quick context on a topic — token-efficient (~50 tokens output).
102
+
103
+ **Flow:**
104
+ 1. FTS5 search for topic
105
+ 2. If domain page exists → read first 3 lines of body (after frontmatter)
106
+ 3. If no domain → read description field from top FTS5 match
107
+ 4. Return: title + description + "Read full: {path}" pointer
108
+ 5. Do NOT read full file — this mode is for quick "do I need to open this?" decisions
109
+
110
+ **Output format:**
111
+ ```
112
+ [Mindlore Brief: Security]
113
+ SSH hardening, firewall rules, audit checks. 5 sources, 2 analyses linked.
114
+ → Read full: .mindlore/domains/security.md
115
+ ```
116
+
117
+ ## Rules
118
+
119
+ - All modes respect the SCHEMA.md writeback rules (Section 6: Wiki vs Diary)
120
+ - search and brief are read-only — no writes
121
+ - ask writes only with user approval
122
+ - stats is read-only
123
+ - Token strategy: prefer ctx_execute_file (if context-mode installed), fallback to Read
124
+ - Tags filter: `--tags security` works in search and ask modes
125
+ - Max results: search=5, ask=3-5 (for synthesis), brief=1
@@ -143,14 +143,70 @@ Discover unexpected connections between sources. Cross-reference analysis.
143
143
  - Max results: 3 per query (BM25 ranking)
144
144
  - Hook injects: file path + first 2 headings
145
145
 
146
+ ### FTS5 Columns (9-col schema, v0.2)
147
+
148
+ | Column | Indexed | Source |
149
+ |--------|---------|--------|
150
+ | `path` | UNINDEXED | File system path |
151
+ | `slug` | Yes | Frontmatter slug |
152
+ | `description` | Yes | Frontmatter description |
153
+ | `type` | UNINDEXED | Frontmatter type |
154
+ | `category` | Yes | Parent directory name |
155
+ | `title` | Yes | Frontmatter title or first heading |
156
+ | `content` | Yes | Markdown body (sans frontmatter) |
157
+ | `tags` | Yes | Frontmatter tags (comma-separated) |
158
+ | `quality` | UNINDEXED | Frontmatter quality (NULL until 50+ sources) |
159
+
146
160
  ### Search Flow (UserPromptSubmit hook)
147
161
 
148
162
  1. Extract keywords from user prompt
149
163
  2. Query FTS5 with BM25 ranking
150
- 3. Return max 3 results as stderr additionalContext
164
+ 3. Return max 3 results as stdout additionalContext
151
165
  4. Agent reads full file only if needed (progressive disclosure)
152
166
 
153
- ## 6. Compounding
167
+ ## 6. Wiki vs Diary (Writeback Target Rules)
168
+
169
+ Knowledge goes to one of two layers. The agent MUST pick the correct one.
170
+
171
+ ### Wiki Layer (permanent knowledge)
172
+
173
+ Directories: `sources/`, `domains/`, `analyses/`, `insights/`, `connections/`, `learnings/`
174
+
175
+ - Persists across sessions — reference value
176
+ - Indexed by FTS5, discoverable via search hook
177
+ - Updated by ingest, query writeback, reflect, evolve
178
+ - Content should be factual, sourced, and reusable
179
+
180
+ ### Diary Layer (session-scoped logs)
181
+
182
+ Directories: `diary/`, `decisions/`
183
+
184
+ - Session-specific: deltas, logs, decision snapshots
185
+ - diary/ entries get `archived: true` after reflect processes them
186
+ - decisions/ are permanent but session-originated (context + rationale)
187
+ - Patterns extracted from diary → moved to `learnings/` (wiki layer)
188
+
189
+ ### Selection Rule
190
+
191
+ | Content Type | Target | Example |
192
+ |-------------|--------|---------|
193
+ | Ingested source summary | `sources/` | URL or text summary |
194
+ | Topic wiki page | `domains/` | Consolidated knowledge on a subject |
195
+ | Multi-source synthesis | `analyses/` | Comparison table, architecture decision |
196
+ | Short Q&A answer | `insights/` | Query writeback (<200 lines) |
197
+ | Cross-reference finding | `connections/` | Link between 2+ unrelated sources |
198
+ | Persistent rule/lesson | `learnings/` | YAPMA/BEST PRACTICE from reflect |
199
+ | Session log/delta | `diary/` | What happened this session |
200
+ | Decision record | `decisions/` | Why X was chosen over Y |
201
+ | Raw capture | `raw/` | Immutable original (URL dump, paste) |
202
+
203
+ ### Anti-patterns
204
+
205
+ - Do NOT write session-specific notes to `insights/` — use `diary/`
206
+ - Do NOT write permanent rules to `diary/` — use `learnings/`
207
+ - Do NOT write decision rationale to `analyses/` — use `decisions/`
208
+
209
+ ## 7. Compounding
154
210
 
155
211
  Knowledge compounds when outputs become inputs:
156
212
 
@@ -173,7 +229,7 @@ Offer to save when:
173
229
  - Large synthesis (200+ lines, 3+ sources) → analyses/
174
230
  - Cross-cutting link → connections/
175
231
 
176
- ## 7. Learnings
232
+ ## 8. Learnings
177
233
 
178
234
  Persistent rules extracted from reflect operations.
179
235
  Organized by topic: `git.md`, `testing.md`, `security.md`, etc.
@@ -201,7 +257,7 @@ tags: [testing, jest, mock]
201
257
  - Use `YAPMA:` / `BEST PRACTICE:` / `KRITIK:` prefixes
202
258
  - Reflect skill proposes, user approves before writing
203
259
 
204
- ## 8. Naming Conventions
260
+ ## 9. Naming Conventions
205
261
 
206
262
  ### Files
207
263