@giwonn/claude-daily-review 0.3.2 → 0.3.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,195 @@
1
+ ---
2
+ description: Generate reviews from raw conversation logs (supports natural language targeting)
3
+ allowed-tools: ["Read", "Write", "Bash"]
4
+ ---
5
+
6
+ # Generate Reviews
7
+
8
+ ## Important Rules
9
+
10
+ - Write all review content in Korean.
11
+ - **NEVER show raw error messages or file paths to the user.**
12
+ - Use the user's profile to add business context.
13
+ - Today's date: use the system date for resolving relative expressions.
14
+
15
+ ## Step 0: Interpret User Request
16
+
17
+ The user may provide a natural language request as arguments. Parse their intent to determine:
18
+
19
+ 1. **Date scope** — which dates/periods to generate
20
+ 2. **Period type** — daily, weekly, monthly, quarterly, yearly (or all cascading)
21
+ 3. **Project filter** — specific project name (optional)
22
+ 4. **Force regeneration** — whether to regenerate already-generated reviews
23
+
24
+ ### Interpretation Examples
25
+
26
+ | User input | from | to | force | Period type | Project |
27
+ |---|---|---|---|---|---|
28
+ | (empty) | - | - | no | all missing | all |
29
+ | "어제 회고" | yesterday | yesterday | yes | daily | all |
30
+ | "3월 15일 회고" | 2026-03-15 | 2026-03-15 | yes | daily | all |
31
+ | "지난주 주간 회고" | (week start) | (week end) | yes | weekly | all |
32
+ | "이번달 회고" | (month start) | today | yes | daily+monthly | all |
33
+ | "1분기 회고" | Q1 start | Q1 end | yes | quarterly | all |
34
+ | "my-app 프로젝트 3월 회고" | 03-01 | 03-31 | yes | daily+monthly | my-app |
35
+ | "올해 회고 다시 만들어줘" | year start | today | yes | all cascade | all |
36
+ | "3월 1일~15일" | 03-01 | 03-15 | yes | daily | all |
37
+
38
+ **Rules:**
39
+ - If the user specifies a specific date/period, set `force` to `yes` (they want it regenerated).
40
+ - If no input is given, run in default mode (generate only missing reviews, no force).
41
+ - For period types like "주간", "월간", "분기", "연간": generate the requested summary AND any prerequisite lower-level reviews needed as input.
42
+ - When the user mentions a project name, generate reviews for all dates but only include logs from that project's cwd.
43
+
44
+ ## Step 1: Collect Data
45
+
46
+ Based on Step 0 interpretation, build the command:
47
+
48
+ ```bash
49
+ CLAUDE_PLUGIN_DATA="${CLAUDE_PLUGIN_DATA}" node "${CLAUDE_PLUGIN_ROOT}/lib/collect-raw-logs.mjs" [OPTIONS]
50
+ ```
51
+
52
+ Options (all optional):
53
+ - `--from YYYY-MM-DD` — start date filter
54
+ - `--to YYYY-MM-DD` — end date filter
55
+ - `--force` — include dates even if already generated
56
+
57
+ If no options, runs in default incremental mode (only new logs since last generation).
58
+
59
+ This outputs JSON with:
60
+ - `profile`: user's company/role/team/context
61
+ - `language`: review language
62
+ - `needs.daily`: dates that need daily review generation
63
+ - `needs.weekly`: weeks that need weekly summary
64
+ - `needs.monthly`: months that need monthly summary
65
+ - `needs.quarterly`: quarters that need quarterly summary
66
+ - `needs.yearly`: years that need yearly summary
67
+ - `logs`: raw conversation logs keyed by date
68
+
69
+ If `needs` is all empty, tell the user "해당 기간에 생성할 회고가 없습니다." and stop.
70
+
71
+ ## Step 2: Apply Filters
72
+
73
+ - **Project filter**: If the user requested a specific project, filter `logs[date]` entries to only include those where the last path segment of `cwd` matches the project name.
74
+ - **Period type filter**: If the user only requested a specific period type (e.g., "주간 회고"), only generate that type and its prerequisites. For example, if weekly is requested, generate daily reviews first (as input), then the weekly summary.
75
+
76
+ ## Step 3: Generate Daily Reviews
77
+
78
+ For each date in `needs.daily`:
79
+
80
+ 1. Take all log entries from `logs[date]`
81
+ 2. Group by `cwd` (last path segment = project name)
82
+ 3. For each project, analyze user/assistant pairs and extract:
83
+ - **작업 요약**: What was accomplished (use profile for business context)
84
+ - **배운 것**: New things learned
85
+ - **고민한 포인트**: Decisions and reasoning
86
+ - **질문과 답변**: Key Q&A (summarized)
87
+ 4. General questions go under "미분류"
88
+
89
+ Write via:
90
+ ```bash
91
+ echo '<content>' | CLAUDE_PLUGIN_DATA="${CLAUDE_PLUGIN_DATA}" node "${CLAUDE_PLUGIN_ROOT}/lib/storage-cli.mjs" write "daily/{date}.md"
92
+ ```
93
+
94
+ Format:
95
+ ```markdown
96
+ ---
97
+ date: {date}
98
+ type: daily-review
99
+ projects: [{names}]
100
+ ---
101
+
102
+ # {date} Daily Review
103
+
104
+ ## [{project}] {title}
105
+ **작업 요약:** {summary}
106
+ **배운 것:**
107
+ - {item}
108
+ **고민한 포인트:**
109
+ - {decision} → {choice} ({reason})
110
+ **질문과 답변:**
111
+ - Q: {question}
112
+ → A: {answer}
113
+
114
+ ## 미분류
115
+ - Q: {question}
116
+ → A: {answer}
117
+ ```
118
+
119
+ ## Step 4: Generate Weekly Summaries
120
+
121
+ For each week in `needs.weekly`:
122
+
123
+ 1. Read the daily reviews that belong to this week using:
124
+ ```bash
125
+ CLAUDE_PLUGIN_DATA="${CLAUDE_PLUGIN_DATA}" node "${CLAUDE_PLUGIN_ROOT}/lib/storage-cli.mjs" read "daily/{date}.md"
126
+ ```
127
+ 2. Summarize into weekly review with:
128
+ - 주요 성과
129
+ - 기술 스택 활용
130
+ - 핵심 의사결정
131
+ - 성장 포인트
132
+
133
+ Write to `weekly/{week}.md`.
134
+
135
+ ## Step 5: Generate Monthly Summaries
136
+
137
+ For each month in `needs.monthly`:
138
+
139
+ 1. Read weekly reviews for that month (or daily reviews if weekly is empty)
140
+ 2. Summarize into monthly review with:
141
+ - 프로젝트별 진행 요약
142
+ - 이번 달 핵심 성장
143
+ - 기술 스택
144
+ - 주요 의사결정 기록
145
+
146
+ Write to `monthly/{month}.md`.
147
+
148
+ ## Step 6: Generate Quarterly Summaries
149
+
150
+ For each quarter in `needs.quarterly`:
151
+
152
+ 1. Read monthly reviews for that quarter
153
+ 2. Summarize with:
154
+ - 분기 성과 요약
155
+ - 핵심 역량 성장
156
+ - 기술 스택 총괄
157
+ - 경력기술서 하이라이트
158
+
159
+ Write to `quarterly/{quarter}.md`.
160
+
161
+ ## Step 7: Generate Yearly Summaries
162
+
163
+ For each year in `needs.yearly`:
164
+
165
+ 1. Read quarterly reviews for that year
166
+ 2. Summarize with:
167
+ - 연간 프로젝트 총괄
168
+ - 핵심 역량 맵
169
+ - 이력서용 요약
170
+ - 경력기술서용 상세
171
+
172
+ Write to `yearly/{year}.md`.
173
+
174
+ ## Step 8: Update last-generated timestamp
175
+
176
+ **Only update if running in default mode (no user-specified scope).**
177
+
178
+ If the user requested a specific date/period/project, do NOT update `last-generated.json` — the incremental tracking should remain unchanged.
179
+
180
+ If running in default mode, save the `newTimestamp` from Step 1:
181
+ ```bash
182
+ echo '{"timestamp":"NEW_TIMESTAMP_VALUE"}' > "${CLAUDE_PLUGIN_DATA}/last-generated.json"
183
+ ```
184
+
185
+ ## Step 9: Report
186
+
187
+ Tell the user what was generated:
188
+ > "회고 생성 완료!"
189
+ > - 일일 회고: {count}개
190
+ > - 주간 요약: {count}개
191
+ > - 월간 요약: {count}개
192
+ > - 분기 요약: {count}개
193
+ > - 연간 요약: {count}개
194
+
195
+ Only show lines where count > 0.
package/hooks/hooks.json CHANGED
@@ -12,17 +12,6 @@
12
12
  ]
13
13
  }
14
14
  ],
15
- "SessionEnd": [
16
- {
17
- "hooks": [
18
- {
19
- "type": "agent",
20
- "prompt": "Follow the instructions in the file at ${CLAUDE_PLUGIN_ROOT}/prompts/session-end.md exactly. The CLAUDE_PLUGIN_DATA directory is: ${CLAUDE_PLUGIN_DATA}. The plugin root is: ${CLAUDE_PLUGIN_ROOT}",
21
- "timeout": 120
22
- }
23
- ]
24
- }
25
- ],
26
15
  "SessionStart": [
27
16
  {
28
17
  "matcher": "startup",
@@ -31,6 +20,12 @@
31
20
  "type": "command",
32
21
  "command": "\"${CLAUDE_PLUGIN_ROOT}/hooks/run-hook.cmd\" session-start-check",
33
22
  "timeout": 5
23
+ },
24
+ {
25
+ "type": "command",
26
+ "command": "node \"${CLAUDE_PLUGIN_ROOT}/hooks/recover-sessions.mjs\"",
27
+ "async": true,
28
+ "timeout": 30
34
29
  }
35
30
  ]
36
31
  }
package/hooks/on-stop.mjs CHANGED
@@ -4,6 +4,34 @@ import { loadConfig, createStorageAdapter } from '../lib/config.mjs';
4
4
  import { parseHookInput, appendRawLog } from '../lib/raw-logger.mjs';
5
5
  import { getRawDir } from '../lib/vault.mjs';
6
6
  import { formatDate } from '../lib/periods.mjs';
7
+ import { readFileSync, writeFileSync, mkdirSync } from 'fs';
8
+ import { dirname, join } from 'path';
9
+
10
+ /**
11
+ * Read the last user message and its timestamp from transcript file
12
+ * @param {string} transcriptPath
13
+ * @returns {{ message: string, timestamp: string }}
14
+ */
15
+ function getLastUserEntry(transcriptPath) {
16
+ try {
17
+ const content = readFileSync(transcriptPath, 'utf-8');
18
+ const lines = content.trim().split('\n');
19
+ for (let i = lines.length - 1; i >= 0; i--) {
20
+ try {
21
+ const entry = JSON.parse(lines[i]);
22
+ if (entry.type !== 'user') continue;
23
+ const msgContent = entry.message?.content;
24
+ const ts = entry.timestamp || '';
25
+ if (typeof msgContent === 'string' && msgContent.trim()) return { message: msgContent, timestamp: ts };
26
+ if (Array.isArray(msgContent)) {
27
+ const textPart = msgContent.find(p => p.type === 'text' && p.text?.trim());
28
+ if (textPart) return { message: textPart.text, timestamp: ts };
29
+ }
30
+ } catch { continue; }
31
+ }
32
+ } catch { /* transcript not accessible */ }
33
+ return { message: '', timestamp: '' };
34
+ }
7
35
 
8
36
  async function main() {
9
37
  try {
@@ -14,11 +42,39 @@ async function main() {
14
42
  process.stdin.setEncoding('utf-8');
15
43
  for await (const chunk of process.stdin) { data += chunk; }
16
44
  const input = parseHookInput(data);
45
+
46
+ // Get user message + timestamp from transcript
47
+ let userMessage = '';
48
+ let userTimestamp = '';
49
+ if (input.transcript_path) {
50
+ const userEntry = getLastUserEntry(input.transcript_path);
51
+ userMessage = userEntry.message;
52
+ userTimestamp = userEntry.timestamp;
53
+ }
54
+ input.last_user_message = userMessage;
55
+ input.user_timestamp = userTimestamp;
56
+
17
57
  const sessionDir = getRawDir(input.session_id);
18
58
  const date = formatDate(new Date());
19
59
  await appendRawLog(storage, sessionDir, date, input);
20
- } catch {
21
- // async hook fail silently
60
+
61
+ // Save transcript path for session recovery
62
+ if (input.transcript_path) {
63
+ const metaPath = `${sessionDir}/.meta.json`;
64
+ const meta = JSON.stringify({ transcript_path: input.transcript_path, session_id: input.session_id });
65
+ await storage.write(metaPath, meta);
66
+ }
67
+ } catch (err) {
68
+ try {
69
+ const logDir = process.env.CLAUDE_PLUGIN_DATA;
70
+ if (logDir) {
71
+ const logPath = join(logDir, 'error.log');
72
+ mkdirSync(dirname(logPath), { recursive: true });
73
+ writeFileSync(logPath, `${new Date().toISOString()} ${err.message}\n${err.stack}\n`, { flag: 'a' });
74
+ }
75
+ } catch {
76
+ // truly silent
77
+ }
22
78
  }
23
79
  }
24
80
  main();
@@ -0,0 +1,179 @@
1
+ #!/usr/bin/env node
2
+ // @ts-check
3
+ // Recovers missing raw log entries from transcript files on SessionStart.
4
+ // Uses a lock file to prevent concurrent recovery across multiple sessions.
5
+ import { readFileSync, writeFileSync, unlinkSync, existsSync, mkdirSync } from 'fs';
6
+ import { join, dirname, basename } from 'path';
7
+ import { loadConfig, createStorageAdapter } from '../lib/config.mjs';
8
+ import { formatDate } from '../lib/periods.mjs';
9
+ import { getRawDir } from '../lib/vault.mjs';
10
+
11
+ const LOCK_STALE_MS = 120_000; // 2 minutes
12
+
13
+ function getLockPath() {
14
+ const dataDir = process.env.CLAUDE_PLUGIN_DATA;
15
+ if (!dataDir) return null;
16
+ return join(dataDir, 'recover.lock');
17
+ }
18
+
19
+ /** @returns {boolean} */
20
+ function acquireLock() {
21
+ const lockPath = getLockPath();
22
+ if (!lockPath) return false;
23
+
24
+ mkdirSync(dirname(lockPath), { recursive: true });
25
+
26
+ // Check for stale lock
27
+ if (existsSync(lockPath)) {
28
+ try {
29
+ const lock = JSON.parse(readFileSync(lockPath, 'utf-8'));
30
+ const age = Date.now() - new Date(lock.timestamp).getTime();
31
+ if (age < LOCK_STALE_MS) return false; // Another session is recovering
32
+ } catch { /* corrupt lock, take over */ }
33
+ }
34
+
35
+ try {
36
+ writeFileSync(lockPath, JSON.stringify({ pid: process.pid, timestamp: new Date().toISOString() }));
37
+ return true;
38
+ } catch { return false; }
39
+ }
40
+
41
+ function releaseLock() {
42
+ const lockPath = getLockPath();
43
+ if (!lockPath) return;
44
+ try { unlinkSync(lockPath); } catch { /* already removed */ }
45
+ }
46
+
47
+ /**
48
+ * Parse transcript JSONL and extract user/assistant message pairs with timestamps.
49
+ * @param {string} transcriptPath
50
+ * @returns {Array<{type: string, message: string, cwd: string, timestamp: string}>}
51
+ */
52
+ function parseTranscript(transcriptPath) {
53
+ try {
54
+ const content = readFileSync(transcriptPath, 'utf-8');
55
+ const lines = content.trim().split('\n');
56
+ /** @type {Array<{type: string, message: string, cwd: string, timestamp: string}>} */
57
+ const entries = [];
58
+
59
+ for (const line of lines) {
60
+ try {
61
+ const entry = JSON.parse(line);
62
+ const ts = entry.timestamp || '';
63
+ const cwd = entry.cwd || '';
64
+
65
+ if (entry.type === 'user') {
66
+ const msgContent = entry.message?.content;
67
+ let text = '';
68
+ if (typeof msgContent === 'string') text = msgContent;
69
+ else if (Array.isArray(msgContent)) {
70
+ const textPart = msgContent.find(p => p.type === 'text' && p.text?.trim());
71
+ if (textPart) text = textPart.text;
72
+ }
73
+ if (text.trim()) entries.push({ type: 'user', message: text, cwd, timestamp: ts });
74
+ } else if (entry.type === 'assistant') {
75
+ const msgContent = entry.message?.content;
76
+ let text = '';
77
+ if (typeof msgContent === 'string') text = msgContent;
78
+ else if (Array.isArray(msgContent)) {
79
+ const textParts = msgContent.filter(p => p.type === 'text').map(p => p.text);
80
+ text = textParts.join('\n');
81
+ }
82
+ if (text.trim()) entries.push({ type: 'assistant', message: text, cwd, timestamp: ts });
83
+ }
84
+ } catch { continue; }
85
+ }
86
+ return entries;
87
+ } catch { return []; }
88
+ }
89
+
90
+ /**
91
+ * Count raw log entries for a session across all date files.
92
+ * @param {import('../lib/types.d.ts').StorageAdapter} storage
93
+ * @param {string} sessionDir
94
+ * @returns {Promise<{count: number, timestamps: Set<string>}>}
95
+ */
96
+ async function getRawLogState(storage, sessionDir) {
97
+ const timestamps = new Set();
98
+ let count = 0;
99
+ try {
100
+ const files = await storage.list(sessionDir);
101
+ for (const file of files) {
102
+ if (!file.endsWith('.jsonl')) continue;
103
+ const content = await storage.read(`${sessionDir}/${file}`);
104
+ if (!content) continue;
105
+ for (const line of content.trim().split('\n')) {
106
+ try {
107
+ const entry = JSON.parse(line);
108
+ if (entry.timestamp) timestamps.add(entry.timestamp);
109
+ count++;
110
+ } catch { continue; }
111
+ }
112
+ }
113
+ } catch { /* session dir might not be listable */ }
114
+ return { count, timestamps };
115
+ }
116
+
117
+ async function main() {
118
+ if (!acquireLock()) return; // Another session is already recovering
119
+
120
+ try {
121
+ const config = loadConfig();
122
+ if (!config) return;
123
+ const storage = await createStorageAdapter(config);
124
+
125
+ const sessions = await storage.list('.raw');
126
+
127
+ for (const sessionId of sessions) {
128
+ const sessionDir = getRawDir(sessionId);
129
+
130
+ // Read .meta.json for transcript path
131
+ const metaContent = await storage.read(`${sessionDir}/.meta.json`);
132
+ if (!metaContent) continue;
133
+
134
+ let meta;
135
+ try { meta = JSON.parse(metaContent); } catch { continue; }
136
+ if (!meta.transcript_path || !existsSync(meta.transcript_path)) continue;
137
+
138
+ // Compare transcript entries with raw log entries
139
+ const transcriptEntries = parseTranscript(meta.transcript_path);
140
+ if (transcriptEntries.length === 0) continue;
141
+
142
+ const rawState = await getRawLogState(storage, sessionDir);
143
+
144
+ // If raw logs have same or more entries, skip
145
+ if (rawState.count >= transcriptEntries.length) continue;
146
+
147
+ // Find missing entries (by timestamp) and append them
148
+ const missingByDate = {};
149
+ for (const entry of transcriptEntries) {
150
+ if (rawState.timestamps.has(entry.timestamp)) continue;
151
+ const date = entry.timestamp ? formatDate(new Date(entry.timestamp)) : formatDate(new Date());
152
+ if (!missingByDate[date]) missingByDate[date] = [];
153
+ missingByDate[date].push(entry);
154
+ }
155
+
156
+ // Append missing entries to appropriate date files
157
+ for (const [date, entries] of Object.entries(missingByDate)) {
158
+ const logPath = `${sessionDir}/${date}.jsonl`;
159
+ const lines = entries.map(e =>
160
+ JSON.stringify({ type: e.type, message: e.message, session_id: sessionId, cwd: e.cwd, timestamp: e.timestamp })
161
+ ).join('\n') + '\n';
162
+ await storage.append(logPath, lines);
163
+ }
164
+ }
165
+ } catch (err) {
166
+ try {
167
+ const logDir = process.env.CLAUDE_PLUGIN_DATA;
168
+ if (logDir) {
169
+ const logPath = join(logDir, 'error.log');
170
+ mkdirSync(dirname(logPath), { recursive: true });
171
+ writeFileSync(logPath, `${new Date().toISOString()} recover: ${err.message}\n`, { flag: 'a' });
172
+ }
173
+ } catch { /* silent */ }
174
+ } finally {
175
+ releaseLock();
176
+ }
177
+ }
178
+
179
+ main();
File without changes
@@ -4,15 +4,7 @@ set -euo pipefail
4
4
  SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
5
5
  PLUGIN_ROOT="$(cd "${SCRIPT_DIR}/.." && pwd)"
6
6
 
7
- result=$(node -e "
8
- import { loadConfig } from '${PLUGIN_ROOT}/lib/config.mjs';
9
- try {
10
- const config = loadConfig();
11
- if (!config) process.stdout.write('NEEDS_SETUP');
12
- } catch {
13
- process.stdout.write('NEEDS_SETUP');
14
- }
15
- " 2>/dev/null || echo "NEEDS_SETUP")
7
+ result=$(node "${PLUGIN_ROOT}/lib/check-config.mjs" 2>/dev/null || echo "NEEDS_SETUP")
16
8
 
17
9
  if [ "$result" = "NEEDS_SETUP" ]; then
18
10
  msg='<important-reminder>IN YOUR FIRST REPLY YOU MUST TELL THE USER: daily-review 플러그인이 아직 설정되지 않았습니다. /daily-review-setup 을 실행해주세요.</important-reminder>'
@@ -0,0 +1,8 @@
1
+ #!/usr/bin/env node
2
+ import { loadConfig } from './config.mjs';
3
+ try {
4
+ const config = loadConfig();
5
+ if (!config) process.stdout.write('NEEDS_SETUP');
6
+ } catch {
7
+ process.stdout.write('NEEDS_SETUP');
8
+ }
@@ -0,0 +1,153 @@
1
+ #!/usr/bin/env node
2
+ // Reads raw logs since last generation, determines what reviews need updating
3
+ import { readFileSync, writeFileSync, existsSync, mkdirSync } from 'fs';
4
+ import { dirname, join } from 'path';
5
+ import { loadConfig, createStorageAdapter } from './config.mjs';
6
+ import { getISOWeek, getISOWeekYear, getQuarter } from './periods.mjs';
7
+
8
+ /** @param {Date} date @returns {string} */
9
+ function toWeekKey(date) {
10
+ return `${getISOWeekYear(date)}-W${String(getISOWeek(date)).padStart(2, '0')}`;
11
+ }
12
+
13
+ /** @param {Date} date @returns {string} */
14
+ function toMonthKey(date) {
15
+ return `${date.getFullYear()}-${String(date.getMonth() + 1).padStart(2, '0')}`;
16
+ }
17
+
18
+ /** @param {Date} date @returns {string} */
19
+ function toQuarterKey(date) {
20
+ return `${date.getFullYear()}-Q${getQuarter(date)}`;
21
+ }
22
+
23
+ /** @param {Date} date @returns {string} */
24
+ function toYearKey(date) {
25
+ return `${date.getFullYear()}`;
26
+ }
27
+
28
+ function getLastGenerated() {
29
+ try {
30
+ const dataDir = process.env.CLAUDE_PLUGIN_DATA;
31
+ if (!dataDir) return null;
32
+ const path = join(dataDir, 'last-generated.json');
33
+ if (!existsSync(path)) return null;
34
+ const data = JSON.parse(readFileSync(path, 'utf-8'));
35
+ return data.timestamp || null;
36
+ } catch { return null; }
37
+ }
38
+
39
+ function saveLastGenerated(timestamp) {
40
+ try {
41
+ const dataDir = process.env.CLAUDE_PLUGIN_DATA;
42
+ if (!dataDir) return;
43
+ const path = join(dataDir, 'last-generated.json');
44
+ mkdirSync(dirname(path), { recursive: true });
45
+ writeFileSync(path, JSON.stringify({ timestamp }), 'utf-8');
46
+ } catch {}
47
+ }
48
+
49
+ function parseArgs() {
50
+ const args = process.argv.slice(2);
51
+ const opts = { from: null, to: null, force: false };
52
+ for (let i = 0; i < args.length; i++) {
53
+ if (args[i] === '--from' && args[i + 1]) opts.from = args[++i];
54
+ else if (args[i] === '--to' && args[i + 1]) opts.to = args[++i];
55
+ else if (args[i] === '--force') opts.force = true;
56
+ }
57
+ return opts;
58
+ }
59
+
60
+ async function main() {
61
+ const config = loadConfig();
62
+ if (!config) { console.log('{}'); return; }
63
+ const storage = await createStorageAdapter(config);
64
+ const opts = parseArgs();
65
+
66
+ const lastGenerated = opts.force ? null : getLastGenerated();
67
+
68
+ // 1. Collect raw logs, filter by lastGenerated timestamp
69
+ const sessions = await storage.list('.raw');
70
+ /** @type {Record<string, Array<{type: string, message: string, cwd: string, timestamp: string}>>} */
71
+ const logsByDate = {};
72
+ const affectedDates = new Set();
73
+
74
+ for (const sess of sessions) {
75
+ const files = await storage.list('.raw/' + sess);
76
+ for (const file of files) {
77
+ if (!file.endsWith('.jsonl')) continue;
78
+ const date = file.replace('.jsonl', '');
79
+ const content = await storage.read('.raw/' + sess + '/' + file);
80
+ if (!content) continue;
81
+
82
+ // Apply date range filter
83
+ if (opts.from && date < opts.from) continue;
84
+ if (opts.to && date > opts.to) continue;
85
+
86
+ for (const line of content.trim().split('\n')) {
87
+ try {
88
+ const entry = JSON.parse(line);
89
+ if (!logsByDate[date]) logsByDate[date] = [];
90
+ logsByDate[date].push({
91
+ type: entry.type || 'unknown',
92
+ message: entry.message || '',
93
+ cwd: entry.cwd || '',
94
+ timestamp: entry.timestamp || '',
95
+ });
96
+
97
+ if (!lastGenerated || entry.timestamp > lastGenerated) {
98
+ affectedDates.add(date);
99
+ }
100
+ } catch {}
101
+ }
102
+ }
103
+ }
104
+
105
+ // 2. If no new entries, nothing to do
106
+ if (affectedDates.size === 0) {
107
+ console.log(JSON.stringify({ needs: { daily: [], weekly: [], monthly: [], quarterly: [], yearly: [] }, logs: {} }));
108
+ return;
109
+ }
110
+
111
+ // 3. Determine which periods are affected by the changed dates
112
+ const periods = config.periods;
113
+ const affectedWeeks = new Set();
114
+ const affectedMonths = new Set();
115
+ const affectedQuarters = new Set();
116
+ const affectedYears = new Set();
117
+
118
+ for (const dateStr of affectedDates) {
119
+ const [y, m, d] = dateStr.split('-').map(Number);
120
+ const date = new Date(y, m - 1, d);
121
+ if (periods.weekly) affectedWeeks.add(toWeekKey(date));
122
+ if (periods.monthly) affectedMonths.add(toMonthKey(date));
123
+ if (periods.quarterly) affectedQuarters.add(toQuarterKey(date));
124
+ if (periods.yearly) affectedYears.add(toYearKey(date));
125
+ }
126
+
127
+ // 4. Only include logs for affected dates
128
+ const filteredLogs = {};
129
+ for (const date of affectedDates) {
130
+ filteredLogs[date] = logsByDate[date];
131
+ }
132
+
133
+ // 5. Output
134
+ const now = new Date().toISOString();
135
+ const result = {
136
+ profile: config.profile,
137
+ language: config.language,
138
+ lastGenerated: lastGenerated,
139
+ newTimestamp: now,
140
+ needs: {
141
+ daily: [...affectedDates].sort(),
142
+ weekly: [...affectedWeeks].sort(),
143
+ monthly: [...affectedMonths].sort(),
144
+ quarterly: [...affectedQuarters].sort(),
145
+ yearly: [...affectedYears].sort(),
146
+ },
147
+ logs: filteredLogs,
148
+ };
149
+
150
+ console.log(JSON.stringify(result));
151
+ }
152
+
153
+ main().catch(() => console.log('{}'));
package/lib/config.mjs CHANGED
@@ -98,7 +98,7 @@ export function createDefaultLocalConfig(basePath) {
98
98
  /** @param {string} owner @param {string} repo @param {string} token @returns {Config} */
99
99
  export function createDefaultGitHubConfig(owner, repo, token) {
100
100
  return {
101
- storage: { type: 'github', github: { owner, repo, token, basePath: 'daily-review' } },
101
+ storage: { type: 'github', github: { owner, repo, token, basePath: '' } },
102
102
  language: 'ko',
103
103
  periods: { daily: true, weekly: true, monthly: true, quarterly: true, yearly: false },
104
104
  profile: { company: '', role: '', team: '', context: '' },
@@ -0,0 +1,13 @@
1
+ #!/usr/bin/env node
2
+ import fs from 'fs';
3
+ import path from 'path';
4
+ const configPath = path.join(process.env.CLAUDE_PLUGIN_DATA, 'config.json');
5
+ const config = JSON.parse(fs.readFileSync(configPath, 'utf-8'));
6
+ const base = config.storage.local.basePath;
7
+ const dirs = ['daily', 'projects', 'uncategorized', '.raw', '.reviews'];
8
+ if (config.periods.weekly) dirs.push('weekly');
9
+ if (config.periods.monthly) dirs.push('monthly');
10
+ if (config.periods.quarterly) dirs.push('quarterly');
11
+ if (config.periods.yearly) dirs.push('yearly');
12
+ dirs.forEach(d => fs.mkdirSync(path.join(base, d), { recursive: true }));
13
+ console.log('Directories created at: ' + base);