claudectx 1.1.1 → 1.1.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -269,14 +269,17 @@ Output shows per-file token counts, cache hit likelihood (based on your recent r
269
269
  Start each session with a cache hit instead of a full miss. Sends a silent priming request to Anthropic so your CLAUDE.md is cached before Claude Code touches it.
270
270
 
271
271
  ```bash
272
- claudectx warmup --api-key $ANTHROPIC_API_KEY # 5-min cache TTL (default)
273
- claudectx warmup --ttl 60 --api-key $ANTHROPIC_API_KEY # 60-min extended TTL
274
- claudectx warmup --cron "0 9 * * 1-5" --api-key ... # Install as daily cron job
275
- claudectx warmup --json # Output result as JSON
272
+ claudectx warmup --api-key $ANTHROPIC_API_KEY # 5-min cache TTL (default)
273
+ claudectx warmup --ttl 60 --api-key $ANTHROPIC_API_KEY # 60-min extended TTL (2× write cost)
274
+ claudectx warmup --model sonnet --api-key $ANTHROPIC_API_KEY # use a specific model
275
+ claudectx warmup --cron "0 9 * * 1-5" # Install as weekday 9am cron job
276
+ claudectx warmup --json # JSON output for scripting
276
277
  ```
277
278
 
278
279
  Reports tokens warmed, write cost, savings per cache hit, and break-even request count.
279
280
 
281
+ > **Note on `--cron`:** The API key is **not** embedded in the cron job. At runtime, the job reads `ANTHROPIC_API_KEY` from your environment. Set it in `~/.profile` or `~/.zshenv` so cron can see it.
282
+
280
283
  ---
281
284
 
282
285
  ### `claudectx drift` — CLAUDE.md stays fresh, not stale
@@ -286,8 +289,9 @@ Over time CLAUDE.md accumulates dead references and sections nobody reads. `drif
286
289
  ```bash
287
290
  claudectx drift # Scan current project
288
291
  claudectx drift --days 14 # Use 14-day read window (default: 30)
289
- claudectx drift --fix # Interactively remove flagged lines
292
+ claudectx drift --fix # Interactively remove flagged lines (creates CLAUDE.md.bak first)
290
293
  claudectx drift --json # JSON output
294
+ claudectx drift --path /other/proj # Scan a different project directory
291
295
  ```
292
296
 
293
297
  Detects 4 types of drift:
@@ -306,20 +310,24 @@ Detects 4 types of drift:
306
310
  Install named, pre-configured hooks beyond the basic read logger.
307
311
 
308
312
  ```bash
309
- claudectx hooks list # Show all available hooks
310
- claudectx hooks add auto-compress --config apiKey=sk-... # Install a hook
311
- claudectx hooks remove slack-digest # Remove an installed hook
312
- claudectx hooks status # Show what's installed
313
+ claudectx hooks list # Show all available hooks
314
+ claudectx hooks add auto-compress # Install with default threshold (50k tokens)
315
+ claudectx hooks add auto-compress --config threshold=30000 # Custom token threshold
316
+ claudectx hooks add slack-digest --config webhookUrl=https://hooks.slack.com/...
317
+ claudectx hooks remove slack-digest # Remove an installed hook
318
+ claudectx hooks status # Show what's installed
313
319
  ```
314
320
 
315
321
  **Built-in hooks:**
316
322
 
317
- | Hook | Trigger | What it does |
318
- |---|---|---|
319
- | `auto-compress` | PostToolUse (Read) | Runs `claudectx compress` after each session |
320
- | `daily-budget` | PreToolUse | Checks token budget before each tool call |
321
- | `slack-digest` | Stop | Posts session report to a Slack webhook |
322
- | `session-warmup` | PostToolUse (Read) | Re-warms the cache after each read |
323
+ | Hook | Trigger | Config | What it does |
324
+ |---|---|---|---|
325
+ | `auto-compress` | PostToolUse (Read) | `threshold` (default: 50000) | Runs `claudectx compress` after each session |
326
+ | `daily-budget` | PreToolUse | _(none)_ | Reports today's spend before each tool call |
327
+ | `slack-digest` | Stop | `webhookUrl` (required) | Posts session report to a Slack webhook |
328
+ | `session-warmup` | PostToolUse (Read) | _(none)_ | Re-warms the cache; reads `ANTHROPIC_API_KEY` from env |
329
+
330
+ > **Security note:** Hooks that need an API key (`compress`, `warmup`) read `ANTHROPIC_API_KEY` from your environment — no secrets are stored in `.claude/settings.local.json`.
323
331
 
324
332
  ---
325
333
 
@@ -344,16 +352,16 @@ See where the money goes across your whole team — without sharing session cont
344
352
 
345
353
  ```bash
346
354
  # Step 1: each developer runs this on their own machine
347
- claudectx teams export
355
+ claudectx teams export # Default: last 30 days, sonnet pricing
356
+ claudectx teams export --days 7 --model haiku # Custom window and model
348
357
 
349
358
  # Step 2: collect the JSON files in a shared directory, then aggregate
350
359
  claudectx teams aggregate --dir ./reports/
360
+ claudectx teams aggregate --dir ./reports/ --anonymize # Replace names with Dev 1, Dev 2...
361
+ claudectx teams aggregate --dir ./reports/ --json # Machine-readable JSON
351
362
 
352
363
  # Optional: copy your export to a shared location
353
364
  claudectx teams share --to /shared/reports/
354
-
355
- # Anonymize for review
356
- claudectx teams aggregate --dir ./reports/ --anonymize
357
365
  ```
358
366
 
359
367
  Output shows per-developer spend, cache hit rate, avg request size, and top shared files. Exports are lightweight JSON — no session content, no prompts, just aggregated token counts.
package/dist/index.js CHANGED
@@ -239,7 +239,7 @@ function findSessionFile(sessionId) {
239
239
  }
240
240
  return files[0]?.filePath ?? null;
241
241
  }
242
- function readSessionUsage(sessionFilePath) {
242
+ async function readSessionUsage(sessionFilePath) {
243
243
  const result = {
244
244
  inputTokens: 0,
245
245
  outputTokens: 0,
@@ -248,28 +248,32 @@ function readSessionUsage(sessionFilePath) {
248
248
  requestCount: 0
249
249
  };
250
250
  if (!fs9.existsSync(sessionFilePath)) return result;
251
- let content;
251
+ const { createReadStream } = await import("fs");
252
+ const { createInterface } = await import("readline");
252
253
  try {
253
- content = fs9.readFileSync(sessionFilePath, "utf-8");
254
- } catch {
255
- return result;
256
- }
257
- const lines = content.trim().split("\n").filter(Boolean);
258
- for (const line of lines) {
259
- try {
260
- const entry = JSON.parse(line);
261
- const usage = entry.usage ?? entry.message?.usage;
262
- if (!usage) continue;
263
- const isAssistant = entry.type === "assistant" || entry.message?.role === "assistant";
264
- if (isAssistant) {
265
- result.inputTokens += usage.input_tokens ?? 0;
266
- result.outputTokens += usage.output_tokens ?? 0;
267
- result.cacheCreationTokens += usage.cache_creation_input_tokens ?? 0;
268
- result.cacheReadTokens += usage.cache_read_input_tokens ?? 0;
269
- result.requestCount++;
254
+ const rl = createInterface({
255
+ input: createReadStream(sessionFilePath, { encoding: "utf-8" }),
256
+ crlfDelay: Infinity
257
+ });
258
+ for await (const line of rl) {
259
+ if (!line.trim()) continue;
260
+ try {
261
+ const entry = JSON.parse(line);
262
+ const usage = entry.usage ?? entry.message?.usage;
263
+ if (!usage) continue;
264
+ const isAssistant = entry.type === "assistant" || entry.message?.role === "assistant";
265
+ if (isAssistant) {
266
+ result.inputTokens += usage.input_tokens ?? 0;
267
+ result.outputTokens += usage.output_tokens ?? 0;
268
+ result.cacheCreationTokens += usage.cache_creation_input_tokens ?? 0;
269
+ result.cacheReadTokens += usage.cache_read_input_tokens ?? 0;
270
+ result.requestCount++;
271
+ }
272
+ } catch {
270
273
  }
271
- } catch {
272
274
  }
275
+ } catch {
276
+ return result;
273
277
  }
274
278
  return result;
275
279
  }
@@ -397,20 +401,23 @@ function Dashboard({
397
401
  const events = readAllEvents();
398
402
  const fileStats2 = aggregateStats(events);
399
403
  const sessionFile2 = sessionId ? findSessionFile(sessionId) : findSessionFile();
400
- const usage2 = sessionFile2 ? readSessionUsage(sessionFile2) : {
404
+ const usagePromise = sessionFile2 ? readSessionUsage(sessionFile2) : Promise.resolve({
401
405
  inputTokens: 0,
402
406
  outputTokens: 0,
403
407
  cacheCreationTokens: 0,
404
408
  cacheReadTokens: 0,
405
409
  requestCount: 0
406
- };
407
- setState((prev) => ({
408
- fileStats: fileStats2,
409
- usage: usage2,
410
- sessionFile: sessionFile2,
411
- lastUpdated: /* @__PURE__ */ new Date(),
412
- tickCount: prev.tickCount + 1
413
- }));
410
+ });
411
+ usagePromise.then((usage2) => {
412
+ setState((prev) => ({
413
+ fileStats: fileStats2,
414
+ usage: usage2,
415
+ sessionFile: sessionFile2,
416
+ lastUpdated: /* @__PURE__ */ new Date(),
417
+ tickCount: prev.tickCount + 1
418
+ }));
419
+ }).catch(() => {
420
+ });
414
421
  }, [sessionId]);
415
422
  (0, import_react.useEffect)(() => {
416
423
  refresh();
@@ -2767,9 +2774,9 @@ init_models();
2767
2774
  function isoDate(d) {
2768
2775
  return d.toISOString().slice(0, 10);
2769
2776
  }
2770
- function calcCost2(inputTokens, outputTokens, model) {
2777
+ function calcCost2(inputTokens, outputTokens, cacheCreationTokens, cacheReadTokens, model) {
2771
2778
  const p = MODEL_PRICING[model];
2772
- return inputTokens / 1e6 * p.inputPerMillion + outputTokens / 1e6 * p.outputPerMillion;
2779
+ return inputTokens / 1e6 * p.inputPerMillion + outputTokens / 1e6 * p.outputPerMillion + cacheCreationTokens / 1e6 * p.cacheWritePerMillion + cacheReadTokens / 1e6 * p.cacheReadPerMillion;
2773
2780
  }
2774
2781
  async function aggregateUsage(days, model = "claude-sonnet-4-6") {
2775
2782
  const now = /* @__PURE__ */ new Date();
@@ -2788,6 +2795,7 @@ async function aggregateUsage(days, model = "claude-sonnet-4-6") {
2788
2795
  inputTokens: 0,
2789
2796
  outputTokens: 0,
2790
2797
  cacheReadTokens: 0,
2798
+ cacheCreationTokens: 0,
2791
2799
  requests: 0,
2792
2800
  costUsd: 0
2793
2801
  });
@@ -2796,20 +2804,23 @@ async function aggregateUsage(days, model = "claude-sonnet-4-6") {
2796
2804
  let totalInput = 0;
2797
2805
  let totalOutput = 0;
2798
2806
  let totalCacheRead = 0;
2807
+ let totalCacheCreation = 0;
2799
2808
  for (const sf of sessionFiles) {
2800
2809
  const dateStr = isoDate(new Date(sf.mtimeMs));
2801
2810
  const bucket = bucketMap.get(dateStr);
2802
2811
  if (!bucket) continue;
2803
- const usage = readSessionUsage(sf.filePath);
2812
+ const usage = await readSessionUsage(sf.filePath);
2804
2813
  bucket.sessions++;
2805
2814
  bucket.inputTokens += usage.inputTokens;
2806
2815
  bucket.outputTokens += usage.outputTokens;
2807
2816
  bucket.cacheReadTokens += usage.cacheReadTokens;
2817
+ bucket.cacheCreationTokens += usage.cacheCreationTokens;
2808
2818
  bucket.requests += usage.requestCount;
2809
- bucket.costUsd += calcCost2(usage.inputTokens, usage.outputTokens, model);
2819
+ bucket.costUsd += calcCost2(usage.inputTokens, usage.outputTokens, usage.cacheCreationTokens, usage.cacheReadTokens, model);
2810
2820
  totalInput += usage.inputTokens;
2811
2821
  totalOutput += usage.outputTokens;
2812
2822
  totalCacheRead += usage.cacheReadTokens;
2823
+ totalCacheCreation += usage.cacheCreationTokens;
2813
2824
  totalRequests += usage.requestCount;
2814
2825
  }
2815
2826
  const fileEvents = readAllEvents().filter(
@@ -2820,10 +2831,12 @@ async function aggregateUsage(days, model = "claude-sonnet-4-6") {
2820
2831
  filePath: s.filePath,
2821
2832
  readCount: s.readCount
2822
2833
  }));
2823
- const totalCost = calcCost2(totalInput, totalOutput, model);
2834
+ const totalCost = calcCost2(totalInput, totalOutput, totalCacheCreation, totalCacheRead, model);
2824
2835
  const cacheHitRate = totalInput > 0 ? Math.round(totalCacheRead / totalInput * 100) : 0;
2825
2836
  const byDay = [...bucketMap.values()].sort((a, b) => a.date.localeCompare(b.date));
2826
2837
  const uniqueSessions = new Set(sessionFiles.map((f) => f.sessionId)).size;
2838
+ const dailyAvgCostUsd = days > 0 ? totalCost / days : 0;
2839
+ const projectedMonthlyUsd = dailyAvgCostUsd * 30;
2827
2840
  return {
2828
2841
  periodDays: days,
2829
2842
  startDate: isoDate(cutoff),
@@ -2833,10 +2846,13 @@ async function aggregateUsage(days, model = "claude-sonnet-4-6") {
2833
2846
  totalInputTokens: totalInput,
2834
2847
  totalOutputTokens: totalOutput,
2835
2848
  totalCacheReadTokens: totalCacheRead,
2849
+ totalCacheCreationTokens: totalCacheCreation,
2836
2850
  cacheHitRate,
2837
2851
  totalCostUsd: totalCost,
2838
2852
  avgCostPerSession: uniqueSessions > 0 ? totalCost / uniqueSessions : 0,
2839
2853
  avgTokensPerRequest: totalRequests > 0 ? Math.round(totalInput / totalRequests) : 0,
2854
+ dailyAvgCostUsd,
2855
+ projectedMonthlyUsd,
2840
2856
  byDay,
2841
2857
  topFiles,
2842
2858
  model,
@@ -2879,9 +2895,12 @@ function formatText(data) {
2879
2895
  lines.push(` Input tokens: ${fmtNum2(data.totalInputTokens)}`);
2880
2896
  lines.push(` Output tokens: ${fmtNum2(data.totalOutputTokens)}`);
2881
2897
  lines.push(` Cache reads: ${fmtNum2(data.totalCacheReadTokens)} (${data.cacheHitRate}% hit rate)`);
2898
+ lines.push(` Cache writes: ${fmtNum2(data.totalCacheCreationTokens)}`);
2882
2899
  lines.push(` Total cost (est.): ${fmtCost2(data.totalCostUsd)}`);
2883
2900
  lines.push(` Avg cost/session: ${fmtCost2(data.avgCostPerSession)}`);
2884
2901
  lines.push(` Avg tokens/request: ${fmtNum2(data.avgTokensPerRequest)}`);
2902
+ lines.push(` Daily avg cost: ${fmtCost2(data.dailyAvgCostUsd)}`);
2903
+ lines.push(` Projected (30-day): ${fmtCost2(data.projectedMonthlyUsd)}`);
2885
2904
  lines.push(` Model: ${data.model}`);
2886
2905
  lines.push("");
2887
2906
  const activeDays = data.byDay.filter((d) => d.sessions > 0);
@@ -2950,9 +2969,12 @@ function formatMarkdown(data) {
2950
2969
  lines.push(`| Input tokens | ${fmtNum2(data.totalInputTokens)} |`);
2951
2970
  lines.push(`| Output tokens | ${fmtNum2(data.totalOutputTokens)} |`);
2952
2971
  lines.push(`| Cache hit rate | ${data.cacheHitRate}% |`);
2972
+ lines.push(`| Cache writes | ${fmtNum2(data.totalCacheCreationTokens)} tokens |`);
2953
2973
  lines.push(`| Total cost (est.) | ${fmtCost2(data.totalCostUsd)} |`);
2954
2974
  lines.push(`| Avg cost/session | ${fmtCost2(data.avgCostPerSession)} |`);
2955
2975
  lines.push(`| Avg tokens/request | ${fmtNum2(data.avgTokensPerRequest)} |`);
2976
+ lines.push(`| Daily avg cost | ${fmtCost2(data.dailyAvgCostUsd)} |`);
2977
+ lines.push(`| Projected (30-day) | ${fmtCost2(data.projectedMonthlyUsd)} |`);
2956
2978
  lines.push(`| Model | \`${data.model}\` |`);
2957
2979
  lines.push("");
2958
2980
  const activeDays = data.byDay.filter((d) => d.sessions > 0);
@@ -4100,9 +4122,9 @@ function getDeveloperIdentity() {
4100
4122
  }
4101
4123
  return os4.hostname();
4102
4124
  }
4103
- function calcCost3(inputTokens, outputTokens, model) {
4125
+ function calcCost3(inputTokens, outputTokens, cacheCreationTokens, cacheReadTokens, model) {
4104
4126
  const p = MODEL_PRICING[model];
4105
- return inputTokens / 1e6 * p.inputPerMillion + outputTokens / 1e6 * p.outputPerMillion;
4127
+ return inputTokens / 1e6 * p.inputPerMillion + outputTokens / 1e6 * p.outputPerMillion + cacheCreationTokens / 1e6 * p.cacheWritePerMillion + cacheReadTokens / 1e6 * p.cacheReadPerMillion;
4106
4128
  }
4107
4129
  function isoDate2(d) {
4108
4130
  return d.toISOString().slice(0, 10);
@@ -4133,6 +4155,7 @@ async function buildTeamExport(days, model, anonymize) {
4133
4155
  inputTokens: 0,
4134
4156
  outputTokens: 0,
4135
4157
  cacheReadTokens: 0,
4158
+ cacheCreationTokens: 0,
4136
4159
  requests: 0,
4137
4160
  costUsd: 0
4138
4161
  });
@@ -4144,14 +4167,15 @@ async function buildTeamExport(days, model, anonymize) {
4144
4167
  for (const sf of sessionFiles) {
4145
4168
  const dateStr = isoDate2(new Date(sf.mtimeMs));
4146
4169
  const bucket = bucketMap.get(dateStr);
4147
- const usage = readSessionUsage(sf.filePath);
4170
+ const usage = await readSessionUsage(sf.filePath);
4148
4171
  if (bucket) {
4149
4172
  bucket.sessions++;
4150
4173
  bucket.inputTokens += usage.inputTokens;
4151
4174
  bucket.outputTokens += usage.outputTokens;
4152
4175
  bucket.cacheReadTokens += usage.cacheReadTokens;
4176
+ bucket.cacheCreationTokens += usage.cacheCreationTokens;
4153
4177
  bucket.requests += usage.requestCount;
4154
- bucket.costUsd += calcCost3(usage.inputTokens, usage.outputTokens, model);
4178
+ bucket.costUsd += calcCost3(usage.inputTokens, usage.outputTokens, usage.cacheCreationTokens, usage.cacheReadTokens, model);
4155
4179
  }
4156
4180
  totalInput += usage.inputTokens;
4157
4181
  totalOutput += usage.outputTokens;
@@ -4162,7 +4186,7 @@ async function buildTeamExport(days, model, anonymize) {
4162
4186
  (e) => new Date(e.timestamp).getTime() >= cutoffMs
4163
4187
  );
4164
4188
  const topWasteFiles = aggregateStats(fileEvents).slice(0, 10).map((s) => ({ filePath: s.filePath, readCount: s.readCount }));
4165
- const totalCostUsd = calcCost3(totalInput, totalOutput, model);
4189
+ const totalCostUsd = calcCost3(totalInput, totalOutput, 0, totalCacheRead, model);
4166
4190
  const cacheHitRate = totalInput > 0 ? Math.round(totalCacheRead / totalInput * 100) : 0;
4167
4191
  const uniqueSessions = new Set(sessionFiles.map((f) => f.sessionId)).size;
4168
4192
  const developer = {
@@ -4378,7 +4402,7 @@ async function teamsCommand(subcommand, options) {
4378
4402
  }
4379
4403
 
4380
4404
  // src/index.ts
4381
- var VERSION = "1.0.0";
4405
+ var VERSION = "1.1.2";
4382
4406
  var DESCRIPTION = "Reduce Claude Code token usage by up to 80%. Context analyzer, auto-optimizer, live dashboard, and smart MCP tools.";
4383
4407
  var program = new import_commander.Command();
4384
4408
  program.name("claudectx").description(DESCRIPTION).version(VERSION);