@vibescore/tracker 0.0.3 → 0.0.5

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,26 +1,117 @@
1
- # @vibescore/tracker
1
+ <div align="center">
2
2
 
3
- Codex CLI token usage tracker (macOS-first, notify-driven).
3
+ # 🟢 VIBESCORE
4
4
 
5
- ## Quick Start
5
+ **QUANTIFY YOUR AI OUTPUT**
6
+ _Real-time AI Analytics for Codex CLI_
7
+
8
+ [**www.vibescore.space**](https://www.vibescore.space)
9
+
10
+ [![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)
11
+ [![Node.js Support](https://img.shields.io/badge/Node.js-%3E%3D18-brightgreen.svg)](https://nodejs.org/)
12
+ [![Platform](https://img.shields.io/badge/Platform-macOS-lightgrey.svg)](https://www.apple.com/macos/)
13
+
14
+ [**English**](README.md) • [**中文说明**](README.zh-CN.md)
15
+
16
+ [**Documentation**](docs/) • [**Dashboard**](dashboard/) • [**Backend API**](BACKEND_API.md)
17
+
18
+ </div>
19
+
20
+ ---
21
+
22
+ ## 🌌 Overview
23
+
24
+ **VibeScore** is an intelligent token usage tracking system designed specifically for macOS developers. It monitors Codex CLI output in real-time, transforming your **AI Output** into quantifiable metrics via a high-fidelity, **Matrix-themed** dashboard.
25
+
26
+ > [!TIP] > **Core Index**: Our signature metric that reflects your flow state by analyzing token consumption rates and patterns.
27
+
28
+ ## 🚀 Key Features
29
+
30
+ - 📡 **Live Sniffer**: Real-time interception of Codex CLI pipes using low-level hooks to capture every completion event.
31
+ - 📊 **Matrix Dashboard**: A high-performance React + Vite dashboard featuring heatmaps, trend charts, and live logs.
32
+ - ⚡ **AI Analytics**: Deep analysis of Input/Output tokens, with dedicated tracking for Cached and Reasoning components.
33
+ - 🔒 **Identity Core**: Robust authentication and permission management to secure your development data.
34
+
35
+ ## 🛠️ Quick Start
36
+
37
+ ### Installation
38
+
39
+ Initialize your environment with a single command:
6
40
 
7
41
  ```bash
8
42
  npx --yes @vibescore/tracker init
43
+ ```
44
+
45
+ ### Sync & Status
46
+
47
+ ```bash
48
+ # Sync latest local session data
9
49
  npx --yes @vibescore/tracker sync
50
+
51
+ # Check current link status
10
52
  npx --yes @vibescore/tracker status
11
- npx --yes @vibescore/tracker uninstall
12
53
  ```
13
54
 
14
- ## Requirements
55
+ ## 🧰 Troubleshooting
15
56
 
16
- - Node.js >= 18
17
- - macOS (current supported platform)
57
+ ### Streak shows 0 days while totals look correct
58
+
59
+ - Streak is defined as consecutive days ending today. If today's total is 0, streak will be 0.
60
+ - If you expect a non-zero streak, clear cached auth/heatmap data and sign in again:
61
+
62
+ ```js
63
+ localStorage.removeItem('vibescore.dashboard.auth.v1');
64
+ Object.keys(localStorage)
65
+ .filter((k) => k.startsWith('vibescore.heatmap.'))
66
+ .forEach((k) => localStorage.removeItem(k));
67
+ location.reload();
68
+ ```
69
+
70
+ - Complete the landing page sign-in flow again after reload.
71
+ - Note: `insforge-auth-token` is not used by the dashboard; use `vibescore.dashboard.auth.v1`.
72
+
73
+ ## 🏗️ Architecture
74
+
75
+ ```mermaid
76
+ graph TD
77
+ A[Codex CLI] -->|Rollout Logs| B(Tracker CLI)
78
+ B -->|AI Tokens| C{Core Relay}
79
+ C --> D[VibeScore Dashboard]
80
+ C --> E[AI Analytics Engine]
81
+ ```
82
+
83
+ ## 💻 Developer Guide
84
+
85
+ To run locally or contribute:
86
+
87
+ ### Dashboard Development
88
+
89
+ ```bash
90
+ # Install dependencies
91
+ cd dashboard
92
+ npm install
93
+
94
+ # Start dev server
95
+ npm run dev
96
+ ```
97
+
98
+ ### Architecture Validation
99
+
100
+ ```bash
101
+ # Validate Copy Registry
102
+ npm run validate:copy
103
+
104
+ # Run smoke tests
105
+ npm run smoke
106
+ ```
18
107
 
19
- ## Notes
108
+ ## 📜 License
20
109
 
21
- - `init` installs a Codex CLI notify hook and issues a device token.
22
- - `sync` parses `~/.codex/sessions/**/rollout-*.jsonl` and uploads token_count deltas.
110
+ This project is licensed under the [MIT License](LICENSE).
23
111
 
24
- ## License
112
+ ---
25
113
 
26
- MIT
114
+ <div align="center">
115
+ <b>System_Ready // 2024 VibeScore OS</b><br/>
116
+ <i>"More Tokens. More Vibe."</i>
117
+ </div>
@@ -0,0 +1,117 @@
1
+ <div align="center">
2
+
3
+ # 🟢 VIBESCORE
4
+
5
+ **量化你的 AI 产出**
6
+ _Codex CLI 实时 AI 分析工具_
7
+
8
+ [**www.vibescore.space**](https://www.vibescore.space)
9
+
10
+ [![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)
11
+ [![Node.js Support](https://img.shields.io/badge/Node.js-%3E%3D18-brightgreen.svg)](https://nodejs.org/)
12
+ [![Platform](https://img.shields.io/badge/Platform-macOS-lightgrey.svg)](https://www.apple.com/macos/)
13
+
14
+ [**English**](README.md) • [**中文说明**](README.zh-CN.md)
15
+
16
+ [**文档**](docs/) • [**控制台**](dashboard/) • [**后端接口**](BACKEND_API.md)
17
+
18
+ </div>
19
+
20
+ ---
21
+
22
+ ## 🌌 项目概述
23
+
24
+ **VibeScore** 是一个专为 macOS 开发者设计的智能令牌(Token)使用追踪系统。它能够实时监控 Codex CLI 的输出,通过高度可视化的 **Matrix** 风格仪表盘,将你的 **AI 产出 (AI Output)** 转化为可量化的指标。
25
+
26
+ > [!TIP] > **Core Index (核心指数)**: 我们的标志性指标,通过分析 Token 消耗速率与模式,反映你的开发心流状态。
27
+
28
+ ## 🚀 核心功能
29
+
30
+ - 📡 **Live Sniffer (实时嗅探)**: 实时监听 Codex CLI 管道,通过底层 Hook 捕获每一次补全事件。
31
+ - 📊 **Matrix Dashboard (矩阵控制台)**: 基于 React + Vite 的高性能仪表盘,具备热力图、趋势图与实时日志。
32
+ - ⚡ **AI Analytics (AI 分析)**: 深度分析 Input/Output Token,支持缓存 (Cached) 与推理 (Reasoning) 部分的分离监控。
33
+ - 🔒 **Identity Core (身份核心)**: 完备的身份验证与权限管理,保护你的开发数据资产。
34
+
35
+ ## 🛠️ 快速开始
36
+
37
+ ### 安装
38
+
39
+ 只需一行命令,即可初始化环境:
40
+
41
+ ```bash
42
+ npx --yes @vibescore/tracker init
43
+ ```
44
+
45
+ ### 同步与状态查看
46
+
47
+ ```bash
48
+ # 同步最新的本地会话数据
49
+ npx --yes @vibescore/tracker sync
50
+
51
+ # 查看当前连接状态
52
+ npx --yes @vibescore/tracker status
53
+ ```
54
+
55
+ ## 🧰 常见问题
56
+
57
+ ### Streak 显示 0 天但总量正常
58
+
59
+ - Streak 的口径是“从今天开始连续使用的天数”,如果今天的 total 为 0,streak 就是 0。
60
+ - 如果你确认应该有 streak,请清理本地缓存并重新登录:
61
+
62
+ ```js
63
+ localStorage.removeItem('vibescore.dashboard.auth.v1');
64
+ Object.keys(localStorage)
65
+ .filter((k) => k.startsWith('vibescore.heatmap.'))
66
+ .forEach((k) => localStorage.removeItem(k));
67
+ location.reload();
68
+ ```
69
+
70
+ - 刷新后重新走一遍 landing page 的登录流程。
71
+ - 说明:Dashboard 不使用 `insforge-auth-token`,实际存储在 `vibescore.dashboard.auth.v1`。
72
+
73
+ ## 🏗️ 系统架构
74
+
75
+ ```mermaid
76
+ graph TD
77
+ A[Codex CLI] -->|Rollout Logs| B(Tracker CLI)
78
+ B -->|AI Tokens| C{Core Relay}
79
+ C --> D[VibeScore Dashboard]
80
+ C --> E[AI Analytics Engine]
81
+ ```
82
+
83
+ ## 💻 开发者指南
84
+
85
+ 如果你想在本地运行或贡献代码:
86
+
87
+ ### 仪表盘开发
88
+
89
+ ```bash
90
+ # 安装依赖
91
+ cd dashboard
92
+ npm install
93
+
94
+ # 启动开发服务器
95
+ npm run dev
96
+ ```
97
+
98
+ ### 整体架构验证
99
+
100
+ ```bash
101
+ # 验证 Copy 注册表
102
+ npm run validate:copy
103
+
104
+ # 执行烟雾测试
105
+ npm run smoke
106
+ ```
107
+
108
+ ## 📜 开源协议
109
+
110
+ 本项目基于 [MIT](LICENSE) 协议开源。
111
+
112
+ ---
113
+
114
+ <div align="center">
115
+ <b>System_Ready // 2024 VibeScore OS</b><br/>
116
+ <i>"More Tokens. More Vibe."</i>
117
+ </div>
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@vibescore/tracker",
3
- "version": "0.0.3",
3
+ "version": "0.0.5",
4
4
  "description": "Codex CLI token usage tracker (macOS-first, notify-driven).",
5
5
  "license": "MIT",
6
6
  "publishConfig": {
package/src/cli.js CHANGED
@@ -48,7 +48,7 @@ function printHelp() {
48
48
  '',
49
49
  'Notes:',
50
50
  ' - init installs a Codex notify hook and issues a device token (default: browser sign in/up).',
51
- ' - optional: set VIBESCORE_DASHBOARD_URL (or --dashboard-url) to use a hosted /connect page.',
51
+ ' - optional: set VIBESCORE_DASHBOARD_URL (or --dashboard-url) to use a hosted landing page.',
52
52
  ' - sync parses ~/.codex/sessions/**/rollout-*.jsonl and uploads token_count deltas.',
53
53
  ' - --debug prints original backend errors when they are normalized.',
54
54
  ''
@@ -7,6 +7,7 @@ const { prompt, promptHidden } = require('../lib/prompt');
7
7
  const { upsertCodexNotify, loadCodexNotifyOriginal } = require('../lib/codex-config');
8
8
  const { beginBrowserAuth } = require('../lib/browser-auth');
9
9
  const { issueDeviceTokenWithPassword, issueDeviceTokenWithAccessToken } = require('../lib/insforge');
10
+ const { cmdSync } = require('./sync');
10
11
 
11
12
  async function cmdInit(argv) {
12
13
  const opts = parseArgs(argv);
@@ -104,6 +105,13 @@ async function cmdInit(argv) {
104
105
  ''
105
106
  ].join('\n')
106
107
  );
108
+
109
+ try {
110
+ await cmdSync([]);
111
+ } catch (err) {
112
+ const msg = err && err.message ? err.message : 'unknown error';
113
+ process.stderr.write(`Initial sync failed: ${msg}\n`);
114
+ }
107
115
  }
108
116
 
109
117
  function parseArgs(argv) {
@@ -24,6 +24,7 @@ async function cmdStatus(argv = []) {
24
24
  const notifySignalPath = path.join(trackerDir, 'notify.signal');
25
25
  const throttlePath = path.join(trackerDir, 'sync.throttle');
26
26
  const uploadThrottlePath = path.join(trackerDir, 'upload.throttle.json');
27
+ const autoRetryPath = path.join(trackerDir, 'auto.retry.json');
27
28
  const codexHome = process.env.CODEX_HOME || path.join(home, '.codex');
28
29
  const codexConfigPath = path.join(codexHome, 'config.toml');
29
30
 
@@ -31,6 +32,7 @@ async function cmdStatus(argv = []) {
31
32
  const cursors = await readJson(cursorsPath);
32
33
  const queueState = (await readJson(queueStatePath)) || { offset: 0 };
33
34
  const uploadThrottle = normalizeUploadState(await readJson(uploadThrottlePath));
35
+ const autoRetry = await readJson(autoRetryPath);
34
36
 
35
37
  const queueSize = await safeStatSize(queuePath);
36
38
  const pendingBytes = Math.max(0, queueSize - (queueState.offset || 0));
@@ -51,6 +53,12 @@ async function cmdStatus(argv = []) {
51
53
  const lastUploadError = uploadThrottle.lastError
52
54
  ? `${uploadThrottle.lastErrorAt || 'unknown'} ${uploadThrottle.lastError}`
53
55
  : null;
56
+ const autoRetryAt = parseEpochMsToIso(autoRetry?.retryAtMs || null);
57
+ const autoRetryLine = autoRetryAt
58
+ ? `- Auto retry after: ${autoRetryAt} (${autoRetry?.reason || 'scheduled'}, pending ${Number(
59
+ autoRetry?.pendingBytes || 0
60
+ )} bytes)`
61
+ : null;
54
62
 
55
63
  process.stdout.write(
56
64
  [
@@ -65,6 +73,7 @@ async function cmdStatus(argv = []) {
65
73
  `- Next upload after: ${nextUpload || 'never'}`,
66
74
  `- Backoff until: ${backoffUntil || 'never'}`,
67
75
  lastUploadError ? `- Last upload error: ${lastUploadError}` : null,
76
+ autoRetryLine,
68
77
  `- Codex notify: ${notifyConfigured ? JSON.stringify(codexNotify) : 'unset'}`,
69
78
  ''
70
79
  ]
@@ -1,6 +1,7 @@
1
1
  const os = require('node:os');
2
2
  const path = require('node:path');
3
3
  const fs = require('node:fs/promises');
4
+ const cp = require('node:child_process');
4
5
 
5
6
  const { ensureDir, readJson, writeJson, openLock } = require('../lib/fs');
6
7
  const { listRolloutFiles, parseRolloutIncremental } = require('../lib/rollout');
@@ -39,13 +40,14 @@ async function cmdSync(argv) {
39
40
  const config = await readJson(configPath);
40
41
  const cursors = (await readJson(cursorsPath)) || { version: 1, files: {}, updatedAt: null };
41
42
  const uploadThrottle = normalizeUploadState(await readJson(uploadThrottlePath));
43
+ let uploadThrottleState = uploadThrottle;
42
44
 
43
45
  const codexHome = process.env.CODEX_HOME || path.join(home, '.codex');
44
46
  const sessionsDir = path.join(codexHome, 'sessions');
45
47
  const rolloutFiles = await listRolloutFiles(sessionsDir);
46
48
 
47
49
  if (progress?.enabled) {
48
- progress.start(`Parsing ${renderBar(0)} 0/${formatNumber(rolloutFiles.length)} files | queued 0`);
50
+ progress.start(`Parsing ${renderBar(0)} 0/${formatNumber(rolloutFiles.length)} files | buckets 0`);
49
51
  }
50
52
 
51
53
  const parseResult = await parseRolloutIncremental({
@@ -56,7 +58,9 @@ async function cmdSync(argv) {
56
58
  if (!progress?.enabled) return;
57
59
  const pct = p.total > 0 ? p.index / p.total : 1;
58
60
  progress.update(
59
- `Parsing ${renderBar(pct)} ${formatNumber(p.index)}/${formatNumber(p.total)} files | queued ${formatNumber(p.eventsQueued)}`
61
+ `Parsing ${renderBar(pct)} ${formatNumber(p.index)}/${formatNumber(p.total)} files | buckets ${formatNumber(
62
+ p.bucketsQueued
63
+ )}`
60
64
  );
61
65
  }
62
66
  });
@@ -70,6 +74,7 @@ async function cmdSync(argv) {
70
74
  const baseUrl = config?.baseUrl || process.env.VIBESCORE_INSFORGE_BASE_URL || 'https://5tmappuk.us-east.insforge.app';
71
75
 
72
76
  let uploadResult = null;
77
+ let uploadAttempted = false;
73
78
  if (deviceToken) {
74
79
  const beforeState = (await readJson(queueStatePath)) || { offset: 0 };
75
80
  const queueSize = await safeStatSize(queuePath);
@@ -77,16 +82,27 @@ async function cmdSync(argv) {
77
82
  let maxBatches = opts.auto ? 3 : opts.drain ? 10_000 : 10;
78
83
  let batchSize = UPLOAD_DEFAULTS.batchSize;
79
84
  let allowUpload = pendingBytes > 0;
85
+ let autoDecision = null;
80
86
 
81
87
  if (opts.auto) {
82
- const decision = decideAutoUpload({
88
+ autoDecision = decideAutoUpload({
83
89
  nowMs: Date.now(),
84
90
  pendingBytes,
85
91
  state: uploadThrottle
86
92
  });
87
- allowUpload = allowUpload && decision.allowed;
88
- maxBatches = decision.allowed ? decision.maxBatches : 0;
89
- batchSize = decision.batchSize;
93
+ allowUpload = allowUpload && autoDecision.allowed;
94
+ maxBatches = autoDecision.allowed ? autoDecision.maxBatches : 0;
95
+ batchSize = autoDecision.batchSize;
96
+ if (!autoDecision.allowed && pendingBytes > 0 && autoDecision.blockedUntilMs > 0) {
97
+ const reason = deriveAutoSkipReason({ decision: autoDecision, state: uploadThrottle });
98
+ await scheduleAutoRetry({
99
+ trackerDir,
100
+ retryAtMs: autoDecision.blockedUntilMs,
101
+ reason,
102
+ pendingBytes,
103
+ source: 'auto-throttled'
104
+ });
105
+ }
90
106
  }
91
107
 
92
108
  if (progress?.enabled && pendingBytes > 0 && allowUpload) {
@@ -97,6 +113,7 @@ async function cmdSync(argv) {
97
113
  }
98
114
 
99
115
  if (allowUpload && maxBatches > 0) {
116
+ uploadAttempted = true;
100
117
  try {
101
118
  uploadResult = await drainQueueToCloud({
102
119
  baseUrl,
@@ -116,12 +133,26 @@ async function cmdSync(argv) {
116
133
  }
117
134
  });
118
135
  if (uploadResult.attempted > 0) {
119
- const next = recordUploadSuccess({ nowMs: Date.now(), state: uploadThrottle });
136
+ const next = recordUploadSuccess({ nowMs: Date.now(), state: uploadThrottleState });
137
+ uploadThrottleState = next;
120
138
  await writeJson(uploadThrottlePath, next);
121
139
  }
122
140
  } catch (e) {
123
- const next = recordUploadFailure({ nowMs: Date.now(), state: uploadThrottle, error: e });
141
+ const next = recordUploadFailure({ nowMs: Date.now(), state: uploadThrottleState, error: e });
142
+ uploadThrottleState = next;
124
143
  await writeJson(uploadThrottlePath, next);
144
+ if (opts.auto && pendingBytes > 0) {
145
+ const retryAtMs = Math.max(next.nextAllowedAtMs || 0, next.backoffUntilMs || 0);
146
+ if (retryAtMs > 0) {
147
+ await scheduleAutoRetry({
148
+ trackerDir,
149
+ retryAtMs,
150
+ reason: 'backoff',
151
+ pendingBytes,
152
+ source: 'auto-error'
153
+ });
154
+ }
155
+ }
125
156
  throw e;
126
157
  }
127
158
  } else {
@@ -135,6 +166,21 @@ async function cmdSync(argv) {
135
166
  const queueSize = await safeStatSize(queuePath);
136
167
  const pendingBytes = Math.max(0, queueSize - Number(afterState.offset || 0));
137
168
 
169
+ if (pendingBytes <= 0) {
170
+ await clearAutoRetry(trackerDir);
171
+ } else if (opts.auto && uploadAttempted) {
172
+ const retryAtMs = Number(uploadThrottleState?.nextAllowedAtMs || 0);
173
+ if (retryAtMs > Date.now()) {
174
+ await scheduleAutoRetry({
175
+ trackerDir,
176
+ retryAtMs,
177
+ reason: 'backlog',
178
+ pendingBytes,
179
+ source: 'auto-backlog'
180
+ });
181
+ }
182
+ }
183
+
138
184
  await maybeSendHeartbeat({
139
185
  baseUrl,
140
186
  deviceToken,
@@ -148,7 +194,7 @@ async function cmdSync(argv) {
148
194
  [
149
195
  'Sync finished:',
150
196
  `- Parsed files: ${parseResult.filesProcessed}`,
151
- `- New events queued: ${parseResult.eventsQueued}`,
197
+ `- New 30-min buckets queued: ${parseResult.bucketsQueued}`,
152
198
  deviceToken
153
199
  ? `- Uploaded: ${uploadResult.inserted} inserted, ${uploadResult.skipped} skipped`
154
200
  : '- Uploaded: skipped (no device token)',
@@ -172,12 +218,14 @@ function parseArgs(argv) {
172
218
  const out = {
173
219
  auto: false,
174
220
  fromNotify: false,
221
+ fromRetry: false,
175
222
  drain: false
176
223
  };
177
224
  for (let i = 0; i < argv.length; i++) {
178
225
  const a = argv[i];
179
226
  if (a === '--auto') out.auto = true;
180
227
  else if (a === '--from-notify') out.fromNotify = true;
228
+ else if (a === '--from-retry') out.fromRetry = true;
181
229
  else if (a === '--drain') out.drain = true;
182
230
  else throw new Error(`Unknown option: ${a}`);
183
231
  }
@@ -217,5 +265,103 @@ async function maybeSendHeartbeat({ baseUrl, deviceToken, trackerDir, uploadResu
217
265
  }
218
266
  }
219
267
 
268
+ function deriveAutoSkipReason({ decision, state }) {
269
+ if (!decision || decision.reason !== 'throttled') return decision?.reason || 'unknown';
270
+ const backoffUntilMs = Number(state?.backoffUntilMs || 0);
271
+ const nextAllowedAtMs = Number(state?.nextAllowedAtMs || 0);
272
+ if (backoffUntilMs > 0 && backoffUntilMs >= nextAllowedAtMs) return 'backoff';
273
+ return 'throttled';
274
+ }
275
+
276
+ async function scheduleAutoRetry({ trackerDir, retryAtMs, reason, pendingBytes, source }) {
277
+ const retryMs = coerceRetryMs(retryAtMs);
278
+ if (!retryMs) return { scheduled: false, retryAtMs: 0 };
279
+
280
+ const retryPath = path.join(trackerDir, AUTO_RETRY_FILENAME);
281
+ const nowMs = Date.now();
282
+ const existing = await readJson(retryPath);
283
+ const existingMs = coerceRetryMs(existing?.retryAtMs);
284
+ if (existingMs && existingMs >= retryMs - 1000) {
285
+ return { scheduled: false, retryAtMs: existingMs };
286
+ }
287
+
288
+ const payload = {
289
+ version: 1,
290
+ retryAtMs: retryMs,
291
+ retryAt: new Date(retryMs).toISOString(),
292
+ reason: typeof reason === 'string' && reason.length > 0 ? reason : 'throttled',
293
+ pendingBytes: Math.max(0, Number(pendingBytes || 0)),
294
+ scheduledAt: new Date(nowMs).toISOString(),
295
+ source: typeof source === 'string' ? source : 'auto'
296
+ };
297
+
298
+ await writeJson(retryPath, payload);
299
+
300
+ const delayMs = Math.min(AUTO_RETRY_MAX_DELAY_MS, Math.max(0, retryMs - nowMs));
301
+ if (delayMs <= 0) return { scheduled: false, retryAtMs: retryMs };
302
+ if (process.env.VIBESCORE_AUTO_RETRY_NO_SPAWN === '1') {
303
+ return { scheduled: false, retryAtMs: retryMs };
304
+ }
305
+
306
+ spawnAutoRetryProcess({
307
+ retryPath,
308
+ trackerBinPath: path.join(trackerDir, 'app', 'bin', 'tracker.js'),
309
+ fallbackPkg: '@vibescore/tracker',
310
+ delayMs
311
+ });
312
+ return { scheduled: true, retryAtMs: retryMs };
313
+ }
314
+
315
+ async function clearAutoRetry(trackerDir) {
316
+ const retryPath = path.join(trackerDir, AUTO_RETRY_FILENAME);
317
+ await fs.unlink(retryPath).catch(() => {});
318
+ }
319
+
320
+ function spawnAutoRetryProcess({ retryPath, trackerBinPath, fallbackPkg, delayMs }) {
321
+ const script = buildAutoRetryScript({ retryPath, trackerBinPath, fallbackPkg, delayMs });
322
+ try {
323
+ const child = cp.spawn(process.execPath, ['-e', script], {
324
+ detached: true,
325
+ stdio: 'ignore',
326
+ env: process.env
327
+ });
328
+ child.unref();
329
+ } catch (_e) {}
330
+ }
331
+
332
+ function buildAutoRetryScript({ retryPath, trackerBinPath, fallbackPkg, delayMs }) {
333
+ return `'use strict';\n` +
334
+ `const fs = require('node:fs');\n` +
335
+ `const cp = require('node:child_process');\n` +
336
+ `const retryPath = ${JSON.stringify(retryPath)};\n` +
337
+ `const trackerBinPath = ${JSON.stringify(trackerBinPath)};\n` +
338
+ `const fallbackPkg = ${JSON.stringify(fallbackPkg)};\n` +
339
+ `const delayMs = ${Math.max(0, Math.floor(delayMs || 0))};\n` +
340
+ `setTimeout(() => {\n` +
341
+ ` let retryAtMs = 0;\n` +
342
+ ` try {\n` +
343
+ ` const raw = fs.readFileSync(retryPath, 'utf8');\n` +
344
+ ` retryAtMs = Number(JSON.parse(raw).retryAtMs || 0);\n` +
345
+ ` } catch (_) {}\n` +
346
+ ` if (!retryAtMs || Date.now() + 1000 < retryAtMs) process.exit(0);\n` +
347
+ ` const argv = ['sync', '--auto', '--from-retry'];\n` +
348
+ ` const cmd = fs.existsSync(trackerBinPath)\n` +
349
+ ` ? [process.execPath, trackerBinPath, ...argv]\n` +
350
+ ` : ['npx', '--yes', fallbackPkg, ...argv];\n` +
351
+ ` try {\n` +
352
+ ` const child = cp.spawn(cmd[0], cmd.slice(1), { detached: true, stdio: 'ignore', env: process.env });\n` +
353
+ ` child.unref();\n` +
354
+ ` } catch (_) {}\n` +
355
+ `}, delayMs);\n`;
356
+ }
357
+
358
+ function coerceRetryMs(v) {
359
+ const n = Number(v);
360
+ if (!Number.isFinite(n) || n <= 0) return 0;
361
+ return Math.floor(n);
362
+ }
363
+
220
364
  const HEARTBEAT_MIN_INTERVAL_MINUTES = 30;
221
365
  const HEARTBEAT_MIN_INTERVAL_MS = HEARTBEAT_MIN_INTERVAL_MINUTES * 60 * 1000;
366
+ const AUTO_RETRY_FILENAME = 'auto.retry.json';
367
+ const AUTO_RETRY_MAX_DELAY_MS = 2 * 60 * 60 * 1000;
@@ -10,7 +10,7 @@ async function beginBrowserAuth({ baseUrl, dashboardUrl, timeoutMs, open }) {
10
10
 
11
11
  const { callbackUrl, waitForCallback } = await startLocalCallbackServer({ callbackPath, timeoutMs });
12
12
 
13
- const authUrl = dashboardUrl ? new URL('/connect', dashboardUrl) : new URL('/auth/sign-up', baseUrl);
13
+ const authUrl = dashboardUrl ? new URL('/', dashboardUrl) : new URL('/auth/sign-up', baseUrl);
14
14
  authUrl.searchParams.set('redirect', callbackUrl);
15
15
  if (dashboardUrl && baseUrl && baseUrl !== DEFAULT_BASE_URL) authUrl.searchParams.set('base_url', baseUrl);
16
16
 
@@ -18,12 +18,14 @@ async function collectTrackerDiagnostics({
18
18
  const notifySignalPath = path.join(trackerDir, 'notify.signal');
19
19
  const throttlePath = path.join(trackerDir, 'sync.throttle');
20
20
  const uploadThrottlePath = path.join(trackerDir, 'upload.throttle.json');
21
+ const autoRetryPath = path.join(trackerDir, 'auto.retry.json');
21
22
  const codexConfigPath = path.join(codexHome, 'config.toml');
22
23
 
23
24
  const config = await readJson(configPath);
24
25
  const cursors = await readJson(cursorsPath);
25
26
  const queueState = (await readJson(queueStatePath)) || { offset: 0 };
26
27
  const uploadThrottle = normalizeUploadState(await readJson(uploadThrottlePath));
28
+ const autoRetry = await readJson(autoRetryPath);
27
29
 
28
30
  const queueSize = await safeStatSize(queuePath);
29
31
  const offsetBytes = Number(queueState.offset || 0);
@@ -37,6 +39,7 @@ async function collectTrackerDiagnostics({
37
39
  const codexNotify = notifyConfigured ? codexNotifyRaw.map((v) => redactValue(v, home)) : null;
38
40
 
39
41
  const lastSuccessAt = uploadThrottle.lastSuccessMs ? new Date(uploadThrottle.lastSuccessMs).toISOString() : null;
42
+ const autoRetryAt = parseEpochMsToIso(autoRetry?.retryAtMs);
40
43
 
41
44
  return {
42
45
  ok: true,
@@ -84,7 +87,18 @@ async function collectTrackerDiagnostics({
84
87
  message: redactError(String(uploadThrottle.lastError), home)
85
88
  }
86
89
  : null
87
- }
90
+ },
91
+ auto_retry: autoRetryAt
92
+ ? {
93
+ next_retry_at: autoRetryAt,
94
+ reason: typeof autoRetry?.reason === 'string' ? autoRetry.reason : null,
95
+ pending_bytes: Number.isFinite(Number(autoRetry?.pendingBytes))
96
+ ? Math.max(0, Number(autoRetry.pendingBytes))
97
+ : null,
98
+ scheduled_at: typeof autoRetry?.scheduledAt === 'string' ? autoRetry.scheduledAt : null,
99
+ source: typeof autoRetry?.source === 'string' ? autoRetry.source : null
100
+ }
101
+ : null
88
102
  };
89
103
  }
90
104
 
@@ -135,4 +149,3 @@ function parseEpochMsToIso(v) {
135
149
  }
136
150
 
137
151
  module.exports = { collectTrackerDiagnostics };
138
-
@@ -2,7 +2,6 @@ const fs = require('node:fs/promises');
2
2
  const fssync = require('node:fs');
3
3
  const path = require('node:path');
4
4
  const readline = require('node:readline');
5
- const crypto = require('node:crypto');
6
5
 
7
6
  const { ensureDir } = require('./fs');
8
7
 
@@ -37,10 +36,16 @@ async function listRolloutFiles(sessionsDir) {
37
36
  async function parseRolloutIncremental({ rolloutFiles, cursors, queuePath, onProgress }) {
38
37
  await ensureDir(path.dirname(queuePath));
39
38
  let filesProcessed = 0;
40
- let eventsQueued = 0;
39
+ let eventsAggregated = 0;
41
40
 
42
41
  const cb = typeof onProgress === 'function' ? onProgress : null;
43
42
  const totalFiles = Array.isArray(rolloutFiles) ? rolloutFiles.length : 0;
43
+ const hourlyState = normalizeHourlyState(cursors?.hourly);
44
+ const touchedBuckets = new Set();
45
+
46
+ if (!cursors.files || typeof cursors.files !== 'object') {
47
+ cursors.files = {};
48
+ }
44
49
 
45
50
  for (let idx = 0; idx < rolloutFiles.length; idx++) {
46
51
  const filePath = rolloutFiles[idx];
@@ -59,7 +64,8 @@ async function parseRolloutIncremental({ rolloutFiles, cursors, queuePath, onPro
59
64
  startOffset,
60
65
  lastTotal,
61
66
  lastModel,
62
- queuePath
67
+ hourlyState,
68
+ touchedBuckets
63
69
  });
64
70
 
65
71
  cursors.files[key] = {
@@ -71,7 +77,7 @@ async function parseRolloutIncremental({ rolloutFiles, cursors, queuePath, onPro
71
77
  };
72
78
 
73
79
  filesProcessed += 1;
74
- eventsQueued += result.eventsQueued;
80
+ eventsAggregated += result.eventsAggregated;
75
81
 
76
82
  if (cb) {
77
83
  cb({
@@ -79,28 +85,32 @@ async function parseRolloutIncremental({ rolloutFiles, cursors, queuePath, onPro
79
85
  total: totalFiles,
80
86
  filePath,
81
87
  filesProcessed,
82
- eventsQueued
88
+ eventsAggregated,
89
+ bucketsQueued: touchedBuckets.size
83
90
  });
84
91
  }
85
92
  }
86
93
 
87
- return { filesProcessed, eventsQueued };
94
+ const bucketsQueued = await enqueueTouchedBuckets({ queuePath, hourlyState, touchedBuckets });
95
+ hourlyState.updatedAt = new Date().toISOString();
96
+ cursors.hourly = hourlyState;
97
+
98
+ return { filesProcessed, eventsAggregated, bucketsQueued };
88
99
  }
89
100
 
90
- async function parseRolloutFile({ filePath, startOffset, lastTotal, lastModel, queuePath }) {
101
+ async function parseRolloutFile({ filePath, startOffset, lastTotal, lastModel, hourlyState, touchedBuckets }) {
91
102
  const st = await fs.stat(filePath);
92
103
  const endOffset = st.size;
93
104
  if (startOffset >= endOffset) {
94
- return { endOffset, lastTotal, lastModel, eventsQueued: 0 };
105
+ return { endOffset, lastTotal, lastModel, eventsAggregated: 0 };
95
106
  }
96
107
 
97
108
  const stream = fssync.createReadStream(filePath, { encoding: 'utf8', start: startOffset });
98
109
  const rl = readline.createInterface({ input: stream, crlfDelay: Infinity });
99
110
 
100
- const toAppend = [];
101
111
  let model = typeof lastModel === 'string' ? lastModel : null;
102
112
  let totals = lastTotal && typeof lastTotal === 'object' ? lastTotal : null;
103
- let eventsQueued = 0;
113
+ let eventsAggregated = 0;
104
114
 
105
115
  for await (const line of rl) {
106
116
  if (!line) continue;
@@ -139,26 +149,122 @@ async function parseRolloutFile({ filePath, startOffset, lastTotal, lastModel, q
139
149
  totals = totalUsage;
140
150
  }
141
151
 
142
- const event = {
143
- event_id: sha256Hex(line),
144
- token_timestamp: tokenTimestamp,
145
- model: model || null,
146
- input_tokens: delta.input_tokens || 0,
147
- cached_input_tokens: delta.cached_input_tokens || 0,
148
- output_tokens: delta.output_tokens || 0,
149
- reasoning_output_tokens: delta.reasoning_output_tokens || 0,
150
- total_tokens: delta.total_tokens || 0
151
- };
152
+ const bucketStart = toUtcHalfHourStart(tokenTimestamp);
153
+ if (!bucketStart) continue;
154
+
155
+ const bucket = getHourlyBucket(hourlyState, bucketStart);
156
+ addTotals(bucket.totals, delta);
157
+ touchedBuckets.add(bucketStart);
158
+ eventsAggregated += 1;
159
+ }
160
+
161
+ return { endOffset, lastTotal: totals, lastModel: model, eventsAggregated };
162
+ }
163
+
164
+ async function enqueueTouchedBuckets({ queuePath, hourlyState, touchedBuckets }) {
165
+ if (!touchedBuckets || touchedBuckets.size === 0) return 0;
152
166
 
153
- toAppend.push(JSON.stringify(event));
154
- eventsQueued += 1;
167
+ const toAppend = [];
168
+ for (const bucketStart of touchedBuckets) {
169
+ const bucket = hourlyState.buckets[bucketStart];
170
+ if (!bucket || !bucket.totals) continue;
171
+ const key = totalsKey(bucket.totals);
172
+ if (bucket.queuedKey === key) continue;
173
+ toAppend.push(
174
+ JSON.stringify({
175
+ hour_start: bucketStart,
176
+ input_tokens: bucket.totals.input_tokens,
177
+ cached_input_tokens: bucket.totals.cached_input_tokens,
178
+ output_tokens: bucket.totals.output_tokens,
179
+ reasoning_output_tokens: bucket.totals.reasoning_output_tokens,
180
+ total_tokens: bucket.totals.total_tokens
181
+ })
182
+ );
183
+ bucket.queuedKey = key;
155
184
  }
156
185
 
157
186
  if (toAppend.length > 0) {
158
187
  await fs.appendFile(queuePath, toAppend.join('\n') + '\n', 'utf8');
159
188
  }
160
189
 
161
- return { endOffset, lastTotal: totals, lastModel: model, eventsQueued };
190
+ return toAppend.length;
191
+ }
192
+
193
+ function normalizeHourlyState(raw) {
194
+ const state = raw && typeof raw === 'object' ? raw : {};
195
+ const buckets = state.buckets && typeof state.buckets === 'object' ? state.buckets : {};
196
+ return {
197
+ version: 1,
198
+ buckets,
199
+ updatedAt: typeof state.updatedAt === 'string' ? state.updatedAt : null
200
+ };
201
+ }
202
+
203
+ function getHourlyBucket(state, hourStart) {
204
+ const buckets = state.buckets;
205
+ let bucket = buckets[hourStart];
206
+ if (!bucket || typeof bucket !== 'object') {
207
+ bucket = { totals: initTotals(), queuedKey: null };
208
+ buckets[hourStart] = bucket;
209
+ return bucket;
210
+ }
211
+
212
+ if (!bucket.totals || typeof bucket.totals !== 'object') {
213
+ bucket.totals = initTotals();
214
+ }
215
+
216
+ if (bucket.queuedKey != null && typeof bucket.queuedKey !== 'string') {
217
+ bucket.queuedKey = null;
218
+ }
219
+
220
+ return bucket;
221
+ }
222
+
223
+ function initTotals() {
224
+ return {
225
+ input_tokens: 0,
226
+ cached_input_tokens: 0,
227
+ output_tokens: 0,
228
+ reasoning_output_tokens: 0,
229
+ total_tokens: 0
230
+ };
231
+ }
232
+
233
+ function addTotals(target, delta) {
234
+ target.input_tokens += delta.input_tokens || 0;
235
+ target.cached_input_tokens += delta.cached_input_tokens || 0;
236
+ target.output_tokens += delta.output_tokens || 0;
237
+ target.reasoning_output_tokens += delta.reasoning_output_tokens || 0;
238
+ target.total_tokens += delta.total_tokens || 0;
239
+ }
240
+
241
+ function totalsKey(totals) {
242
+ return [
243
+ totals.input_tokens || 0,
244
+ totals.cached_input_tokens || 0,
245
+ totals.output_tokens || 0,
246
+ totals.reasoning_output_tokens || 0,
247
+ totals.total_tokens || 0
248
+ ].join('|');
249
+ }
250
+
251
+ function toUtcHalfHourStart(ts) {
252
+ const dt = new Date(ts);
253
+ if (!Number.isFinite(dt.getTime())) return null;
254
+ const minutes = dt.getUTCMinutes();
255
+ const halfMinute = minutes >= 30 ? 30 : 0;
256
+ const bucketStart = new Date(
257
+ Date.UTC(
258
+ dt.getUTCFullYear(),
259
+ dt.getUTCMonth(),
260
+ dt.getUTCDate(),
261
+ dt.getUTCHours(),
262
+ halfMinute,
263
+ 0,
264
+ 0
265
+ )
266
+ );
267
+ return bucketStart.toISOString();
162
268
  }
163
269
 
164
270
  function pickDelta(lastUsage, totalUsage, prevTotals) {
@@ -209,10 +315,6 @@ function normalizeUsage(u) {
209
315
  return out;
210
316
  }
211
317
 
212
- function sha256Hex(s) {
213
- return crypto.createHash('sha256').update(s, 'utf8').digest('hex');
214
- }
215
-
216
318
  function isNonEmptyObject(v) {
217
319
  return Boolean(v && typeof v === 'object' && !Array.isArray(v) && Object.keys(v).length > 0);
218
320
  }
@@ -1,6 +1,6 @@
1
1
  const DEFAULTS = {
2
- intervalMs: 10 * 60_000,
3
- jitterMsMax: 120_000,
2
+ intervalMs: 30 * 60_000,
3
+ jitterMsMax: 60_000,
4
4
  backlogBytes: 1_000_000,
5
5
  batchSize: 300,
6
6
  maxBatchesSmall: 2,
@@ -3,7 +3,7 @@ const fssync = require('node:fs');
3
3
  const readline = require('node:readline');
4
4
 
5
5
  const { ensureDir, readJson, writeJson } = require('./fs');
6
- const { ingestEvents } = require('./vibescore-api');
6
+ const { ingestHourly } = require('./vibescore-api');
7
7
 
8
8
  async function drainQueueToCloud({ baseUrl, deviceToken, queuePath, queueStatePath, maxBatches, batchSize, onProgress }) {
9
9
  await ensureDir(require('node:path').dirname(queueStatePath));
@@ -16,14 +16,14 @@ async function drainQueueToCloud({ baseUrl, deviceToken, queuePath, queueStatePa
16
16
 
17
17
  const cb = typeof onProgress === 'function' ? onProgress : null;
18
18
  const queueSize = await safeFileSize(queuePath);
19
- const maxEvents = Math.max(1, Math.floor(Number(batchSize || 200)));
19
+ const maxBuckets = Math.max(1, Math.floor(Number(batchSize || 200)));
20
20
 
21
21
  for (let batch = 0; batch < maxBatches; batch++) {
22
- const res = await readBatch(queuePath, offset, maxEvents);
23
- if (res.events.length === 0) break;
22
+ const res = await readBatch(queuePath, offset, maxBuckets);
23
+ if (res.buckets.length === 0) break;
24
24
 
25
- attempted += res.events.length;
26
- const ingest = await ingestEvents({ baseUrl, deviceToken, events: res.events });
25
+ attempted += res.buckets.length;
26
+ const ingest = await ingestHourly({ baseUrl, deviceToken, hourly: res.buckets });
27
27
  inserted += ingest.inserted || 0;
28
28
  skipped += ingest.skipped || 0;
29
29
 
@@ -47,33 +47,37 @@ async function drainQueueToCloud({ baseUrl, deviceToken, queuePath, queueStatePa
47
47
  return { inserted, skipped, attempted };
48
48
  }
49
49
 
50
- async function readBatch(queuePath, startOffset, maxEvents) {
50
+ async function readBatch(queuePath, startOffset, maxBuckets) {
51
51
  const st = await fs.stat(queuePath).catch(() => null);
52
- if (!st || !st.isFile()) return { events: [], nextOffset: startOffset };
53
- if (startOffset >= st.size) return { events: [], nextOffset: startOffset };
52
+ if (!st || !st.isFile()) return { buckets: [], nextOffset: startOffset };
53
+ if (startOffset >= st.size) return { buckets: [], nextOffset: startOffset };
54
54
 
55
55
  const stream = fssync.createReadStream(queuePath, { encoding: 'utf8', start: startOffset });
56
56
  const rl = readline.createInterface({ input: stream, crlfDelay: Infinity });
57
57
 
58
- const events = [];
58
+ const bucketMap = new Map();
59
59
  let offset = startOffset;
60
+ let linesRead = 0;
60
61
  for await (const line of rl) {
61
62
  const bytes = Buffer.byteLength(line, 'utf8') + 1;
62
63
  offset += bytes;
63
64
  if (!line.trim()) continue;
64
- let ev;
65
+ let bucket;
65
66
  try {
66
- ev = JSON.parse(line);
67
+ bucket = JSON.parse(line);
67
68
  } catch (_e) {
68
69
  continue;
69
70
  }
70
- events.push(ev);
71
- if (events.length >= maxEvents) break;
71
+ const hourStart = typeof bucket?.hour_start === 'string' ? bucket.hour_start : null;
72
+ if (!hourStart) continue;
73
+ bucketMap.set(hourStart, bucket);
74
+ linesRead += 1;
75
+ if (linesRead >= maxBuckets) break;
72
76
  }
73
77
 
74
78
  rl.close();
75
79
  stream.close?.();
76
- return { events, nextOffset: offset };
80
+ return { buckets: Array.from(bucketMap.values()), nextOffset: offset };
77
81
  }
78
82
 
79
83
  async function safeFileSize(p) {
@@ -35,13 +35,13 @@ async function issueDeviceToken({ baseUrl, accessToken, deviceName, platform = '
35
35
  return { token, deviceId };
36
36
  }
37
37
 
38
- async function ingestEvents({ baseUrl, deviceToken, events }) {
38
+ async function ingestHourly({ baseUrl, deviceToken, hourly }) {
39
39
  const data = await invokeFunctionWithRetry({
40
40
  baseUrl,
41
41
  accessToken: deviceToken,
42
42
  slug: 'vibescore-ingest',
43
43
  method: 'POST',
44
- body: { events },
44
+ body: { hourly },
45
45
  errorPrefix: 'Ingest failed',
46
46
  retry: { maxRetries: 3, baseDelayMs: 500, maxDelayMs: 5000 }
47
47
  });
@@ -72,7 +72,7 @@ async function syncHeartbeat({ baseUrl, deviceToken }) {
72
72
  module.exports = {
73
73
  signInWithPassword,
74
74
  issueDeviceToken,
75
- ingestEvents,
75
+ ingestHourly,
76
76
  syncHeartbeat
77
77
  };
78
78