@vibescore/tracker 0.0.2 → 0.0.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,26 +1,97 @@
1
- # @vibescore/tracker
1
+ <div align="center">
2
2
 
3
- Codex CLI token usage tracker (macOS-first, notify-driven).
3
+ # 🟢 VIBESCORE
4
4
 
5
- ## Quick Start
5
+ **QUANTIFY YOUR AI OUTPUT**
6
+ _Real-time AI Analytics for Codex CLI_
7
+
8
+ [![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)
9
+ [![Node.js Support](https://img.shields.io/badge/Node.js-%3E%3D18-brightgreen.svg)](https://nodejs.org/)
10
+ [![Platform](https://img.shields.io/badge/Platform-macOS-lightgrey.svg)](https://www.apple.com/macos/)
11
+
12
+ [**English**](README.md) • [**中文说明**](README.zh-CN.md)
13
+
14
+ [**Documentation**](docs/) • [**Dashboard**](dashboard/) • [**Backend API**](BACKEND_API.md)
15
+
16
+ </div>
17
+
18
+ ---
19
+
20
+ ## 🌌 Overview
21
+
22
+ **VibeScore** is an intelligent token usage tracking system designed specifically for macOS developers. It monitors Codex CLI output in real-time, transforming your **AI Output** into quantifiable metrics via a high-fidelity, **Matrix-themed** dashboard.
23
+
24
+ > [!TIP] > **Core Index**: Our signature metric that reflects your flow state by analyzing token consumption rates and patterns.
25
+
26
+ ## 🚀 Key Features
27
+
28
+ - 📡 **Live Sniffer**: Real-time interception of Codex CLI pipes using low-level hooks to capture every completion event.
29
+ - 📊 **Matrix Dashboard**: A high-performance React + Vite dashboard featuring heatmaps, trend charts, and live logs.
30
+ - ⚡ **AI Analytics**: Deep analysis of Input/Output tokens, with dedicated tracking for Cached and Reasoning components.
31
+ - 🔒 **Identity Core**: Robust authentication and permission management to secure your development data.
32
+
33
+ ## 🛠️ Quick Start
34
+
35
+ ### Installation
36
+
37
+ Initialize your environment with a single command:
6
38
 
7
39
  ```bash
8
40
  npx --yes @vibescore/tracker init
41
+ ```
42
+
43
+ ### Sync & Status
44
+
45
+ ```bash
46
+ # Sync latest local session data
9
47
  npx --yes @vibescore/tracker sync
48
+
49
+ # Check current link status
10
50
  npx --yes @vibescore/tracker status
11
- npx --yes @vibescore/tracker uninstall
12
51
  ```
13
52
 
14
- ## Requirements
53
+ ## 🏗️ Architecture
54
+
55
+ ```mermaid
56
+ graph TD
57
+ A[Codex CLI] -->|Rollout Logs| B(Tracker CLI)
58
+ B -->|AI Tokens| C{Core Relay}
59
+ C --> D[VibeScore Dashboard]
60
+ C --> E[AI Analytics Engine]
61
+ ```
62
+
63
+ ## 💻 Developer Guide
64
+
65
+ To run locally or contribute:
15
66
 
16
- - Node.js >= 18
17
- - macOS (current supported platform)
67
+ ### Dashboard Development
68
+
69
+ ```bash
70
+ # Install dependencies
71
+ cd dashboard
72
+ npm install
73
+
74
+ # Start dev server
75
+ npm run dev
76
+ ```
77
+
78
+ ### Architecture Validation
79
+
80
+ ```bash
81
+ # Validate Copy Registry
82
+ npm run validate:copy
83
+
84
+ # Run smoke tests
85
+ npm run smoke
86
+ ```
18
87
 
19
- ## Notes
88
+ ## 📜 License
20
89
 
21
- - `init` installs a Codex CLI notify hook and issues a device token.
22
- - `sync` parses `~/.codex/sessions/**/rollout-*.jsonl` and uploads token_count deltas.
90
+ This project is licensed under the [MIT License](LICENSE).
23
91
 
24
- ## License
92
+ ---
25
93
 
26
- MIT
94
+ <div align="center">
95
+ <b>System_Ready // 2024 VibeScore OS</b><br/>
96
+ <i>"More Tokens. More Vibe."</i>
97
+ </div>
@@ -0,0 +1,97 @@
1
+ <div align="center">
2
+
3
+ # 🟢 VIBESCORE
4
+
5
+ **量化你的 AI 产出**
6
+ _Codex CLI 实时 AI 分析工具_
7
+
8
+ [![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)
9
+ [![Node.js Support](https://img.shields.io/badge/Node.js-%3E%3D18-brightgreen.svg)](https://nodejs.org/)
10
+ [![Platform](https://img.shields.io/badge/Platform-macOS-lightgrey.svg)](https://www.apple.com/macos/)
11
+
12
+ [**English**](README.md) • [**中文说明**](README.zh-CN.md)
13
+
14
+ [**文档**](docs/) • [**控制台**](dashboard/) • [**后端接口**](BACKEND_API.md)
15
+
16
+ </div>
17
+
18
+ ---
19
+
20
+ ## 🌌 项目概述
21
+
22
+ **VibeScore** 是一个专为 macOS 开发者设计的智能令牌(Token)使用追踪系统。它能够实时监控 Codex CLI 的输出,通过高度可视化的 **Matrix** 风格仪表盘,将你的 **AI 产出 (AI Output)** 转化为可量化的指标。
23
+
24
+ > [!TIP] > **Core Index (核心指数)**: 我们的标志性指标,通过分析 Token 消耗速率与模式,反映你的开发心流状态。
25
+
26
+ ## 🚀 核心功能
27
+
28
+ - 📡 **Live Sniffer (实时嗅探)**: 实时监听 Codex CLI 管道,通过底层 Hook 捕获每一次补全事件。
29
+ - 📊 **Matrix Dashboard (矩阵控制台)**: 基于 React + Vite 的高性能仪表盘,具备热力图、趋势图与实时日志。
30
+ - ⚡ **AI Analytics (AI 分析)**: 深度分析 Input/Output Token,支持缓存 (Cached) 与推理 (Reasoning) 部分的分离监控。
31
+ - 🔒 **Identity Core (身份核心)**: 完备的身份验证与权限管理,保护你的开发数据资产。
32
+
33
+ ## 🛠️ 快速开始
34
+
35
+ ### 安装
36
+
37
+ 只需一行命令,即可初始化环境:
38
+
39
+ ```bash
40
+ npx --yes @vibescore/tracker init
41
+ ```
42
+
43
+ ### 同步与状态查看
44
+
45
+ ```bash
46
+ # 同步最新的本地会话数据
47
+ npx --yes @vibescore/tracker sync
48
+
49
+ # 查看当前连接状态
50
+ npx --yes @vibescore/tracker status
51
+ ```
52
+
53
+ ## 🏗️ 系统架构
54
+
55
+ ```mermaid
56
+ graph TD
57
+ A[Codex CLI] -->|Rollout Logs| B(Tracker CLI)
58
+ B -->|AI Tokens| C{Core Relay}
59
+ C --> D[VibeScore Dashboard]
60
+ C --> E[AI Analytics Engine]
61
+ ```
62
+
63
+ ## 💻 开发者指南
64
+
65
+ 如果你想在本地运行或贡献代码:
66
+
67
+ ### 仪表盘开发
68
+
69
+ ```bash
70
+ # 安装依赖
71
+ cd dashboard
72
+ npm install
73
+
74
+ # 启动开发服务器
75
+ npm run dev
76
+ ```
77
+
78
+ ### 整体架构验证
79
+
80
+ ```bash
81
+ # 验证 Copy 注册表
82
+ npm run validate:copy
83
+
84
+ # 执行烟雾测试
85
+ npm run smoke
86
+ ```
87
+
88
+ ## 📜 开源协议
89
+
90
+ 本项目基于 [MIT](LICENSE) 协议开源。
91
+
92
+ ---
93
+
94
+ <div align="center">
95
+ <b>System_Ready // 2014 VibeScore OS</b><br/>
96
+ <i>"More Tokens. More Vibe."</i>
97
+ </div>
package/bin/tracker.js CHANGED
@@ -9,10 +9,19 @@ if (debug) process.env.VIBESCORE_DEBUG = '1';
9
9
  run(argv).catch((err) => {
10
10
  console.error(err?.stack || String(err));
11
11
  if (debug) {
12
+ if (typeof err?.status === 'number') {
13
+ console.error(`Status: ${err.status}`);
14
+ }
15
+ if (typeof err?.code === 'string' && err.code.trim()) {
16
+ console.error(`Code: ${err.code.trim()}`);
17
+ }
12
18
  const original = err?.originalMessage;
13
19
  if (original && original !== err?.message) {
14
20
  console.error(`Original error: ${original}`);
15
21
  }
22
+ if (typeof err?.nextActions === 'string' && err.nextActions.trim()) {
23
+ console.error(`Next actions: ${err.nextActions.trim()}`);
24
+ }
16
25
  }
17
26
  process.exitCode = 1;
18
27
  });
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@vibescore/tracker",
3
- "version": "0.0.2",
3
+ "version": "0.0.4",
4
4
  "description": "Codex CLI token usage tracker (macOS-first, notify-driven).",
5
5
  "license": "MIT",
6
6
  "publishConfig": {
@@ -7,6 +7,7 @@ const { prompt, promptHidden } = require('../lib/prompt');
7
7
  const { upsertCodexNotify, loadCodexNotifyOriginal } = require('../lib/codex-config');
8
8
  const { beginBrowserAuth } = require('../lib/browser-auth');
9
9
  const { issueDeviceTokenWithPassword, issueDeviceTokenWithAccessToken } = require('../lib/insforge');
10
+ const { cmdSync } = require('./sync');
10
11
 
11
12
  async function cmdInit(argv) {
12
13
  const opts = parseArgs(argv);
@@ -104,6 +105,13 @@ async function cmdInit(argv) {
104
105
  ''
105
106
  ].join('\n')
106
107
  );
108
+
109
+ try {
110
+ await cmdSync([]);
111
+ } catch (err) {
112
+ const msg = err && err.message ? err.message : 'unknown error';
113
+ process.stderr.write(`Initial sync failed: ${msg}\n`);
114
+ }
107
115
  }
108
116
 
109
117
  function parseArgs(argv) {
@@ -6,6 +6,7 @@ const { ensureDir, readJson, writeJson, openLock } = require('../lib/fs');
6
6
  const { listRolloutFiles, parseRolloutIncremental } = require('../lib/rollout');
7
7
  const { drainQueueToCloud } = require('../lib/uploader');
8
8
  const { createProgress, renderBar, formatNumber, formatBytes } = require('../lib/progress');
9
+ const { syncHeartbeat } = require('../lib/vibescore-api');
9
10
  const {
10
11
  DEFAULTS: UPLOAD_DEFAULTS,
11
12
  normalizeState: normalizeUploadState,
@@ -44,7 +45,7 @@ async function cmdSync(argv) {
44
45
  const rolloutFiles = await listRolloutFiles(sessionsDir);
45
46
 
46
47
  if (progress?.enabled) {
47
- progress.start(`Parsing ${renderBar(0)} 0/${formatNumber(rolloutFiles.length)} files | queued 0`);
48
+ progress.start(`Parsing ${renderBar(0)} 0/${formatNumber(rolloutFiles.length)} files | buckets 0`);
48
49
  }
49
50
 
50
51
  const parseResult = await parseRolloutIncremental({
@@ -55,7 +56,9 @@ async function cmdSync(argv) {
55
56
  if (!progress?.enabled) return;
56
57
  const pct = p.total > 0 ? p.index / p.total : 1;
57
58
  progress.update(
58
- `Parsing ${renderBar(pct)} ${formatNumber(p.index)}/${formatNumber(p.total)} files | queued ${formatNumber(p.eventsQueued)}`
59
+ `Parsing ${renderBar(pct)} ${formatNumber(p.index)}/${formatNumber(p.total)} files | buckets ${formatNumber(
60
+ p.bucketsQueued
61
+ )}`
59
62
  );
60
63
  }
61
64
  });
@@ -130,16 +133,24 @@ async function cmdSync(argv) {
130
133
  progress?.stop();
131
134
  }
132
135
 
133
- if (!opts.auto) {
134
- const afterState = (await readJson(queueStatePath)) || { offset: 0 };
135
- const queueSize = await safeStatSize(queuePath);
136
- const pendingBytes = Math.max(0, queueSize - Number(afterState.offset || 0));
136
+ const afterState = (await readJson(queueStatePath)) || { offset: 0 };
137
+ const queueSize = await safeStatSize(queuePath);
138
+ const pendingBytes = Math.max(0, queueSize - Number(afterState.offset || 0));
139
+
140
+ await maybeSendHeartbeat({
141
+ baseUrl,
142
+ deviceToken,
143
+ trackerDir,
144
+ uploadResult,
145
+ pendingBytes
146
+ });
137
147
 
148
+ if (!opts.auto) {
138
149
  process.stdout.write(
139
150
  [
140
151
  'Sync finished:',
141
152
  `- Parsed files: ${parseResult.filesProcessed}`,
142
- `- New events queued: ${parseResult.eventsQueued}`,
153
+ `- New 30-min buckets queued: ${parseResult.bucketsQueued}`,
143
154
  deviceToken
144
155
  ? `- Uploaded: ${uploadResult.inserted} inserted, ${uploadResult.skipped} skipped`
145
156
  : '- Uploaded: skipped (no device token)',
@@ -185,3 +196,28 @@ async function safeStatSize(p) {
185
196
  return 0;
186
197
  }
187
198
  }
199
+
200
+ async function maybeSendHeartbeat({ baseUrl, deviceToken, trackerDir, uploadResult, pendingBytes }) {
201
+ if (!deviceToken || !uploadResult) return;
202
+ if (pendingBytes > 0) return;
203
+ if (Number(uploadResult.inserted || 0) !== 0) return;
204
+
205
+ const heartbeatPath = path.join(trackerDir, 'sync.heartbeat.json');
206
+ const heartbeatState = await readJson(heartbeatPath);
207
+ const lastPingAt = Date.parse(heartbeatState?.lastPingAt || '');
208
+ const nowMs = Date.now();
209
+ if (Number.isFinite(lastPingAt) && nowMs - lastPingAt < HEARTBEAT_MIN_INTERVAL_MS) return;
210
+
211
+ try {
212
+ await syncHeartbeat({ baseUrl, deviceToken });
213
+ await writeJson(heartbeatPath, {
214
+ lastPingAt: new Date(nowMs).toISOString(),
215
+ minIntervalMinutes: HEARTBEAT_MIN_INTERVAL_MINUTES
216
+ });
217
+ } catch (_e) {
218
+ // best-effort heartbeat; ignore failures
219
+ }
220
+ }
221
+
222
+ const HEARTBEAT_MIN_INTERVAL_MINUTES = 30;
223
+ const HEARTBEAT_MIN_INTERVAL_MS = HEARTBEAT_MIN_INTERVAL_MINUTES * 60 * 1000;
@@ -2,7 +2,6 @@ const fs = require('node:fs/promises');
2
2
  const fssync = require('node:fs');
3
3
  const path = require('node:path');
4
4
  const readline = require('node:readline');
5
- const crypto = require('node:crypto');
6
5
 
7
6
  const { ensureDir } = require('./fs');
8
7
 
@@ -37,10 +36,16 @@ async function listRolloutFiles(sessionsDir) {
37
36
  async function parseRolloutIncremental({ rolloutFiles, cursors, queuePath, onProgress }) {
38
37
  await ensureDir(path.dirname(queuePath));
39
38
  let filesProcessed = 0;
40
- let eventsQueued = 0;
39
+ let eventsAggregated = 0;
41
40
 
42
41
  const cb = typeof onProgress === 'function' ? onProgress : null;
43
42
  const totalFiles = Array.isArray(rolloutFiles) ? rolloutFiles.length : 0;
43
+ const hourlyState = normalizeHourlyState(cursors?.hourly);
44
+ const touchedBuckets = new Set();
45
+
46
+ if (!cursors.files || typeof cursors.files !== 'object') {
47
+ cursors.files = {};
48
+ }
44
49
 
45
50
  for (let idx = 0; idx < rolloutFiles.length; idx++) {
46
51
  const filePath = rolloutFiles[idx];
@@ -59,7 +64,8 @@ async function parseRolloutIncremental({ rolloutFiles, cursors, queuePath, onPro
59
64
  startOffset,
60
65
  lastTotal,
61
66
  lastModel,
62
- queuePath
67
+ hourlyState,
68
+ touchedBuckets
63
69
  });
64
70
 
65
71
  cursors.files[key] = {
@@ -71,7 +77,7 @@ async function parseRolloutIncremental({ rolloutFiles, cursors, queuePath, onPro
71
77
  };
72
78
 
73
79
  filesProcessed += 1;
74
- eventsQueued += result.eventsQueued;
80
+ eventsAggregated += result.eventsAggregated;
75
81
 
76
82
  if (cb) {
77
83
  cb({
@@ -79,28 +85,32 @@ async function parseRolloutIncremental({ rolloutFiles, cursors, queuePath, onPro
79
85
  total: totalFiles,
80
86
  filePath,
81
87
  filesProcessed,
82
- eventsQueued
88
+ eventsAggregated,
89
+ bucketsQueued: touchedBuckets.size
83
90
  });
84
91
  }
85
92
  }
86
93
 
87
- return { filesProcessed, eventsQueued };
94
+ const bucketsQueued = await enqueueTouchedBuckets({ queuePath, hourlyState, touchedBuckets });
95
+ hourlyState.updatedAt = new Date().toISOString();
96
+ cursors.hourly = hourlyState;
97
+
98
+ return { filesProcessed, eventsAggregated, bucketsQueued };
88
99
  }
89
100
 
90
- async function parseRolloutFile({ filePath, startOffset, lastTotal, lastModel, queuePath }) {
101
+ async function parseRolloutFile({ filePath, startOffset, lastTotal, lastModel, hourlyState, touchedBuckets }) {
91
102
  const st = await fs.stat(filePath);
92
103
  const endOffset = st.size;
93
104
  if (startOffset >= endOffset) {
94
- return { endOffset, lastTotal, lastModel, eventsQueued: 0 };
105
+ return { endOffset, lastTotal, lastModel, eventsAggregated: 0 };
95
106
  }
96
107
 
97
108
  const stream = fssync.createReadStream(filePath, { encoding: 'utf8', start: startOffset });
98
109
  const rl = readline.createInterface({ input: stream, crlfDelay: Infinity });
99
110
 
100
- const toAppend = [];
101
111
  let model = typeof lastModel === 'string' ? lastModel : null;
102
112
  let totals = lastTotal && typeof lastTotal === 'object' ? lastTotal : null;
103
- let eventsQueued = 0;
113
+ let eventsAggregated = 0;
104
114
 
105
115
  for await (const line of rl) {
106
116
  if (!line) continue;
@@ -139,26 +149,122 @@ async function parseRolloutFile({ filePath, startOffset, lastTotal, lastModel, q
139
149
  totals = totalUsage;
140
150
  }
141
151
 
142
- const event = {
143
- event_id: sha256Hex(line),
144
- token_timestamp: tokenTimestamp,
145
- model: model || null,
146
- input_tokens: delta.input_tokens || 0,
147
- cached_input_tokens: delta.cached_input_tokens || 0,
148
- output_tokens: delta.output_tokens || 0,
149
- reasoning_output_tokens: delta.reasoning_output_tokens || 0,
150
- total_tokens: delta.total_tokens || 0
151
- };
152
+ const bucketStart = toUtcHalfHourStart(tokenTimestamp);
153
+ if (!bucketStart) continue;
154
+
155
+ const bucket = getHourlyBucket(hourlyState, bucketStart);
156
+ addTotals(bucket.totals, delta);
157
+ touchedBuckets.add(bucketStart);
158
+ eventsAggregated += 1;
159
+ }
160
+
161
+ return { endOffset, lastTotal: totals, lastModel: model, eventsAggregated };
162
+ }
163
+
164
+ async function enqueueTouchedBuckets({ queuePath, hourlyState, touchedBuckets }) {
165
+ if (!touchedBuckets || touchedBuckets.size === 0) return 0;
152
166
 
153
- toAppend.push(JSON.stringify(event));
154
- eventsQueued += 1;
167
+ const toAppend = [];
168
+ for (const bucketStart of touchedBuckets) {
169
+ const bucket = hourlyState.buckets[bucketStart];
170
+ if (!bucket || !bucket.totals) continue;
171
+ const key = totalsKey(bucket.totals);
172
+ if (bucket.queuedKey === key) continue;
173
+ toAppend.push(
174
+ JSON.stringify({
175
+ hour_start: bucketStart,
176
+ input_tokens: bucket.totals.input_tokens,
177
+ cached_input_tokens: bucket.totals.cached_input_tokens,
178
+ output_tokens: bucket.totals.output_tokens,
179
+ reasoning_output_tokens: bucket.totals.reasoning_output_tokens,
180
+ total_tokens: bucket.totals.total_tokens
181
+ })
182
+ );
183
+ bucket.queuedKey = key;
155
184
  }
156
185
 
157
186
  if (toAppend.length > 0) {
158
187
  await fs.appendFile(queuePath, toAppend.join('\n') + '\n', 'utf8');
159
188
  }
160
189
 
161
- return { endOffset, lastTotal: totals, lastModel: model, eventsQueued };
190
+ return toAppend.length;
191
+ }
192
+
193
+ function normalizeHourlyState(raw) {
194
+ const state = raw && typeof raw === 'object' ? raw : {};
195
+ const buckets = state.buckets && typeof state.buckets === 'object' ? state.buckets : {};
196
+ return {
197
+ version: 1,
198
+ buckets,
199
+ updatedAt: typeof state.updatedAt === 'string' ? state.updatedAt : null
200
+ };
201
+ }
202
+
203
+ function getHourlyBucket(state, hourStart) {
204
+ const buckets = state.buckets;
205
+ let bucket = buckets[hourStart];
206
+ if (!bucket || typeof bucket !== 'object') {
207
+ bucket = { totals: initTotals(), queuedKey: null };
208
+ buckets[hourStart] = bucket;
209
+ return bucket;
210
+ }
211
+
212
+ if (!bucket.totals || typeof bucket.totals !== 'object') {
213
+ bucket.totals = initTotals();
214
+ }
215
+
216
+ if (bucket.queuedKey != null && typeof bucket.queuedKey !== 'string') {
217
+ bucket.queuedKey = null;
218
+ }
219
+
220
+ return bucket;
221
+ }
222
+
223
+ function initTotals() {
224
+ return {
225
+ input_tokens: 0,
226
+ cached_input_tokens: 0,
227
+ output_tokens: 0,
228
+ reasoning_output_tokens: 0,
229
+ total_tokens: 0
230
+ };
231
+ }
232
+
233
+ function addTotals(target, delta) {
234
+ target.input_tokens += delta.input_tokens || 0;
235
+ target.cached_input_tokens += delta.cached_input_tokens || 0;
236
+ target.output_tokens += delta.output_tokens || 0;
237
+ target.reasoning_output_tokens += delta.reasoning_output_tokens || 0;
238
+ target.total_tokens += delta.total_tokens || 0;
239
+ }
240
+
241
+ function totalsKey(totals) {
242
+ return [
243
+ totals.input_tokens || 0,
244
+ totals.cached_input_tokens || 0,
245
+ totals.output_tokens || 0,
246
+ totals.reasoning_output_tokens || 0,
247
+ totals.total_tokens || 0
248
+ ].join('|');
249
+ }
250
+
251
+ function toUtcHalfHourStart(ts) {
252
+ const dt = new Date(ts);
253
+ if (!Number.isFinite(dt.getTime())) return null;
254
+ const minutes = dt.getUTCMinutes();
255
+ const halfMinute = minutes >= 30 ? 30 : 0;
256
+ const bucketStart = new Date(
257
+ Date.UTC(
258
+ dt.getUTCFullYear(),
259
+ dt.getUTCMonth(),
260
+ dt.getUTCDate(),
261
+ dt.getUTCHours(),
262
+ halfMinute,
263
+ 0,
264
+ 0
265
+ )
266
+ );
267
+ return bucketStart.toISOString();
162
268
  }
163
269
 
164
270
  function pickDelta(lastUsage, totalUsage, prevTotals) {
@@ -209,10 +315,6 @@ function normalizeUsage(u) {
209
315
  return out;
210
316
  }
211
317
 
212
- function sha256Hex(s) {
213
- return crypto.createHash('sha256').update(s, 'utf8').digest('hex');
214
- }
215
-
216
318
  function isNonEmptyObject(v) {
217
319
  return Boolean(v && typeof v === 'object' && !Array.isArray(v) && Object.keys(v).length > 0);
218
320
  }
@@ -1,6 +1,6 @@
1
1
  const DEFAULTS = {
2
- intervalMs: 10 * 60_000,
3
- jitterMsMax: 120_000,
2
+ intervalMs: 30 * 60_000,
3
+ jitterMsMax: 60_000,
4
4
  backlogBytes: 1_000_000,
5
5
  batchSize: 300,
6
6
  maxBatchesSmall: 2,
@@ -3,7 +3,7 @@ const fssync = require('node:fs');
3
3
  const readline = require('node:readline');
4
4
 
5
5
  const { ensureDir, readJson, writeJson } = require('./fs');
6
- const { ingestEvents } = require('./vibescore-api');
6
+ const { ingestHourly } = require('./vibescore-api');
7
7
 
8
8
  async function drainQueueToCloud({ baseUrl, deviceToken, queuePath, queueStatePath, maxBatches, batchSize, onProgress }) {
9
9
  await ensureDir(require('node:path').dirname(queueStatePath));
@@ -16,14 +16,14 @@ async function drainQueueToCloud({ baseUrl, deviceToken, queuePath, queueStatePa
16
16
 
17
17
  const cb = typeof onProgress === 'function' ? onProgress : null;
18
18
  const queueSize = await safeFileSize(queuePath);
19
- const maxEvents = Math.max(1, Math.floor(Number(batchSize || 200)));
19
+ const maxBuckets = Math.max(1, Math.floor(Number(batchSize || 200)));
20
20
 
21
21
  for (let batch = 0; batch < maxBatches; batch++) {
22
- const res = await readBatch(queuePath, offset, maxEvents);
23
- if (res.events.length === 0) break;
22
+ const res = await readBatch(queuePath, offset, maxBuckets);
23
+ if (res.buckets.length === 0) break;
24
24
 
25
- attempted += res.events.length;
26
- const ingest = await ingestEvents({ baseUrl, deviceToken, events: res.events });
25
+ attempted += res.buckets.length;
26
+ const ingest = await ingestHourly({ baseUrl, deviceToken, hourly: res.buckets });
27
27
  inserted += ingest.inserted || 0;
28
28
  skipped += ingest.skipped || 0;
29
29
 
@@ -47,33 +47,37 @@ async function drainQueueToCloud({ baseUrl, deviceToken, queuePath, queueStatePa
47
47
  return { inserted, skipped, attempted };
48
48
  }
49
49
 
50
- async function readBatch(queuePath, startOffset, maxEvents) {
50
+ async function readBatch(queuePath, startOffset, maxBuckets) {
51
51
  const st = await fs.stat(queuePath).catch(() => null);
52
- if (!st || !st.isFile()) return { events: [], nextOffset: startOffset };
53
- if (startOffset >= st.size) return { events: [], nextOffset: startOffset };
52
+ if (!st || !st.isFile()) return { buckets: [], nextOffset: startOffset };
53
+ if (startOffset >= st.size) return { buckets: [], nextOffset: startOffset };
54
54
 
55
55
  const stream = fssync.createReadStream(queuePath, { encoding: 'utf8', start: startOffset });
56
56
  const rl = readline.createInterface({ input: stream, crlfDelay: Infinity });
57
57
 
58
- const events = [];
58
+ const bucketMap = new Map();
59
59
  let offset = startOffset;
60
+ let linesRead = 0;
60
61
  for await (const line of rl) {
61
62
  const bytes = Buffer.byteLength(line, 'utf8') + 1;
62
63
  offset += bytes;
63
64
  if (!line.trim()) continue;
64
- let ev;
65
+ let bucket;
65
66
  try {
66
- ev = JSON.parse(line);
67
+ bucket = JSON.parse(line);
67
68
  } catch (_e) {
68
69
  continue;
69
70
  }
70
- events.push(ev);
71
- if (events.length >= maxEvents) break;
71
+ const hourStart = typeof bucket?.hour_start === 'string' ? bucket.hour_start : null;
72
+ if (!hourStart) continue;
73
+ bucketMap.set(hourStart, bucket);
74
+ linesRead += 1;
75
+ if (linesRead >= maxBuckets) break;
72
76
  }
73
77
 
74
78
  rl.close();
75
79
  stream.close?.();
76
- return { events, nextOffset: offset };
80
+ return { buckets: Array.from(bucketMap.values()), nextOffset: offset };
77
81
  }
78
82
 
79
83
  async function safeFileSize(p) {
@@ -35,13 +35,13 @@ async function issueDeviceToken({ baseUrl, accessToken, deviceName, platform = '
35
35
  return { token, deviceId };
36
36
  }
37
37
 
38
- async function ingestEvents({ baseUrl, deviceToken, events }) {
38
+ async function ingestHourly({ baseUrl, deviceToken, hourly }) {
39
39
  const data = await invokeFunctionWithRetry({
40
40
  baseUrl,
41
41
  accessToken: deviceToken,
42
42
  slug: 'vibescore-ingest',
43
43
  method: 'POST',
44
- body: { events },
44
+ body: { hourly },
45
45
  errorPrefix: 'Ingest failed',
46
46
  retry: { maxRetries: 3, baseDelayMs: 500, maxDelayMs: 5000 }
47
47
  });
@@ -52,10 +52,28 @@ async function ingestEvents({ baseUrl, deviceToken, events }) {
52
52
  };
53
53
  }
54
54
 
55
+ async function syncHeartbeat({ baseUrl, deviceToken }) {
56
+ const data = await invokeFunction({
57
+ baseUrl,
58
+ accessToken: deviceToken,
59
+ slug: 'vibescore-sync-ping',
60
+ method: 'POST',
61
+ body: {},
62
+ errorPrefix: 'Sync heartbeat failed'
63
+ });
64
+
65
+ return {
66
+ updated: Boolean(data?.updated),
67
+ last_sync_at: typeof data?.last_sync_at === 'string' ? data.last_sync_at : null,
68
+ min_interval_minutes: Number(data?.min_interval_minutes || 0)
69
+ };
70
+ }
71
+
55
72
  module.exports = {
56
73
  signInWithPassword,
57
74
  issueDeviceToken,
58
- ingestEvents
75
+ ingestHourly,
76
+ syncHeartbeat
59
77
  };
60
78
 
61
79
  async function invokeFunction({ baseUrl, accessToken, slug, method, body, errorPrefix }) {
@@ -82,17 +100,30 @@ async function invokeFunctionWithRetry({ baseUrl, accessToken, slug, method, bod
82
100
  }
83
101
 
84
102
  function normalizeSdkError(error, errorPrefix) {
85
- const raw = error?.message || String(error || 'Unknown error');
103
+ const raw = extractSdkErrorMessage(error);
86
104
  const msg = normalizeBackendErrorMessage(raw);
87
105
  const err = new Error(errorPrefix ? `${errorPrefix}: ${msg}` : msg);
88
106
  const status = error?.statusCode ?? error?.status;
107
+ const code = typeof error?.error === 'string' ? error.error.trim() : '';
89
108
  if (typeof status === 'number') err.status = status;
109
+ if (code) err.code = code;
90
110
  err.retryable = isRetryableStatus(status) || isRetryableMessage(raw);
91
111
  if (msg !== raw) err.originalMessage = raw;
92
112
  if (error?.nextActions) err.nextActions = error.nextActions;
93
113
  return err;
94
114
  }
95
115
 
116
+ function extractSdkErrorMessage(error) {
117
+ if (!error) return 'Unknown error';
118
+ const message = typeof error.message === 'string' ? error.message.trim() : '';
119
+ const code = typeof error.error === 'string' ? error.error.trim() : '';
120
+ if (message && message !== 'InsForgeError') return message;
121
+ if (code && code !== 'REQUEST_FAILED') return code;
122
+ if (message) return message;
123
+ if (code) return code;
124
+ return String(error);
125
+ }
126
+
96
127
  function normalizeBackendErrorMessage(message) {
97
128
  if (!isBackendRuntimeDownMessage(message)) return String(message || 'Unknown error');
98
129
  return 'Backend runtime unavailable (InsForge). Please retry later.';