instar 0.28.76 → 0.28.78

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (448) hide show
  1. package/dashboard/index.html +486 -0
  2. package/dist/cli.js +5 -8
  3. package/dist/cli.js.map +1 -1
  4. package/dist/commands/discovery.d.ts.map +1 -1
  5. package/dist/commands/discovery.js +2 -2
  6. package/dist/commands/discovery.js.map +1 -1
  7. package/dist/commands/init.d.ts.map +1 -1
  8. package/dist/commands/init.js +22 -4
  9. package/dist/commands/init.js.map +1 -1
  10. package/dist/commands/job.d.ts.map +1 -1
  11. package/dist/commands/job.js +2 -2
  12. package/dist/commands/job.js.map +1 -1
  13. package/dist/commands/ledgerCleanup.d.ts.map +1 -1
  14. package/dist/commands/ledgerCleanup.js +2 -2
  15. package/dist/commands/ledgerCleanup.js.map +1 -1
  16. package/dist/commands/listener.d.ts.map +1 -1
  17. package/dist/commands/listener.js +7 -12
  18. package/dist/commands/listener.js.map +1 -1
  19. package/dist/commands/nuke.d.ts.map +1 -1
  20. package/dist/commands/nuke.js +11 -21
  21. package/dist/commands/nuke.js.map +1 -1
  22. package/dist/commands/server.d.ts.map +1 -1
  23. package/dist/commands/server.js +79 -5
  24. package/dist/commands/server.js.map +1 -1
  25. package/dist/commands/setup.d.ts.map +1 -1
  26. package/dist/commands/setup.js +11 -15
  27. package/dist/commands/setup.js.map +1 -1
  28. package/dist/commands/slack-cli.d.ts.map +1 -1
  29. package/dist/commands/slack-cli.js +5 -8
  30. package/dist/commands/slack-cli.js.map +1 -1
  31. package/dist/commands/whatsapp.d.ts.map +1 -1
  32. package/dist/commands/whatsapp.js +2 -2
  33. package/dist/commands/whatsapp.js.map +1 -1
  34. package/dist/commands/worktree.d.ts.map +1 -1
  35. package/dist/commands/worktree.js +2 -2
  36. package/dist/commands/worktree.js.map +1 -1
  37. package/dist/core/AgentConnector.d.ts.map +1 -1
  38. package/dist/core/AgentConnector.js +9 -10
  39. package/dist/core/AgentConnector.js.map +1 -1
  40. package/dist/core/AgentRegistry.d.ts.map +1 -1
  41. package/dist/core/AgentRegistry.js +3 -4
  42. package/dist/core/AgentRegistry.js.map +1 -1
  43. package/dist/core/AutoDispatcher.d.ts.map +1 -1
  44. package/dist/core/AutoDispatcher.js +2 -2
  45. package/dist/core/AutoDispatcher.js.map +1 -1
  46. package/dist/core/AutoUpdater.d.ts.map +1 -1
  47. package/dist/core/AutoUpdater.js +2 -2
  48. package/dist/core/AutoUpdater.js.map +1 -1
  49. package/dist/core/AutonomousEvolution.d.ts.map +1 -1
  50. package/dist/core/AutonomousEvolution.js +2 -2
  51. package/dist/core/AutonomousEvolution.js.map +1 -1
  52. package/dist/core/BackupManager.d.ts.map +1 -1
  53. package/dist/core/BackupManager.js +2 -2
  54. package/dist/core/BackupManager.js.map +1 -1
  55. package/dist/core/BranchManager.d.ts.map +1 -1
  56. package/dist/core/BranchManager.js +3 -3
  57. package/dist/core/BranchManager.js.map +1 -1
  58. package/dist/core/CaffeinateManager.d.ts.map +1 -1
  59. package/dist/core/CaffeinateManager.js +2 -2
  60. package/dist/core/CaffeinateManager.js.map +1 -1
  61. package/dist/core/DeferredDispatchTracker.d.ts.map +1 -1
  62. package/dist/core/DeferredDispatchTracker.js +2 -2
  63. package/dist/core/DeferredDispatchTracker.js.map +1 -1
  64. package/dist/core/DispatchManager.d.ts.map +1 -1
  65. package/dist/core/DispatchManager.js +3 -4
  66. package/dist/core/DispatchManager.js.map +1 -1
  67. package/dist/core/EvolutionManager.d.ts.map +1 -1
  68. package/dist/core/EvolutionManager.js +2 -2
  69. package/dist/core/EvolutionManager.js.map +1 -1
  70. package/dist/core/ExecutionJournal.d.ts.map +1 -1
  71. package/dist/core/ExecutionJournal.js +2 -2
  72. package/dist/core/ExecutionJournal.js.map +1 -1
  73. package/dist/core/FeedbackManager.d.ts.map +1 -1
  74. package/dist/core/FeedbackManager.js +2 -2
  75. package/dist/core/FeedbackManager.js.map +1 -1
  76. package/dist/core/FileClassifier.d.ts.map +1 -1
  77. package/dist/core/FileClassifier.js +8 -17
  78. package/dist/core/FileClassifier.js.map +1 -1
  79. package/dist/core/ForegroundRestartWatcher.d.ts.map +1 -1
  80. package/dist/core/ForegroundRestartWatcher.js +3 -4
  81. package/dist/core/ForegroundRestartWatcher.js.map +1 -1
  82. package/dist/core/GitStateManager.d.ts.map +1 -1
  83. package/dist/core/GitStateManager.js +3 -12
  84. package/dist/core/GitStateManager.js.map +1 -1
  85. package/dist/core/GitSync.d.ts.map +1 -1
  86. package/dist/core/GitSync.js +6 -6
  87. package/dist/core/GitSync.js.map +1 -1
  88. package/dist/core/GlobalInstallCleanup.d.ts.map +1 -1
  89. package/dist/core/GlobalInstallCleanup.js +3 -4
  90. package/dist/core/GlobalInstallCleanup.js.map +1 -1
  91. package/dist/core/GlobalSecretStore.d.ts.map +1 -1
  92. package/dist/core/GlobalSecretStore.js +3 -4
  93. package/dist/core/GlobalSecretStore.js.map +1 -1
  94. package/dist/core/HandoffManager.d.ts.map +1 -1
  95. package/dist/core/HandoffManager.js +5 -5
  96. package/dist/core/HandoffManager.js.map +1 -1
  97. package/dist/core/JargonDetector.d.ts +28 -0
  98. package/dist/core/JargonDetector.d.ts.map +1 -0
  99. package/dist/core/JargonDetector.js +59 -0
  100. package/dist/core/JargonDetector.js.map +1 -0
  101. package/dist/core/LedgerSessionRegistry.d.ts.map +1 -1
  102. package/dist/core/LedgerSessionRegistry.js +2 -2
  103. package/dist/core/LedgerSessionRegistry.js.map +1 -1
  104. package/dist/core/MachineIdentity.d.ts.map +1 -1
  105. package/dist/core/MachineIdentity.js +2 -2
  106. package/dist/core/MachineIdentity.js.map +1 -1
  107. package/dist/core/MessagingToneGate.d.ts +42 -5
  108. package/dist/core/MessagingToneGate.d.ts.map +1 -1
  109. package/dist/core/MessagingToneGate.js +40 -6
  110. package/dist/core/MessagingToneGate.js.map +1 -1
  111. package/dist/core/ParallelDevWiring.d.ts.map +1 -1
  112. package/dist/core/ParallelDevWiring.js +3 -6
  113. package/dist/core/ParallelDevWiring.js.map +1 -1
  114. package/dist/core/PostUpdateMigrator.d.ts +26 -0
  115. package/dist/core/PostUpdateMigrator.d.ts.map +1 -1
  116. package/dist/core/PostUpdateMigrator.js +249 -46
  117. package/dist/core/PostUpdateMigrator.js.map +1 -1
  118. package/dist/core/ProjectMapper.d.ts.map +1 -1
  119. package/dist/core/ProjectMapper.js +5 -11
  120. package/dist/core/ProjectMapper.js.map +1 -1
  121. package/dist/core/RelationshipManager.d.ts.map +1 -1
  122. package/dist/core/RelationshipManager.js +4 -5
  123. package/dist/core/RelationshipManager.js.map +1 -1
  124. package/dist/core/SafeGitExecutor.d.ts +11 -5
  125. package/dist/core/SafeGitExecutor.d.ts.map +1 -1
  126. package/dist/core/SafeGitExecutor.js +87 -1
  127. package/dist/core/SafeGitExecutor.js.map +1 -1
  128. package/dist/core/ScopeVerifier.d.ts.map +1 -1
  129. package/dist/core/ScopeVerifier.js +3 -6
  130. package/dist/core/ScopeVerifier.js.map +1 -1
  131. package/dist/core/SecretStore.d.ts.map +1 -1
  132. package/dist/core/SecretStore.js +2 -2
  133. package/dist/core/SecretStore.js.map +1 -1
  134. package/dist/core/SharedStateLedger.d.ts.map +1 -1
  135. package/dist/core/SharedStateLedger.js +2 -2
  136. package/dist/core/SharedStateLedger.js.map +1 -1
  137. package/dist/core/SoulManager.d.ts.map +1 -1
  138. package/dist/core/SoulManager.js +3 -4
  139. package/dist/core/SoulManager.js.map +1 -1
  140. package/dist/core/StateManager.d.ts.map +1 -1
  141. package/dist/core/StateManager.js +4 -6
  142. package/dist/core/StateManager.js.map +1 -1
  143. package/dist/core/SyncOrchestrator.d.ts.map +1 -1
  144. package/dist/core/SyncOrchestrator.js +6 -7
  145. package/dist/core/SyncOrchestrator.js.map +1 -1
  146. package/dist/core/UpdateChecker.d.ts.map +1 -1
  147. package/dist/core/UpdateChecker.js +3 -4
  148. package/dist/core/UpdateChecker.js.map +1 -1
  149. package/dist/core/UpgradeGuideProcessor.d.ts.map +1 -1
  150. package/dist/core/UpgradeGuideProcessor.js +3 -4
  151. package/dist/core/UpgradeGuideProcessor.js.map +1 -1
  152. package/dist/core/WorktreeManager.d.ts.map +1 -1
  153. package/dist/core/WorktreeManager.js +9 -14
  154. package/dist/core/WorktreeManager.js.map +1 -1
  155. package/dist/knowledge/KnowledgeManager.d.ts.map +1 -1
  156. package/dist/knowledge/KnowledgeManager.js +2 -2
  157. package/dist/knowledge/KnowledgeManager.js.map +1 -1
  158. package/dist/lifeline/ServerSupervisor.d.ts +28 -0
  159. package/dist/lifeline/ServerSupervisor.d.ts.map +1 -1
  160. package/dist/lifeline/ServerSupervisor.js +171 -73
  161. package/dist/lifeline/ServerSupervisor.js.map +1 -1
  162. package/dist/lifeline/TelegramLifeline.d.ts.map +1 -1
  163. package/dist/lifeline/TelegramLifeline.js +10 -4
  164. package/dist/lifeline/TelegramLifeline.js.map +1 -1
  165. package/dist/lifeline/detectLaunchdSupervised.d.ts +43 -0
  166. package/dist/lifeline/detectLaunchdSupervised.d.ts.map +1 -0
  167. package/dist/lifeline/detectLaunchdSupervised.js +106 -0
  168. package/dist/lifeline/detectLaunchdSupervised.js.map +1 -0
  169. package/dist/lifeline/droppedMessages.d.ts.map +1 -1
  170. package/dist/lifeline/droppedMessages.js +2 -2
  171. package/dist/lifeline/droppedMessages.js.map +1 -1
  172. package/dist/memory/EpisodicMemory.d.ts.map +1 -1
  173. package/dist/memory/EpisodicMemory.js +2 -2
  174. package/dist/memory/EpisodicMemory.js.map +1 -1
  175. package/dist/memory/TopicMemory.d.ts.map +1 -1
  176. package/dist/memory/TopicMemory.js +5 -8
  177. package/dist/memory/TopicMemory.js.map +1 -1
  178. package/dist/messaging/AgentTokenManager.d.ts.map +1 -1
  179. package/dist/messaging/AgentTokenManager.js +2 -2
  180. package/dist/messaging/AgentTokenManager.js.map +1 -1
  181. package/dist/messaging/DropPickup.d.ts.map +1 -1
  182. package/dist/messaging/DropPickup.js +2 -2
  183. package/dist/messaging/DropPickup.js.map +1 -1
  184. package/dist/messaging/GitSyncTransport.d.ts.map +1 -1
  185. package/dist/messaging/GitSyncTransport.js +4 -6
  186. package/dist/messaging/GitSyncTransport.js.map +1 -1
  187. package/dist/messaging/MessageStore.d.ts.map +1 -1
  188. package/dist/messaging/MessageStore.js +3 -4
  189. package/dist/messaging/MessageStore.js.map +1 -1
  190. package/dist/messaging/TelegramAdapter.d.ts.map +1 -1
  191. package/dist/messaging/TelegramAdapter.js +5 -8
  192. package/dist/messaging/TelegramAdapter.js.map +1 -1
  193. package/dist/messaging/backends/BaileysBackend.d.ts.map +1 -1
  194. package/dist/messaging/backends/BaileysBackend.js +3 -4
  195. package/dist/messaging/backends/BaileysBackend.js.map +1 -1
  196. package/dist/messaging/local-tone-check.d.ts +61 -0
  197. package/dist/messaging/local-tone-check.d.ts.map +1 -0
  198. package/dist/messaging/local-tone-check.js +78 -0
  199. package/dist/messaging/local-tone-check.js.map +1 -0
  200. package/dist/messaging/pending-relay-store.d.ts +153 -0
  201. package/dist/messaging/pending-relay-store.d.ts.map +1 -0
  202. package/dist/messaging/pending-relay-store.js +351 -0
  203. package/dist/messaging/pending-relay-store.js.map +1 -0
  204. package/dist/messaging/secret-patterns.d.ts +35 -0
  205. package/dist/messaging/secret-patterns.d.ts.map +1 -0
  206. package/dist/messaging/secret-patterns.js +70 -0
  207. package/dist/messaging/secret-patterns.js.map +1 -0
  208. package/dist/messaging/shared/EncryptedAuthStore.d.ts.map +1 -1
  209. package/dist/messaging/shared/EncryptedAuthStore.js +3 -4
  210. package/dist/messaging/shared/EncryptedAuthStore.js.map +1 -1
  211. package/dist/messaging/shared/MessageLogger.d.ts.map +1 -1
  212. package/dist/messaging/shared/MessageLogger.js +2 -2
  213. package/dist/messaging/shared/MessageLogger.js.map +1 -1
  214. package/dist/messaging/shared/PrivacyConsent.d.ts.map +1 -1
  215. package/dist/messaging/shared/PrivacyConsent.js +2 -2
  216. package/dist/messaging/shared/PrivacyConsent.js.map +1 -1
  217. package/dist/messaging/shared/SessionChannelRegistry.d.ts.map +1 -1
  218. package/dist/messaging/shared/SessionChannelRegistry.js +2 -2
  219. package/dist/messaging/shared/SessionChannelRegistry.js.map +1 -1
  220. package/dist/messaging/system-templates.d.ts +87 -0
  221. package/dist/messaging/system-templates.d.ts.map +1 -0
  222. package/dist/messaging/system-templates.js +236 -0
  223. package/dist/messaging/system-templates.js.map +1 -0
  224. package/dist/messaging/whoami-cache.d.ts +66 -0
  225. package/dist/messaging/whoami-cache.d.ts.map +1 -0
  226. package/dist/messaging/whoami-cache.js +149 -0
  227. package/dist/messaging/whoami-cache.js.map +1 -0
  228. package/dist/moltbridge/ProfileCompiler.d.ts.map +1 -1
  229. package/dist/moltbridge/ProfileCompiler.js +13 -7
  230. package/dist/moltbridge/ProfileCompiler.js.map +1 -1
  231. package/dist/monitoring/CommitmentTracker.d.ts.map +1 -1
  232. package/dist/monitoring/CommitmentTracker.js +2 -2
  233. package/dist/monitoring/CommitmentTracker.js.map +1 -1
  234. package/dist/monitoring/CredentialProvider.d.ts.map +1 -1
  235. package/dist/monitoring/CredentialProvider.js +2 -2
  236. package/dist/monitoring/CredentialProvider.js.map +1 -1
  237. package/dist/monitoring/DegradationReporter.d.ts +41 -0
  238. package/dist/monitoring/DegradationReporter.d.ts.map +1 -1
  239. package/dist/monitoring/DegradationReporter.js +96 -4
  240. package/dist/monitoring/DegradationReporter.js.map +1 -1
  241. package/dist/monitoring/HealthChecker.d.ts.map +1 -1
  242. package/dist/monitoring/HealthChecker.js +2 -2
  243. package/dist/monitoring/HealthChecker.js.map +1 -1
  244. package/dist/monitoring/HookEventReceiver.d.ts.map +1 -1
  245. package/dist/monitoring/HookEventReceiver.js +2 -2
  246. package/dist/monitoring/HookEventReceiver.js.map +1 -1
  247. package/dist/monitoring/InstructionsVerifier.d.ts.map +1 -1
  248. package/dist/monitoring/InstructionsVerifier.js +2 -2
  249. package/dist/monitoring/InstructionsVerifier.js.map +1 -1
  250. package/dist/monitoring/PresenceProxy.d.ts.map +1 -1
  251. package/dist/monitoring/PresenceProxy.js +5 -8
  252. package/dist/monitoring/PresenceProxy.js.map +1 -1
  253. package/dist/monitoring/QuotaTracker.d.ts.map +1 -1
  254. package/dist/monitoring/QuotaTracker.js +2 -2
  255. package/dist/monitoring/QuotaTracker.js.map +1 -1
  256. package/dist/monitoring/SessionMigrator.d.ts.map +1 -1
  257. package/dist/monitoring/SessionMigrator.js +2 -2
  258. package/dist/monitoring/SessionMigrator.js.map +1 -1
  259. package/dist/monitoring/SessionRecovery.d.ts.map +1 -1
  260. package/dist/monitoring/SessionRecovery.js +2 -2
  261. package/dist/monitoring/SessionRecovery.js.map +1 -1
  262. package/dist/monitoring/TelemetryAuth.d.ts.map +1 -1
  263. package/dist/monitoring/TelemetryAuth.js +3 -4
  264. package/dist/monitoring/TelemetryAuth.js.map +1 -1
  265. package/dist/monitoring/TokenLedger.d.ts +130 -0
  266. package/dist/monitoring/TokenLedger.d.ts.map +1 -0
  267. package/dist/monitoring/TokenLedger.js +523 -0
  268. package/dist/monitoring/TokenLedger.js.map +1 -0
  269. package/dist/monitoring/TokenLedgerPoller.d.ts +26 -0
  270. package/dist/monitoring/TokenLedgerPoller.d.ts.map +1 -0
  271. package/dist/monitoring/TokenLedgerPoller.js +44 -0
  272. package/dist/monitoring/TokenLedgerPoller.js.map +1 -0
  273. package/dist/monitoring/TriageOrchestrator.d.ts.map +1 -1
  274. package/dist/monitoring/TriageOrchestrator.js +3 -4
  275. package/dist/monitoring/TriageOrchestrator.js.map +1 -1
  276. package/dist/monitoring/WorktreeReaper.d.ts.map +1 -1
  277. package/dist/monitoring/WorktreeReaper.js +5 -7
  278. package/dist/monitoring/WorktreeReaper.js.map +1 -1
  279. package/dist/monitoring/delivery-failure-sentinel/recovery-policy.d.ts +83 -0
  280. package/dist/monitoring/delivery-failure-sentinel/recovery-policy.d.ts.map +1 -0
  281. package/dist/monitoring/delivery-failure-sentinel/recovery-policy.js +218 -0
  282. package/dist/monitoring/delivery-failure-sentinel/recovery-policy.js.map +1 -0
  283. package/dist/monitoring/delivery-failure-sentinel.d.ts +177 -0
  284. package/dist/monitoring/delivery-failure-sentinel.d.ts.map +1 -0
  285. package/dist/monitoring/delivery-failure-sentinel.js +598 -0
  286. package/dist/monitoring/delivery-failure-sentinel.js.map +1 -0
  287. package/dist/monitoring/probes/PlatformProbe.d.ts.map +1 -1
  288. package/dist/monitoring/probes/PlatformProbe.js +3 -4
  289. package/dist/monitoring/probes/PlatformProbe.js.map +1 -1
  290. package/dist/monitoring/templates-drift-verifier.d.ts +109 -0
  291. package/dist/monitoring/templates-drift-verifier.d.ts.map +1 -0
  292. package/dist/monitoring/templates-drift-verifier.js +324 -0
  293. package/dist/monitoring/templates-drift-verifier.js.map +1 -0
  294. package/dist/paste/PasteManager.d.ts.map +1 -1
  295. package/dist/paste/PasteManager.js +5 -8
  296. package/dist/paste/PasteManager.js.map +1 -1
  297. package/dist/publishing/PrivateViewer.d.ts.map +1 -1
  298. package/dist/publishing/PrivateViewer.js +2 -2
  299. package/dist/publishing/PrivateViewer.js.map +1 -1
  300. package/dist/scheduler/JobScheduler.d.ts.map +1 -1
  301. package/dist/scheduler/JobScheduler.js +2 -2
  302. package/dist/scheduler/JobScheduler.js.map +1 -1
  303. package/dist/server/AgentServer.d.ts +22 -0
  304. package/dist/server/AgentServer.d.ts.map +1 -1
  305. package/dist/server/AgentServer.js +199 -1
  306. package/dist/server/AgentServer.js.map +1 -1
  307. package/dist/server/WebSocketManager.d.ts +11 -0
  308. package/dist/server/WebSocketManager.d.ts.map +1 -1
  309. package/dist/server/WebSocketManager.js +28 -0
  310. package/dist/server/WebSocketManager.js.map +1 -1
  311. package/dist/server/boot-id.d.ts +58 -0
  312. package/dist/server/boot-id.d.ts.map +1 -0
  313. package/dist/server/boot-id.js +121 -0
  314. package/dist/server/boot-id.js.map +1 -0
  315. package/dist/server/middleware.d.ts +14 -1
  316. package/dist/server/middleware.d.ts.map +1 -1
  317. package/dist/server/middleware.js +81 -1
  318. package/dist/server/middleware.js.map +1 -1
  319. package/dist/server/routes.d.ts +76 -0
  320. package/dist/server/routes.d.ts.map +1 -1
  321. package/dist/server/routes.js +626 -11
  322. package/dist/server/routes.js.map +1 -1
  323. package/dist/threadline/AgentDiscovery.d.ts.map +1 -1
  324. package/dist/threadline/AgentDiscovery.js +2 -2
  325. package/dist/threadline/AgentDiscovery.js.map +1 -1
  326. package/dist/threadline/AgentTrustManager.d.ts.map +1 -1
  327. package/dist/threadline/AgentTrustManager.js +2 -2
  328. package/dist/threadline/AgentTrustManager.js.map +1 -1
  329. package/dist/threadline/BackfillCore.d.ts +70 -0
  330. package/dist/threadline/BackfillCore.d.ts.map +1 -0
  331. package/dist/threadline/BackfillCore.js +117 -0
  332. package/dist/threadline/BackfillCore.js.map +1 -0
  333. package/dist/threadline/CircuitBreaker.d.ts.map +1 -1
  334. package/dist/threadline/CircuitBreaker.js +2 -2
  335. package/dist/threadline/CircuitBreaker.js.map +1 -1
  336. package/dist/threadline/ComputeMeter.d.ts.map +1 -1
  337. package/dist/threadline/ComputeMeter.js +2 -2
  338. package/dist/threadline/ComputeMeter.js.map +1 -1
  339. package/dist/threadline/ContextThreadMap.d.ts.map +1 -1
  340. package/dist/threadline/ContextThreadMap.js +2 -2
  341. package/dist/threadline/ContextThreadMap.js.map +1 -1
  342. package/dist/threadline/HeartbeatWatchdog.d.ts +78 -0
  343. package/dist/threadline/HeartbeatWatchdog.d.ts.map +1 -0
  344. package/dist/threadline/HeartbeatWatchdog.js +212 -0
  345. package/dist/threadline/HeartbeatWatchdog.js.map +1 -0
  346. package/dist/threadline/HeartbeatWriter.d.ts +79 -0
  347. package/dist/threadline/HeartbeatWriter.d.ts.map +1 -0
  348. package/dist/threadline/HeartbeatWriter.js +109 -0
  349. package/dist/threadline/HeartbeatWriter.js.map +1 -0
  350. package/dist/threadline/InvitationManager.d.ts.map +1 -1
  351. package/dist/threadline/InvitationManager.js +2 -2
  352. package/dist/threadline/InvitationManager.js.map +1 -1
  353. package/dist/threadline/ListenerSessionManager.d.ts +59 -0
  354. package/dist/threadline/ListenerSessionManager.d.ts.map +1 -1
  355. package/dist/threadline/ListenerSessionManager.js +79 -0
  356. package/dist/threadline/ListenerSessionManager.js.map +1 -1
  357. package/dist/threadline/MCPAuth.d.ts.map +1 -1
  358. package/dist/threadline/MCPAuth.js +2 -2
  359. package/dist/threadline/MCPAuth.js.map +1 -1
  360. package/dist/threadline/PipeSessionSpawner.d.ts.map +1 -1
  361. package/dist/threadline/PipeSessionSpawner.js +3 -4
  362. package/dist/threadline/PipeSessionSpawner.js.map +1 -1
  363. package/dist/threadline/RateLimiter.d.ts.map +1 -1
  364. package/dist/threadline/RateLimiter.js +2 -2
  365. package/dist/threadline/RateLimiter.js.map +1 -1
  366. package/dist/threadline/RelaySpawnFailureHandler.d.ts +53 -0
  367. package/dist/threadline/RelaySpawnFailureHandler.d.ts.map +1 -0
  368. package/dist/threadline/RelaySpawnFailureHandler.js +73 -0
  369. package/dist/threadline/RelaySpawnFailureHandler.js.map +1 -0
  370. package/dist/threadline/SessionLifecycle.d.ts.map +1 -1
  371. package/dist/threadline/SessionLifecycle.js +2 -2
  372. package/dist/threadline/SessionLifecycle.js.map +1 -1
  373. package/dist/threadline/SpawnLedger.d.ts +94 -0
  374. package/dist/threadline/SpawnLedger.d.ts.map +1 -0
  375. package/dist/threadline/SpawnLedger.js +194 -0
  376. package/dist/threadline/SpawnLedger.js.map +1 -0
  377. package/dist/threadline/SpawnNonce.d.ts +49 -0
  378. package/dist/threadline/SpawnNonce.d.ts.map +1 -0
  379. package/dist/threadline/SpawnNonce.js +99 -0
  380. package/dist/threadline/SpawnNonce.js.map +1 -0
  381. package/dist/threadline/TelegramBridge.d.ts +140 -0
  382. package/dist/threadline/TelegramBridge.d.ts.map +1 -0
  383. package/dist/threadline/TelegramBridge.js +224 -0
  384. package/dist/threadline/TelegramBridge.js.map +1 -0
  385. package/dist/threadline/TelegramBridgeConfig.d.ts +79 -0
  386. package/dist/threadline/TelegramBridgeConfig.d.ts.map +1 -0
  387. package/dist/threadline/TelegramBridgeConfig.js +168 -0
  388. package/dist/threadline/TelegramBridgeConfig.js.map +1 -0
  389. package/dist/threadline/ThreadlineBootstrap.d.ts.map +1 -1
  390. package/dist/threadline/ThreadlineBootstrap.js +2 -2
  391. package/dist/threadline/ThreadlineBootstrap.js.map +1 -1
  392. package/dist/threadline/ThreadlineMCPServer.d.ts.map +1 -1
  393. package/dist/threadline/ThreadlineMCPServer.js +5 -0
  394. package/dist/threadline/ThreadlineMCPServer.js.map +1 -1
  395. package/dist/threadline/ThreadlineObservability.d.ts +95 -0
  396. package/dist/threadline/ThreadlineObservability.d.ts.map +1 -0
  397. package/dist/threadline/ThreadlineObservability.js +310 -0
  398. package/dist/threadline/ThreadlineObservability.js.map +1 -0
  399. package/dist/threadline/WakeSocketServer.d.ts.map +1 -1
  400. package/dist/threadline/WakeSocketServer.js +3 -4
  401. package/dist/threadline/WakeSocketServer.js.map +1 -1
  402. package/dist/threadline/listener-daemon.d.ts.map +1 -1
  403. package/dist/threadline/listener-daemon.js +3 -4
  404. package/dist/threadline/listener-daemon.js.map +1 -1
  405. package/dist/users/UserManager.d.ts.map +1 -1
  406. package/dist/users/UserManager.js +2 -2
  407. package/dist/users/UserManager.js.map +1 -1
  408. package/dist/users/UserOnboarding.d.ts.map +1 -1
  409. package/dist/users/UserOnboarding.js +2 -2
  410. package/dist/users/UserOnboarding.js.map +1 -1
  411. package/dist/utils/jsonl-rotation.d.ts.map +1 -1
  412. package/dist/utils/jsonl-rotation.js +2 -2
  413. package/dist/utils/jsonl-rotation.js.map +1 -1
  414. package/package.json +1 -1
  415. package/scripts/analyze-release.js +7 -12
  416. package/scripts/check-contract-evidence.js +27 -10
  417. package/scripts/fix-better-sqlite3.cjs +0 -2
  418. package/scripts/instar-dev-precommit.js +0 -2
  419. package/scripts/lint-no-direct-destructive.js +24 -4
  420. package/scripts/lint-template-sha-history.ts +183 -0
  421. package/scripts/migrate-incident-2026-04-17.mjs +2 -2
  422. package/scripts/run-migration.js +500 -0
  423. package/scripts/test-bootstrap-relay.mjs +2 -2
  424. package/scripts/threadline-bridge-backfill.mjs +379 -0
  425. package/scripts/verify-deployed-templates.ts +87 -0
  426. package/src/data/builtin-manifest.json +140 -132
  427. package/src/templates/scripts/git-sync-gate.sh +0 -4
  428. package/src/templates/scripts/telegram-reply.sh +318 -13
  429. package/upgrades/0.28.77.md +133 -0
  430. package/upgrades/0.28.78.md +90 -0
  431. package/upgrades/side-effects/agent-health-alert-authority-routing.md +121 -0
  432. package/upgrades/side-effects/comprehensive-destructive-tool-containment-migration.md +82 -0
  433. package/upgrades/side-effects/deferral-detector-orphan-todo.md +101 -0
  434. package/upgrades/side-effects/lifeline-self-heal-hardening.md +151 -0
  435. package/upgrades/side-effects/relay-spawn-ghost-reply-phase1.md +139 -0
  436. package/upgrades/side-effects/telegram-delivery-robustness-layer-2.md +320 -0
  437. package/upgrades/side-effects/telegram-delivery-robustness-layer-3.md +202 -0
  438. package/upgrades/side-effects/telegram-delivery-robustness-layer-7.md +339 -0
  439. package/upgrades/side-effects/telegram-delivery-robustness.md +178 -0
  440. package/upgrades/side-effects/threadline-bridge-backfill.md +203 -0
  441. package/upgrades/side-effects/threadline-canonical-inbox-write.md +218 -0
  442. package/upgrades/side-effects/threadline-observability-tab.md +206 -0
  443. package/upgrades/side-effects/threadline-tg-bridge-module.md +196 -0
  444. package/upgrades/side-effects/threadline-tg-bridge-settings-surface.md +208 -0
  445. package/upgrades/side-effects/token-ledger-bounded-scan.md +230 -0
  446. package/upgrades/side-effects/token-ledger-phase1.md +123 -0
  447. package/upgrades/NEXT.md +0 -53
  448. /package/upgrades/side-effects/{telegram-lifeline-version-missing-info.md → 0.28.76.md} +0 -0
@@ -29,7 +29,6 @@ fi
29
29
  LOCAL_CHANGES=$(git status --porcelain 2>/dev/null | head -1)
30
30
 
31
31
  # Fetch remote (silent, with timeout)
32
- // safe-git-allow: incremental-migration
33
32
  git fetch origin --quiet 2>/dev/null &
34
33
  FETCH_PID=$!
35
34
  sleep 5 && kill "$FETCH_PID" 2>/dev/null &
@@ -56,11 +55,8 @@ fi
56
55
  # If both sides have changes, check for potential conflicts
57
56
  if [ -n "$LOCAL_CHANGES" ] && [ "${BEHIND:-0}" -gt 0 ]; then
58
57
  # Stash local changes temporarily and try a dry-run merge
59
- // safe-git-allow: incremental-migration
60
58
  git stash --quiet 2>/dev/null
61
- // safe-git-allow: incremental-migration
62
59
  MERGE_OUTPUT=$(git merge-tree "$(git merge-base HEAD "$TRACKING_BRANCH")" HEAD "$TRACKING_BRANCH" 2>/dev/null)
63
- // safe-git-allow: incremental-migration
64
60
  git stash pop --quiet 2>/dev/null
65
61
 
66
62
  if echo "$MERGE_OUTPUT" | grep -q "<<<<<<"; then
@@ -15,7 +15,17 @@
15
15
  # ('html' is reserved for trusted internal callers.)
16
16
  # When absent, the server's configured default applies.
17
17
  #
18
- # Reads INSTAR_PORT from environment (default: 4040).
18
+ # Port resolution (in order):
19
+ # 1. INSTAR_PORT environment variable (explicit operator override).
20
+ # 2. `port` field in .instar/config.json (the canonical agent-local truth).
21
+ # 3. Hardcoded fallback to 4040, with a stderr warning. This is the path
22
+ # that historically caused cross-tenant misroutes on multi-agent hosts.
23
+ #
24
+ # Auth:
25
+ # Sends `Authorization: Bearer <authToken>` AND `X-Instar-AgentId: <projectName>`
26
+ # (both read from .instar/config.json). The agent-id header lets the server
27
+ # reject auth-bearing requests that hit the wrong agent's port BEFORE token
28
+ # comparison — a token sent to the wrong server is structurally inert.
19
29
 
20
30
  FORMAT=""
21
31
 
@@ -64,12 +74,35 @@ if [ -z "$MSG" ]; then
64
74
  exit 1
65
75
  fi
66
76
 
67
- PORT="${INSTAR_PORT:-4040}"
68
-
69
- # Read auth token from config (if present)
77
+ # Resolve config-derived values from .instar/config.json (single python3
78
+ # invocation). Env > config > 4040-warn for port; config-only for authToken
79
+ # and agentId.
70
80
  AUTH_TOKEN=""
81
+ AGENT_ID=""
82
+ CONFIG_PORT=""
71
83
  if [ -f ".instar/config.json" ]; then
72
- AUTH_TOKEN=$(python3 -c "import json; print(json.load(open('.instar/config.json')).get('authToken',''))" 2>/dev/null)
84
+ CONFIG_VALUES=$(python3 -c "
85
+ import json, sys
86
+ try:
87
+ c = json.load(open('.instar/config.json'))
88
+ except Exception:
89
+ sys.exit(0)
90
+ print(c.get('authToken', ''))
91
+ print(c.get('projectName', ''))
92
+ print(c.get('port', ''))
93
+ " 2>/dev/null)
94
+ AUTH_TOKEN=$(printf '%s\n' "$CONFIG_VALUES" | sed -n '1p')
95
+ AGENT_ID=$(printf '%s\n' "$CONFIG_VALUES" | sed -n '2p')
96
+ CONFIG_PORT=$(printf '%s\n' "$CONFIG_VALUES" | sed -n '3p')
97
+ fi
98
+
99
+ if [ -n "$INSTAR_PORT" ]; then
100
+ PORT="$INSTAR_PORT"
101
+ elif [ -n "$CONFIG_PORT" ]; then
102
+ PORT="$CONFIG_PORT"
103
+ else
104
+ PORT=4040
105
+ echo "WARN: telegram-reply.sh — no INSTAR_PORT env and no port in .instar/config.json; falling back to 4040" >&2
73
106
  fi
74
107
 
75
108
  # Build JSON body (text + optional format).
@@ -89,16 +122,20 @@ if [ -z "$JSON_BODY" ]; then
89
122
  JSON_BODY="{\"text\":\"${ESCAPED}\"}"
90
123
  fi
91
124
 
125
+ # Assemble curl args. Always include X-Instar-AgentId when we can resolve it
126
+ # from config — the server uses it to reject wrong-port requests before
127
+ # evaluating the token.
128
+ CURL_ARGS=(-s -w "\n%{http_code}" -X POST "http://localhost:${PORT}/telegram/reply/${TOPIC_ID}"
129
+ -H 'Content-Type: application/json'
130
+ -d "$JSON_BODY")
92
131
  if [ -n "$AUTH_TOKEN" ]; then
93
- RESPONSE=$(curl -s -w "\n%{http_code}" -X POST "http://localhost:${PORT}/telegram/reply/${TOPIC_ID}" \
94
- -H 'Content-Type: application/json' \
95
- -H "Authorization: Bearer ${AUTH_TOKEN}" \
96
- -d "$JSON_BODY")
97
- else
98
- RESPONSE=$(curl -s -w "\n%{http_code}" -X POST "http://localhost:${PORT}/telegram/reply/${TOPIC_ID}" \
99
- -H 'Content-Type: application/json' \
100
- -d "$JSON_BODY")
132
+ CURL_ARGS+=(-H "Authorization: Bearer ${AUTH_TOKEN}")
101
133
  fi
134
+ if [ -n "$AGENT_ID" ]; then
135
+ CURL_ARGS+=(-H "X-Instar-AgentId: ${AGENT_ID}")
136
+ fi
137
+
138
+ RESPONSE=$(curl "${CURL_ARGS[@]}")
102
139
 
103
140
  HTTP_CODE=$(echo "$RESPONSE" | tail -1)
104
141
  BODY=$(echo "$RESPONSE" | sed '$d')
@@ -129,6 +166,274 @@ elif [ "$HTTP_CODE" = "422" ]; then
129
166
  echo " Revise the message (remove CLI commands, file paths, config syntax, API endpoints) and retry." >&2
130
167
  exit 1
131
168
  else
169
+ # Recoverable-class detection (spec § Layer 2b).
170
+ #
171
+ # The classification table below is the entire decision matrix. Codes
172
+ # marked recoverable get enqueued in the per-agent SQLite queue and a
173
+ # best-effort POST /events/delivery-failed is sent so the in-process
174
+ # Layer 3 sentinel can react in <1s. Anything not in the recoverable
175
+ # set is terminal (default-deny on unknown 403s, per spec round-2
176
+ # resolution).
177
+ #
178
+ # Recoverable:
179
+ # - 5xx, conn-refused (HTTP_CODE=000), DNS failure (also 000)
180
+ # - 403 with structured `agent_id_mismatch`
181
+ # - 403 with structured `rate_limited` (sentinel honors Retry-After)
182
+ # NOT recoverable here (already handled above or terminal):
183
+ # - 200 (success), 408 (ambiguous), 422 (tone gate)
184
+ # - 400, 403/revoked, 403 unstructured
185
+ RECOVERABLE=0
186
+ if [ "$HTTP_CODE" = "000" ] || \
187
+ ( [ "$HTTP_CODE" -ge 500 ] 2>/dev/null && [ "$HTTP_CODE" -le 599 ] 2>/dev/null ); then
188
+ RECOVERABLE=1
189
+ elif [ "$HTTP_CODE" = "403" ]; then
190
+ # Inspect the structured error code in the body. Unstructured 403 is
191
+ # default-deny per spec § 2b.
192
+ ERROR_CODE=$(echo "$BODY" | python3 -c 'import sys,json
193
+ try:
194
+ print(json.load(sys.stdin).get("error",""))
195
+ except Exception:
196
+ print("")' 2>/dev/null)
197
+ case "$ERROR_CODE" in
198
+ agent_id_mismatch|rate_limited)
199
+ RECOVERABLE=1
200
+ ;;
201
+ *)
202
+ RECOVERABLE=0
203
+ ;;
204
+ esac
205
+ fi
206
+
207
+ if [ "$RECOVERABLE" = "1" ]; then
208
+ # Enqueue (spec § Layer 2b). Path: <stateDir>/state/pending-relay.<agentId>.sqlite
209
+ # Mode 0600 enforced by the Node-side store; the CLI inherits umask, so
210
+ # we explicitly chmod after first create as well.
211
+ QUEUE_DIR=".instar/state"
212
+ mkdir -p "$QUEUE_DIR" 2>/dev/null
213
+ # Sanitize agent-id for filename (mirrors src/messaging/pending-relay-store.ts).
214
+ SAFE_AGENT_ID=$(printf '%s' "${AGENT_ID:-unknown}" | tr -c 'A-Za-z0-9._-' '_')
215
+ QUEUE_DB="${QUEUE_DIR}/pending-relay.${SAFE_AGENT_ID}.sqlite"
216
+
217
+ # delivery_id — UUIDv4 via python3 (already a hard dep above).
218
+ DELIVERY_ID=$(python3 -c 'import uuid; print(uuid.uuid4())' 2>/dev/null)
219
+ if [ -z "$DELIVERY_ID" ]; then
220
+ echo "Failed (HTTP $HTTP_CODE): $BODY" >&2
221
+ echo " (also: failed to generate delivery_id; queue write skipped)" >&2
222
+ exit 1
223
+ fi
224
+
225
+ # text_hash — SHA-256 of the raw text. Whitespace normalization is the
226
+ # job of higher layers; for dedup-window we just need byte-stable hashing.
227
+ TEXT_HASH=$(printf '%s' "$MSG" | shasum -a 256 2>/dev/null | awk '{print $1}')
228
+ if [ -z "$TEXT_HASH" ]; then
229
+ TEXT_HASH=$(printf '%s' "$MSG" | python3 -c 'import sys,hashlib; print(hashlib.sha256(sys.stdin.buffer.read()).hexdigest())' 2>/dev/null)
230
+ fi
231
+
232
+ # 32KB text cap (spec § 2b step 3).
233
+ MSG_BYTES=$(printf '%s' "$MSG" | wc -c | tr -d ' ')
234
+ TRUNCATED=0
235
+ QUEUE_TEXT="$MSG"
236
+ if [ "$MSG_BYTES" -gt 32768 ] 2>/dev/null; then
237
+ QUEUE_TEXT=$(printf '%s' "$MSG" | head -c 32768)
238
+ TRUNCATED=1
239
+ fi
240
+
241
+ ATTEMPTED_AT=$(date -u +%Y-%m-%dT%H:%M:%S.000Z)
242
+ NOW_EPOCH=$(date -u +%s)
243
+
244
+ # Run the queue write through python3's stdlib sqlite3 module — it
245
+ # owns schema creation, the 5s dedup window, parameterized inserts,
246
+ # and 0600-mode enforcement. python3 is already a hard dependency
247
+ # of this script (see config-parsing block above) and Python's
248
+ # stdlib sqlite3 module is universally available — more so than
249
+ # the `sqlite3` CLI binary or a node module that requires resolution
250
+ # against an installed `instar` package's node_modules. The DB file
251
+ # produced is byte-identical to one produced by `better-sqlite3`
252
+ # (same SQLite engine on disk).
253
+ #
254
+ # We pass values via environment variables; the message text comes
255
+ # via stdin so it's never escaped through a shell layer.
256
+ # Env vars must be set on `python3` (the consumer of stdin), not on
257
+ # `printf` — in `VAR=x cmd1 | cmd2`, VAR is exported only to cmd1.
258
+ printf '%s' "$QUEUE_TEXT" | \
259
+ Q_DELIVERY_ID="$DELIVERY_ID" \
260
+ Q_TOPIC_ID="$TOPIC_ID" \
261
+ Q_TEXT_HASH="$TEXT_HASH" \
262
+ Q_FORMAT="$FORMAT" \
263
+ Q_HTTP_CODE="$HTTP_CODE" \
264
+ Q_ERROR_BODY="$BODY" \
265
+ Q_PORT="$PORT" \
266
+ Q_ATTEMPTED_AT="$ATTEMPTED_AT" \
267
+ Q_TRUNCATED="$TRUNCATED" \
268
+ Q_DB_PATH="$QUEUE_DB" \
269
+ python3 -c '
270
+ import os, sqlite3, sys, json, datetime
271
+ try:
272
+ db_path = os.environ["Q_DB_PATH"]
273
+ text = sys.stdin.buffer.read()
274
+ conn = sqlite3.connect(db_path, timeout=5.0)
275
+ try:
276
+ os.chmod(db_path, 0o600)
277
+ except OSError:
278
+ pass
279
+ # Drain pragma result rows so they do not leak to stdout. Stdout is
280
+ # used to communicate the delivery_id back to the calling shell.
281
+ conn.execute("PRAGMA journal_mode = WAL").fetchall()
282
+ conn.execute("PRAGMA synchronous = NORMAL").fetchall()
283
+ conn.execute("PRAGMA busy_timeout = 5000").fetchall()
284
+ conn.execute("""CREATE TABLE IF NOT EXISTS entries (
285
+ delivery_id TEXT PRIMARY KEY,
286
+ topic_id INTEGER NOT NULL,
287
+ text_hash TEXT NOT NULL,
288
+ text BLOB NOT NULL,
289
+ format TEXT,
290
+ http_code INTEGER,
291
+ error_body TEXT,
292
+ attempted_port INTEGER,
293
+ attempted_at TEXT NOT NULL,
294
+ attempts INTEGER NOT NULL DEFAULT 1,
295
+ next_attempt_at TEXT,
296
+ state TEXT NOT NULL,
297
+ claimed_by TEXT,
298
+ status_history TEXT NOT NULL DEFAULT "[]",
299
+ truncated INTEGER NOT NULL DEFAULT 0
300
+ )""")
301
+ # Idempotent column add for older schemas.
302
+ try:
303
+ conn.execute("ALTER TABLE entries ADD COLUMN truncated INTEGER NOT NULL DEFAULT 0")
304
+ except sqlite3.OperationalError as e:
305
+ if "duplicate column name" not in str(e):
306
+ raise
307
+ conn.execute("CREATE INDEX IF NOT EXISTS idx_state_next ON entries(state, next_attempt_at)")
308
+ conn.execute("CREATE INDEX IF NOT EXISTS idx_text_hash_topic ON entries(text_hash, topic_id)")
309
+ # 5s dedup window (spec § 2b step 2).
310
+ cutoff = (datetime.datetime.now(datetime.timezone.utc) - datetime.timedelta(seconds=5)).strftime("%Y-%m-%dT%H:%M:%S.000Z")
311
+ cur = conn.execute(
312
+ "SELECT delivery_id FROM entries WHERE topic_id=? AND text_hash=? AND attempted_at>=? ORDER BY attempted_at DESC LIMIT 1",
313
+ (int(os.environ["Q_TOPIC_ID"]), os.environ["Q_TEXT_HASH"], cutoff),
314
+ )
315
+ dup = cur.fetchone()
316
+ if dup:
317
+ # Dedup match — caller already has the delivery_id. We do not
318
+ # write to stdout (caller does not consume our output; bash uses
319
+ # its own DELIVERY_ID variable).
320
+ conn.close()
321
+ sys.exit(0)
322
+ initial_history = json.dumps([
323
+ {"state": "queued", "at": os.environ["Q_ATTEMPTED_AT"], "http_code": int(os.environ["Q_HTTP_CODE"])}
324
+ ])
325
+ conn.execute(
326
+ """INSERT OR IGNORE INTO entries (
327
+ delivery_id, topic_id, text_hash, text, format,
328
+ http_code, error_body, attempted_port,
329
+ attempted_at, attempts, next_attempt_at,
330
+ state, claimed_by, status_history, truncated
331
+ ) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, 1, NULL, "queued", NULL, ?, ?)""",
332
+ (
333
+ os.environ["Q_DELIVERY_ID"],
334
+ int(os.environ["Q_TOPIC_ID"]),
335
+ os.environ["Q_TEXT_HASH"],
336
+ text,
337
+ os.environ.get("Q_FORMAT") or None,
338
+ int(os.environ["Q_HTTP_CODE"]),
339
+ os.environ.get("Q_ERROR_BODY") or None,
340
+ int(os.environ["Q_PORT"]),
341
+ os.environ["Q_ATTEMPTED_AT"],
342
+ initial_history,
343
+ int(os.environ.get("Q_TRUNCATED", "0")),
344
+ ),
345
+ )
346
+ conn.commit()
347
+ conn.close()
348
+ # Deliberately silent on success — bash consumed nothing from our stdout.
349
+ except Exception as exc:
350
+ sys.stderr.write("queue-write-failed: " + str(exc) + "\n")
351
+ sys.exit(2)
352
+ ' >/dev/null 2>&1
353
+ QUEUE_RC=$?
354
+ if [ "$QUEUE_RC" != "0" ] && command -v sqlite3 >/dev/null 2>&1; then
355
+ # Fallback: sqlite3 CLI direct write. Used only when python3's
356
+ # stdlib sqlite3 module is somehow unavailable (rare — e.g. a
357
+ # build of CPython compiled --without-sqlite). Schema is created
358
+ # with IF NOT EXISTS so this is safe even if the Python path
359
+ # partially ran.
360
+ printf '%s' "$QUEUE_TEXT" > "${QUEUE_DB}.tmp.text"
361
+ sqlite3 "$QUEUE_DB" >/dev/null 2>&1 <<SQL
362
+ PRAGMA journal_mode=WAL;
363
+ PRAGMA synchronous=NORMAL;
364
+ PRAGMA busy_timeout=5000;
365
+ CREATE TABLE IF NOT EXISTS entries (
366
+ delivery_id TEXT PRIMARY KEY,
367
+ topic_id INTEGER NOT NULL,
368
+ text_hash TEXT NOT NULL,
369
+ text BLOB NOT NULL,
370
+ format TEXT,
371
+ http_code INTEGER,
372
+ error_body TEXT,
373
+ attempted_port INTEGER,
374
+ attempted_at TEXT NOT NULL,
375
+ attempts INTEGER NOT NULL DEFAULT 1,
376
+ next_attempt_at TEXT,
377
+ state TEXT NOT NULL,
378
+ claimed_by TEXT,
379
+ status_history TEXT NOT NULL DEFAULT '[]',
380
+ truncated INTEGER NOT NULL DEFAULT 0
381
+ );
382
+ CREATE INDEX IF NOT EXISTS idx_state_next ON entries(state, next_attempt_at);
383
+ CREATE INDEX IF NOT EXISTS idx_text_hash_topic ON entries(text_hash, topic_id);
384
+ INSERT OR IGNORE INTO entries (
385
+ delivery_id, topic_id, text_hash, text, format,
386
+ http_code, error_body, attempted_port, attempted_at,
387
+ attempts, state, status_history, truncated
388
+ ) VALUES (
389
+ '$DELIVERY_ID', $TOPIC_ID, '$TEXT_HASH',
390
+ CAST(readfile('${QUEUE_DB}.tmp.text') AS BLOB), $( [ -n "$FORMAT" ] && printf "'%s'" "$FORMAT" || echo "NULL"),
391
+ $HTTP_CODE, NULL, $PORT, '$ATTEMPTED_AT',
392
+ 1, 'queued', '[]', $TRUNCATED
393
+ );
394
+ SQL
395
+ rm -f "${QUEUE_DB}.tmp.text" 2>/dev/null
396
+ chmod 600 "$QUEUE_DB" 2>/dev/null
397
+ fi
398
+
399
+ # Best-effort POST /events/delivery-failed to the SAME port the
400
+ # original send used (NOT the live config port — see spec § Layer 2c
401
+ # cross-tenant safety).
402
+ EVENT_BODY=$(EV_DELIVERY_ID="$DELIVERY_ID" \
403
+ EV_TOPIC_ID="$TOPIC_ID" \
404
+ EV_TEXT_HASH="$TEXT_HASH" \
405
+ EV_HTTP_CODE="$HTTP_CODE" \
406
+ EV_ERROR_BODY="$BODY" \
407
+ EV_PORT="$PORT" \
408
+ python3 -c '
409
+ import sys, json, os
410
+ print(json.dumps({
411
+ "delivery_id": os.environ["EV_DELIVERY_ID"],
412
+ "topic_id": int(os.environ["EV_TOPIC_ID"]),
413
+ "text_hash": os.environ["EV_TEXT_HASH"],
414
+ "http_code": int(os.environ["EV_HTTP_CODE"]),
415
+ "error_body": (os.environ.get("EV_ERROR_BODY") or "")[:1024],
416
+ "attempted_port": int(os.environ["EV_PORT"]),
417
+ "attempts": 1,
418
+ }))
419
+ ' 2>/dev/null)
420
+
421
+ if [ -n "$EVENT_BODY" ] && [ -n "$AUTH_TOKEN" ]; then
422
+ EVENT_CURL=(-s -o /dev/null -w "%{http_code}" -X POST "http://localhost:${PORT}/events/delivery-failed"
423
+ -H 'Content-Type: application/json'
424
+ -H "Authorization: Bearer ${AUTH_TOKEN}"
425
+ --max-time 2
426
+ -d "$EVENT_BODY")
427
+ if [ -n "$AGENT_ID" ]; then
428
+ EVENT_CURL+=(-H "X-Instar-AgentId: ${AGENT_ID}")
429
+ fi
430
+ curl "${EVENT_CURL[@]}" >/dev/null 2>&1 || true
431
+ fi
432
+
433
+ echo "Queued for recovery (HTTP $HTTP_CODE, delivery_id ${DELIVERY_ID%%-*}…): $BODY" >&2
434
+ exit 1
435
+ fi
436
+
132
437
  echo "Failed (HTTP $HTTP_CODE): $BODY" >&2
133
438
  exit 1
134
439
  fi
@@ -0,0 +1,133 @@
1
+ # Upgrade Guide — vNEXT
2
+
3
+ <!-- bump: minor -->
4
+
5
+ ## What Changed
6
+
7
+ Four PRs have landed on `main` since v0.28.76 without a release cut. This
8
+ upgrade publishes them together. Headline item is the new **Token Ledger**;
9
+ the other three are cluster-resilience hardening already running in CI.
10
+
11
+ ### 1. Token Ledger (read-only token-usage observability) — feat #112
12
+
13
+ A new core monitoring feature. Every agent now tails Claude Code's per-session
14
+ JSONL files at `~/.claude/projects/<encoded-cwd>/<session-uuid>.jsonl`,
15
+ parses each `assistant` line's `message.usage` block, and rolls token counts
16
+ (input, output, cache-read, cache-creation) into a SQLite ledger at
17
+ `<stateDir>/server-data/token-ledger.db`.
18
+
19
+ Surfaces:
20
+
21
+ - `GET /tokens/summary` — totals per agent, per project, per hour/day window.
22
+ - `GET /tokens/sessions` — top sessions by total tokens, with first-seen /
23
+ last-seen / message-count.
24
+ - `GET /tokens/by-project` — project-level breakdown across all sessions.
25
+ - `GET /tokens/orphans` — sessions still present in the JSONL tree with no
26
+ activity in the last 30 minutes (a signal, not an authority — does not
27
+ kill anything).
28
+ - New "Tokens" dashboard tab — top sessions, project breakdown, orphans list.
29
+
30
+ Implementation notes for future builders:
31
+
32
+ - The reader is strictly read-only against `~/.claude/projects/`. It never
33
+ opens those files for write.
34
+ - Ingest is idempotent (`INSERT OR IGNORE` on `request_id`). Mid-tick crashes
35
+ cannot double-count.
36
+ - File-rotation detected by inode change OR head-content fingerprint
37
+ (256-byte hash). The fingerprint guard was added because Linux can reuse
38
+ inode numbers on rapid unlink+recreate; macOS does not. Cross-filesystem
39
+ safe.
40
+ - The poller has a reentry guard — concurrent ticks are skipped, not stacked.
41
+ - Pure additive surface. No existing route, behavior, or DB changed.
42
+
43
+ This is Phase 1 of the token-management initiative. Phase 2 (a strategy test
44
+ harness comparing keep-alive vs. resume vs. fresh-spawn-with-summary vs.
45
+ mid-session compaction) will be designed separately once the ledger has
46
+ collected real data. Phase 3 (smarter compaction or budget enforcement) will
47
+ be informed by what 1 and 2 reveal — never bolted on without ledger evidence.
48
+
49
+ ### 2. Lifeline self-heal hardening — feat #111
50
+
51
+ Closes the three stacked failures behind Inspec's silent crash-loop on
52
+ 2026-04-29. Path-aware better-sqlite3 preflight now scans nested
53
+ `shadow-install/node_modules/instar/node_modules/better-sqlite3/...` paths
54
+ that the previous hoisted-only check missed. Adds a `consecutiveBindFailures`
55
+ counter that escalates to a forced rebuild after two back-to-back unhealthy
56
+ spawns. Replaces the brittle `process.ppid === 1` launchd-supervision check
57
+ with a multi-signal helper that also accepts an explicit env-var marker,
58
+ parent-process-name = `launchd` on darwin, and parent-name = `systemd`/`init`
59
+ on Linux — covers user-domain launchd which the old check did not. New plist
60
+ template carries the supervised marker as belt-and-suspenders.
61
+
62
+ ### 3. Threadline spawn-guard foundation (Phase 1a) — feat #110
63
+
64
+ Ledger + heartbeat watchdog + failure authority for threadline relay spawns.
65
+ Adds the structural layer underneath the relay so that spawn lifecycle
66
+ (launched → heartbeat → success | failure-classified) is recorded and
67
+ authoritative, instead of being inferred from process state. Sets up the
68
+ substrate for spawn-loop suppression in a follow-up phase.
69
+
70
+ ### 4. Threadline canonical inbox write at relay-ingest — fix #113
71
+
72
+ The relay handler had three routing branches (pipe-mode, warm-listener,
73
+ cold-spawn) but only the warm-listener branch was writing to the canonical
74
+ inbox at `.instar/threadline/inbox.jsonl.active`. The canonical file had been
75
+ frozen since 2026-04-05, hiding ~4 weeks of inbound traffic from the
76
+ dashboard, observability, and any downstream consumer of the canonical
77
+ inbox. The fix hoists a single HMAC-signed canonical-inbox append to
78
+ relay-ingest, before the branching, so all three paths converge on one
79
+ source of truth. Uses the existing HKDF-derived signing key — no new key
80
+ material, no ambient-key footgun.
81
+
82
+ ## What to Tell Your User
83
+
84
+ - **Token visibility**: I can now see exactly how many tokens I am burning,
85
+ per session, per project, per hour. There is a new Tokens tab on the
86
+ dashboard with my top sessions, project breakdown, and a list of any
87
+ sessions sitting idle. This is the foundation for managing token usage
88
+ smartly — next phases will compare different conversation strategies and
89
+ add smarter context compaction once I have real numbers in hand.
90
+ - **Quieter, more reliable startup**: I am better at recovering from a
91
+ startup hiccup involving a particular native module, and I detect when I
92
+ am being supervised by the system in more situations. Translation: fewer
93
+ silent crash-loops on the rare day the install gets into a weird state.
94
+ - **Threadline message bookkeeping**: Inbound messages routed through my
95
+ relay are now all recorded to my canonical inbox, regardless of which
96
+ internal path delivered them. Anything that was flowing only through the
97
+ per-listener queue is now also visible to the dashboard and any tool that
98
+ reads the main inbox.
99
+
100
+ ## Summary of New Capabilities
101
+
102
+ | Capability | How to Use |
103
+ |-----------|-----------|
104
+ | Per-session, per-project token rollups | Tokens dashboard tab, or GET /tokens/summary, /tokens/sessions, /tokens/by-project |
105
+ | Idle-session detector (signal only, no kill authority) | GET /tokens/orphans |
106
+ | Resilient native-module preflight on startup | Automatic on upgrade |
107
+ | Multi-signal launchd-supervised detection | Automatic on upgrade |
108
+ | Canonical threadline inbox writes from every relay path | Automatic on upgrade |
109
+ | Threadline spawn ledger + heartbeat substrate | Internal foundation; surface phases follow |
110
+
111
+ ## Evidence
112
+
113
+ This release is feature + hardening, not a one-shot bug fix, but two of the
114
+ four PRs do close concrete failures. Evidence for those:
115
+
116
+ - **#111 lifeline self-heal**: Inspec's 2026-04-29 silent crash-loop was
117
+ reproduced by inducing a load failure on the nested
118
+ `shadow-install/node_modules/instar/node_modules/better-sqlite3`. Before:
119
+ preflight reported clean, supervisor respawned into the same broken state.
120
+ After: nested path is discovered, rebuild fires after two back-to-back
121
+ unhealthy spawns, supervisor escalates instead of looping. Covered by 18
122
+ new unit tests across `detect-launchd-supervised.test.ts` and
123
+ `find-better-sqlite3-copies.test.ts`.
124
+ - **#113 canonical inbox**: Reproduction is direct — before fix,
125
+ `.instar/threadline/inbox.jsonl.active` mtime was 2026-04-05 on every
126
+ install we checked despite ongoing relay traffic. After fix, the file
127
+ receives an HMAC-signed entry at every relay-ingest. Verified end-to-end
128
+ by sending a Telegram message and observing the canonical-inbox tail.
129
+
130
+ For the token ledger, evidence is the 12 unit tests in
131
+ `tests/unit/token-ledger.test.ts` plus manual JSONL-shape verification
132
+ against real session files in `~/.claude/projects/`. Pure additive
133
+ observability — no prior failure mode to "fix."
@@ -0,0 +1,90 @@
1
+ # Upgrade Guide — vNEXT
2
+
3
+ <!-- bump: patch -->
4
+
5
+ ## What Changed
6
+
7
+ The Token Ledger (shipped in v0.28.77 as Phase 1 read-only token-usage
8
+ observability) had an unbounded synchronous first scan. On agents with
9
+ deep Claude Code history this blocked the Node event loop for minutes —
10
+ one local agent had 119,130 JSONL transcripts totaling 12 GB, and on
11
+ boot the server stopped responding to its own `/health` endpoint, which
12
+ caused the lifeline supervisor to declare the agent dead and restart it
13
+ in a loop.
14
+
15
+ This release bounds the scan in three independent ways:
16
+
17
+ 1. **Per-tick file cap** (default 500) with a persistent in-memory
18
+ cursor across ticks. The first poll backfills 500 files; the next
19
+ poll picks up where the previous one stopped; once the tree is fully
20
+ walked the cursor wraps back to the start so newly-written sessions
21
+ are still picked up.
22
+ 2. **Intra-tick yielding** (default every 25 files) via `setImmediate`.
23
+ Even within a single tick the event loop gets to drain HTTP and
24
+ health-check traffic — the server stays responsive while the ledger
25
+ is doing its work.
26
+ 3. **Optional max file age** (default 30 days at the wiring layer). The
27
+ ledger ignores transcripts whose mtime is older than the backfill
28
+ window. Active sessions are never blackholed: appending a new turn
29
+ updates the file's mtime, which brings it back into the window. The
30
+ source JSONLs in `~/.claude/projects/` remain the ground truth, so an
31
+ operator can widen the window later by passing a larger
32
+ `maxFileAgeMs` and the ledger will pick up the older data on the
33
+ next scan.
34
+
35
+ A new `scanAllAsync()` method is the path the poller now uses; the
36
+ original `scanAll()` sync entry point is preserved for tests and any
37
+ caller that doesn't need yielding (and now honors the per-tick cap and
38
+ age cutoff too).
39
+
40
+ No schema migration. No new routes. No new external surfaces. Pure
41
+ containment fix for the v0.28.77 regression.
42
+
43
+ ## What to Tell Your User
44
+
45
+ - **Quieter, more reliable startup with the new Tokens tab**: I no
46
+ longer get stuck staring at years of old session transcripts on boot.
47
+ When I start up, I look at the most recent month of activity first,
48
+ in small batches, and I keep answering you in between batches. The
49
+ Tokens tab will fill in over the first few minutes instead of being
50
+ empty until everything has been read at once.
51
+
52
+ ## Summary of New Capabilities
53
+
54
+ | Capability | How to Use |
55
+ |-----------|-----------|
56
+ | Bounded first-boot scan of the token ledger | Automatic on upgrade |
57
+ | Configurable backfill window for the token ledger | Pass `maxFileAgeMs` to TokenLedger constructor (defaults to 30 days at the wiring layer) |
58
+ | Per-tick scan cap and event-loop yielding | Automatic on upgrade |
59
+
60
+ ## Evidence
61
+
62
+ Reproduction (before fix):
63
+
64
+ 1. Start v0.28.77 on a host with deep Claude Code history (the local
65
+ reproduction host had 119,130 JSONL files / 12 GB under
66
+ `~/.claude/projects/`).
67
+ 2. `curl -m 5 http://localhost:4042/health` hangs — connection accepted
68
+ but no response within timeout.
69
+ 3. `sample <pid> 1` shows the main thread spending 100% of its time in
70
+ `uv_fs_stat` callbacks under
71
+ `Builtins_InterpreterEntryTrampoline` — a JS loop hammering the
72
+ filesystem with no event-loop yields.
73
+ 4. The lifeline supervisor's health probe times out, declares the
74
+ server unhealthy, and restarts it. The next boot starts the same
75
+ scan over again.
76
+
77
+ After fix:
78
+
79
+ 1. Same host, same `~/.claude/projects/` tree. Server boots and
80
+ `curl http://localhost:4042/health` returns a normal JSON response
81
+ within a few hundred ms.
82
+ 2. `curl http://localhost:4042/tokens/summary` returns valid JSON
83
+ immediately (initially with a small subset of recent sessions).
84
+ Subsequent ticks fill in the rest of the 30-day window.
85
+ 3. The lifeline supervisor sees a healthy server and stops restarting.
86
+
87
+ Unit tests: `tests/unit/token-ledger.test.ts` — 15/15 passing locally
88
+ on the `fix/token-ledger-bounded-scan` branch. Three new tests cover
89
+ the cursor resume, age cutoff, and async yielding behavior. Typecheck
90
+ clean (`npx tsc --noEmit`).