@firstlovecenter/ai-chat 0.2.0 → 0.2.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md ADDED
@@ -0,0 +1,86 @@
1
+ # Changelog
2
+
3
+ All notable changes to `@firstlovecenter/ai-chat` are documented here.
4
+
5
+ The format follows [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
6
+ and the project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
+
8
+ ## [0.2.2] — 2026-05-08
9
+
10
+ ### Added
11
+
12
+ - **Live elapsed counter while the agent is thinking** — the inline thinking indicator now shows `Thinking… (Xs)` with a live ticker that advances every second. Suppresses sub-second values (no flicker on instant responses).
13
+ - **Final duration after the action buttons** — once the agent marks a turn done, the wall-clock duration appears to the right of the Copy / Retry buttons (`5s`, `1m`, `2m 30s` style). Restored history (sessions loaded from the DB) shows nothing here since durations weren't persisted.
14
+
15
+ The new `AnswerState.startedAt` and `AnswerState.durationMs` fields are stamped at submit-time and on every `done: true` flip respectively. Format helper `formatDuration(ms)` produces `Xs`, `Xm`, or `Xm Ys`.
16
+
17
+ ## [0.2.1] — 2026-05-08
18
+
19
+ ### Fixed
20
+
21
+ - **Chat "Thinking…" indicator visibility** — the spinning indicator was being rendered at the bottom of the thread, below the input pill, where users had to scroll to see it. The thinking state now shows the animated spinner inline next to the AI sparkle avatar where the response will appear, and the redundant bottom indicator is removed. Closes a UX bug reported during initial smoke-testing of `<AiChat />` from the package.
22
+
23
+ ## [0.2.0] — 2026-05-08
24
+
25
+ ### Added
26
+
27
+ - **`ConfigureAiChatOpts.rolePrompt`** — host-supplied persona prepended to the system blocks on every turn. Accepts a `string` for a static org-wide role or a `(ctx) => string | Promise<string>` function for per-request variation (e.g. different role per scope, locale, or A/B cohort). The role text is marked `cached: true` so it benefits from Anthropic's ephemeral prompt cache and Vertex's automatic prefix cache. Sent every turn (stateless API), but with caching the marginal cost after the first turn rounds to zero for typical 200–500 token roles.
28
+
29
+ ### Fixed
30
+
31
+ - **Cache marker cap (Claude provider)** — Anthropic limits `cache_control` markers to 4 per request. The provider now silently keeps the marker on the first 4 cached system blocks and drops the hint from any beyond, rather than letting the request fail. This guards against the natural growth of cached blocks once `rolePrompt` is in use (`role` + `system` + `schema` + `semantic-layer` + `scope-summary` would have been 5).
32
+
33
+ ## [0.1.1] — 2026-05-08
34
+
35
+ ### Fixed
36
+
37
+ - **`AGENT_NO_PRESENT` fallback** — when the model emits text but never calls `present()` (common for advisory questions like "what should we do to increase X?"), the agent loop now wraps the model's text as a single-block `paragraph_brief` `PresentPayload` instead of failing hard. The user always gets a usable answer; the genuine dead end (no tools AND no text) still surfaces the error. The `SELF_VERIFY_REQUIRED` gate is unaffected — it still fires only when the model explicitly calls `present()` with `raw_numbers` it might have hallucinated.
38
+
39
+ ## [0.1.0] — 2026-05-08
40
+
41
+ Initial public release on npm under `@firstlovecenter/ai-chat`.
42
+
43
+ ### Added
44
+
45
+ - **`configureAiChat({...})` runtime factory** — single entry point that wires the package's agent loop, narrators, providers, and three route factories with host-supplied ports. Returns `{ runAgent, routes, registries }`.
46
+ - **Six injected ports**:
47
+ - `PersistencePort` — domain-shaped CRUD for `chat_sessions`, `chat_messages`, `ai_settings`.
48
+ - `AuthPort<S>` — `requireAuth(req)` + `isSuperAdmin(scope)`.
49
+ - `ScopePort<S>` — `resolveScopeLabel` + `buildScopeSummary` (UI chip + system-prompt context).
50
+ - `ToolsPort` — host's tool registry + `buildSystemBlocks(ctx)`.
51
+ - `VertexPort` — `projectId`, `defaultLocation`, `auth: GoogleAuth` (host constructs; package never reads `process.env.GCP_*`), `modelIds`.
52
+ - `LoggerPort` — optional structured logging.
53
+ - **Three reference persistence adapters** behind the same `PersistencePort` contract:
54
+ - `createDrizzlePersistence(db)` — drizzle-orm MySQL (canonical table objects shipped from `@firstlovecenter/ai-chat/server/drizzle`).
55
+ - `createPrismaPersistence(prisma)` — Prisma client (schema fragment shipped at `prisma/chat-models.prisma` for hosts to paste).
56
+ - `createMemoryPersistence()` — in-memory adapter for the package's own contract tests (internal-only).
57
+ - **Three route factories** returning Next-compatible handlers:
58
+ - `agentCustom` — POST SSE handler (Custom chat protocol).
59
+ - `chatSessions` — list/POST + detail GET/PATCH/DELETE.
60
+ - `adminSettings` — GET/PATCH for `tool_provider`, `gcp_location`, `chat_interface` (validated against registries).
61
+ - **Lifecycle hooks** consumed by route factories so consuming hosts can plumb in project-specific concerns without forking:
62
+ - `RouteHooks<S>` (all routes): `onRequest` (before auth — shutdown gating returns 503), `onAuthenticated` (after auth — rate limiting returns 429).
63
+ - `AgentCustomHooks<S>` (SSE route only): adds `generateSessionId` (per-request session id used for `ToolContext.sessionId` and downstream resources like SQL view names), `onSessionStart` (per-request setup, e.g. `CREATE VIEW`), `onSessionEnd` (always-runs cleanup with `cause: 'complete' | 'error' | 'abort'`).
64
+ - **Two registries** the host's settings UI maps over:
65
+ - `toolProviders` — built-in `claude` + `gemini` Vertex AI adapters; extensible via `extraToolProviders`.
66
+ - `chatInterfaces` — `custom` (this package's UI) + `vercel` (placeholder for the Vercel AI SDK build, landing in v0.x).
67
+ - **Custom chat UI** — `AiChat` and `AnswerBlocks` exported from `@firstlovecenter/ai-chat/ui`. Logic lifted byte-for-byte from the original FLC implementation; only host-coupled imports were rewritten.
68
+ - **MIT license**, public `publishConfig`, peer-deps marked optional for `drizzle-orm` and `@prisma/client` so a host installing only one ORM never pulls the other.
69
+
70
+ ### Subpath exports
71
+
72
+ - `@firstlovecenter/ai-chat/server` — runtime factory, agent loop, ports, registries, types.
73
+ - `@firstlovecenter/ai-chat/server/drizzle` — drizzle adapter + canonical table objects.
74
+ - `@firstlovecenter/ai-chat/server/prisma` — prisma adapter + schema fragment string.
75
+ - `@firstlovecenter/ai-chat/ui` — `AiChat`, `AnswerBlocks`, registries.
76
+
77
+ ### Build
78
+
79
+ - Dual ESM + CJS output via `tsup` with full `.d.ts` (and `.d.cts`) generation.
80
+ - UI bundle gets a post-build `'use client';` directive injection (tsup otherwise strips module-level directives during bundling, breaking RSC consumers that import the UI from a server file).
81
+
82
+ [0.2.2]: https://github.com/firstlovecenter/flc-ai-chat/compare/v0.2.1...v0.2.2
83
+ [0.2.1]: https://github.com/firstlovecenter/flc-ai-chat/compare/v0.2.0...v0.2.1
84
+ [0.2.0]: https://github.com/firstlovecenter/flc-ai-chat/compare/v0.1.1...v0.2.0
85
+ [0.1.1]: https://github.com/firstlovecenter/flc-ai-chat/compare/v0.1.0...v0.1.1
86
+ [0.1.0]: https://github.com/firstlovecenter/flc-ai-chat/releases/tag/v0.1.0
package/dist/ui/index.cjs CHANGED
@@ -544,6 +544,14 @@ function AiLineChart({
544
544
  ))
545
545
  ] }) }) });
546
546
  }
547
+ function formatDuration(ms) {
548
+ if (ms < 0) ms = 0;
549
+ const totalSec = Math.round(ms / 1e3);
550
+ if (totalSec < 60) return `${totalSec}s`;
551
+ const m = Math.floor(totalSec / 60);
552
+ const s = totalSec % 60;
553
+ return s === 0 ? `${m}m` : `${m}m ${s}s`;
554
+ }
547
555
  var PROVIDER_LABELS = {
548
556
  claude: "Claude",
549
557
  grok: "Grok",
@@ -718,7 +726,7 @@ function AiChat({
718
726
  }
719
727
  setAnswers((prev) => [
720
728
  ...prev,
721
- { question: trimmed, blocks: [], done: false }
729
+ { question: trimmed, blocks: [], done: false, startedAt: Date.now() }
722
730
  ]);
723
731
  const ac = new AbortController();
724
732
  abortRef.current = ac;
@@ -739,6 +747,7 @@ function AiChat({
739
747
  (prev) => updateLast(prev, (a) => ({
740
748
  ...a,
741
749
  done: true,
750
+ durationMs: a.startedAt != null ? Date.now() - a.startedAt : void 0,
742
751
  error: { code: "NETWORK", message }
743
752
  }))
744
753
  );
@@ -750,6 +759,7 @@ function AiChat({
750
759
  (prev) => updateLast(prev, (a) => ({
751
760
  ...a,
752
761
  done: true,
762
+ durationMs: a.startedAt != null ? Date.now() - a.startedAt : void 0,
753
763
  error: { code: "NO_BODY", message: "No response stream." }
754
764
  }))
755
765
  );
@@ -776,6 +786,7 @@ function AiChat({
776
786
  (prev) => updateLast(prev, (a) => ({
777
787
  ...a,
778
788
  done: true,
789
+ durationMs: a.startedAt != null ? Date.now() - a.startedAt : void 0,
779
790
  error: { code: "STREAM", message }
780
791
  }))
781
792
  );
@@ -808,7 +819,7 @@ function AiChat({
808
819
  /* @__PURE__ */ jsxRuntime.jsxs(
809
820
  "aside",
810
821
  {
811
- "aria-hidden": !sidebarOpen,
822
+ inert: !sidebarOpen,
812
823
  className: cn(
813
824
  "absolute inset-y-0 left-0 z-20 flex w-72 max-w-[85vw] flex-col border-r border-border bg-sidebar text-sidebar-foreground shadow-lg transition-transform duration-200 ease-out",
814
825
  sidebarOpen ? "translate-x-0" : "-translate-x-full"
@@ -940,22 +951,16 @@ function AiChat({
940
951
  ] }) : heroVisible ? /* @__PURE__ */ jsxRuntime.jsxs("div", { className: "flex h-full flex-col items-center justify-center text-center", children: [
941
952
  /* @__PURE__ */ jsxRuntime.jsx("h1", { className: "text-2xl font-medium tracking-tight text-foreground sm:text-3xl", children: greeting }),
942
953
  /* @__PURE__ */ jsxRuntime.jsx("p", { className: "mt-2 text-2xl font-light tracking-tight text-muted-foreground sm:text-3xl", children: "What's on your mind?" })
943
- ] }) : /* @__PURE__ */ jsxRuntime.jsxs("div", { className: "mx-auto flex w-full max-w-3xl flex-col gap-6", children: [
944
- answers.map((a, idx) => /* @__PURE__ */ jsxRuntime.jsx(
945
- AnswerView,
946
- {
947
- answer: a,
948
- onRetry: () => void submit(a.question),
949
- forwardRef: idx === answers.length - 1 ? lastAnswerRef : void 0,
950
- isLast: idx === answers.length - 1
951
- },
952
- idx
953
- )),
954
- pending && /* @__PURE__ */ jsxRuntime.jsxs("p", { className: "text-sm text-muted-foreground", children: [
955
- /* @__PURE__ */ jsxRuntime.jsx(lucideReact.Loader2, { className: "mr-1 inline size-3.5 animate-spin" }),
956
- "Thinking\u2026"
957
- ] })
958
- ] })
954
+ ] }) : /* @__PURE__ */ jsxRuntime.jsx("div", { className: "mx-auto flex w-full max-w-3xl flex-col gap-6", children: answers.map((a, idx) => /* @__PURE__ */ jsxRuntime.jsx(
955
+ AnswerView,
956
+ {
957
+ answer: a,
958
+ onRetry: () => void submit(a.question),
959
+ forwardRef: idx === answers.length - 1 ? lastAnswerRef : void 0,
960
+ isLast: idx === answers.length - 1
961
+ },
962
+ idx
963
+ )) })
959
964
  }
960
965
  ),
961
966
  /* @__PURE__ */ jsxRuntime.jsx("div", { className: "shrink-0 px-4 pb-4 pt-2", children: /* @__PURE__ */ jsxRuntime.jsxs(
@@ -1080,6 +1085,14 @@ function AnswerView({
1080
1085
  }, [answer.blocks, answer.error]);
1081
1086
  const showActions = answer.done && (answer.blocks.length > 0 || answer.error != null);
1082
1087
  const isThinking = !answer.done && answer.blocks.length === 0 && !answer.error;
1088
+ const [, forceTick] = React.useState(0);
1089
+ React.useEffect(() => {
1090
+ if (answer.done || answer.startedAt == null) return;
1091
+ const id = window.setInterval(() => forceTick((n) => n + 1), 1e3);
1092
+ return () => window.clearInterval(id);
1093
+ }, [answer.done, answer.startedAt]);
1094
+ const liveElapsed = answer.startedAt != null ? Date.now() - answer.startedAt : null;
1095
+ const finalDuration = answer.durationMs;
1083
1096
  return /* @__PURE__ */ jsxRuntime.jsxs(
1084
1097
  "div",
1085
1098
  {
@@ -1096,7 +1109,15 @@ function AnswerView({
1096
1109
  /* @__PURE__ */ jsxRuntime.jsx("div", { className: "flex justify-end", children: /* @__PURE__ */ jsxRuntime.jsx(UserChip, { text: answer.question }) }),
1097
1110
  isThinking ? /* @__PURE__ */ jsxRuntime.jsxs("div", { className: "flex items-center gap-3", children: [
1098
1111
  /* @__PURE__ */ jsxRuntime.jsx("div", { className: "flex size-7 shrink-0 items-center justify-center rounded-full bg-primary/10 text-primary", children: /* @__PURE__ */ jsxRuntime.jsx(lucideReact.Sparkles, { className: "size-4" }) }),
1099
- /* @__PURE__ */ jsxRuntime.jsx("p", { className: "text-sm text-muted-foreground", children: "Thinking\u2026" })
1112
+ /* @__PURE__ */ jsxRuntime.jsxs("p", { className: "flex items-center text-sm text-muted-foreground", children: [
1113
+ /* @__PURE__ */ jsxRuntime.jsx(lucideReact.Loader2, { className: "mr-1 inline size-3.5 animate-spin" }),
1114
+ "Thinking\u2026",
1115
+ liveElapsed != null && liveElapsed >= 1e3 && /* @__PURE__ */ jsxRuntime.jsxs("span", { className: "ml-1 tabular-nums", children: [
1116
+ "(",
1117
+ formatDuration(liveElapsed),
1118
+ ")"
1119
+ ] })
1120
+ ] })
1100
1121
  ] }) : /* @__PURE__ */ jsxRuntime.jsxs("div", { className: "flex items-start gap-3", children: [
1101
1122
  /* @__PURE__ */ jsxRuntime.jsx("div", { className: "mt-0.5 flex size-7 shrink-0 items-center justify-center rounded-full bg-primary/10 text-primary", children: /* @__PURE__ */ jsxRuntime.jsx(lucideReact.Sparkles, { className: "size-4" }) }),
1102
1123
  /* @__PURE__ */ jsxRuntime.jsxs("div", { className: "flex min-w-0 flex-1 flex-col gap-3", children: [
@@ -1128,6 +1149,14 @@ function AnswerView({
1128
1149
  className: "inline-flex size-8 items-center justify-center rounded-md text-muted-foreground hover:bg-accent hover:text-foreground",
1129
1150
  children: /* @__PURE__ */ jsxRuntime.jsx(lucideReact.RotateCcw, { className: "size-4" })
1130
1151
  }
1152
+ ),
1153
+ finalDuration != null && finalDuration >= 1e3 && /* @__PURE__ */ jsxRuntime.jsx(
1154
+ "span",
1155
+ {
1156
+ className: "ml-1 text-xs text-muted-foreground tabular-nums",
1157
+ title: "Time taken to generate this response",
1158
+ children: formatDuration(finalDuration)
1159
+ }
1131
1160
  )
1132
1161
  ] })
1133
1162
  ] })
@@ -1333,7 +1362,13 @@ function handleEvent(raw, setAnswers) {
1333
1362
  })
1334
1363
  );
1335
1364
  } else if (event === "done") {
1336
- setAnswers((prev) => updateLast(prev, (a) => ({ ...a, done: true })));
1365
+ setAnswers(
1366
+ (prev) => updateLast(prev, (a) => ({
1367
+ ...a,
1368
+ done: true,
1369
+ durationMs: a.startedAt != null ? Date.now() - a.startedAt : void 0
1370
+ }))
1371
+ );
1337
1372
  } else if (event === "error") {
1338
1373
  setAnswers(
1339
1374
  (prev) => updateLast(prev, (a) => ({