@firstlovecenter/ai-chat 0.2.0 → 0.2.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +76 -0
- package/dist/ui/index.cjs +14 -17
- package/dist/ui/index.cjs.map +1 -1
- package/dist/ui/index.js +14 -17
- package/dist/ui/index.js.map +1 -1
- package/package.json +3 -2
package/CHANGELOG.md
ADDED
|
@@ -0,0 +1,76 @@
|
|
|
1
|
+
# Changelog
|
|
2
|
+
|
|
3
|
+
All notable changes to `@firstlovecenter/ai-chat` are documented here.
|
|
4
|
+
|
|
5
|
+
The format follows [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
|
6
|
+
and the project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
|
7
|
+
|
|
8
|
+
## [0.2.1] — 2026-05-08
|
|
9
|
+
|
|
10
|
+
### Fixed
|
|
11
|
+
|
|
12
|
+
- **Chat "Thinking…" indicator visibility** — the spinning indicator was being rendered at the bottom of the thread, below the input pill, where users had to scroll to see it. The thinking state now shows the animated spinner inline next to the AI sparkle avatar where the response will appear, and the redundant bottom indicator is removed. Closes a UX bug reported during initial smoke-testing of `<AiChat />` from the package.
|
|
13
|
+
|
|
14
|
+
## [0.2.0] — 2026-05-08
|
|
15
|
+
|
|
16
|
+
### Added
|
|
17
|
+
|
|
18
|
+
- **`ConfigureAiChatOpts.rolePrompt`** — host-supplied persona prepended to the system blocks on every turn. Accepts a `string` for a static org-wide role or a `(ctx) => string | Promise<string>` function for per-request variation (e.g. different role per scope, locale, or A/B cohort). The role text is marked `cached: true` so it benefits from Anthropic's ephemeral prompt cache and Vertex's automatic prefix cache. Sent every turn (stateless API), but with caching the marginal cost after the first turn rounds to zero for typical 200–500 token roles.
|
|
19
|
+
|
|
20
|
+
### Fixed
|
|
21
|
+
|
|
22
|
+
- **Cache marker cap (Claude provider)** — Anthropic limits `cache_control` markers to 4 per request. The provider now silently keeps the marker on the first 4 cached system blocks and drops the hint from any beyond, rather than letting the request fail. This guards against the natural growth of cached blocks once `rolePrompt` is in use (`role` + `system` + `schema` + `semantic-layer` + `scope-summary` would have been 5).
|
|
23
|
+
|
|
24
|
+
## [0.1.1] — 2026-05-08
|
|
25
|
+
|
|
26
|
+
### Fixed
|
|
27
|
+
|
|
28
|
+
- **`AGENT_NO_PRESENT` fallback** — when the model emits text but never calls `present()` (common for advisory questions like "what should we do to increase X?"), the agent loop now wraps the model's text as a single-block `paragraph_brief` `PresentPayload` instead of failing hard. The user always gets a usable answer; the genuine dead end (no tools AND no text) still surfaces the error. The `SELF_VERIFY_REQUIRED` gate is unaffected — it still fires only when the model explicitly calls `present()` with `raw_numbers` it might have hallucinated.
|
|
29
|
+
|
|
30
|
+
## [0.1.0] — 2026-05-08
|
|
31
|
+
|
|
32
|
+
Initial public release on npm under `@firstlovecenter/ai-chat`.
|
|
33
|
+
|
|
34
|
+
### Added
|
|
35
|
+
|
|
36
|
+
- **`configureAiChat({...})` runtime factory** — single entry point that wires the package's agent loop, narrators, providers, and three route factories with host-supplied ports. Returns `{ runAgent, routes, registries }`.
|
|
37
|
+
- **Six injected ports**:
|
|
38
|
+
- `PersistencePort` — domain-shaped CRUD for `chat_sessions`, `chat_messages`, `ai_settings`.
|
|
39
|
+
- `AuthPort<S>` — `requireAuth(req)` + `isSuperAdmin(scope)`.
|
|
40
|
+
- `ScopePort<S>` — `resolveScopeLabel` + `buildScopeSummary` (UI chip + system-prompt context).
|
|
41
|
+
- `ToolsPort` — host's tool registry + `buildSystemBlocks(ctx)`.
|
|
42
|
+
- `VertexPort` — `projectId`, `defaultLocation`, `auth: GoogleAuth` (host constructs; package never reads `process.env.GCP_*`), `modelIds`.
|
|
43
|
+
- `LoggerPort` — optional structured logging.
|
|
44
|
+
- **Three reference persistence adapters** behind the same `PersistencePort` contract:
|
|
45
|
+
- `createDrizzlePersistence(db)` — drizzle-orm MySQL (canonical table objects shipped from `@firstlovecenter/ai-chat/server/drizzle`).
|
|
46
|
+
- `createPrismaPersistence(prisma)` — Prisma client (schema fragment shipped at `prisma/chat-models.prisma` for hosts to paste).
|
|
47
|
+
- `createMemoryPersistence()` — in-memory adapter for the package's own contract tests (internal-only).
|
|
48
|
+
- **Three route factories** returning Next-compatible handlers:
|
|
49
|
+
- `agentCustom` — POST SSE handler (Custom chat protocol).
|
|
50
|
+
- `chatSessions` — list/POST + detail GET/PATCH/DELETE.
|
|
51
|
+
- `adminSettings` — GET/PATCH for `tool_provider`, `gcp_location`, `chat_interface` (validated against registries).
|
|
52
|
+
- **Lifecycle hooks** consumed by route factories so consuming hosts can plumb in project-specific concerns without forking:
|
|
53
|
+
- `RouteHooks<S>` (all routes): `onRequest` (before auth — shutdown gating returns 503), `onAuthenticated` (after auth — rate limiting returns 429).
|
|
54
|
+
- `AgentCustomHooks<S>` (SSE route only): adds `generateSessionId` (per-request session id used for `ToolContext.sessionId` and downstream resources like SQL view names), `onSessionStart` (per-request setup, e.g. `CREATE VIEW`), `onSessionEnd` (always-runs cleanup with `cause: 'complete' | 'error' | 'abort'`).
|
|
55
|
+
- **Two registries** the host's settings UI maps over:
|
|
56
|
+
- `toolProviders` — built-in `claude` + `gemini` Vertex AI adapters; extensible via `extraToolProviders`.
|
|
57
|
+
- `chatInterfaces` — `custom` (this package's UI) + `vercel` (placeholder for the Vercel AI SDK build, landing in v0.x).
|
|
58
|
+
- **Custom chat UI** — `AiChat` and `AnswerBlocks` exported from `@firstlovecenter/ai-chat/ui`. Logic lifted byte-for-byte from the original FLC implementation; only host-coupled imports were rewritten.
|
|
59
|
+
- **MIT license**, public `publishConfig`, peer-deps marked optional for `drizzle-orm` and `@prisma/client` so a host installing only one ORM never pulls the other.
|
|
60
|
+
|
|
61
|
+
### Subpath exports
|
|
62
|
+
|
|
63
|
+
- `@firstlovecenter/ai-chat/server` — runtime factory, agent loop, ports, registries, types.
|
|
64
|
+
- `@firstlovecenter/ai-chat/server/drizzle` — drizzle adapter + canonical table objects.
|
|
65
|
+
- `@firstlovecenter/ai-chat/server/prisma` — prisma adapter + schema fragment string.
|
|
66
|
+
- `@firstlovecenter/ai-chat/ui` — `AiChat`, `AnswerBlocks`, registries.
|
|
67
|
+
|
|
68
|
+
### Build
|
|
69
|
+
|
|
70
|
+
- Dual ESM + CJS output via `tsup` with full `.d.ts` (and `.d.cts`) generation.
|
|
71
|
+
- UI bundle gets a post-build `'use client';` directive injection (tsup otherwise strips module-level directives during bundling, breaking RSC consumers that import the UI from a server file).
|
|
72
|
+
|
|
73
|
+
[0.2.1]: https://github.com/firstlovecenter/flc-ai-chat/compare/v0.2.0...v0.2.1
|
|
74
|
+
[0.2.0]: https://github.com/firstlovecenter/flc-ai-chat/compare/v0.1.1...v0.2.0
|
|
75
|
+
[0.1.1]: https://github.com/firstlovecenter/flc-ai-chat/compare/v0.1.0...v0.1.1
|
|
76
|
+
[0.1.0]: https://github.com/firstlovecenter/flc-ai-chat/releases/tag/v0.1.0
|
package/dist/ui/index.cjs
CHANGED
|
@@ -940,22 +940,16 @@ function AiChat({
|
|
|
940
940
|
] }) : heroVisible ? /* @__PURE__ */ jsxRuntime.jsxs("div", { className: "flex h-full flex-col items-center justify-center text-center", children: [
|
|
941
941
|
/* @__PURE__ */ jsxRuntime.jsx("h1", { className: "text-2xl font-medium tracking-tight text-foreground sm:text-3xl", children: greeting }),
|
|
942
942
|
/* @__PURE__ */ jsxRuntime.jsx("p", { className: "mt-2 text-2xl font-light tracking-tight text-muted-foreground sm:text-3xl", children: "What's on your mind?" })
|
|
943
|
-
] }) : /* @__PURE__ */ jsxRuntime.
|
|
944
|
-
|
|
945
|
-
|
|
946
|
-
|
|
947
|
-
|
|
948
|
-
|
|
949
|
-
|
|
950
|
-
|
|
951
|
-
|
|
952
|
-
|
|
953
|
-
)),
|
|
954
|
-
pending && /* @__PURE__ */ jsxRuntime.jsxs("p", { className: "text-sm text-muted-foreground", children: [
|
|
955
|
-
/* @__PURE__ */ jsxRuntime.jsx(lucideReact.Loader2, { className: "mr-1 inline size-3.5 animate-spin" }),
|
|
956
|
-
"Thinking\u2026"
|
|
957
|
-
] })
|
|
958
|
-
] })
|
|
943
|
+
] }) : /* @__PURE__ */ jsxRuntime.jsx("div", { className: "mx-auto flex w-full max-w-3xl flex-col gap-6", children: answers.map((a, idx) => /* @__PURE__ */ jsxRuntime.jsx(
|
|
944
|
+
AnswerView,
|
|
945
|
+
{
|
|
946
|
+
answer: a,
|
|
947
|
+
onRetry: () => void submit(a.question),
|
|
948
|
+
forwardRef: idx === answers.length - 1 ? lastAnswerRef : void 0,
|
|
949
|
+
isLast: idx === answers.length - 1
|
|
950
|
+
},
|
|
951
|
+
idx
|
|
952
|
+
)) })
|
|
959
953
|
}
|
|
960
954
|
),
|
|
961
955
|
/* @__PURE__ */ jsxRuntime.jsx("div", { className: "shrink-0 px-4 pb-4 pt-2", children: /* @__PURE__ */ jsxRuntime.jsxs(
|
|
@@ -1096,7 +1090,10 @@ function AnswerView({
|
|
|
1096
1090
|
/* @__PURE__ */ jsxRuntime.jsx("div", { className: "flex justify-end", children: /* @__PURE__ */ jsxRuntime.jsx(UserChip, { text: answer.question }) }),
|
|
1097
1091
|
isThinking ? /* @__PURE__ */ jsxRuntime.jsxs("div", { className: "flex items-center gap-3", children: [
|
|
1098
1092
|
/* @__PURE__ */ jsxRuntime.jsx("div", { className: "flex size-7 shrink-0 items-center justify-center rounded-full bg-primary/10 text-primary", children: /* @__PURE__ */ jsxRuntime.jsx(lucideReact.Sparkles, { className: "size-4" }) }),
|
|
1099
|
-
/* @__PURE__ */ jsxRuntime.
|
|
1093
|
+
/* @__PURE__ */ jsxRuntime.jsxs("p", { className: "flex items-center text-sm text-muted-foreground", children: [
|
|
1094
|
+
/* @__PURE__ */ jsxRuntime.jsx(lucideReact.Loader2, { className: "mr-1 inline size-3.5 animate-spin" }),
|
|
1095
|
+
"Thinking\u2026"
|
|
1096
|
+
] })
|
|
1100
1097
|
] }) : /* @__PURE__ */ jsxRuntime.jsxs("div", { className: "flex items-start gap-3", children: [
|
|
1101
1098
|
/* @__PURE__ */ jsxRuntime.jsx("div", { className: "mt-0.5 flex size-7 shrink-0 items-center justify-center rounded-full bg-primary/10 text-primary", children: /* @__PURE__ */ jsxRuntime.jsx(lucideReact.Sparkles, { className: "size-4" }) }),
|
|
1102
1099
|
/* @__PURE__ */ jsxRuntime.jsxs("div", { className: "flex min-w-0 flex-1 flex-col gap-3", children: [
|