@ainyc/canonry 1.48.2 → 1.48.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -2,7 +2,7 @@
2
2
 
3
3
  [![npm version](https://img.shields.io/npm/v/@ainyc/canonry)](https://www.npmjs.com/package/@ainyc/canonry) [![License: FSL-1.1-ALv2](https://img.shields.io/badge/License-FSL--1.1--ALv2-blue.svg)](https://fsl.software/) [![Node.js >= 22.14](https://img.shields.io/badge/node-%3E%3D22.14-brightgreen)](https://nodejs.org)
4
4
 
5
- Canonry is an agent-first AEO platform powered by [OpenClaw](https://openclaw.ai). It tracks how ChatGPT, Gemini, Claude, and Perplexity cite your site, detects regressions, diagnoses causes, coordinates fixes, and reports results.
5
+ Canonry is an agent-first AEO platform CLI- and API-native, with a bundled AI agent. It tracks how ChatGPT, Gemini, Claude, and Perplexity cite your site, detects regressions, diagnoses causes, coordinates fixes, and reports results.
6
6
 
7
7
  AEO (Answer Engine Optimization) is about making sure your content shows up accurately in AI-generated answers. As search shifts from links to synthesized responses, you need something that can monitor, analyze, and act across these engines continuously.
8
8
 
@@ -15,7 +15,7 @@ npm install -g @ainyc/canonry
15
15
  canonry agent setup
16
16
  ```
17
17
 
18
- One command. It installs [OpenClaw](https://openclaw.ai), configures the agent's LLM, sets up monitoring providers, and seeds the workspace. Interactive prompts guide you through everything, or pass flags for fully automated setup:
18
+ One command. It installs the agent runtime, configures the agent's LLM, sets up monitoring providers, and seeds the workspace. Interactive prompts guide you through everything, or pass flags for fully automated setup:
19
19
 
20
20
  ```bash
21
21
  canonry agent setup --gemini-key <key> --agent-key <key> --format json
@@ -42,7 +42,7 @@ canonry serve
42
42
 
43
43
  ## What the Agent Does
44
44
 
45
- The Canonry agent ("Aero") is an [OpenClaw](https://openclaw.ai)-powered operator:
45
+ The Canonry agent ("Aero") is an autonomous operator:
46
46
 
47
47
  - **Monitors** visibility sweeps across providers on schedule, tracking citation changes over time
48
48
  - **Analyzes** regressions, emerging opportunities, and correlates visibility shifts with site changes
@@ -53,7 +53,7 @@ Every action the agent takes goes through the same CLI and API available to ever
53
53
 
54
54
  ## Features
55
55
 
56
- - **Agent-operated.** The OpenClaw agent monitors, analyzes, and acts autonomously. Humans supervise via the dashboard.
56
+ - **Agent-operated.** The bundled agent monitors, analyzes, and acts autonomously. Humans supervise via the dashboard.
57
57
  - **Multi-provider.** Query Gemini, OpenAI, Claude, Perplexity, and local LLMs from a single platform.
58
58
  - **Config-as-code.** Kubernetes-style YAML files. Version control your monitoring, let agents apply changes declaratively.
59
59
  - **Self-hosted.** Runs locally with SQLite. No cloud account required.
@@ -148,9 +148,9 @@ Integration setup guides: [Google Search Console](docs/google-search-console-set
148
148
 
149
149
  ## Skills
150
150
 
151
- The agent learns how to operate canonry through bundled [OpenClaw skills](https://clawhub.dev) that cover CLI commands, provider setup, analysis workflows, and troubleshooting. Skills are seeded into the agent workspace during `canonry agent setup`.
151
+ The agent learns how to operate canonry through bundled skills that cover CLI commands, provider setup, analysis workflows, and troubleshooting. Skills are seeded into the agent workspace during `canonry agent setup`.
152
152
 
153
- **Claude Code** also picks up the skill automatically from `.claude/skills/canonry-setup/` when you open this repo. **ClawHub** hosts the same skill at [clawhub.dev](https://clawhub.dev) for any MCP-equipped agent.
153
+ **Claude Code** also picks up the skill automatically from `.claude/skills/canonry-setup/` when you open this repo.
154
154
 
155
155
  ## Deployment
156
156
 
@@ -2,7 +2,7 @@
2
2
 
3
3
  ## Per-Client State Template
4
4
 
5
- Store in OpenClaw agent memory after each significant event:
5
+ Store in agent memory after each significant event:
6
6
 
7
7
  ```
8
8
  Client: <business name>
@@ -3,7 +3,7 @@ name: canonry
3
3
  description: "Agent-first AEO monitoring and operating platform."
4
4
  metadata:
5
5
  {
6
- "openclaw":
6
+ "agent":
7
7
  {
8
8
  "emoji": "📡",
9
9
  "requires": { "bins": ["canonry"] },
@@ -296,10 +296,10 @@ canonry export <project> --include-results > project.yaml
296
296
  canonry sitemap inspect <project>
297
297
  ```
298
298
 
299
- ## Agent (OpenClaw Integration)
299
+ ## Agent
300
300
 
301
301
  `canonry agent setup` is the single entry point for configuring the agent. It handles everything:
302
- canonry initialization, OpenClaw installation, profile setup, LLM credential configuration,
302
+ canonry initialization, agent runtime installation, profile setup, LLM credential configuration,
303
303
  and workspace seeding. If canonry is not yet configured, it runs the interactive init flow first
304
304
  (prompting for monitoring provider keys and agent LLM credentials).
305
305
 
@@ -313,7 +313,7 @@ canonry agent setup --agent-provider openrouter --agent-key <key> --agent-model
313
313
  GEMINI_API_KEY=<key> canonry agent setup --agent-key <key> --format json
314
314
 
315
315
  # Lifecycle
316
- canonry agent start # start OpenClaw gateway as background process
316
+ canonry agent start # start agent gateway as background process
317
317
  canonry agent stop # stop the gateway process
318
318
  canonry agent status # check if gateway is running
319
319
  canonry agent status --format json # JSON output
@@ -336,9 +336,9 @@ canonry agent detach <project> # remove agent webhook from pro
336
336
  canonry agent detach <project> --format json # JSON output
337
337
  ```
338
338
 
339
- **Setup flow:** init canonry (if needed) → install OpenClaw (if needed) → configure profile → configure gateway → set agent LLM credentials → seed workspace with skills.
339
+ **Setup flow:** init canonry (if needed) → install agent runtime (if needed) → configure profile → configure gateway → set agent LLM credentials → seed workspace with skills.
340
340
 
341
- **Agent LLM credentials** are stored in `~/.openclaw-aero/.env` (e.g. `ANTHROPIC_API_KEY=...`) and loaded into the gateway process at start time. The model is set via `openclaw models set`.
341
+ **Agent LLM credentials** are stored in the agent env file (e.g. `ANTHROPIC_API_KEY=...`) and loaded into the gateway process at start time. The model is set via the agent CLI.
342
342
 
343
343
  **Re-running is safe:** setup is idempotent — it skips steps that are already configured.
344
344
 
@@ -6,6 +6,8 @@ import {
6
6
  bingUrlInspections,
7
7
  competitors,
8
8
  createLogger,
9
+ dropLegacyCredentialColumns,
10
+ extractLegacyCredentials,
9
11
  gaAiReferrals,
10
12
  gaSocialReferrals,
11
13
  gaTrafficSnapshots,
@@ -23,7 +25,7 @@ import {
23
25
  runs,
24
26
  schedules,
25
27
  usageCounters
26
- } from "./chunk-JTKHPNGL.js";
28
+ } from "./chunk-ZZ57GRV6.js";
27
29
 
28
30
  // src/config.ts
29
31
  import fs from "fs";
@@ -338,7 +340,7 @@ import crypto23 from "crypto";
338
340
  import fs7 from "fs";
339
341
  import path8 from "path";
340
342
  import { fileURLToPath as fileURLToPath2 } from "url";
341
- import { eq as eq24, sql as sql6 } from "drizzle-orm";
343
+ import { eq as eq24 } from "drizzle-orm";
342
344
  import Fastify from "fastify";
343
345
 
344
346
  // ../contracts/src/config-schema.ts
@@ -15959,68 +15961,46 @@ function serializeSessionCookie(opts) {
15959
15961
  }
15960
15962
  return parts.join("; ");
15961
15963
  }
15962
- function migrateDbCredentialsToConfig(db, config) {
15963
- try {
15964
- const googleColCheck = db.all(sql6.raw(
15965
- `SELECT COUNT(*) as c FROM pragma_table_info('google_connections') WHERE name = 'access_token'`
15966
- ));
15967
- if (googleColCheck[0]?.c) {
15968
- const rows = db.all(sql6.raw(
15969
- `SELECT domain, connection_type, property_id, sitemap_url, access_token, refresh_token, token_expires_at, scopes, created_at, updated_at FROM google_connections WHERE refresh_token IS NOT NULL AND refresh_token != ''`
15970
- ));
15971
- let migrated = 0;
15972
- for (const row of rows) {
15973
- const connType = row.connection_type;
15974
- const existing = getGoogleConnection(config, row.domain, connType);
15975
- if (existing?.refreshToken) continue;
15976
- upsertGoogleConnection(config, {
15977
- domain: row.domain,
15978
- connectionType: connType,
15979
- propertyId: row.property_id ?? null,
15980
- sitemapUrl: row.sitemap_url ?? null,
15981
- accessToken: row.access_token ?? void 0,
15982
- refreshToken: row.refresh_token ?? null,
15983
- tokenExpiresAt: row.token_expires_at ?? null,
15984
- scopes: parseJsonColumn(row.scopes, []),
15985
- createdAt: row.created_at,
15986
- updatedAt: row.updated_at
15987
- });
15988
- migrated++;
15989
- }
15990
- if (migrated > 0) {
15991
- saveConfigPatch({ google: config.google });
15992
- log9.info("credentials.migrated", { type: "google", count: migrated });
15993
- }
15994
- }
15995
- const gaColCheck = db.all(sql6.raw(
15996
- `SELECT COUNT(*) as c FROM pragma_table_info('ga_connections') WHERE name = 'private_key'`
15997
- ));
15998
- if (gaColCheck[0]?.c) {
15999
- const rows = db.all(sql6.raw(
16000
- `SELECT id, project_id, property_id, client_email, private_key, created_at, updated_at FROM ga_connections WHERE private_key IS NOT NULL AND private_key != ''`
16001
- ));
16002
- let migrated = 0;
16003
- for (const row of rows) {
16004
- const project = db.select({ name: projects.name }).from(projects).where(eq24(projects.id, row.project_id)).get();
16005
- if (!project) continue;
16006
- const existing = getGa4Connection(config, project.name);
16007
- if (existing?.privateKey) continue;
16008
- upsertGa4Connection(config, {
16009
- projectName: project.name,
16010
- propertyId: row.property_id,
16011
- clientEmail: row.client_email,
16012
- privateKey: row.private_key,
16013
- createdAt: row.created_at,
16014
- updatedAt: row.updated_at
16015
- });
16016
- migrated++;
16017
- }
16018
- if (migrated > 0) {
16019
- saveConfigPatch({ ga4: config.ga4 });
16020
- log9.info("credentials.migrated", { type: "ga4", count: migrated });
16021
- }
16022
- }
16023
- } catch {
15964
+ function applyLegacyCredentials(rows, config) {
15965
+ let migratedGoogle = 0;
15966
+ for (const row of rows.google) {
15967
+ const existing = getGoogleConnection(config, row.domain, row.connectionType);
15968
+ if (existing?.refreshToken) continue;
15969
+ upsertGoogleConnection(config, {
15970
+ domain: row.domain,
15971
+ connectionType: row.connectionType,
15972
+ propertyId: row.propertyId,
15973
+ sitemapUrl: row.sitemapUrl,
15974
+ accessToken: row.accessToken ?? void 0,
15975
+ refreshToken: row.refreshToken,
15976
+ tokenExpiresAt: row.tokenExpiresAt,
15977
+ scopes: row.scopes,
15978
+ createdAt: row.createdAt,
15979
+ updatedAt: row.updatedAt
15980
+ });
15981
+ migratedGoogle++;
15982
+ }
15983
+ if (migratedGoogle > 0) {
15984
+ saveConfigPatch({ google: config.google });
15985
+ log9.info("credentials.migrated", { type: "google", count: migratedGoogle });
15986
+ }
15987
+ let migratedGa4 = 0;
15988
+ for (const row of rows.ga4) {
15989
+ const existing = getGa4Connection(config, row.projectName);
15990
+ if (existing?.privateKey) continue;
15991
+ upsertGa4Connection(config, {
15992
+ projectName: row.projectName,
15993
+ propertyId: row.propertyId,
15994
+ clientEmail: row.clientEmail,
15995
+ privateKey: row.privateKey,
15996
+ createdAt: row.createdAt,
15997
+ updatedAt: row.updatedAt
15998
+ });
15999
+ migratedGa4++;
16000
+ }
16001
+ if (migratedGa4 > 0) {
16002
+ saveConfigPatch({ ga4: config.ga4 });
16003
+ log9.info("credentials.migrated", { type: "ga4", count: migratedGa4 });
16024
16004
  }
16025
16005
  }
16026
16006
  async function createServer(opts) {
@@ -16047,7 +16027,15 @@ async function createServer(opts) {
16047
16027
  quota: opts.config.geminiQuota
16048
16028
  };
16049
16029
  }
16050
- migrateDbCredentialsToConfig(opts.db, opts.config);
16030
+ try {
16031
+ const legacyRows = extractLegacyCredentials(opts.db);
16032
+ applyLegacyCredentials(legacyRows, opts.config);
16033
+ dropLegacyCredentialColumns(opts.db);
16034
+ } catch (err) {
16035
+ log9.warn("credentials.migration.failed", {
16036
+ error: err instanceof Error ? err.message : String(err)
16037
+ });
16038
+ }
16051
16039
  log9.info("providers.configured", { providers: Object.keys(providers).filter((k) => {
16052
16040
  const p = providers[k];
16053
16041
  return p?.apiKey || p?.baseUrl || p?.vertexProject;
@@ -854,6 +854,12 @@ var MIGRATIONS = [
854
854
  `CREATE INDEX IF NOT EXISTS idx_snapshots_created_at ON query_snapshots(created_at)`
855
855
  // v36: Transaction handling and SQL injection review: verified all strings use SQLite ? binding via Drizzle.
856
856
  // No changes required for parameterization.
857
+ // v37: The legacy credential columns (private_key on ga_connections; access_token,
858
+ // refresh_token, token_expires_at on google_connections) are removed by the
859
+ // extractLegacyCredentials / dropLegacyCredentialColumns pair below. Callers
860
+ // read the rows, persist them to config.yaml, and only then drop the columns
861
+ // so a failed config write doesn't permanently lose credentials. Keeping the
862
+ // DROPs as raw SQL here would race with that read.
857
863
  ];
858
864
  function isDuplicateColumnError(err) {
859
865
  if (!(err instanceof Error)) return false;
@@ -861,6 +867,83 @@ function isDuplicateColumnError(err) {
861
867
  if (err.cause instanceof Error && err.cause.message.includes("duplicate column name")) return true;
862
868
  return false;
863
869
  }
870
+ function columnExists(db, table, column) {
871
+ const rows = db.all(sql.raw(
872
+ `SELECT COUNT(*) as c FROM pragma_table_info('${table}') WHERE name = '${column}'`
873
+ ));
874
+ return (rows[0]?.c ?? 0) > 0;
875
+ }
876
+ function dropColumnIfExists(db, table, column) {
877
+ try {
878
+ db.run(sql.raw(`ALTER TABLE ${table} DROP COLUMN ${column}`));
879
+ } catch (err) {
880
+ if (!(err instanceof Error)) throw err;
881
+ const msg = err.message;
882
+ const causeMsg = err.cause instanceof Error ? err.cause.message : "";
883
+ const expected = `no such column: "${column}"`;
884
+ const expectedAlt = `no such column: ${column}`;
885
+ if (msg.includes(expected) || msg.includes(expectedAlt)) return;
886
+ if (causeMsg.includes(expected) || causeMsg.includes(expectedAlt)) return;
887
+ throw err;
888
+ }
889
+ }
890
+ function extractLegacyCredentials(db) {
891
+ const out = { google: [], ga4: [] };
892
+ if (columnExists(db, "google_connections", "access_token")) {
893
+ const rows = db.all(sql.raw(
894
+ `SELECT domain, connection_type, property_id, sitemap_url, access_token, refresh_token, token_expires_at, scopes, created_at, updated_at
895
+ FROM google_connections
896
+ WHERE refresh_token IS NOT NULL AND refresh_token != ''`
897
+ ));
898
+ for (const row of rows) {
899
+ out.google.push({
900
+ domain: row.domain,
901
+ connectionType: row.connection_type,
902
+ propertyId: row.property_id,
903
+ sitemapUrl: row.sitemap_url,
904
+ accessToken: row.access_token,
905
+ refreshToken: row.refresh_token,
906
+ tokenExpiresAt: row.token_expires_at,
907
+ scopes: parseJsonColumn(row.scopes, []),
908
+ createdAt: row.created_at,
909
+ updatedAt: row.updated_at
910
+ });
911
+ }
912
+ }
913
+ if (columnExists(db, "ga_connections", "private_key")) {
914
+ const rows = db.all(sql.raw(
915
+ `SELECT p.name AS project_name, ga.property_id, ga.client_email, ga.private_key, ga.created_at, ga.updated_at
916
+ FROM ga_connections ga
917
+ INNER JOIN projects p ON p.id = ga.project_id
918
+ WHERE ga.private_key IS NOT NULL AND ga.private_key != ''`
919
+ ));
920
+ for (const row of rows) {
921
+ out.ga4.push({
922
+ projectName: row.project_name,
923
+ propertyId: row.property_id,
924
+ clientEmail: row.client_email,
925
+ privateKey: row.private_key,
926
+ createdAt: row.created_at,
927
+ updatedAt: row.updated_at
928
+ });
929
+ }
930
+ }
931
+ return out;
932
+ }
933
+ function dropLegacyCredentialColumns(db) {
934
+ if (columnExists(db, "google_connections", "access_token")) {
935
+ dropColumnIfExists(db, "google_connections", "access_token");
936
+ }
937
+ if (columnExists(db, "google_connections", "refresh_token")) {
938
+ dropColumnIfExists(db, "google_connections", "refresh_token");
939
+ }
940
+ if (columnExists(db, "google_connections", "token_expires_at")) {
941
+ dropColumnIfExists(db, "google_connections", "token_expires_at");
942
+ }
943
+ if (columnExists(db, "ga_connections", "private_key")) {
944
+ dropColumnIfExists(db, "ga_connections", "private_key");
945
+ }
946
+ }
864
947
  function migrate(db) {
865
948
  const statements = MIGRATION_SQL.split(";").map((s) => s.trim()).filter((s) => s.length > 0);
866
949
  for (const statement of statements) {
@@ -1308,6 +1391,8 @@ export {
1308
1391
  healthSnapshots,
1309
1392
  createClient,
1310
1393
  parseJsonColumn,
1394
+ extractLegacyCredentials,
1395
+ dropLegacyCredentialColumns,
1311
1396
  migrate,
1312
1397
  createLogger,
1313
1398
  IntelligenceService
package/dist/cli.js CHANGED
@@ -47,7 +47,7 @@ import {
47
47
  trackEvent,
48
48
  usageError,
49
49
  writeAgentEnv
50
- } from "./chunk-YPTVJRJY.js";
50
+ } from "./chunk-IPOVH342.js";
51
51
  import {
52
52
  apiKeys,
53
53
  competitors,
@@ -57,7 +57,7 @@ import {
57
57
  projects,
58
58
  querySnapshots,
59
59
  runs
60
- } from "./chunk-JTKHPNGL.js";
60
+ } from "./chunk-ZZ57GRV6.js";
61
61
 
62
62
  // src/cli.ts
63
63
  import { pathToFileURL } from "url";
@@ -304,7 +304,7 @@ async function backfillAnswerVisibilityCommand(opts) {
304
304
  console.log(` Errors: ${providerErrors}`);
305
305
  }
306
306
  async function backfillInsightsCommand(project, opts) {
307
- const { IntelligenceService } = await import("./intelligence-service-Q4WX46MJ.js");
307
+ const { IntelligenceService } = await import("./intelligence-service-MZ7SXEGE.js");
308
308
  const config = loadConfig();
309
309
  const db = createClient(config.database);
310
310
  migrate(db);
package/dist/index.js CHANGED
@@ -1,8 +1,8 @@
1
1
  import {
2
2
  createServer,
3
3
  loadConfig
4
- } from "./chunk-YPTVJRJY.js";
5
- import "./chunk-JTKHPNGL.js";
4
+ } from "./chunk-IPOVH342.js";
5
+ import "./chunk-ZZ57GRV6.js";
6
6
  export {
7
7
  createServer,
8
8
  loadConfig
@@ -1,6 +1,6 @@
1
1
  import {
2
2
  IntelligenceService
3
- } from "./chunk-JTKHPNGL.js";
3
+ } from "./chunk-ZZ57GRV6.js";
4
4
  export {
5
5
  IntelligenceService
6
6
  };
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@ainyc/canonry",
3
- "version": "1.48.2",
3
+ "version": "1.48.4",
4
4
  "type": "module",
5
5
  "description": "The ultimate open-source AEO monitoring tool - track how answer engines cite your domain",
6
6
  "license": "FSL-1.1-ALv2",
@@ -58,16 +58,16 @@
58
58
  "@ainyc/canonry-contracts": "0.0.0",
59
59
  "@ainyc/canonry-config": "0.0.0",
60
60
  "@ainyc/canonry-db": "0.0.0",
61
- "@ainyc/canonry-intelligence": "0.0.0",
62
61
  "@ainyc/canonry-integration-bing": "0.0.0",
63
62
  "@ainyc/canonry-integration-wordpress": "0.0.0",
64
- "@ainyc/canonry-integration-google": "0.0.0",
65
63
  "@ainyc/canonry-provider-cdp": "0.0.0",
64
+ "@ainyc/canonry-intelligence": "0.0.0",
66
65
  "@ainyc/canonry-provider-claude": "0.0.0",
67
- "@ainyc/canonry-provider-gemini": "0.0.0",
68
66
  "@ainyc/canonry-provider-local": "0.0.0",
69
67
  "@ainyc/canonry-provider-openai": "0.0.0",
70
- "@ainyc/canonry-provider-perplexity": "0.0.0"
68
+ "@ainyc/canonry-integration-google": "0.0.0",
69
+ "@ainyc/canonry-provider-perplexity": "0.0.0",
70
+ "@ainyc/canonry-provider-gemini": "0.0.0"
71
71
  },
72
72
  "scripts": {
73
73
  "build": "tsx scripts/copy-agent-assets.ts && tsup && tsx build-web.ts",