@ainyc/canonry 1.0.0 → 1.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +223 -0
  2. package/package.json +7 -6
package/README.md ADDED
@@ -0,0 +1,223 @@
1
+ # Canonry
2
+
3
+ [![npm version](https://img.shields.io/npm/v/@ainyc/canonry)](https://www.npmjs.com/package/@ainyc/canonry) [![License: AGPL-3.0](https://img.shields.io/badge/License-AGPL--3.0-blue.svg)](https://www.gnu.org/licenses/agpl-3.0) [![Node.js >= 20](https://img.shields.io/badge/node-%3E%3D20-brightgreen)](https://nodejs.org)
4
+
5
+ **Open-source AEO monitoring for your domain.** Canonry tracks how AI answer engines (ChatGPT, Gemini, Claude, and others) cite or omit your website for the keywords you care about.
6
+
7
+ AEO (Answer Engine Optimization) is the practice of ensuring your content is accurately represented in AI-generated answers. As search shifts from links to synthesized responses, monitoring your visibility across answer engines is essential.
8
+
9
+ ## Quick Start
10
+
11
+ ```bash
12
+ npm install -g @ainyc/canonry
13
+ canonry init
14
+ canonry serve
15
+ ```
16
+
17
+ Open [http://localhost:4100](http://localhost:4100) to access the web dashboard.
18
+
19
+ ## Features
20
+
21
+ - **Multi-provider monitoring** -- query Gemini, OpenAI, Claude, and local LLMs (Ollama, LM Studio, or any OpenAI-compatible endpoint) from a single tool.
22
+ - **Three equal surfaces** -- CLI, REST API, and web dashboard all backed by the same API. No surface is privileged.
23
+ - **Config-as-code** -- manage projects with Kubernetes-style YAML files. Version control your monitoring setup.
24
+ - **Self-hosted** -- runs locally with SQLite. No cloud account, no external dependencies beyond the LLM API keys you choose to configure.
25
+ - **Scheduled monitoring** -- set up cron-based recurring runs to track citation changes over time.
26
+ - **Webhook notifications** -- get alerted when your citation status changes.
27
+ - **Audit logging** -- full history of every action taken through any surface.
28
+
29
+ ## CLI Reference
30
+
31
+ ### Setup
32
+
33
+ ```bash
34
+ canonry init # Initialize config and database
35
+ canonry serve # Start server (API + web dashboard)
36
+ canonry settings # View/edit configuration
37
+ ```
38
+
39
+ ### Projects
40
+
41
+ ```bash
42
+ canonry project create <name> --domain <domain> --country US --language en
43
+ canonry project list
44
+ canonry project show <name>
45
+ canonry project delete <name>
46
+ ```
47
+
48
+ ### Keywords and Competitors
49
+
50
+ ```bash
51
+ canonry keyword add <project> "keyword one" "keyword two"
52
+ canonry keyword list <project>
53
+ canonry keyword import <project> <file.csv>
54
+
55
+ canonry competitor add <project> competitor1.com competitor2.com
56
+ canonry competitor list <project>
57
+ ```
58
+
59
+ ### Visibility Runs
60
+
61
+ ```bash
62
+ canonry run <project> # Run all configured providers
63
+ canonry run <project> --provider gemini # Run a single provider
64
+ canonry runs <project> # List past runs
65
+ canonry status <project> # Current visibility summary
66
+ canonry evidence <project> # View citation evidence
67
+ canonry history <project> # Per-keyword citation timeline
68
+ canonry export <project> # Export project as YAML
69
+ ```
70
+
71
+ ### Config-as-Code
72
+
73
+ ```bash
74
+ canonry apply canonry.yaml # Declarative project apply
75
+ ```
76
+
77
+ ### Scheduling and Notifications
78
+
79
+ ```bash
80
+ canonry schedule set <project> --cron "0 8 * * *"
81
+ canonry schedule show <project>
82
+ canonry schedule enable <project>
83
+ canonry schedule disable <project>
84
+ canonry schedule remove <project>
85
+
86
+ canonry notify add <project> --url https://hooks.slack.com/...
87
+ canonry notify list <project>
88
+ canonry notify remove <project> <id>
89
+ canonry notify test <project> <id>
90
+ ```
91
+
92
+ ## Config-as-Code
93
+
94
+ Define your monitoring projects in version-controlled YAML files:
95
+
96
+ ```yaml
97
+ apiVersion: canonry/v1
98
+ kind: Project
99
+ metadata:
100
+ name: my-project
101
+ spec:
102
+ displayName: My Project
103
+ canonicalDomain: example.com
104
+ country: US
105
+ language: en
106
+ keywords:
107
+ - best dental implants near me
108
+ - emergency dentist open now
109
+ competitors:
110
+ - competitor.com
111
+ providers:
112
+ - gemini
113
+ - openai
114
+ - claude
115
+ - local
116
+ ```
117
+
118
+ Apply with the CLI or the API:
119
+
120
+ ```bash
121
+ canonry apply canonry.yaml
122
+ ```
123
+
124
+ ```bash
125
+ curl -X POST http://localhost:4100/api/v1/apply \
126
+ -H "Authorization: Bearer cnry_..." \
127
+ -H "Content-Type: application/yaml" \
128
+ --data-binary @canonry.yaml
129
+ ```
130
+
131
+ The database is authoritative. Config files are input, not state.
132
+
133
+ ## Provider Setup
134
+
135
+ Canonry queries multiple AI answer engines. Configure the providers you want during `canonry init`, or add them later via the settings page or API.
136
+
137
+ ### Gemini
138
+
139
+ Get an API key from [Google AI Studio](https://aistudio.google.com/apikey).
140
+
141
+ ### OpenAI
142
+
143
+ Get an API key from [platform.openai.com](https://platform.openai.com/api-keys).
144
+
145
+ ### Claude
146
+
147
+ Get an API key from [console.anthropic.com](https://console.anthropic.com/settings/keys).
148
+
149
+ ### Local LLMs
150
+
151
+ Any OpenAI-compatible endpoint works -- Ollama, LM Studio, llama.cpp, vLLM, and similar tools. Configure via CLI or API:
152
+
153
+ ```bash
154
+ canonry settings provider local --base-url http://localhost:11434/v1
155
+ ```
156
+
157
+ The base URL is the only required field. API key is optional (most local servers don't need one). You can also set a specific model:
158
+
159
+ ```bash
160
+ canonry settings provider local --base-url http://localhost:11434/v1 --model llama3
161
+ ```
162
+
163
+ > **Note:** Unless your local model has web search capabilities, responses will be based solely on its training data. Cloud providers (Gemini, OpenAI, Claude) use live web search to ground their answers, which produces more accurate citation results. Local LLMs are best used for comparing how different models perceive your brand without real-time search context.
164
+
165
+ ## API
166
+
167
+ All endpoints are served under `/api/v1/`. Authenticate with a bearer token:
168
+
169
+ ```
170
+ Authorization: Bearer cnry_...
171
+ ```
172
+
173
+ Key endpoints:
174
+
175
+ | Method | Path | Description |
176
+ |--------|------|-------------|
177
+ | `PUT` | `/api/v1/projects/{name}` | Create or update a project |
178
+ | `POST` | `/api/v1/projects/{name}/runs` | Trigger a visibility sweep |
179
+ | `GET` | `/api/v1/projects/{name}/timeline` | Per-keyword citation history |
180
+ | `GET` | `/api/v1/projects/{name}/snapshots/diff` | Compare two runs |
181
+ | `POST` | `/api/v1/apply` | Config-as-code apply |
182
+ | `GET` | `/api/v1/openapi.json` | OpenAPI spec (no auth required) |
183
+
184
+ ## Web Dashboard
185
+
186
+ The bundled web dashboard provides five views:
187
+
188
+ - **Overview** -- portfolio-level visibility scores across all projects with sparkline trends.
189
+ - **Project** -- command center with score gauges, keyword evidence tables, and competitor analysis.
190
+ - **Runs** -- history of all visibility sweeps with per-provider breakdowns.
191
+ - **Settings** -- provider configuration, scheduling, and notification management.
192
+ - **Setup** -- guided wizard for first-time onboarding.
193
+
194
+ Access it at [http://localhost:4100](http://localhost:4100) after running `canonry serve`.
195
+
196
+ ## Requirements
197
+
198
+ - Node.js >= 20
199
+ - At least one provider API key (or a local LLM endpoint)
200
+
201
+ ## Development
202
+
203
+ ```bash
204
+ git clone https://github.com/ainyc/canonry.git
205
+ cd canonry
206
+ pnpm install
207
+ pnpm run typecheck
208
+ pnpm run test
209
+ pnpm run lint
210
+ pnpm run dev:web # Run SPA in dev mode
211
+ ```
212
+
213
+ ## Contributing
214
+
215
+ Contributions are welcome. See [CONTRIBUTING.md](./CONTRIBUTING.md) for setup instructions.
216
+
217
+ ## License
218
+
219
+ [AGPL-3.0-only](./LICENSE)
220
+
221
+ ---
222
+
223
+ Built by [AI NYC](https://ainyc.ai)
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@ainyc/canonry",
3
- "version": "1.0.0",
3
+ "version": "1.0.1",
4
4
  "type": "module",
5
5
  "license": "AGPL-3.0-only",
6
6
  "bin": {
@@ -17,7 +17,8 @@
17
17
  "bin/",
18
18
  "dist/",
19
19
  "assets/",
20
- "src/"
20
+ "src/",
21
+ "README.md"
21
22
  ],
22
23
  "engines": {
23
24
  "node": ">=20"
@@ -42,11 +43,11 @@
42
43
  "tsx": "^4.19.0",
43
44
  "@ainyc/canonry-api-routes": "0.0.0",
44
45
  "@ainyc/canonry-contracts": "0.0.0",
45
- "@ainyc/canonry-provider-claude": "0.0.0",
46
- "@ainyc/canonry-provider-gemini": "0.0.0",
47
46
  "@ainyc/canonry-db": "0.0.0",
48
- "@ainyc/canonry-provider-local": "0.0.0",
49
- "@ainyc/canonry-provider-openai": "0.0.0"
47
+ "@ainyc/canonry-provider-gemini": "0.0.0",
48
+ "@ainyc/canonry-provider-claude": "0.0.0",
49
+ "@ainyc/canonry-provider-openai": "0.0.0",
50
+ "@ainyc/canonry-provider-local": "0.0.0"
50
51
  },
51
52
  "scripts": {
52
53
  "build": "tsup && tsx build-web.ts",