pagesight 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,40 @@
1
+ name: CI
2
+
3
+ on:
4
+ push:
5
+ branches: [main]
6
+ pull_request:
7
+ branches: [main]
8
+
9
+ jobs:
10
+ check:
11
+ runs-on: ubuntu-latest
12
+ steps:
13
+ - uses: actions/checkout@v4
14
+ - uses: oven-sh/setup-bun@v2
15
+ - run: bun install --frozen-lockfile
16
+ - run: bunx biome check src/
17
+ - run: bunx tsc --noEmit
18
+
19
+ release:
20
+ needs: check
21
+ if: github.ref == 'refs/heads/main' && github.event_name == 'push'
22
+ runs-on: ubuntu-latest
23
+ permissions:
24
+ contents: write
25
+ issues: write
26
+ pull-requests: write
27
+ steps:
28
+ - uses: actions/checkout@v4
29
+ with:
30
+ fetch-depth: 0
31
+ - uses: actions/setup-node@v4
32
+ with:
33
+ node-version: 22
34
+ - uses: oven-sh/setup-bun@v2
35
+ - run: bun install --frozen-lockfile
36
+ - run: npx semantic-release
37
+ env:
38
+ GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
39
+ NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
40
+ ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
@@ -0,0 +1,10 @@
1
+ {
2
+ "branches": ["main"],
3
+ "plugins": [
4
+ "@semantic-release/commit-analyzer",
5
+ "semantic-release-ai-notes",
6
+ "@semantic-release/npm",
7
+ "@semantic-release/github",
8
+ "@semantic-release/git"
9
+ ]
10
+ }
package/CLAUDE.md ADDED
@@ -0,0 +1,52 @@
1
+ # Pagesight
2
+
3
+ MCP server for SEO, GEO, and web performance analysis. npm package: `pagesight`.
4
+
5
+ ## Stack
6
+
7
+ - **Runtime**: Bun (not Node.js)
8
+ - **Language**: TypeScript
9
+ - **MCP SDK**: `@modelcontextprotocol/sdk`
10
+ - **Linter**: Biome
11
+ - **Git hooks**: Lefthook (pre-commit: biome + tsc)
12
+
13
+ ## Architecture
14
+
15
+ ```
16
+ src/
17
+ index.ts # MCP server entry, registers all tools
18
+ lib/
19
+ auth.ts # OAuth 2.0 + Service Account auth for GSC
20
+ gsc.ts # Google Search Console API client
21
+ psi.ts # PageSpeed Insights API client
22
+ crux.ts # Chrome UX Report API client
23
+ tools/
24
+ inspect.ts # URL Inspection tool
25
+ pagespeed.ts # PageSpeed Insights tool
26
+ crux.ts # CrUX + CrUX History tools
27
+ performance.ts # Search Analytics tool
28
+ sitemaps.ts # Sites + Sitemaps tool
29
+ setup.ts # Auth setup helper
30
+ ```
31
+
32
+ ## APIs Used
33
+
34
+ | API | Auth | Env Var |
35
+ |-----|------|---------|
36
+ | Google Search Console | OAuth 2.0 / Service Account | `GSC_CLIENT_ID`, `GSC_CLIENT_SECRET`, `GSC_REFRESH_TOKEN` |
37
+ | PageSpeed Insights v5 | API key (optional) | `GOOGLE_API_KEY` |
38
+ | Chrome UX Report | API key (required) | `GOOGLE_API_KEY` |
39
+
40
+ ## Commands
41
+
42
+ - `bun run src/index.ts` — start MCP server
43
+ - `bun run lint` — biome check
44
+ - `bun run format` — biome format
45
+ - `bun test` — run tests
46
+
47
+ ## Conventions
48
+
49
+ - All tools have try/catch error handling with clean error messages
50
+ - Use Bun built-in APIs over third-party packages
51
+ - No HTML parsing or on-page analysis — only authoritative data sources
52
+ - Every check must be backed by an official API or standard (Google APIs, RFC 9309), not industry conventions
package/LICENSE ADDED
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2026 Caio Pizzol
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
package/README.md ADDED
@@ -0,0 +1,183 @@
1
+ # Pagesight
2
+
3
+ Google's data + AI crawler intelligence, your AI assistant's hands.
4
+
5
+ An open-source MCP server that gives AI assistants direct access to Google Search Console, PageSpeed Insights, Chrome UX Report, and a robots.txt analyzer that audits 139+ AI crawlers. No made-up rules. No invented scores. Just data from authoritative sources.
6
+
7
+ Most SEO tools flag "title over 60 characters" and "only one H1 allowed." [Google's own engineers say those rules don't exist.](#why-not-other-seo-tools) Pagesight skips the myths and asks the sources directly.
8
+
9
+ ## Tools
10
+
11
+ Eight tools. Three Google APIs. 139+ AI bots tracked. One install.
12
+
13
+ ### `inspect`
14
+
15
+ Ask Google: is this page indexed? What canonical did you choose? Any crawl errors? Structured data issues?
16
+
17
+ Returns index status, canonical (yours vs Google's), crawl status, rich results validation, sitemaps, and referring URLs — directly from Google's index.
18
+
19
+ ### `pagespeed`
20
+
21
+ Run Google Lighthouse on any URL:
22
+
23
+ - **Scores**: performance, accessibility, best-practices, seo
24
+ - **Core Web Vitals (lab)**: FCP, LCP, TBT, CLS, Speed Index, TTI
25
+ - **CrUX field data**: real Chrome user metrics when available (page + origin)
26
+ - **Opportunities**: ranked by severity with potential savings
27
+ - **Strategy**: `mobile` or `desktop`
28
+ - **Locale**: localized results (e.g., `pt-BR`)
29
+
30
+ ### `crux`
31
+
32
+ Real-world Core Web Vitals from Chrome users (28-day rolling window):
33
+
34
+ - **Metrics**: LCP, FCP, INP, CLS, TTFB, RTT, navigation types, form factors
35
+ - **Granularity**: by URL or origin, by device (DESKTOP, PHONE, TABLET)
36
+ - **Data**: p75 values + histogram distributions (good/needs improvement/poor)
37
+
38
+ ### `crux_history`
39
+
40
+ Core Web Vitals trends over time — up to 40 weekly data points (~10 months):
41
+
42
+ - Trend detection (improved/stable/worse) with percentage change
43
+ - Recent data points table for LCP, INP, CLS
44
+ - Custom period count (1-40)
45
+
46
+ ### `performance`
47
+
48
+ Google Search Console search analytics with full API coverage:
49
+
50
+ - **Dimensions**: `query`, `page`, `country`, `device`, `date`, `searchAppearance`, `hour`
51
+ - **Search types**: `web`, `image`, `video`, `news`, `discover`, `googleNews`
52
+ - **Filters**: `equals`, `contains`, `notEquals`, `notContains`, `includingRegex`, `excludingRegex`
53
+ - **Aggregation**: `auto`, `byPage`, `byProperty`, `byNewsShowcasePanel`
54
+ - **Data freshness**: `all`, `final`, `hourly_all`
55
+ - **Pagination**: up to 25,000 rows with offset
56
+
57
+ ### `robots`
58
+
59
+ Fetch and analyze any site's robots.txt:
60
+
61
+ - **Syntax validation** per [RFC 9309](https://www.rfc-editor.org/rfc/rfc9309)
62
+ - **AI crawler audit** — checks 139+ bots from the [ai-robots-txt](https://github.com/ai-robots-txt/ai.robots.txt) community registry
63
+ - **Bot categories**: training scrapers, AI search crawlers, AI assistants, AI agents
64
+ - **Per-bot status**: blocked or allowed, with the matched rule and group
65
+ - **Path checking**: is a specific path allowed for a specific user-agent?
66
+ - **Sitemaps**: lists all sitemaps declared in robots.txt
67
+
68
+ ```
69
+ === robots.txt: https://www.cnn.com ===
70
+ AI Crawlers: 55 blocked, 84 allowed (of 139 known)
71
+ Source: github.com/ai-robots-txt/ai.robots.txt
72
+
73
+ BLOCKED GPTBot (OpenAI) — GPT model training
74
+ BLOCKED ClaudeBot (Anthropic) — Claude model training
75
+ ALLOWED Claude-User (Anthropic) — User-initiated fetching
76
+ BLOCKED PerplexityBot (Perplexity) — Search indexing
77
+ ```
78
+
79
+ ### `sitemaps`
80
+
81
+ Search Console properties and sitemaps (read-only):
82
+
83
+ - `list_sites` — all GSC properties with permission level
84
+ - `get_site` — details for a specific property
85
+ - `list_sitemaps` — sitemaps with error/warning counts and content types
86
+ - `get_sitemap` — full details for a specific sitemap
87
+
88
+ ### `setup`
89
+
90
+ Check auth status or walk through OAuth interactively.
91
+
92
+ ## Setup
93
+
94
+ ### 1. Google Cloud project
95
+
96
+ 1. Go to [Google Cloud Console](https://console.cloud.google.com/)
97
+ 2. Create a project (or use existing)
98
+ 3. Enable three APIs:
99
+ - **Google Search Console API**
100
+ - **PageSpeed Insights API**
101
+ - **Chrome UX Report API**
102
+ 4. Create **OAuth client ID** (Desktop app) — for Search Console
103
+ 5. Create **API key** — for PageSpeed and CrUX
104
+
105
+ ### 2. Authorize Search Console
106
+
107
+ Use the `setup` tool to walk through OAuth, or manually:
108
+
109
+ 1. Visit the auth URL with your client ID
110
+ 2. Authorize access to Search Console
111
+ 3. Copy the code from the redirect URL
112
+ 4. Exchange it for a refresh token
113
+
114
+ ### 3. Configure
115
+
116
+ ```env
117
+ GSC_CLIENT_ID=your-client-id.apps.googleusercontent.com
118
+ GSC_CLIENT_SECRET=your-client-secret
119
+ GSC_REFRESH_TOKEN=your-refresh-token
120
+ GOOGLE_API_KEY=your-api-key
121
+ ```
122
+
123
+ Note: The `robots` tool works without any credentials — it fetches the public `/robots.txt` file directly.
124
+
125
+ ## Usage
126
+
127
+ Add to Claude Code, Cursor, or any MCP client:
128
+
129
+ ```json
130
+ {
131
+ "mcpServers": {
132
+ "pagesight": {
133
+ "command": "bun",
134
+ "args": ["run", "/path/to/pagesight/src/index.ts"],
135
+ "env": {
136
+ "GSC_CLIENT_ID": "your-client-id",
137
+ "GSC_CLIENT_SECRET": "your-secret",
138
+ "GSC_REFRESH_TOKEN": "your-token",
139
+ "GOOGLE_API_KEY": "your-api-key"
140
+ }
141
+ }
142
+ }
143
+ }
144
+ ```
145
+
146
+ Then just talk to your AI assistant:
147
+
148
+ ```
149
+ "Is https://mysite.com indexed?"
150
+ "What canonical did Google choose for this page?"
151
+ "Run pagespeed on my homepage, mobile"
152
+ "Show me CrUX data for my site on phones"
153
+ "How have my Core Web Vitals changed over the last 10 months?"
154
+ "Which queries bring traffic to this page?"
155
+ "Which AI crawlers can access my site?"
156
+ "Is GPTBot blocked on reddit.com?"
157
+ "Any sitemap errors?"
158
+ ```
159
+
160
+ ## Why not other SEO tools?
161
+
162
+ We researched every common SEO "rule" against official Google documentation. Most are myths:
163
+
164
+ - **"Title must be under 60 characters"** — Google: "there's no limit." Gary Illyes called it "an externally made-up metric."
165
+ - **"Meta description must be 155 characters"** — Google: "there's no limit on how long a meta description can be."
166
+ - **"Only one H1 per page"** — John Mueller: "You can use H1 tags as often as you want. There's no limit."
167
+ - **"Minimum 300 words per page"** — Mueller: "the number of words on a page is not a quality factor, not a ranking factor."
168
+ - **"Text-to-HTML ratio matters"** — Mueller: "it makes absolutely no sense at all for SEO."
169
+
170
+ Tools that flag these "issues" are reporting their opinions, not data. Pagesight only reports what authoritative sources actually return — Google's APIs for search data, RFC 9309 for robots.txt, and a community-maintained registry for AI crawlers.
171
+
172
+ ## Development
173
+
174
+ ```bash
175
+ bun install # install dependencies
176
+ bun run start # start MCP server
177
+ bun run lint # biome check
178
+ bun run format # biome format
179
+ ```
180
+
181
+ ## License
182
+
183
+ MIT
package/biome.json ADDED
@@ -0,0 +1,14 @@
1
+ {
2
+ "$schema": "https://biomejs.dev/schemas/2.4.10/schema.json",
3
+ "files": {
4
+ "includes": ["src/**"]
5
+ },
6
+ "formatter": {
7
+ "indentStyle": "space",
8
+ "indentWidth": 2,
9
+ "lineWidth": 120
10
+ },
11
+ "linter": {
12
+ "enabled": true
13
+ }
14
+ }
package/brand.md ADDED
@@ -0,0 +1,275 @@
1
+ ---
2
+ name: "Pagesight"
3
+ tagline: "Lint your site for search engines and AI."
4
+ version: 2
5
+ language: en
6
+ ---
7
+
8
+ # Pagesight
9
+
10
+ ## Strategy
11
+
12
+ ### Overview
13
+
14
+ Pagesight is an open-source MCP server that gives AI assistants direct access to how search engines and AI crawlers see your site. Google's index status, Core Web Vitals, search performance, structured data validation — plus a robots.txt analyzer that audits 139+ AI crawlers. All from authoritative sources, not made-up rules.
15
+
16
+ It was born from a developer maintaining multiple personal projects — tired of juggling SEO tools that flag "title too long" and "only one H1 allowed" when Google itself says those rules don't exist. Pagesight skips the myths and asks the sources directly: is this page indexed? What canonical did Google choose? Is GPTBot blocked or allowed? How fast is it for real Chrome users?
17
+
18
+ What it really does: it wraps three Google APIs — Search Console, PageSpeed Insights, and Chrome UX Report — plus an RFC 9309-compliant robots.txt analyzer with a live AI crawler registry into a single MCP server. Your assistant can inspect a URL, check performance, audit which AI bots can access your content, and diagnose search issues without you opening a dashboard.
19
+
20
+ The problem it solves: SEO tooling is built for marketers staring at dashboards, not developers shipping code. The MCP ecosystem fragmented it further — 15+ single-purpose servers, each wrapping one API, each with its own auth flow. And most SEO tools report rules that Google has explicitly debunked.
21
+
22
+ **Before Pagesight:** Install 5 MCP servers. Configure 5 auth flows. Get flagged for "multiple H1 tags" and "title over 60 characters" — neither of which Google cares about.
23
+
24
+ **After Pagesight:** One server. Google APIs + AI crawler intelligence. Only data from authoritative sources.
25
+
26
+ Long-term ambition: become the standard way AI assistants understand and optimize web presence — for both traditional search and generative AI.
27
+
28
+ ### Positioning
29
+
30
+ **Category:** Open-source Google Search intelligence for AI assistants.
31
+
32
+ Not a SaaS platform. Not a dashboard. Not a wrapper for a paid API. Not another single-purpose MCP server. Not a tool that invents rules.
33
+
34
+ **Competitive landscape:**
35
+ - **Top layer:** Enterprise SaaS (Semrush, Ahrefs, Moz) — $100-500/mo, built for marketing teams, dashboard-centric, hundreds of made-up "SEO scores"
36
+ - **Mid layer:** Focused SaaS (Frase, Otterly.AI, Profound) — $25-500/mo, narrower scope, still subscription-based
37
+ - **Bottom layer:** Open-source MCP servers — free but fragmented. 5+ PageSpeed MCPs, 9+ Search Console MCPs, all silos with partial API coverage. DataForSEO MCP is comprehensive but wraps a paid API.
38
+
39
+ Pagesight sits below all of them in cost (free) and above all of them in accuracy (Google's own data, not approximations). It trades dashboards for raw API access that agents can use programmatically.
40
+
41
+ **Structural differentials:**
42
+ - Only MCP server unifying Google Search Console + PageSpeed Insights + Chrome UX Report + AI crawler audit
43
+ - Reports what authoritative sources report. Doesn't invent rules, scores, or thresholds.
44
+ - robots.txt analysis per RFC 9309 with 139+ AI crawlers from a live community registry
45
+ - Full Google API coverage — every read-only method, every parameter, every dimension
46
+ - Designed for AI assistants, not humans clicking through tabs
47
+ - Zero vendor lock-in — MIT license, works with Claude, Cursor, Codex, any MCP client
48
+
49
+ **Territory:** Pagesight owns the intersection of "developer tooling" and "Google Search data." The tool that makes Google's APIs conversational.
50
+
51
+ ### Personality
52
+
53
+ **Dominant archetype:** The Architect — systematic, precise, evidence-based. Builds on verified foundations, not assumptions.
54
+
55
+ **Attributes the brand transmits:** precise, honest, unified, developer-native, evidence-based, practical
56
+
57
+ **What it is:**
58
+ - Google's data, made accessible to AI assistants
59
+ - A tool for builders who want facts, not opinions
60
+ - Evidence over convention
61
+ - Free as in freedom
62
+ - One tool that replaces five
63
+
64
+ **What it is not:**
65
+ - Not a tool that invents SEO rules
66
+ - Not a score generator that makes up numbers
67
+ - Not trying to be your SEO consultant
68
+ - Not a platform that wants your monthly payment
69
+ - Not enterprise software dressed up as open source
70
+
71
+ ### Promise
72
+
73
+ Your site, seen through Google's eyes.
74
+
75
+ One install. Three APIs. No made-up rules.
76
+
77
+ If your AI assistant can't ask Google about your site, your SEO tool has failed.
78
+
79
+ **Base message:** Pagesight gives AI assistants direct access to what Google knows about your site — index status, performance, Core Web Vitals, search queries — so they can diagnose and fix issues with real data.
80
+
81
+ **Synthesizing phrase:** Pagesight exists so developers never have to open an SEO dashboard again.
82
+
83
+ ### Guardrails
84
+
85
+ **Tone summary:** direct, technical, confident, evidence-based, helpful
86
+
87
+ **What the brand cannot be:**
88
+ - A tool that promises "10x your traffic"
89
+ - A score generator that assigns arbitrary numbers
90
+ - A tool that flags "issues" Google says don't matter
91
+ - A project that treats developers as secondary users
92
+ - Anything that feels like Salesforce
93
+
94
+ **Litmus test:** If Google's own documentation doesn't support the claim, don't make it.
95
+
96
+ ---
97
+
98
+ ## Voice
99
+
100
+ ### Identity
101
+
102
+ We don't make up rules. We ask Google. Pagesight is an open-source MCP server that connects AI assistants to Google Search Console, PageSpeed Insights, and Chrome UX Report — the same data Google uses to evaluate your site.
103
+
104
+ We are not a marketing platform. We are not consultants. We are not SaaS. We are a developer tool that does one job well: make Google's search data accessible to AI assistants.
105
+
106
+ Most SEO tools flag "title over 60 characters" and "only one H1 allowed." Google's own engineers have said these rules are made up. We researched every common SEO rule against official documentation and built a tool that only reports what Google actually tells us.
107
+
108
+ **Essence:** Google's data, your AI assistant's hands.
109
+
110
+ ### Tagline & Slogans
111
+
112
+ **Primary tagline:** Lint your site for search engines and AI.
113
+ *Use everywhere: README, homepage, social bios, conference talks.*
114
+
115
+ **Alternatives:**
116
+ - Google's search data + AI crawler intelligence for AI assistants.
117
+ - One MCP server. Three Google APIs. 139+ AI bots tracked.
118
+ - Ask Google, not SEO tools.
119
+
120
+ **Slogans for different contexts:**
121
+ - README hero: "Google Search Console + PageSpeed + CrUX in one MCP server."
122
+ - Technical: "Three Google APIs. Seven tools. One install."
123
+ - Community: "Stop guessing. Ask Google."
124
+ - Myth-busting: "Title length limits? Google says they don't exist. We checked."
125
+ - Developer pitch: "Your AI assistant's direct line to Google Search."
126
+
127
+ ### Manifesto
128
+
129
+ Most SEO tools are built on myths.
130
+
131
+ "Title must be under 60 characters." Google: "there's no limit." "Only one H1 per page." John Mueller: "You can use H1 tags as often as you want." "Meta description must be 155 characters." Google: "There's no limit on how long a meta description can be." "Minimum 300 words per page." Mueller: "Word count is not a quality factor."
132
+
133
+ We checked every rule. We read the documentation. We watched the office hours. Most of what SEO tools flag has no basis in anything Google has ever said.
134
+
135
+ So we built something different.
136
+
137
+ Pagesight doesn't make up rules. It asks Google directly. Three APIs — Search Console, PageSpeed Insights, Chrome UX Report — unified into one MCP server that your AI assistant can use.
138
+
139
+ Is this page indexed? Ask Google. What canonical did Google choose? Ask Google. How fast is this page for real users? Ask Google. What queries bring traffic? Ask Google.
140
+
141
+ No scores we invented. No thresholds we made up. No "best practices" that are actually just blog posts from 2015.
142
+
143
+ Just Google's data. In your AI assistant's hands.
144
+
145
+ **Pagesight.**
146
+
147
+ ### Message Pillars
148
+
149
+ **Evidence-Based**
150
+ Every data point comes from authoritative sources — Google's APIs, RFC 9309, community-maintained bot registries. We don't invent rules, assign scores, or set thresholds. If the source doesn't support it, we don't report it.
151
+
152
+ **Unified**
153
+ Three Google APIs plus AI crawler intelligence in one MCP server. Search Console for indexing and search data. PageSpeed Insights for Lighthouse audits. Chrome UX Report for real-world performance. robots.txt analyzer for AI bot access control. Eight tools. One install.
154
+
155
+ **Open**
156
+ MIT license. Free APIs only. No paid dependencies. No vendor lock-in. Works with Claude, Cursor, Codex, and every MCP-compatible client.
157
+
158
+ **Developer-Native**
159
+ Built with Bun. Configured with env vars. Speaks MCP over stdio. Designed to run inside AI workflows, not beside them. No dashboard. No browser. No login page.
160
+
161
+ **Myth-Free**
162
+ We researched every common SEO rule against Google's official documentation. Title length limits, H1 requirements, word count minimums — all debunked. Pagesight only reports what Google actually cares about.
163
+
164
+ ### Phrases
165
+
166
+ - "Lint your site for search engines and AI."
167
+ - "Google's data, your AI assistant's hands."
168
+ - "One server. Three APIs. 139 bots. No made-up rules."
169
+ - "Stop guessing. Ask Google."
170
+ - "Your AI assistant's direct line to Google Search."
171
+ - "Is GPTBot blocked? Ask Pagesight."
172
+ - "Title length limits? Google says they don't exist."
173
+ - "We checked every SEO rule. Most are myths."
174
+ - "If the source doesn't report it, neither do we."
175
+
176
+ ### Social Bios
177
+
178
+ **LinkedIn:**
179
+ Pagesight is an open-source MCP server that gives AI assistants direct access to Google Search Console, PageSpeed Insights, and Chrome UX Report. Inspect indexing status, track Core Web Vitals, analyze search performance — with real data from Google's APIs, not made-up rules. Built with Bun. MIT license.
180
+
181
+ **Instagram:**
182
+ Open-source MCP server for AI assistants
183
+ Google Search Console + PageSpeed + CrUX
184
+ No dashboards. No made-up rules. Just Google's data.
185
+ MIT License | Built with Bun
186
+
187
+ **X/Twitter:**
188
+ Open-source MCP server that gives AI assistants direct access to Google Search Console, PageSpeed Insights, and CrUX. No made-up SEO rules. Just Google's data.
189
+
190
+ **GitHub:**
191
+ Lint your site for search engines and AI. Google Search Console + PageSpeed Insights + Chrome UX Report in one MCP server.
192
+
193
+ ### Tonal Rules
194
+
195
+ 1. Speak in short, declarative sentences. No hedging.
196
+ 2. Lead with what the tool does, not what it is.
197
+ 3. Use developer vocabulary: install, configure, query, inspect — not "solutions," "empower," "leverage."
198
+ 4. When citing an SEO claim, always reference the source. "Google says X" with a link, not "best practice suggests."
199
+ 5. Never use marketing superlatives: "revolutionary," "game-changing," "next-generation," "cutting-edge."
200
+ 6. When debunking a myth, quote the Google engineer directly. Names and dates.
201
+ 7. Respect the reader's time. If it can be said in one line, don't use three.
202
+ 8. Use code examples and terminal output over screenshots and diagrams.
203
+ 9. Acknowledge what we don't cover. "Pagesight doesn't do X" is better than hiding it.
204
+ 10. Never promise traffic results. Promise access to data.
205
+ 11. Open source is a fact, not a selling point. Don't over-celebrate it.
206
+ 12. "Google says" is our strongest argument. Use it.
207
+
208
+ **Identity boundaries:**
209
+ - We are not consultants who leave a report behind.
210
+ - We are not a platform trying to become your workflow.
211
+ - We are not a score generator that makes up numbers.
212
+ - We are not a startup looking for your email address.
213
+ - We are not building the next Semrush. We are building a bridge to Google's APIs.
214
+
215
+ | We Say | We Never Say |
216
+ |---|---|
217
+ | "Install Pagesight" | "Sign up for Pagesight" |
218
+ | "Ask Google" | "Optimize your digital presence" |
219
+ | "Google reports this page is not indexed" | "Our analysis shows SEO score 73/100" |
220
+ | "One MCP server, three Google APIs" | "All-in-one platform" |
221
+ | "CrUX shows p75 LCP of 2.1s" | "Your page speed needs improvement" |
222
+ | "Open source, MIT license" | "Free forever (with Pro tier)" |
223
+ | "Built for AI assistants" | "Built for teams of all sizes" |
224
+ | "Works with any MCP client" | "Seamless integration ecosystem" |
225
+
226
+ ---
227
+
228
+ ## Visual
229
+
230
+ ### Colors
231
+
232
+ **Primary:** `#0A0A0A` (near-black)
233
+ Headlines, body text, primary UI elements. The default state. Clean, authoritative, no decoration.
234
+
235
+ **Secondary:** `#18181B` (zinc-900)
236
+ Card backgrounds, code blocks, secondary surfaces. Provides depth without competing.
237
+
238
+ **Accent:** `#4285F4` (Google blue)
239
+ Links, interactive elements, data highlights. References Google as the data source. Used sparingly.
240
+
241
+ **Good:** `#22C55E` (green-500)
242
+ Pass states, "indexed" indicators, good CrUX ratings. The "everything is fine" color.
243
+
244
+ **Warning:** `#EAB308` (yellow-500)
245
+ Warnings, "needs improvement" CrUX ratings. Attention without alarm.
246
+
247
+ **Error:** `#EF4444` (red-500)
248
+ Errors, "not indexed" states, poor CrUX ratings. Demands action.
249
+
250
+ **Muted:** `#71717A` (zinc-500)
251
+ Secondary text, descriptions, metadata. Stays out of the way.
252
+
253
+ **Background:** `#FAFAFA` (zinc-50)
254
+ Page backgrounds, light mode surfaces. Clean, not sterile.
255
+
256
+ **Avoid:** Gradients, purples (too SaaS), any neon or saturated palette that signals "marketing tool." Don't overuse Google blue — it's an accent, not a primary.
257
+
258
+ ### Typography
259
+
260
+ **Display:** Inter — 700 weight
261
+ Headlines, hero text, brand name. Clean geometric sans-serif. Same family used by Linear and Vercel.
262
+
263
+ **Body:** Inter — 400/500 weight
264
+ Paragraphs, descriptions, UI text. Highly legible at all sizes.
265
+
266
+ **Mono:** JetBrains Mono — 400 weight
267
+ Code examples, terminal output, tool names, API responses, configuration snippets. This is a developer tool — monospace is a first-class citizen, not an afterthought.
268
+
269
+ ### Style
270
+
271
+ **Design keywords:** systematic, minimal, structured, terminal-native, data-forward
272
+
273
+ **Reference brands:** Linear (systematic clarity), Vercel (developer-native), Resend (clean minimalism), Google Search Console (data authority)
274
+
275
+ **Direction:** The identity should communicate precision and data integrity, not decoration. Every visual element should feel like it belongs in a terminal or a well-designed README. Data tables and API responses are the hero content, not illustrations. If it wouldn't look right next to a `curl` output, reconsider it.