@diegovelasquezweb/a11y-engine 0.7.0 → 0.7.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -10,8 +10,9 @@ Accessibility automation engine for web applications. It orchestrates multi engi
10
10
  | **Multi engine scanning** | Runs axe-core, CDP accessibility tree checks, and pa11y HTML CodeSniffer against each page, then merges and deduplicates findings across all three engines |
11
11
  | **Stack detection** | Detects framework and CMS from runtime signals and from project source signals such as package.json and file structure |
12
12
  | **Fix intelligence** | Enriches each finding with WCAG mapping, fix code snippets, framework and CMS specific notes, UI library ownership hints, effort estimates, and persona impact |
13
+ | **AI enrichment** | Optional Claude-powered analysis that adds contextual fix suggestions based on detected stack, repo structure, and finding patterns |
13
14
  | **Report generation** | Produces HTML dashboard, PDF compliance report, manual testing checklist, and Markdown remediation guide |
14
- | **Source code scanning** | Static regex analysis of project source for accessibility patterns that runtime engines cannot detect |
15
+ | **Source code scanning** | Static regex analysis of project source for accessibility patterns that runtime engines cannot detect — works with local paths or remote GitHub repos |
15
16
 
16
17
  ## Installation
17
18
 
@@ -40,35 +41,48 @@ import {
40
41
  getScannerHelp,
41
42
  getPersonaReference,
42
43
  getUiHelp,
44
+ getConformanceLevels,
45
+ getWcagPrinciples,
46
+ getSeverityLevels,
43
47
  getKnowledge,
44
48
  } from "@diegovelasquezweb/a11y-engine";
45
49
  ```
46
50
 
47
51
  #### runAudit
48
52
 
49
- Runs the full scan pipeline: route discovery crawler, scan, merge, and analyze. Returns a payload ready for `getFindings`.
53
+ Runs the full scan pipeline: route discovery, scan, merge, analyze, and optional AI enrichment. Returns a payload ready for `getFindings`.
50
54
 
51
55
  ```ts
52
56
  const payload = await runAudit({
53
57
  baseUrl: "https://example.com",
54
58
  maxRoutes: 5,
55
59
  axeTags: ["wcag2a", "wcag2aa", "best-practice"],
56
- onProgress: (step, status) => console.log(`${step}: ${status}`),
60
+ engines: { axe: true, cdp: true, pa11y: true },
61
+ onProgress: (step, status, extra) => console.log(`${step}: ${status}`, extra),
57
62
  });
58
63
  ```
59
64
 
60
- Progress steps emitted: `page`, `axe`, `cdp`, `pa11y`, `merge`, `intelligence`.
65
+ Progress steps emitted: `repo`, `page`, `axe`, `cdp`, `pa11y`, `merge`, `intelligence`, `patterns`, `ai`. Steps are conditional — `repo` only fires when `repoUrl` is set, `patterns` when source scanning is active, and `ai` when AI enrichment is configured.
61
66
 
62
- **Common `runAudit` options**
67
+ **`runAudit` options**
63
68
 
64
- | Option | Description |
65
- | :--- | :--- |
66
- | `baseUrl` | Target URL to scan |
67
- | `maxRoutes` | Maximum routes to scan |
68
- | `axeTags` | WCAG tag filters |
69
- | `projectDir` | Project source path for source-aware detection and pattern scanning |
70
- | `skipPatterns` | Disable source pattern scanning |
71
- | `onProgress` | Progress callback for UI updates |
69
+ | Option | Type | Description |
70
+ | :--- | :--- | :--- |
71
+ | `baseUrl` | `string` | Target URL to scan |
72
+ | `maxRoutes` | `number` | Maximum routes to discover and scan |
73
+ | `crawlDepth` | `number` | How many link levels to follow from the starting URL |
74
+ | `axeTags` | `string[]` | WCAG tag filters (e.g. `["wcag2a", "wcag2aa"]`) |
75
+ | `engines` | `{ axe?, cdp?, pa11y? }` | Enable or disable individual scan engines |
76
+ | `waitUntil` | `string` | Page load strategy: `"domcontentloaded"`, `"load"`, or `"networkidle"` |
77
+ | `timeoutMs` | `number` | Per-page timeout in milliseconds |
78
+ | `viewport` | `{ width, height }` | Browser viewport size |
79
+ | `colorScheme` | `string` | Emulated color scheme: `"light"` or `"dark"` |
80
+ | `projectDir` | `string` | Local project path for stack detection and source pattern scanning |
81
+ | `repoUrl` | `string` | GitHub repo URL for remote stack detection and source pattern scanning |
82
+ | `githubToken` | `string` | GitHub token for API access when using `repoUrl` |
83
+ | `skipPatterns` | `boolean` | Disable source pattern scanning |
84
+ | `ai` | `{ enabled?, apiKey?, githubToken?, model? }` | AI enrichment configuration |
85
+ | `onProgress` | `(step, status, extra?) => void` | Progress callback for UI updates |
72
86
 
73
87
  See [API Reference](docs/api-reference.md) for the full `RunAuditOptions` contract.
74
88
 
@@ -119,15 +133,17 @@ These functions render final artifacts from scan payload data.
119
133
 
120
134
  ### Knowledge API
121
135
 
122
- These functions expose scanner help content, persona explanations, and UI copy
123
- so frontends or agents can render tooltips and guidance from engine-owned data.
136
+ These functions expose scanner help content, persona explanations, conformance levels, and UI copy so frontends or agents can render tooltips and guidance from engine-owned data.
124
137
 
125
138
  | Function | Returns | Description |
126
139
  | :--- | :--- | :--- |
127
140
  | `getScannerHelp(options?)` | `{ locale, version, title, engines, options }` | Scanner option and engine help metadata |
128
141
  | `getPersonaReference(options?)` | `{ locale, version, personas }` | Persona labels, descriptions, and mapping hints |
129
142
  | `getUiHelp(options?)` | `{ locale, version, tooltips, glossary }` | Shared tooltip copy and glossary entries |
130
- | `getKnowledge(options?)` | `{ locale, version, scanner, personas, tooltips, glossary }` | Full documentation pack for UI or agent flows |
143
+ | `getConformanceLevels(options?)` | `{ locale, version, conformanceLevels }` | WCAG conformance level definitions with axe tag mappings |
144
+ | `getWcagPrinciples(options?)` | `{ locale, version, wcagPrinciples }` | The four WCAG principles with criterion prefix patterns |
145
+ | `getSeverityLevels(options?)` | `{ locale, version, severityLevels }` | Severity level definitions with labels and ordering |
146
+ | `getKnowledge(options?)` | Full knowledge pack | Combines all knowledge APIs into a single response for UI or agent flows |
131
147
 
132
148
  See [API Reference](docs/api-reference.md) for exact options and return types.
133
149
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@diegovelasquezweb/a11y-engine",
3
- "version": "0.7.0",
3
+ "version": "0.7.2",
4
4
  "description": "WCAG 2.2 accessibility audit engine — scanner, analyzer, and report builders",
5
5
  "type": "module",
6
6
  "license": "MIT",
@@ -104,8 +104,87 @@ export async function fetchRepoFile(repoUrl, filePath, token) {
104
104
  return null;
105
105
  }
106
106
 
107
+ const SKIP_DIRS = new Set([
108
+ "node_modules", ".git", "dist", "build", ".next", ".nuxt",
109
+ "coverage", ".cache", "out", ".turbo", ".vercel", ".netlify",
110
+ "public", "static", "wp-includes", "wp-admin",
111
+ ]);
112
+
113
+ // Common source root directories to walk when the Trees API is truncated.
114
+ const FALLBACK_SOURCE_DIRS = [
115
+ "src", "app", "pages", "components", "lib", "utils", "hooks",
116
+ ];
117
+
118
+ /**
119
+ * Filters a flat list of tree items to only matching files.
120
+ * @param {Array<{type: string, path: string}>} tree
121
+ * @param {Set<string>} extSet
122
+ * @returns {string[]}
123
+ */
124
+ function filterTree(tree, extSet) {
125
+ return tree
126
+ .filter((item) => {
127
+ if (item.type !== "blob") return false;
128
+ const parts = item.path.split("/");
129
+ if (parts.some((p) => SKIP_DIRS.has(p) || p.startsWith("."))) return false;
130
+ const ext = item.path.slice(item.path.lastIndexOf(".")).toLowerCase();
131
+ return extSet.has(ext);
132
+ })
133
+ .map((item) => item.path);
134
+ }
135
+
136
+ /**
137
+ * Recursively lists files in a directory using the GitHub Contents API.
138
+ * Used as fallback when the Trees API returns a truncated response.
139
+ *
140
+ * @param {string} owner
141
+ * @param {string} repo
142
+ * @param {string} branch
143
+ * @param {string} dir
144
+ * @param {Set<string>} extSet
145
+ * @param {Record<string, string>} headers
146
+ * @param {number} depth
147
+ * @returns {Promise<string[]>}
148
+ */
149
+ async function walkContentsApi(owner, repo, branch, dir, extSet, headers, depth = 0) {
150
+ if (depth > 6) return [];
151
+
152
+ try {
153
+ const url = `${GITHUB_API}/repos/${owner}/${repo}/contents/${dir}?ref=${branch}`;
154
+ const res = await fetch(url, { headers });
155
+ if (!res.ok) return [];
156
+
157
+ const items = await res.json();
158
+ if (!Array.isArray(items)) return [];
159
+
160
+ const files = [];
161
+ const subdirPromises = [];
162
+
163
+ for (const item of items) {
164
+ if (item.type === "file") {
165
+ const ext = item.path.slice(item.path.lastIndexOf(".")).toLowerCase();
166
+ if (extSet.has(ext)) files.push(item.path);
167
+ } else if (item.type === "dir") {
168
+ const name = item.name;
169
+ if (!SKIP_DIRS.has(name) && !name.startsWith(".")) {
170
+ subdirPromises.push(
171
+ walkContentsApi(owner, repo, branch, item.path, extSet, headers, depth + 1)
172
+ );
173
+ }
174
+ }
175
+ }
176
+
177
+ const nested = await Promise.all(subdirPromises);
178
+ return [...files, ...nested.flat()];
179
+ } catch {
180
+ return [];
181
+ }
182
+ }
183
+
107
184
  /**
108
185
  * Lists files in a repository directory using the GitHub Trees API.
186
+ * Falls back to the Contents API for each common source directory if
187
+ * the Trees API response is truncated (large monorepos).
109
188
  * Returns file paths matching the given extensions.
110
189
  *
111
190
  * @param {string} repoUrl
@@ -118,32 +197,29 @@ export async function listRepoFiles(repoUrl, extensions, token) {
118
197
  if (!parsed) return [];
119
198
 
120
199
  const { owner, repo, branch } = parsed;
200
+ const extSet = new Set(extensions.map((e) => e.toLowerCase()));
201
+ const headers = authHeaders(token);
121
202
 
122
203
  try {
123
204
  const url = `${GITHUB_API}/repos/${owner}/${repo}/git/trees/${branch}?recursive=1`;
124
- const res = await fetch(url, { headers: authHeaders(token) });
205
+ const res = await fetch(url, { headers });
125
206
  if (!res.ok) return [];
126
207
 
127
- const { tree } = await res.json();
128
- if (!Array.isArray(tree)) return [];
129
-
130
- const extSet = new Set(extensions.map((e) => e.toLowerCase()));
131
- const skipDirs = new Set([
132
- "node_modules", ".git", "dist", "build", ".next", ".nuxt",
133
- "coverage", ".cache", "out", ".turbo", ".vercel", ".netlify",
134
- "public", "static", "wp-includes", "wp-admin",
135
- ]);
136
-
137
- return tree
138
- .filter((item) => {
139
- if (item.type !== "blob") return false;
140
- const path = item.path;
141
- const parts = path.split("/");
142
- if (parts.some((p) => skipDirs.has(p) || p.startsWith("."))) return false;
143
- const ext = path.slice(path.lastIndexOf(".")).toLowerCase();
144
- return extSet.has(ext);
145
- })
146
- .map((item) => item.path);
208
+ const data = await res.json();
209
+ if (!Array.isArray(data.tree)) return [];
210
+
211
+ if (!data.truncated) {
212
+ return filterTree(data.tree, extSet);
213
+ }
214
+
215
+ // Trees API truncated — fall back to Contents API for common source dirs
216
+ const results = await Promise.all(
217
+ FALLBACK_SOURCE_DIRS.map((dir) =>
218
+ walkContentsApi(owner, repo, branch, dir, extSet, headers)
219
+ )
220
+ );
221
+
222
+ return [...new Set(results.flat())];
147
223
  } catch {
148
224
  return [];
149
225
  }
@@ -175,9 +175,9 @@ function resolveCmsNotesKey(framework) {
175
175
  function filterNotes(notes, framework) {
176
176
  if (!notes || typeof notes !== "object") return null;
177
177
  const intelKey = resolveFrameworkNotesKey(framework);
178
- if (intelKey && notes[intelKey]) return { [intelKey]: notes[intelKey] };
178
+ if (intelKey && notes[intelKey]) return notes[intelKey];
179
179
  const cmsKey = resolveCmsNotesKey(framework);
180
- if (cmsKey && notes[cmsKey]) return { [cmsKey]: notes[cmsKey] };
180
+ if (cmsKey && notes[cmsKey]) return notes[cmsKey];
181
181
  return null;
182
182
  }
183
183
 
@@ -1184,6 +1184,10 @@ async function _runDomScannerInternal(args) {
1184
1184
 
1185
1185
  writeProgress("page", "running");
1186
1186
 
1187
+ // Track which steps have already emitted "done" so multi-route scans
1188
+ // don't reset progress back to "running" for routes after the first.
1189
+ const emittedDone = new Set();
1190
+
1187
1191
  for (let i = 0; i < routes.length; i += tabPages.length) {
1188
1192
  const batch = [];
1189
1193
  for (let j = 0; j < tabPages.length && i + j < routes.length; j++) {
@@ -1195,7 +1199,10 @@ async function _runDomScannerInternal(args) {
1195
1199
  log.info(`[${idx + 1}/${total}] Scanning: ${routePath}`);
1196
1200
  const targetUrl = new URL(routePath, baseUrl).toString();
1197
1201
 
1198
- writeProgress("page", "done");
1202
+ if (!emittedDone.has("page")) {
1203
+ writeProgress("page", "done");
1204
+ emittedDone.add("page");
1205
+ }
1199
1206
 
1200
1207
  let result = { url: targetUrl, violations: [], incomplete: [], passes: [], metadata: {} };
1201
1208
  let cdpViolations = [];
@@ -1203,7 +1210,7 @@ async function _runDomScannerInternal(args) {
1203
1210
 
1204
1211
  // Step 1: axe-core (conditional)
1205
1212
  if (args.engines.axe) {
1206
- writeProgress("axe", "running");
1213
+ if (!emittedDone.has("axe")) writeProgress("axe", "running");
1207
1214
  result = await analyzeRoute(
1208
1215
  tabPage,
1209
1216
  targetUrl,
@@ -1216,7 +1223,10 @@ async function _runDomScannerInternal(args) {
1216
1223
  args.axeTags,
1217
1224
  );
1218
1225
  const axeViolationCount = result.violations?.length || 0;
1219
- writeProgress("axe", "done", { found: axeViolationCount });
1226
+ if (!emittedDone.has("axe")) {
1227
+ writeProgress("axe", "done", { found: axeViolationCount });
1228
+ emittedDone.add("axe");
1229
+ }
1220
1230
  log.info(`axe-core: ${axeViolationCount} violation(s) found`);
1221
1231
  } else {
1222
1232
  // Navigate for CDP/pa11y even if axe is off
@@ -1228,9 +1238,12 @@ async function _runDomScannerInternal(args) {
1228
1238
 
1229
1239
  // Step 2: CDP checks (conditional)
1230
1240
  if (args.engines.cdp) {
1231
- writeProgress("cdp", "running");
1241
+ if (!emittedDone.has("cdp")) writeProgress("cdp", "running");
1232
1242
  cdpViolations = await runCdpChecks(tabPage);
1233
- writeProgress("cdp", "done", { found: cdpViolations.length });
1243
+ if (!emittedDone.has("cdp")) {
1244
+ writeProgress("cdp", "done", { found: cdpViolations.length });
1245
+ emittedDone.add("cdp");
1246
+ }
1234
1247
  log.info(`CDP checks: ${cdpViolations.length} issue(s) found`);
1235
1248
  } else {
1236
1249
  log.info("CDP checks: skipped (disabled)");
@@ -1238,9 +1251,12 @@ async function _runDomScannerInternal(args) {
1238
1251
 
1239
1252
  // Step 3: pa11y (conditional)
1240
1253
  if (args.engines.pa11y) {
1241
- writeProgress("pa11y", "running");
1254
+ if (!emittedDone.has("pa11y")) writeProgress("pa11y", "running");
1242
1255
  pa11yViolations = await runPa11yChecks(targetUrl, args.axeTags);
1243
- writeProgress("pa11y", "done", { found: pa11yViolations.length });
1256
+ if (!emittedDone.has("pa11y")) {
1257
+ writeProgress("pa11y", "done", { found: pa11yViolations.length });
1258
+ emittedDone.add("pa11y");
1259
+ }
1244
1260
  log.info(`pa11y: ${pa11yViolations.length} issue(s) found`);
1245
1261
  } else {
1246
1262
  log.info("pa11y: skipped (disabled)");
@@ -1248,18 +1264,21 @@ async function _runDomScannerInternal(args) {
1248
1264
 
1249
1265
  // Step 4: Merge results
1250
1266
  const axeViolationCount = result.violations?.length || 0;
1251
- writeProgress("merge", "running");
1267
+ if (!emittedDone.has("merge")) writeProgress("merge", "running");
1252
1268
  const mergedViolations = mergeViolations(
1253
1269
  result.violations || [],
1254
1270
  cdpViolations,
1255
1271
  pa11yViolations,
1256
1272
  );
1257
- writeProgress("merge", "done", {
1258
- axe: axeViolationCount,
1259
- cdp: cdpViolations.length,
1260
- pa11y: pa11yViolations.length,
1261
- merged: mergedViolations.length,
1262
- });
1273
+ if (!emittedDone.has("merge")) {
1274
+ writeProgress("merge", "done", {
1275
+ axe: axeViolationCount,
1276
+ cdp: cdpViolations.length,
1277
+ pa11y: pa11yViolations.length,
1278
+ merged: mergedViolations.length,
1279
+ });
1280
+ emittedDone.add("merge");
1281
+ }
1263
1282
  log.info(`Merged: ${mergedViolations.length} total unique violations (axe: ${axeViolationCount}, cdp: ${cdpViolations.length}, pa11y: ${pa11yViolations.length})`);
1264
1283
 
1265
1284
  // Screenshots for merged violations