@diegovelasquezweb/a11y-engine 0.7.1 → 0.7.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -10,8 +10,9 @@ Accessibility automation engine for web applications. It orchestrates multi engi
10
10
  | **Multi engine scanning** | Runs axe-core, CDP accessibility tree checks, and pa11y HTML CodeSniffer against each page, then merges and deduplicates findings across all three engines |
11
11
  | **Stack detection** | Detects framework and CMS from runtime signals and from project source signals such as package.json and file structure |
12
12
  | **Fix intelligence** | Enriches each finding with WCAG mapping, fix code snippets, framework and CMS specific notes, UI library ownership hints, effort estimates, and persona impact |
13
+ | **AI enrichment** | Optional Claude-powered analysis that adds contextual fix suggestions based on detected stack, repo structure, and finding patterns |
13
14
  | **Report generation** | Produces HTML dashboard, PDF compliance report, manual testing checklist, and Markdown remediation guide |
14
- | **Source code scanning** | Static regex analysis of project source for accessibility patterns that runtime engines cannot detect |
15
+ | **Source code scanning** | Static regex analysis of project source for accessibility patterns that runtime engines cannot detect — works with local paths or remote GitHub repos |
15
16
 
16
17
  ## Installation
17
18
 
@@ -40,37 +41,28 @@ import {
40
41
  getScannerHelp,
41
42
  getPersonaReference,
42
43
  getUiHelp,
44
+ getConformanceLevels,
45
+ getWcagPrinciples,
46
+ getSeverityLevels,
43
47
  getKnowledge,
44
48
  } from "@diegovelasquezweb/a11y-engine";
45
49
  ```
46
50
 
47
51
  #### runAudit
48
52
 
49
- Runs the full scan pipeline: route discovery crawler, scan, merge, and analyze. Returns a payload ready for `getFindings`.
53
+ Runs the full scan pipeline: route discovery, scan, merge, analyze, and optional AI enrichment. Returns a payload ready for `getFindings`.
50
54
 
51
55
  ```ts
52
56
  const payload = await runAudit({
53
57
  baseUrl: "https://example.com",
54
58
  maxRoutes: 5,
55
59
  axeTags: ["wcag2a", "wcag2aa", "best-practice"],
56
- onProgress: (step, status) => console.log(`${step}: ${status}`),
60
+ engines: { axe: true, cdp: true, pa11y: true },
61
+ onProgress: (step, status, extra) => console.log(`${step}: ${status}`, extra),
57
62
  });
58
63
  ```
59
64
 
60
- Progress steps emitted: `page`, `axe`, `cdp`, `pa11y`, `merge`, `intelligence`.
61
-
62
- **Common `runAudit` options**
63
-
64
- | Option | Description |
65
- | :--- | :--- |
66
- | `baseUrl` | Target URL to scan |
67
- | `maxRoutes` | Maximum routes to scan |
68
- | `axeTags` | WCAG tag filters |
69
- | `projectDir` | Project source path for source-aware detection and pattern scanning |
70
- | `skipPatterns` | Disable source pattern scanning |
71
- | `onProgress` | Progress callback for UI updates |
72
-
73
- See [API Reference](docs/api-reference.md) for the full `RunAuditOptions` contract.
65
+ See [API Reference](docs/api-reference.md) for options, progress steps, and return types.
74
66
 
75
67
  #### getFindings
76
68
 
@@ -119,15 +111,17 @@ These functions render final artifacts from scan payload data.
119
111
 
120
112
  ### Knowledge API
121
113
 
122
- These functions expose scanner help content, persona explanations, and UI copy
123
- so frontends or agents can render tooltips and guidance from engine-owned data.
114
+ These functions expose scanner help content, persona explanations, conformance levels, and UI copy so frontends or agents can render guidance from engine-owned data.
124
115
 
125
116
  | Function | Returns | Description |
126
117
  | :--- | :--- | :--- |
127
118
  | `getScannerHelp(options?)` | `{ locale, version, title, engines, options }` | Scanner option and engine help metadata |
128
119
  | `getPersonaReference(options?)` | `{ locale, version, personas }` | Persona labels, descriptions, and mapping hints |
129
- | `getUiHelp(options?)` | `{ locale, version, tooltips, glossary }` | Shared tooltip copy and glossary entries |
130
- | `getKnowledge(options?)` | `{ locale, version, scanner, personas, tooltips, glossary }` | Full documentation pack for UI or agent flows |
120
+ | `getUiHelp(options?)` | `{ locale, version, concepts, glossary }` | Shared concept definitions and glossary entries |
121
+ | `getConformanceLevels(options?)` | `{ locale, version, conformanceLevels }` | WCAG conformance level definitions with axe tag mappings |
122
+ | `getWcagPrinciples(options?)` | `{ locale, version, wcagPrinciples }` | The four WCAG principles with criterion prefix patterns |
123
+ | `getSeverityLevels(options?)` | `{ locale, version, severityLevels }` | Severity level definitions with labels and ordering |
124
+ | `getKnowledge(options?)` | Full knowledge pack | Combines all knowledge APIs into a single response for UI or agent flows |
131
125
 
132
126
  See [API Reference](docs/api-reference.md) for exact options and return types.
133
127
 
@@ -8,7 +8,7 @@
8
8
 
9
9
  ### `runAudit(options)`
10
10
 
11
- Runs route discovery, runtime scan, merge, and analyzer enrichment.
11
+ Runs route discovery, runtime scan, merge, analyzer enrichment, and optional AI enrichment. Supports local project paths or remote GitHub repos for stack detection and source pattern scanning.
12
12
 
13
13
  `options` (`RunAuditOptions`):
14
14
 
@@ -30,10 +30,28 @@ Runs route discovery, runtime scan, merge, and analyzer enrichment.
30
30
  | `ignoreFindings` | `string[]` |
31
31
  | `framework` | `string` |
32
32
  | `projectDir` | `string` |
33
+ | `repoUrl` | `string` |
34
+ | `githubToken` | `string` |
33
35
  | `skipPatterns` | `boolean` |
34
36
  | `screenshotsDir` | `string` |
37
+ | `engines` | `{ axe?: boolean; cdp?: boolean; pa11y?: boolean }` |
38
+ | `ai` | `{ enabled?: boolean; apiKey?: string; githubToken?: string; model?: string }` |
35
39
  | `onProgress` | `(step: string, status: string, extra?: Record<string, unknown>) => void` |
36
40
 
41
+ Progress steps emitted via `onProgress`:
42
+
43
+ | Step | When |
44
+ | :--- | :--- |
45
+ | `page` | Always — page load |
46
+ | `axe` | Always — axe-core scan |
47
+ | `cdp` | Always — CDP accessibility tree check |
48
+ | `pa11y` | Always — pa11y HTML CodeSniffer scan |
49
+ | `merge` | Always — finding deduplication |
50
+ | `intelligence` | Always — enrichment and WCAG mapping |
51
+ | `repo` | When `repoUrl` is set |
52
+ | `patterns` | When source scanning is active |
53
+ | `ai` | When AI enrichment is configured |
54
+
37
55
  Returns: `Promise<ScanPayload>`
38
56
 
39
57
  ### `getFindings(input, options?)`
@@ -123,14 +141,35 @@ Returns: `PersonaReference` (`{ locale, version, personas }`)
123
141
  - `options`: `KnowledgeOptions`
124
142
  - `locale?: string`
125
143
 
126
- Returns: `UiHelp` (`{ locale, version, tooltips, glossary }`)
144
+ Returns: `UiHelp` (`{ locale, version, concepts, glossary }`)
145
+
146
+ ### `getConformanceLevels(options?)`
147
+
148
+ - `options`: `KnowledgeOptions`
149
+ - `locale?: string`
150
+
151
+ Returns: `ConformanceLevelsResult` (`{ locale, version, conformanceLevels }`)
152
+
153
+ ### `getWcagPrinciples(options?)`
154
+
155
+ - `options`: `KnowledgeOptions`
156
+ - `locale?: string`
157
+
158
+ Returns: `WcagPrinciplesResult` (`{ locale, version, wcagPrinciples }`)
159
+
160
+ ### `getSeverityLevels(options?)`
161
+
162
+ - `options`: `KnowledgeOptions`
163
+ - `locale?: string`
164
+
165
+ Returns: `SeverityLevelsResult` (`{ locale, version, severityLevels }`)
127
166
 
128
167
  ### `getKnowledge(options?)`
129
168
 
130
169
  - `options`: `KnowledgeOptions`
131
170
  - `locale?: string`
132
171
 
133
- Returns: `EngineKnowledge` (`{ locale, version, scanner, personas, tooltips, glossary }`)
172
+ Returns: `EngineKnowledge` (`{ locale, version, scanner, personas, concepts, glossary, docs, conformanceLevels, wcagPrinciples, severityLevels }`)
134
173
 
135
174
  ---
136
175
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@diegovelasquezweb/a11y-engine",
3
- "version": "0.7.1",
3
+ "version": "0.7.3",
4
4
  "description": "WCAG 2.2 accessibility audit engine — scanner, analyzer, and report builders",
5
5
  "type": "module",
6
6
  "license": "MIT",
package/src/cli/audit.mjs CHANGED
@@ -140,6 +140,9 @@ async function main() {
140
140
  const timeoutMs = getArgValue("timeout-ms") || DEFAULTS.timeoutMs;
141
141
  const axeTags = getArgValue("axe-tags");
142
142
 
143
+ const repoUrl = getArgValue("repo-url");
144
+ const githubToken = getArgValue("github-token");
145
+
143
146
  const sessionFile = getInternalPath("a11y-session.json");
144
147
  let projectDir = getArgValue("project-dir");
145
148
  if (projectDir) {
@@ -268,8 +271,14 @@ async function main() {
268
271
  if (framework) analyzerArgs.push("--framework", framework);
269
272
  await runScript("../enrichment/analyzer.mjs", analyzerArgs);
270
273
 
271
- if (projectDir && !skipPatterns) {
272
- const patternArgs = ["--project-dir", path.resolve(projectDir)];
274
+ if ((projectDir || repoUrl) && !skipPatterns) {
275
+ const patternArgs = [];
276
+ if (projectDir) {
277
+ patternArgs.push("--project-dir", path.resolve(projectDir));
278
+ } else {
279
+ patternArgs.push("--repo-url", repoUrl);
280
+ if (githubToken) patternArgs.push("--github-token", githubToken);
281
+ }
273
282
  let resolvedFramework = framework;
274
283
  if (!resolvedFramework) {
275
284
  try {
@@ -104,8 +104,87 @@ export async function fetchRepoFile(repoUrl, filePath, token) {
104
104
  return null;
105
105
  }
106
106
 
107
+ const SKIP_DIRS = new Set([
108
+ "node_modules", ".git", "dist", "build", ".next", ".nuxt",
109
+ "coverage", ".cache", "out", ".turbo", ".vercel", ".netlify",
110
+ "public", "static", "wp-includes", "wp-admin",
111
+ ]);
112
+
113
+ // Common source root directories to walk when the Trees API is truncated.
114
+ const FALLBACK_SOURCE_DIRS = [
115
+ "src", "app", "pages", "components", "lib", "utils", "hooks",
116
+ ];
117
+
118
+ /**
119
+ * Filters a flat list of tree items to only matching files.
120
+ * @param {Array<{type: string, path: string}>} tree
121
+ * @param {Set<string>} extSet
122
+ * @returns {string[]}
123
+ */
124
+ function filterTree(tree, extSet) {
125
+ return tree
126
+ .filter((item) => {
127
+ if (item.type !== "blob") return false;
128
+ const parts = item.path.split("/");
129
+ if (parts.some((p) => SKIP_DIRS.has(p) || p.startsWith("."))) return false;
130
+ const ext = item.path.slice(item.path.lastIndexOf(".")).toLowerCase();
131
+ return extSet.has(ext);
132
+ })
133
+ .map((item) => item.path);
134
+ }
135
+
136
+ /**
137
+ * Recursively lists files in a directory using the GitHub Contents API.
138
+ * Used as fallback when the Trees API returns a truncated response.
139
+ *
140
+ * @param {string} owner
141
+ * @param {string} repo
142
+ * @param {string} branch
143
+ * @param {string} dir
144
+ * @param {Set<string>} extSet
145
+ * @param {Record<string, string>} headers
146
+ * @param {number} depth
147
+ * @returns {Promise<string[]>}
148
+ */
149
+ async function walkContentsApi(owner, repo, branch, dir, extSet, headers, depth = 0) {
150
+ if (depth > 6) return [];
151
+
152
+ try {
153
+ const url = `${GITHUB_API}/repos/${owner}/${repo}/contents/${dir}?ref=${branch}`;
154
+ const res = await fetch(url, { headers });
155
+ if (!res.ok) return [];
156
+
157
+ const items = await res.json();
158
+ if (!Array.isArray(items)) return [];
159
+
160
+ const files = [];
161
+ const subdirPromises = [];
162
+
163
+ for (const item of items) {
164
+ if (item.type === "file") {
165
+ const ext = item.path.slice(item.path.lastIndexOf(".")).toLowerCase();
166
+ if (extSet.has(ext)) files.push(item.path);
167
+ } else if (item.type === "dir") {
168
+ const name = item.name;
169
+ if (!SKIP_DIRS.has(name) && !name.startsWith(".")) {
170
+ subdirPromises.push(
171
+ walkContentsApi(owner, repo, branch, item.path, extSet, headers, depth + 1)
172
+ );
173
+ }
174
+ }
175
+ }
176
+
177
+ const nested = await Promise.all(subdirPromises);
178
+ return [...files, ...nested.flat()];
179
+ } catch {
180
+ return [];
181
+ }
182
+ }
183
+
107
184
  /**
108
185
  * Lists files in a repository directory using the GitHub Trees API.
186
+ * Falls back to the Contents API for each common source directory if
187
+ * the Trees API response is truncated (large monorepos).
109
188
  * Returns file paths matching the given extensions.
110
189
  *
111
190
  * @param {string} repoUrl
@@ -118,32 +197,29 @@ export async function listRepoFiles(repoUrl, extensions, token) {
118
197
  if (!parsed) return [];
119
198
 
120
199
  const { owner, repo, branch } = parsed;
200
+ const extSet = new Set(extensions.map((e) => e.toLowerCase()));
201
+ const headers = authHeaders(token);
121
202
 
122
203
  try {
123
204
  const url = `${GITHUB_API}/repos/${owner}/${repo}/git/trees/${branch}?recursive=1`;
124
- const res = await fetch(url, { headers: authHeaders(token) });
205
+ const res = await fetch(url, { headers });
125
206
  if (!res.ok) return [];
126
207
 
127
- const { tree } = await res.json();
128
- if (!Array.isArray(tree)) return [];
129
-
130
- const extSet = new Set(extensions.map((e) => e.toLowerCase()));
131
- const skipDirs = new Set([
132
- "node_modules", ".git", "dist", "build", ".next", ".nuxt",
133
- "coverage", ".cache", "out", ".turbo", ".vercel", ".netlify",
134
- "public", "static", "wp-includes", "wp-admin",
135
- ]);
136
-
137
- return tree
138
- .filter((item) => {
139
- if (item.type !== "blob") return false;
140
- const path = item.path;
141
- const parts = path.split("/");
142
- if (parts.some((p) => skipDirs.has(p) || p.startsWith("."))) return false;
143
- const ext = path.slice(path.lastIndexOf(".")).toLowerCase();
144
- return extSet.has(ext);
145
- })
146
- .map((item) => item.path);
208
+ const data = await res.json();
209
+ if (!Array.isArray(data.tree)) return [];
210
+
211
+ if (!data.truncated) {
212
+ return filterTree(data.tree, extSet);
213
+ }
214
+
215
+ // Trees API truncated — fall back to Contents API for common source dirs
216
+ const results = await Promise.all(
217
+ FALLBACK_SOURCE_DIRS.map((dir) =>
218
+ walkContentsApi(owner, repo, branch, dir, extSet, headers)
219
+ )
220
+ );
221
+
222
+ return [...new Set(results.flat())];
147
223
  } catch {
148
224
  return [];
149
225
  }
@@ -175,9 +175,9 @@ function resolveCmsNotesKey(framework) {
175
175
  function filterNotes(notes, framework) {
176
176
  if (!notes || typeof notes !== "object") return null;
177
177
  const intelKey = resolveFrameworkNotesKey(framework);
178
- if (intelKey && notes[intelKey]) return { [intelKey]: notes[intelKey] };
178
+ if (intelKey && notes[intelKey]) return notes[intelKey];
179
179
  const cmsKey = resolveCmsNotesKey(framework);
180
- if (cmsKey && notes[cmsKey]) return { [cmsKey]: notes[cmsKey] };
180
+ if (cmsKey && notes[cmsKey]) return notes[cmsKey];
181
181
  return null;
182
182
  }
183
183
 
@@ -46,6 +46,8 @@ function parseArgs(argv) {
46
46
 
47
47
  const args = {
48
48
  projectDir: null,
49
+ repoUrl: null,
50
+ githubToken: null,
49
51
  framework: null,
50
52
  output: getInternalPath("a11y-pattern-findings.json"),
51
53
  onlyPattern: null,
@@ -56,13 +58,15 @@ function parseArgs(argv) {
56
58
  const value = argv[i + 1];
57
59
  if (!key.startsWith("--") || value === undefined) continue;
58
60
  if (key === "--project-dir") args.projectDir = value;
61
+ if (key === "--repo-url") args.repoUrl = value;
62
+ if (key === "--github-token") args.githubToken = value;
59
63
  if (key === "--framework") args.framework = value;
60
64
  if (key === "--output") args.output = value;
61
65
  if (key === "--only-pattern") args.onlyPattern = value;
62
66
  i++;
63
67
  }
64
68
 
65
- if (!args.projectDir) throw new Error("Missing required --project-dir");
69
+ if (!args.projectDir && !args.repoUrl) throw new Error("Missing required --project-dir or --repo-url");
66
70
  return args;
67
71
  }
68
72
 
@@ -320,7 +324,7 @@ export async function scanPatternRemote(pattern, repoUrl, githubToken, framework
320
324
  /**
321
325
  * Main execution function for the pattern scanner.
322
326
  */
323
- function main() {
327
+ async function main() {
324
328
  const args = parseArgs(process.argv.slice(2));
325
329
  const { patterns } = loadAssetJson(
326
330
  ASSET_PATHS.remediation.codePatterns,
@@ -339,23 +343,44 @@ function main() {
339
343
  process.exit(0);
340
344
  }
341
345
 
342
- const scanDirs = resolveScanDirs(args.framework, args.projectDir);
343
- log.info(`Scanning source code at: ${args.projectDir}`);
344
- if (scanDirs.length > 1 || scanDirs[0] !== args.projectDir) {
345
- log.info(` Scoped to: ${scanDirs.map((d) => relative(args.projectDir, d)).join(", ")}`);
346
- }
347
- log.info(`Running ${activePatterns.length} pattern(s)...`);
348
-
349
346
  const allFindings = [];
350
- for (const pattern of activePatterns) {
351
- const findings = [];
352
- for (const scanDir of scanDirs) {
353
- findings.push(...scanPattern(pattern, scanDir, args.projectDir));
347
+
348
+ if (args.repoUrl) {
349
+ // Remote scan via GitHub API — no clone needed
350
+ log.info(`Scanning source code at: ${args.repoUrl}`);
351
+ log.info(`Running ${activePatterns.length} pattern(s) via GitHub API...`);
352
+
353
+ for (const pattern of activePatterns) {
354
+ const findings = await scanPatternRemote(
355
+ pattern,
356
+ args.repoUrl,
357
+ args.githubToken || null,
358
+ args.framework || null,
359
+ );
360
+ if (findings.length > 0) {
361
+ log.info(` ${pattern.id}: ${findings.length} match(es)`);
362
+ }
363
+ allFindings.push(...findings);
354
364
  }
355
- if (findings.length > 0) {
356
- log.info(` ${pattern.id}: ${findings.length} match(es)`);
365
+ } else {
366
+ // Local filesystem scan
367
+ const scanDirs = resolveScanDirs(args.framework, args.projectDir);
368
+ log.info(`Scanning source code at: ${args.projectDir}`);
369
+ if (scanDirs.length > 1 || scanDirs[0] !== args.projectDir) {
370
+ log.info(` Scoped to: ${scanDirs.map((d) => relative(args.projectDir, d)).join(", ")}`);
371
+ }
372
+ log.info(`Running ${activePatterns.length} pattern(s)...`);
373
+
374
+ for (const pattern of activePatterns) {
375
+ const findings = [];
376
+ for (const scanDir of scanDirs) {
377
+ findings.push(...scanPattern(pattern, scanDir, args.projectDir));
378
+ }
379
+ if (findings.length > 0) {
380
+ log.info(` ${pattern.id}: ${findings.length} match(es)`);
381
+ }
382
+ allFindings.push(...findings);
357
383
  }
358
- allFindings.push(...findings);
359
384
  }
360
385
 
361
386
  const confirmed = allFindings.filter((f) => f.status === "confirmed").length;
@@ -363,7 +388,7 @@ function main() {
363
388
 
364
389
  writeJson(args.output, {
365
390
  generated_at: new Date().toISOString(),
366
- project_dir: args.projectDir,
391
+ project_dir: args.repoUrl || args.projectDir,
367
392
  findings: allFindings,
368
393
  summary: {
369
394
  total: allFindings.length,
@@ -378,10 +403,8 @@ function main() {
378
403
  }
379
404
 
380
405
  if (process.argv[1] === fileURLToPath(import.meta.url)) {
381
- try {
382
- main();
383
- } catch (error) {
406
+ main().catch((error) => {
384
407
  log.error(error.message);
385
408
  process.exit(1);
386
- }
409
+ });
387
410
  }