Automated Accessibility Testing
Automated accessibility testing is the fastest way to identify common WCAG violations in your web projects. These tools scan your HTML, CSS, and ARIA attributes to flag issues that can be programmatically detected — things like missing alt text, insufficient color contrast, and duplicate IDs. While automated testing is essential, it is only the starting point. Understanding what these tools can and cannot catch is critical to building a complete accessibility testing strategy.
The Major Automated Testing Tools
axe-core
axe-core is the industry-leading open-source accessibility testing engine, developed and maintained by Deque Systems. It is the engine that powers CodeFrog's accessibility testing as well as the Lighthouse accessibility audit in Chrome DevTools.
- Rule set: Over 90 rules covering WCAG 2.0, 2.1, and 2.2 success criteria at Levels A and AA
- Zero false positives policy: axe-core is designed to minimize false positives. If it reports an issue, the issue is almost certainly real.
- Integration: Available as a JavaScript library, browser extension (axe DevTools), Playwright/Puppeteer integration, and CI/CD-compatible CLI
- How CodeFrog uses it: CodeFrog runs axe-core directly against your localhost environment, scanning every page it crawls and reporting violations with severity levels and remediation guidance
Pa11y
Pa11y is a free, open-source accessibility testing tool that runs in Node.js. It wraps the HTML CodeSniffer engine (with axe-core support available) and provides both a CLI and a CI integration layer.
- Pa11y CLI: Test individual URLs from the command line
- Pa11y CI: Run accessibility tests against multiple URLs as part of your CI/CD pipeline, with configurable thresholds
- Pa11y Dashboard: A web dashboard for tracking accessibility issues over time
- Strengths: Excellent for CI/CD integration, supports WCAG 2.1 AA by default, configurable to ignore specific rules or elements
WAVE
WAVE (Web Accessibility Evaluation Tool) is developed by WebAIM and provides a visual overlay showing accessibility issues directly on your page.
- Browser extensions: Available for Chrome and Firefox
- Visual approach: Icons are injected into the page next to the elements with issues, making it easy to see exactly where problems are
- Structural view: Shows heading hierarchy, landmark regions, and other structural elements
- Best for: Quick visual audits during development, non-technical stakeholders who need to see issues in context
Lighthouse Accessibility Audits
Google Lighthouse includes an accessibility audit category powered by axe-core. It runs as part of the Lighthouse suite in Chrome DevTools, as a CLI tool, or via PageSpeed Insights.
- Scoring: Provides a 0–100 accessibility score based on axe-core results
- Integration: Built into Chrome DevTools (Audits panel), available via
npx lighthouseCLI - Limitation: Uses a subset of axe-core rules, so running axe-core directly will catch more issues than Lighthouse alone
What Automated Tools CAN Catch
Automated tools typically catch 30–40% of all WCAG issues. The types of issues they excel at detecting include:
- Missing alt text — Images without
altattributes are easily detected by scanning the DOM - Color contrast failures — Tools can compute the contrast ratio between text color and background color and compare it against WCAG thresholds (4.5:1 for normal text, 3:1 for large text at Level AA)
- Missing form labels — Input elements without associated
<label>elements oraria-label/aria-labelledbyattributes - Duplicate IDs — Multiple elements sharing the same
idattribute, which breaks ARIA references and label associations - Missing landmark regions — Pages without
<main>,<nav>,<header>, or<footer>elements (or equivalent ARIA roles) - Missing document language — The
<html>element lacking alangattribute - Empty links and buttons — Interactive elements with no accessible name
- Invalid ARIA attributes — Using ARIA roles, states, or properties incorrectly
- Missing page title — Pages without a
<title>element - Tabindex misuse — Elements with
tabindexvalues greater than 0, which disrupts natural tab order
What Automated Tools CANNOT Catch
The remaining 60–70% of WCAG issues require human judgment. Automated tools fundamentally cannot evaluate:
- Alt text quality — A tool can detect that an image has an
altattribute, but it cannot determine whetheralt="image"is meaningfully descriptive. Only a human can judge if the alternative text conveys the same information as the image. - Logical reading order — CSS can visually reorder content in ways that do not match the DOM order. A screen reader follows the DOM, so visually reordered content may be read in a confusing sequence. Automated tools cannot evaluate whether the reading order is logical.
- Keyboard trap nuances — While tools can detect some keyboard traps (focus that enters a component but cannot leave), complex interactive widgets like modal dialogs, date pickers, and custom dropdowns require manual testing to verify that focus is managed correctly.
- Complex widget accessibility — Custom components like carousels, accordions, tab panels, and tree views need manual verification that ARIA roles, states, and properties are implemented correctly and that the expected keyboard interaction patterns work.
- Meaningful heading hierarchy — Tools can detect if heading levels are skipped (e.g., jumping from
<h2>to<h4>), but they cannot evaluate whether the heading text accurately describes the section content. - Error handling and form guidance — Whether error messages are clear, specific, and help users correct their input requires human evaluation.
- Consistent navigation — Whether navigation patterns are consistent across pages is a cross-page evaluation that most tools do not perform.
- Timing and animations — Whether users have enough time to complete tasks, whether animations can be paused, and whether motion is reduced when the user requests it.
How CodeFrog Uses axe-core
CodeFrog integrates axe-core to provide automated accessibility testing directly on your localhost development environment. Here is how it works:
- Local crawling: CodeFrog crawls your localhost site, discovering pages by following links
- axe-core injection: For each page, CodeFrog injects axe-core into the page context and runs a full accessibility audit
- Violation reporting: Results are organized by severity (critical, serious, moderate, minor) with specific WCAG success criteria references
- Remediation guidance: Each violation includes a description of the issue, the affected HTML element, and guidance on how to fix it
- Pre-production testing: Because testing happens on localhost, you catch and fix issues before they ever reach production
Integrating Automated Testing into CI/CD
Running accessibility checks as part of your continuous integration pipeline ensures that new code does not introduce accessibility regressions. Here is how to set up Pa11y CI as a quality gate:
Step 1: Install Pa11y CI
npm install --save-dev pa11y-ci
Step 2: Create a Configuration File
Create a .pa11yci file in your project root:
{
"defaults": {
"standard": "WCAG2AA",
"timeout": 30000,
"wait": 1000
},
"urls": [
"http://localhost:3000/",
"http://localhost:3000/about",
"http://localhost:3000/contact",
"http://localhost:3000/products"
]
}
Step 3: Add to Your CI Pipeline
In a GitHub Actions workflow, you can run Pa11y CI after starting your development server:
# In your GitHub Actions workflow
- name: Start dev server
run: npm start &
- name: Wait for server
run: npx wait-on http://localhost:3000
- name: Run accessibility tests
run: npx pa11y-ci
Step 4: Configure Thresholds
Pa11y CI will exit with a non-zero status code if any issues are found, failing your CI build. You can configure it to allow a certain number of warnings while still failing on errors.
Choosing the Right Tool
Each tool has its strengths. Here is a practical guide to choosing:
- For local development: Use CodeFrog (axe-core on localhost) for the most comprehensive automated scanning with zero setup
- For browser-based auditing: Use the axe DevTools browser extension or WAVE for quick visual audits during development
- For CI/CD pipelines: Use Pa11y CI or the axe-core CLI to prevent accessibility regressions in automated builds
- For quick checks: Use Lighthouse in Chrome DevTools for a fast accessibility score alongside performance and SEO audits
- For all of the above: Combine tools. Run CodeFrog locally during development, axe DevTools in the browser for spot checks, and Pa11y CI in your pipeline for continuous monitoring
Resources
- axe-core on GitHub — The open-source accessibility testing engine
- Pa11y Documentation — Free accessibility testing tools for CLI and CI/CD
- Deque University — Accessibility training and resources from the axe-core team