A11y Compliance Testing
Your company just received a demand letter citing ADA non-compliance on your web application. The legal team is alarmed, and engineering discovers that the signup form has no label associations, the navigation is unusable with a keyboard, and the color contrast on half the call-to-action buttons fails WCAG AA standards. Accessibility is not a nice-to-have — it is a legal requirement and an engineering discipline that AI tools can dramatically accelerate.
What You’ll Walk Away With
Section titled “What You’ll Walk Away With”- Automated WCAG 2.2 AA compliance testing integrated into your CI pipeline
- AI-powered accessibility auditing that explains violations and generates fixes
- Playwright-based accessibility test patterns for E2E flows
- Keyboard navigation and screen reader testing workflows
- Prompts that generate accessible code from the start
Automated WCAG Auditing
Section titled “Automated WCAG Auditing”Using axe-core with AI Analysis
Section titled “Using axe-core with AI Analysis”Set up automated accessibility testing for our Next.js application:
1. Install @axe-core/playwright for Playwright integration2. Create an accessibility test helper at /tests/a11y/audit-helper.ts that: - Runs axe-core on any page - Filters results by WCAG 2.2 AA criteria - Categorizes violations by severity (critical, serious, moderate, minor) - Generates fix suggestions for each violation
3. Create accessibility tests for our critical pages: - /tests/a11y/pages/homepage.a11y.spec.ts - /tests/a11y/pages/login.a11y.spec.ts - /tests/a11y/pages/dashboard.a11y.spec.ts - /tests/a11y/pages/settings.a11y.spec.ts
4. Each test should: - Run the full axe audit - Fail on critical and serious violations - Report moderate violations as warnings - Test at both desktop and mobile viewports
Follow our Playwright config in playwright.config.tsclaude "Set up accessibility testing infrastructure:
1. Install: npm install -D @axe-core/playwright2. Create /tests/a11y/run-audit.ts: - Accept a URL and Playwright page - Run axe-core with WCAG 2.2 AA rules - Return structured results with fix suggestions3. Create tests for all routes in our application: - Discover routes from /src/pages/ directory - Generate an a11y test for each page - Save to /tests/a11y/pages/4. Run all accessibility tests5. Generate a compliance report at /docs/a11y-report.md
For each violation found, include:- The WCAG criterion it violates- The HTML element causing the violation- A specific code fix"Set up WCAG 2.2 AA compliance testing:1. Add axe-core to our Playwright test infrastructure2. Create accessibility tests for every page3. Run the audit and generate a compliance report4. Create fixes for any critical violations found5. Create a PR with tests, fixes, and the compliance reportKeyboard Navigation Testing
Section titled “Keyboard Navigation Testing”Generating Accessible Code from the Start
Section titled “Generating Accessible Code from the Start”Instead of auditing and fixing, generate accessible code from the beginning.
CI Integration for Accessibility
Section titled “CI Integration for Accessibility”-
Run axe-core on every PR
Add accessibility testing to your CI pipeline. Fail the build on critical and serious violations.
-
Track compliance over time
Store audit results and trend them. Your compliance score should improve over time, not degrade.
-
Prevent new violations
The most important gate: no new critical or serious accessibility violations in any PR. Existing violations can be tracked in a backlog.
-
Weekly accessibility report
Generate a report showing: total violations by severity, new violations this week, resolved violations, compliance percentage by page.
When This Breaks
Section titled “When This Breaks”“axe-core reports hundreds of violations and it is overwhelming.” Start by fixing only critical violations. These are the ones that completely block users from accessing content. Move to serious, then moderate. Do not try to fix everything at once.
“The AI generates accessible code but it looks different from our design system.” Add your design system constraints to the prompt: “Use our existing color palette. Focus indicator must be our brand blue (#2563eb) with a 2px offset.” Accessible does not mean ugly — it means intentional.
“Screen reader testing is manual and slow.” Automated tests catch 30-40% of accessibility issues. The rest require manual testing with screen readers (VoiceOver, NVDA). Use AI to generate a manual test checklist for QA, focusing on the interactions that automated tools miss.
“Developers do not know enough about accessibility to fix violations.” The AI prompts in this guide generate both the fix and the explanation. Developers learn accessibility by fixing real violations with context, not by reading documentation.