Finding the hidden clues
Illustration by Andrea FuentesAs a designer deeply involved in web accessibility, I often encounter questions about manual testing and how It can appear overwhelming, like trying to piece together a challenge with missing parts or navigate a room with hidden clues. But I’ve found ways to make it approachable and effective. In this article, I’ll share how I’ve integrated manual testing into my workflow, focusing on how we can make our websites better. We build not just for today, but for those who come after.
At its core, manual accessibility testing involves a real person looking for issues that could create barriers for someone using assistive technology. While it can be time-consuming and requires specific knowledge, I believe that even partial manual audits offer significant value, revealing crucial ‘clues’ that automated tools often miss.
Why manual testing matters (and isn’t always intimidating)
Many people find manual testing intimidating because it seems to demand extensive time and resources, along with specialized knowledge. But a valuable manual test doesn’t always have to be a comprehensive audit.
Think of it this way: a designer reviewing color contrast on a new button, a developer ensuring the code order reflects in the reading order, or a tester using evaluation tools to guide their template checks. All these are forms of manual testing, demonstrating that small, focused efforts can be highly impactful. The key is sampling.

Sampling in manual testing
Sampling is how I combat the time and resource challenges of manual testing. This means picking specific clues or puzzle sections to focus on. This makes manual testing manageable and ensures valuable insights.
I find that automated and manual testing are much more powerful when used together. Automated tools are like the basic scan of a room, excellent at checking every corner for issues and reducing the number of problems you need to find manually. Manual testing, conversely, uncovers issues that automated tools miss and helps you understand their real-world impact.
My testing samples
I typically operate within a partial manual audit framework.
- Focus on specific results: You might test all images on a page, specifically looking at their alternative text, or color contrast on issues within the main content rather than the entire website.
- Focus on specific pages: Instead of testing every page, you might manually test four key pages: the homepage, a complex page, a crucial user flow, and a random page. This often provides significant insight into common issues across the site, like checking different rooms.
- Focus on certain tests: Sometimes, you’ll only perform an WAVE review, or just zoom and keyboard tests, rather than every single accessibility test. This is like focusing on lock picking in one round, then deciphering codes in another.
- Focus on certain tests: Sometimes, you’ll only perform a WAVE review, or just zoom and keyboard tests, rather than every single accessibility test. This is like focusing on lock picking in one round, then deciphering codes in another.
These sampling methods make manual testing doable within my daily workflow.
A list of items to check when conducting accessibility testing.Testing techniques: Your accessibility toolkit
Checking for content reflow
Zoom tests ensure that content remains readable and usable when a user magnifies the page. This is like checking if the puzzle pieces still fit together when you look at them from a different angle.
Here are two main zoom tests you can perform:
- Zooming to 200%: Use your browser’s zoom function (Command or Control + plus key) to magnify the page. Then, simply scroll through to check for overlapping content or horizontal scrolling for the entire page. Also, make sure the menu navigation reflows correctly.
- Zooming to 400% at 1280 pixels wide: First, reset your zoom to 100%. Then, open your browser’s developer tools (right-click and select ‘Inspect’) to resize the viewport to exactly 1280 pixels wide. After closing the inspect panel, zoom in to 400%. Then, scroll through the page, looking for overlapping content, excessive horizontal scrolling, or content that becomes unreadable. This test often reveals issues with sticky headers or carousels.
- Zooming to 200%: Use your browser’s zoom function (
CommandorControl+ plus key) to magnify the page. Then, simply scroll through to check for overlapping content or horizontal scrolling for the entire page. Also, make sure the menu navigation reflows correctly. - Zooming to 400% at 1280 pixels wide: First, reset your zoom to 100%. Then, open your browser’s developer tools to resize the viewport to exactly 1280 pixels wide. After closing the inspect panel, zoom in to 400%. Then, scroll through the page, looking for overlapping content, excessive horizontal scrolling, or content that becomes unreadable. This test often reveals issues with sticky headers or carousels.
It’s common for a website to switch to its mobile version at 400% zoom, especially when starting at 1280 pixels. This often triggers media queries, which is a good sign that the responsiveness is working.
It’s common for a website to switch to its mobile version at 400% zoom, especially when starting at 1280 pixels. This often triggers media queries, which is a good sign that the responsiveness is working.
A webpage shown at 100% zoom and 400% zoom, demonstrating reflow.Navigating without a mouse
Keyboard testing is crucial because it ensures that all interactive elements on a page can be accessed and operated using only a keyboard. This is your key to navigating the digital space.
- Tab: To navigate forward through interactive elements.
- Shift + Tab: To navigate backward.
- Enter: To select links or buttons.
- Space bar: To select checkboxes, radio buttons, or dropdowns.
- Arrow keys: To navigate within interactive elements like forms or carousels.
- Escape: To close dialog boxes or modals.
When performing a keyboard test, look for several key indicators:
- Focus indicator: Every interactive element you tab to must have a visible focus indicator , a clear visual outline showing where the keyboard focus is. This indicator needs sufficient contrast to stand out.
- Logical navigation order: The tab order should follow the visual reading order of the page. It shouldn’t jump around erratically, creating confusing “jumps” in your path.
- Keyboard accessibility of point gestures: Any components that need interaction using finger movements on touchscreens or trackpads must be fully operable with a keyboard or have an alternative method to perform the action.
- Skip to content link: For pages with lengthy navigation, a “Skip to main content” link is essential. This allows keyboard users to bypass repetitive navigation.
- Form completion: I ensure that forms can be completed entirely using only the keyboard.
Essentially, if something can’t be done with just a keyboard, it’s an accessibility issue.
Shows a focus ring, which visually tells keyboard users which element is active.Automate/manual review: Guiding my Accessibility checks
I frequently use the free WAVE extension as a guide for my accessibility reviews, while it performs automated checks, it highlights areas for inspection. Think of it as a blueprint, showing you where to look for specific types of clues.
The main areas I focus on are:
- Order tab: This tab shows the order of elements on the page, which directly corresponds to the tab order. It also displays the “accessible name” for interactive elements, which is what a screen reader announces. I quickly scroll through this to check for logical order and correct accessible names.
- Structure tab: Here, I review the structure of regions (like header, main content, footer) and, critically, the heading structure (H1, H2, H3, etc.). Screen readers use these to navigate, so I ensure they are logical and correctly applied. For example, a long paragraph marked as an H3 is a red flag, like a misplaced sign leading to nowhere.
- Contrast tab: While automated tools are good at checking text contrast, they struggle with contrast in images or infographics. The WAVE Contrast tab provides a color picker tool, allowing me to manually sample colors from an image or graphic and check their contrast ratio against accessibility guidelines. This ensures all visual clues are perceivable.
Automated accessibility audit showing common errors, alerts, and features on a webpage.Getting started with screen readers
Screen reader testing is vital, but it does have a learning curve. Think of it as learning a new language to interpret the puzzle’s audio clues.
- Set up your screen reader: I recommend setting up NVDA for Windows or VoiceOver for Apple. NVDA can be a bit easier to learn due to its output and navigation.
- Learn the stop button: This is crucial! Knowing how to pause the screen reader (often the Control key or Caps Lock + N for NVDA) is essential to avoid frustration.
- Practice on a known good website: I start by testing a website I know has few accessibility issues. This helps me understand what normal screen reader output and navigation sound like, so I can differentiate between a screen reader issue and a website issue.
- Embrace the learning curve: The best way to learn is by doing. When I encounter something I’m unsure about, I research it. Little by little, I become more comfortable.
Before I even begin screen reader testing, I make sure automated errors are fixed and I’ve completed my keyboard reviews. This significantly streamlines the process. When I use a screen reader, I focus on:
- Correct announcements: Ensuring all focusable elements are announced correctly, do the audio cues provide the expected information?
- Dynamic content accessibility: Checking that content changes due to user input are announced clearly, do new messages get announced as they appear?
- Form instructions and error feedback: Verifying that form instructions and error messages are announced logically, are you clearly guided on how to enter the text?
- Custom widget navigation: Confirming that custom components are navigable and announced properly, can you interact with all the interactive elements?
Screen reader announcing content.Incorporating manual testing into my workflows: Solving the puzzle together
Here are a few ways I’ve seen manual testing successfully integrated into different workflows:
- Designers: As you refine a new UI component or a single screen, do a quick manual review focused on color contrast and heading structure. This ensures your part of the puzzle is clearly defined and visually accessible.
- Minor Development Releases: For minor releases, someone on my team might do a zoom, keyboard, and WAVE review on just the updated parts of the website or template before it goes live.
- Major Development Releases: For larger releases, I might perform all the manual tests (zoom, keyboard, WAVE review, and screen reader tests), but I’ll limit my focus to the affected parts of the application.
- Annual Audit: Once a year, my organization might outsource a comprehensive third-party manual audit. Alternatively, I might perform a full set of manual tests on a sample of four key pages, a full, periodic check of the entire experience.
My journey: unlocking digital experiences together
So, what’s it really like to do these audits? Honestly, the actual testing is just the start. For every hour I spend hunting for accessibility clues, I often spend more time chatting about what I found, explaining why it matters, and retesting fixes. My goal is to partner with clients to build better experiences, not just deliver a list of problems. The most rewarding moments happen when we work together to untangle a “mess,” which is how teams truly learn to build inclusively from the start.
This hands-on, team effort has a real impact. For example, I worked with an education platform where their commitment to accessibility is loud and clear, even though they don’t publicly share audit specifics. We performed several audits on their design system and specific user flows. By continuously evaluating their products internally and with outside help, they ensure their materials work for all students. Their dedication to Universal Design for Learning shows that we’re all in this together, creating a great and fair learning experience for everyone.
Conclusion: your role in the inclusive digital world
Manual accessibility testing, often seen as a huge task, can be done smartly through sampling. By choosing what and how to test carefully, even partial manual tests give great results. The best way is to combine automated and manual testing as they help each other. Also, quick manual checks, like keyboard or WAVE reviews, can make screen reader testing easier later.
I encourage you to do a zoom, keyboard, or WAVE review of your web pages today. Remember, there are many ways to add even a small manual test to your work, making your websites better for everyone. For those who come after.