Accessibility: Automated Testing of Websites
You may have read in our previous blog post what the Accessibility Act means for websites. In this article, we’ll guide you step by step on how to get a quick, automated overview of your website’s accessibility. We’ll introduce you to two widely recognized and commonly used tools for automated accessibility testing: Google Lighthouse and WAVE. Both tools are free to use and require no technical expertise.
Automated vs. Manual Testing
Before we dive into the practical part of this article, we’d like to take a closer look at the two types of accessibility testing and explain why a combination of both methods is essential to thoroughly evaluate a website’s accessibility.
Automated tests require no prior knowledge and can be executed very quickly. These tests primarily assess aspects related to the technical implementation of the site, focusing on elements embedded in the website’s underlying code that shape its static appearance. However, automated tests fall short when it comes to evaluating the actual usability of the site, such as navigating with a keyboard.
Manual testing, in contrast, is more complex and time-consuming. To perform it effectively, a solid understanding of accessibility barriers and how they can appear on websites is necessary. Manual tests reflect how well a website can truly be used by people with disabilities, capturing the real-world user experience.
In summary, automated tests are useful for gaining an initial understanding of a website’s structure and stability. A positive result from these tests indicates that the website has a solid foundation for accessibility. However, it is only through manual testing that the genuine accessibility of the website can be assessed. Therefore, relying solely on automated tests is never sufficient.
Testing with Lighthouse
- If you’re using Google Chrome as your browser, you already have Lighthouse installed and simply need to open Chrome. If you’re using a different browser, you can add Lighthouse as an extension. This example will focus on using Lighthouse within Chrome.
- Enter the URL of the website you want to test into the browser's address bar. For this demonstration, we’ve used the ORF website.
- Next, open Chrome’s Developer Tools. You can do this by pressing the F12 key or by right-clicking on the page and selecting “Inspect” from the context menu.
- Once in Developer Tools, navigate to the “Lighthouse” tab.
- To test the ORF homepage, select the "Navigation" mode, set the device to "Desktop," and choose only the "Accessibility" category.
- Click the “Analyze Page Load” button, and after a brief moment, you’ll receive the results.
The result
It’s immediately noticeable that Lighthouse rates the page’s accessibility at 84%. This is a strong score, but there’s still room for improvement.
A full breakdown of the results would be beyond the scope of this article. However, we’d like to highlight one specific example. Lighthouse points out that some images are missing the “alt” attribute (“alt” stands for “alternative”). The “alt” attribute is used to describe the content of important images, and this description is read aloud by screen readers.
The “alt” attribute must always be present for images, though in some cases it should remain empty—for example, when the element is purely decorative and doesn’t convey any important information to the user. In this situation, an empty “alt” attribute signals to a screen reader that the image should be ignored. You can learn more about this in the Web Content Accessibility Guidelines (WCAG).
Lighthouse lists all affected elements. On the left, you’ll see the elements visually highlighted with a green outline. The text in the right column shows the element's class—an attribute that web developers use to identify and style elements.
In our example, Lighthouse flags only the “alt” attributes of icons placed by orf.at alongside links that lead to subpages (e.g., https://wien.orf.at). In these cases, an “alt” attribute definitely needs to be added. The question remains whether it can be left empty. A quick check with a screen reader reveals that, while the article headline indicates a link, it doesn’t specify where the link leads. To improve navigation, the “alt” attribute should mention that the link directs to a subpage.
Testing with WAVE
- To test with WAVE, open https://wave.webaim.org/.
- Enter the web page address you wish to test into the "Web Page address" field and click the small arrow next to the input box. We’ll once again use the ORF website for this test.
- You will receive the results after a short moment.
If you don’t want to visit the WAVE website every time you run a test, you can also install the official browser extension.
The result
Unlike Lighthouse, WAVE displays the tested areas directly on the page being analyzed. To view the details, simply click on the “Details” tab within the WAVE section.
It is immediately noticeable that, like Lighthouse, WAVE flags the missing “alt” texts as errors. WAVE also displays several warnings. A warning indicates that there might be an issue, but it requires human review for further evaluation. By clicking on the icons, you can see directly on the website which element is affected. Just as we did with Lighthouse, we’ll take a closer look at one of the problematic aspects here.
WAVE (similar to Lighthouse) highlights that the contrast in some areas is insufficient. High contrast is crucial to help individuals with visual impairments recognize and use all important elements on a website. This includes people affected by conditions such as cataracts Cataracts. Even people without visual impairments benefit from high contrast, for instance, when screen brightness is inadequate.
Looking at WAVE’s results, two main contrast issues stand out, which we have marked graphically in the illustration below: active links in the header (1) and image captions (2). For clarity, we have hidden all other notifications.
In the image captions, it is immediately clear that contrast is an issue in some cases. For instance, in area (2), consider the article “Studie: Klimakrise trifft Olympiastandorte” (second image from the left in the bottom row): here, white text is displayed on a very light background.
The contrast issue in the header is less obvious: the current weather is shown on a slightly lighter background compared to the darker row above it (containing links like Television, ORF ON, etc.). But what about the last error on the top right of the header? Here, WAVE has unfortunately placed the error icon in a confusing spot. Looking at the page in its original form, you can see that the currently active link (“News”) has insufficient contrast, as the text is rendered in gray.
Learn more about contrast in the WCAG.
Conclusion
You have successfully completed your first automated accessibility tests — congratulations! Discuss the test results with your web developer, who can interpret the findings and implement the appropriate measures.
As mentioned at the beginning, automated tests are great for quickly and easily identifying technical weaknesses. However, true accessibility can only be assessed through manual testing. In our example with orf.at, Lighthouse’s score of 84 points and WAVE’s 27 errors may give a less favorable initial impression than what we experience when using the site. In our next article, we’ll show you how to conduct manual tests and what to keep in mind, using Microsoft Accessibility Insights as a guide.