A 3 stage guide to digital accessibility assessment

21 March 2019Anna Gotfryd

Digital accessibility is a big issue. In the UK alone, 13 million people have a disability, that’s one in five of the population*. And we’re all living longer, increasing the incidence and definitions of disability. So why is it so often an afterthought in digital design and build?

Building a great digital experience is a complex and time-consuming process. Building experiences at scale increases the complexity exponentially and time constraints usually dictate that certain elements “need to be sacrificed” to speed up the whole process. Too often, accessibility is a consideration that suffers.

The upshot of this is that accessibility is currently more of a buzzword than a defined part of the process. Everybody talks about it but the reality is that most people know very little about it. And many businesses and institutions have had a very harsh lesson, learning more about accessibility via a lawsuit. For website owners, developers and testers, accessibility is almost always about legal requirements and not really considered with the end-user in mind.

It’s not until you step into the shoes of a customer with disabilities that you really understand the situation. Working recently with a visually impaired person, struggling to buy a train ticket online, made me reappraise my view of accessibility standards.

However, inclusion in digital design and development is far from straightforward. It requires planned and well thought through actions at all stages of the process. But it’s never too late to audit your digital properties and assess their level of accessibility.

What is an accessibility audit?

The word audit doesn’t necessarily have positive connotations. However, the aim of an accessibility audit is to find solutions. So it’s sensible to begin with defining a set of objectives - where do you want to be once you’ve acted on the audit recommendations? Global WCAG standards help with establishing those objectives, i.e. we would like our website to be AA compliant. Introducing inclusion improvements retrospectively can be difficult so we’ve developed a 3 phase audit process to help you put things right:

audit_process

Bug identification

Manual testing to assess whether accessibility criteria have been met is an extremely time consuming process. In searching for a smart way to identify bugs of this nature, we use the same method that we use when scoping a project from scratch. When we define a project, we don’t know every detail but it doesn’t prevent us from gathering a complete scope of the project?

In the same way, testing all website components (navigation, carousels, article lists etc.) against all web content accessibility guidelines (WCAG 2.1 AA), should result in a meaningful audit, regardless of the number of pages they are being used on. To achieve this, we carry out a combination of extensive and essential testing, employing both automated and manual inspection methods. Extensive testing looks at a set of references pages, chosen to reflect key templates; essential testing ensures that all pages ultimately meet the same set of criteria.

When assessing reference pages, testers use end-user assistive technology (Object Inspector 32, Java Access Bridge, Java Ferret , WAT Toolbar, WAT-C Toolbar, JAWS, Aria Markup, Frames Named Anchors Bookmarklet, Color Oracle, Contrast checker, WAVE, VoiceOver, Accessibility built in Windows/Mac, Accessibility Developer Tool) to mirror the experience of a user with a disability. For instance, they’ll navigate the website in the same way, checking whether all interactive elements of the webpage are available with both mouse and keyboard. They’ll assess whether it’s possible to “view” images and videos via meaningful alternative texts, and check that a reference page is readable and usable when fonts are enlarged (eg. to 150-200%).

For all other pages based on the same templates, essential testing uses our automated testing tool, AET ( to be precise HTML Sniffer in AET). AET crawls every web page against the following criteria to generate an extensive report, indicating red flags where appropriate for the html related page elements:

1.1.1 Non-text content (level A)
1.3.1 Info and relationships (level A)
1.4.3 Contrast (minimum) (level AA)
2.4.1 Bypass blocks (level A)
2.4.2 Page titled (level A)
3.1.1 Language of page (level A)
4.1.1 Parsing (level A)
4.1.2 Name, role, value (level A)

We’ve even developed a module which aggregates bugs from across different pages, helping to identify the size of the problem and find quick wins.

For example, AET can check whether or not the html has the language attributes to pass 3.1.1 criterion and it can check whether an image has an alternative text (1.1.1). But it can’t replace a human in deciding whether this alternative text is appropriate or not. That’s where manual testing from an experienced QA is essential.

As bugs are raised, they should be stored and filtered against the following criteria to simplify the process for the teams that will solve the identified issues:

  • WCAG criteria
  • Website components that need fixing
  • Design issues
  • Content issues

Recommendations

Solutions to accessibility issues are not straightforward and will always involve a dispersed responsibility:

  • Design assurance (agency)
  • Implementation (software house)
  • CMS platform capabilities (the platform)
  • Content authoring (content team)

Take, for example, an issue with alternative text. It might be a content issue; the author has omitted the image alternative text or the alternative text is misleading. Or it might be a platform issue; the component used doesn’t support the alternative text. Another example might be around website contrast. The issue could be connected with incorrect design; the contrast ratios proposed by the designers are too low. But it might be due to bad content authoring; text on an image that is not visible, even for people without a sight defect.

Finding the right solution obviously depends on correctly identifying the issue and assigning the right tasks to the right team. The advantage of a good CMS means that non-technical content teams will probably be able to resolve most issues, without the need for code changes and back-end development.

Prioritisation

The reality is that prioritisation is an ongoing process in which we prioritise the bugs and identify solutions. When testing for accessibility issues it’s sometimes hard to see the wood for the trees. Each and every issue seems important. However, what you really need is an aerial view of the wood so that you can see exactly where the fires are and what needs extinguishing.

To objectify the selection of the most important issues, we have created an algorithm based around the WCAG criteria rankings (A or AA or AAA), their severity and occurrence. So we are able to create a heatmap outlining the most glaring errors that should be prioritised. Issues around website navigation seem to be the most common first fix.

We usually base the prioritisation of solutions on a responsibility model. We tackle the easiest fixes (content issues), then move further to js, css fixes (front-end issues), design fixes and finally, back-end fixes. Alternatively, we can tackle bugs and solutions in separate workstreams, running in parallel.

The path to compliance

Different approaches work in different situations, but there’s no point identifying a huge sack of bugs without coming up with appropriate solutions.

accessibility_2

But even this is not the end of the road. What comes next is the implementation phase where we fix the accessibility issues, re-do the tests and roll out the fixes. However, simply fixing issues without drawing attention to them within operational teams will only lead to the same future mistakes. So it’s very important to define a governance framework and document standards, assigning ownership within your organisation and enforcing these standards through internal processes. This is the only way to ensure WCAG compliance and bring about real change.

You can read more about the potential value of inclusion-led thinking from Open Inclusion, guest speakers at our Unconference last year. Or get in touch to find out more about how we may be able to help you revisit your digital experiences with every audience in mind.

*Source: Open Inclusion

Author: Anna Gotfryd
Published: 21 March 2019
Tags:
AETInclusionQAtestingunconference
 

People in or team love to share their experience. Explore our blog

Job Opportunities

We're always looking for new faces to join the Cognifide family. Take a look at our available jobs.