Hypergraph accessibility audit

Cultural and individual differences of any kind should not make it harder to use software or technology in your everyday lives. Regretfully, it still very much does, as the default is that of an able-bodied white man in an overdeveloped country. We live in an ableist, sexist, racist, and colonialized system that takes this as the default experience and builds for it, despite those experiences not being worth more than anybody else's.

Take for example, the experiences of disabled people. Experiences on the computer are highly designed for people with visual acuity and motor capabilities. We can all be hit with a disability rather abruptly in our lives, which is likely to make the things we take for granted much harder. It takes only one accident, disease, or the manifestation of a latent issue that's already there. We are able-bodied, until we are not.

We recognise that Liberate Science is non-diverse: we are able-bodied, mostly man, mostly white, and come from a historically privileged position. This means we have blindspots when we do our work because of our lived experiences that are closer to the default in many ways. This means that we have a responsibility towards our fellow humans from all walks of life and do our best to empathise and build for their experiences.

To take our responsibility, we did an in-house accessibility audit before our public beta release, and share our findings to recognise our shortcomings. All of these things will be addressed before we leave beta, because accessibility is not an afterthought to us.

A11y checklist

Using the A11y checklist for web accessibility, we audited Hypergraph (v0.5.4; see our complete audit table here). This checklist provides a way to go through (Web) applications systematically, covering things like visual contrasts all the way through code structure of the pages.

Our key findings provide clear and practical improvements:

  • We have no alternative texts throughout Hypergraph, which will make it difficult to navigate using voice assist (see below).
  • The page structure is set up visually, but the code does not provide the clearest differentiation for humans. For example, our heading levels are not consistently used to provide document structure.
  • Our form validation and errors need to be improved by being less color indicated (visual) and more explicit (text).
  • Our pages and sections do not indicate what language it is in, and does not allow for Right-to-Left (RTL) languages to be displayed.

The A11y checklist has proven very helpful, but also abstract and alienated from the experiences whose needs we try to serve. In order to try to empathize better with those experiences, we tried to use Hypergraph with special color displays and voice assist to discover additional improvements.


One in twenty men are red-green colorblind, and there are many other visual impairments across people. To make sure that the visual designs are accessible across different visual spaces, we tested the designs using the various macOS color spaces. These screenshots indicate we do quite well, although there is some room for improvement regarding the consistency of colors across the design, as the inverted screen showcases.

From top-left to bottom-right, clockwise: Inverted colours, blue-yellow (tritanopia), green-red filter (deuteranopia), red-green filter (protanopia), and greyscale (monochromia).

Voice assist

With the macOS VoiceOver tool, Hypergraph became difficult to use. We tried to use Hypergraph with our eyes closed and attempted to create a profile, add content, and add it to our Hypergraph profile. Using our visual memory of the application, we succeeded. Without it, we clearly would not have.

The test of voice assist highlighted the lack of alternative texts for some buttons. For example, we could not add content to Hypergraph if we had not known its location by heart (the voice assist read link index.html — hardly comprehensible). Also, filling out the content form was difficult because of the lack of alternative text that described the functions of the various buttons and boxes.

Ultimately, adding a next research step was impossible through just voice assist and keyboard navigation within Hypergraph. This means that a core functionality is at this moment not available to those who cannot use Hypergraph visually, and that is insufficient.


The ally checklist provided us with concrete improvements in how we structure the Hypergraph pages that will help improve accessibility. It also informed us that our visual design is quite accessible. The accessibility tests using voice assist and the colour adjustments highlighted that there are some inconsistencies in background colour, and that the application is unusable without visual aids at the moment.

If you have feedback on our accessibility audit, we would love to hear it. You can reach us through the chat button (bottom right) or on [email protected].

Hypergraph accessibility audit
Liberate Science GmbH August 11, 2020
Why we're starting Many Paths