Data sonification & case study presenting astronomical events to the visually Impaired via sound

Kim-Marie Jones –

Arup, L5 Barrack Place 151 Clarence Street, Sydney, NSW, 2000, Australia

Additional authors: Mitchell Allen (Arup) , Kashlin McCutcheon

Popular version of 3aSP4 – Development of a Data Sonification Toolkit and Case Study Sonifying Astrophysical Phenomena for Visually Impaired Individuals
Presented at the 185th ASA Meeting
Read the abstract at

Please keep in mind that the research described in this Lay Language Paper may not have yet been peer reviewed.

Have you ever listened to stars appearing in the night sky?

Image courtesy of NASA & ESA; CC BY 4.0

Data is typically presented in a visual manner. Sonification is the use of non-speech audio to convey information.

Acousticians at Arup had the exciting opportunity to collaborate with astrophysicist Chris Harrison to produce data sonifications of astronomical events for visually impaired individuals. The sonifications were presented at the 2019 British Science Festival (at a show entitled A Dark Tour of The Universe).

There are many sonification tools available online. However, many of these tools require in-depth knowledge of computer programming or audio software.

The researchers aimed to develop a sonification toolkit which would allow engineers working at Arup to produce accurate representations of complex datasets in Arup’s spatial audio lab (called the SoundLab), without needing to have an in-depth knowledge of computer programming or audio software.

Using sonifications to analyse data has some benefits over data visualisation. For example:

  • Humans are capable of processing and interpreting many different sounds simultaneously in the background while carrying out a task (for example, a pilot can focus on flying and interpret important alarms in the background, without having to turn his/her attention away to look at a screen or gauge),
  • The human auditory system is incredibly powerful and flexible and is capable of effortlessly performing extremely complex pattern recognition (for example, the health and emotional state of a speaker, as well as the meaning of a sentence, can be determined from just a few spoken words) [source],
  • and of course, sonification also allows visually impaired individuals the opportunity to understand and interpret data.

The researchers scaled down and mapped each stream of astronomical data to a parameter of sound and they successfully used their toolkit to create accurate sonifications of astronomical events for the show at the British Science Festival. The sonifications were vetted by visually impaired astronomer Nicolas Bonne to validate their veracity.

Information on A Dark Tour of the Universe is available at the European Southern Observatory website, as are links to the sonifications. Make sure you listen to stars appearing in the night sky and galaxies merging! Table 1 gives specific examples of parameter mapping for these two sonifications. The concept of parameter mapping is further illustrated in Figure 1.

Table 1
Figure 1: image courtesy of NASA’s Space Physics Data Facility