Accessibility audio map using Highcharts

French map to feature a map audio using Highcharts audio

In this article, I will walk through the process of creating an accessibility audio map using the Highcharts sonification module. This interactive map presents the population density across various regions of France. As users explore the map, they will encounter distinct sounds that vary in speed to represent the differing population densities. Additionally, the names of the French regions are audibly announced. The result is the demo below:

As you explore our interactive map, you’ll notice how it dynamically responds to user interactions such as mouseovers and clicks. Each action triggers auditory feedback, playing specific sounds and verbally announcing the names of the regions.

So, here are the requirements for such a map:

  1. Each time a user clicks on a region, a distinct sound plays, signaling a change in focus to a new region.
  2. The population density of each region is conveyed through a series of repetitive sounds.
  3. As part of its accessibility features, the map also includes a text-to-speech function that verbally announces the name of each region.

Let’s dive into the code and understand the ins and outs of this effective audio map that fits the requirements above.


The libraries used in this demo are the following:

<script src=""></script>
<script src=""></script>
<script src=""></script>

Before I keep going, I would like to highlight two distinct Highcharts accessibility modules that are used in the demo:

  • sonification.js to add audio features to the map.
  • accessibility.js for enhancing the map’s accessibility features.


For the dataset, I am using a simple data object to get the dataset for this demo:

data: [
  ['fr-hdf', 189], 


The most crucial part of the code is located within the series object:

Series: [
  sonification: {
  accessibility: {
  point: {

Let’s break down each section.


The track array within the sonification object defines the individual sound configurations. Each track answers one specific requirement defined above and represents a different aspect of the sonification
The track array is used:

  1. First track: To play a note on a vibraphone instrument to indicate that the user is on a new region with a specific pitch and volume pitch: 'c10', volume: 0.3
  2. Second track: To play notes g2 according to the value of each region. The object gapBetweenNotes varies based on the population density value. A smaller value means more significant gaps between the notes; the opposite happens for bigger values. Notice the use of the logarithmic option to allow better perception change.
  3. Third track: To speak the region’s name using the type to speech to activate the speech synthesis.

As I explained in the previous accessibility audio boxplot article, It is important to understand that the sonification property configures the audio representation of the map. The mapping object inside the tracks object is a critical part of the sonification configuration, as it defines how the different elements, such as the region’s value and region’s name on the map, are translated into sound or speech.


The accessibility part is used to guide the screen reader to start with the description, then the density value for each region (valueDescriptionFormat)

point: {
  valueDescriptionFormat: '{xDescription}, {point.value} people per square kilometer.'


This part of the code handles the click event. When a user clicks on a region, the code checks if sonifyOnHover is false (initially set at the beginning of the script). If so, it calls this.sonify() to start sonifying the clicked region. If sonifyOnHover is true, it stops any ongoing sonification by calling this.series.chart.sonification.cancel().

Now, you have a good idea and a starting point to create audio maps using Highcharts sonification and accessibility modules.