MuseWeb 2020: voice user interfaces and multi-modal accessibility

__

10 minute read
Jo Morrison

Jo Morrison

Director of Digital Innovation & Research

Arts & Culture

Digital Insights

Image from Cheryl Fogle-Hatch's research. Replica stone points, with affixed QR-code “coins”

This year’s Museweb 2020 conference, the world’s largest international museum technology conference, successfully transitioned its whole five-day programme in Los Angeles to the online world in just two weeks — a magnificent and Herculean effort necessitated by the worldwide lockdown. 

To jumpstart this annual showcasing of peer-reviewed practice from innovators around the globe, conference co-chair Rich Cherry said, “You are the pioneers who spend your day to day inventing the future of museums. This year is the best example of support and perseverance. It’s amazing and humbling.”

I had the honour of chairing one session entitled “Multi-Modal Access to Exhibits,” which gave me the privilege of listening to two inspiring presentations of papers authored by Scott Gillam and Ben Bergman (The Power of Speech: Connecting Audiences in Dialogue with Voice User Interfaces) and Cheryl Fogle-Hatch (“Bring Your Own Device” (BYOD) programming facilitates accessibility for people who are blind or have low vision). 

In this article, I’ll explore their research and findings.

The Power of Speech

Scott started his session with an overview of the Canadian Museum for Human Rights (CMHR), explaining that it is there to create space for transformative conversations related to human rights and to promote respect for others. 

He also said that the museum revolves around the idea that respect for others and understanding the importance of human rights can serve as a positive force for change in the world.

Canadian Museum for Human Rights, Canada. Source: flickr.com

Scott went on to explain the impetus for creating the voice controlled interface as a key input of their temporary exhibition in 2019. In so doing he noted that VUI has emerged as potentially the most ubiquitous user-facing solution since mobile computing and responsive design – with 1 in 4 American households having access to a smart speaker. With this amazingly high adoption rate of a technology, it is imperative that cultural institutions explore the engagement opportunities that they afford.

Methodology

In collaboration with photojournalist Kevin Frayer and Rohingya and Burmese community members in Canada, the Time to Act: Rohingya Voices is an exhibit that portrays the struggles of the Rohingya people of Myanmar as they face decades of violent persecution and genocide. The exhibit showcases large-scale projections of black-and-white photos telling the story of the Rohingya people’s exodus. These images are accompanied by two large-scale tactile photographs with audio descriptions and quotes from the community members.

The secondary experience, meanwhile, involves a wall of photos and videos taken by the refugees themselves during their time in refugee camps and their present lives in Canada. Key to this experience is the Oral History Interactive (OHI) – which is the focus of this article.

To create the OHI, the institution developed a custom voice-first interface. Visitors can ask a question to the interface through a tactile dial or verbal interaction. A video will then play showing one of the Rohingya people now living in Canada, answering the question.

One of the challenges for this exhibition was how to immerse visitors in its content in a way that emphasized its relevance, encouraged empathy, and directly connected them to the subject matter. The museum did this through using VUI as the primary mode of interaction and providing a rich and textured multi-sensory experience through the supplementary use of visual, auditory and tactile components.

“Careful consideration of the CMHR’s universal design approach paved the way for innovation, allowing for verbal, non-verbal, tactile, visual, and auditory elements that supported a fully accessible experience for audiences of all abilities.” 

To create a smooth and successful visitor experience, CMHR employed an iterative and inclusive design approach with four rounds of user testing that involved staff members, volunteers, and students. For instance, based on the desire to mimic a conversation, video clips were considered based on length. After several rounds of review and testing, the following questions were developed to form the basis of the VUI experience:

  • Where were you born?
  • Can you tell me about your family?
  • What does it mean to be Rohingya?
  • What was your experience coming to Canada?
  • How is your life in Canada?
  • What do you want people to know about the Rohingya genocide?
  • What can people do to take action?

Results

This study found that the Oral History Interactive (OHI) received the highest engagement, as 60% of visitors stopped to interact with the VUI tech or watch a companion or another visitor use the interactive. The OHI received more visitors than any other interactive in the museum’s galleries. 

“This experience was more successful at getting visitors to stop than any interactive in the museum’s core galleries.”

Other highlighted observations include visitors preferring to use the tactile dial to ask their questions (as opposed to verbal interaction), groups with children were more inclined to ask verbal questions, and that digital interactives with physical components attract more visitors than screens alone. They are hopeful that this trend of VUIs in museums will be integrated into other galleries, which can also eventually include sign languages for the deaf community. 

Scott concluded that using VUIs is an effective way to tell stories and provide agency to previously unheard voices. He, however, recommends further research into the emotional and cognitive effects of interacting with subject matters through this technology.

Bring Your Own Device

For Cheryl Fogle-Hatch’s paper, non-visual accessibility and smartphone technology were integral components of the two case studies she presented. 

3D printed replica stone point with lanyard attached. Sourced from Cheryl's paper.
3D printed replica stone point with lanyard attached. Sourced from Cheryl’s paper.

The first project was a prototype design for a travelling archaeological exhibit with 3D replicas of stone tools. Users scan QR codes to get more information about the artefacts. The second case study looked into Near Field Communication (NFC) and Bluetooth technologies via the WayAround mobile app to scan and access the texts available in a tactile art exhibit. 

The travelling archeological exhibit saw two archaeologists and one engineering professor design a prototype of an inclusive and accessible exhibit for both blind and sighted visitors (Fogle-Hatch et al. 2018). As Cheryl describes “The exhibit consisted of 3-D replicas of stone projectile points (spear tips and arrowheads) that could be physically manipulated to be explored tactually. Additionally, QR-codes were included that could be scanned with a smartphone (Lacoma, 2018). The QR-codes directed the user to a webpage that displayed information about the artifacts. This text could be accessed using screen magnification or voice output while also being available for sighted users. Consequently, the exhibit was inclusive since it could be experienced by mixed groups of visitors, whether they happened to be sighted or blind.”

Image shows WayAround app cartoon as described in Cheryl's paper featured in MuseWeb 2020
The WayAround App

The WayAround App provided information about each artwork was programmed via Bluetooth into a WayTag, that was contained in a round sticker affixed to the top left of each print label. To accommodate sighted visitors, the stickers were confined to the margins of the print label so that text was not obscured. Stickers could be identified by touch, and the visitor could point the phone towards the sticker to scan the tag. A rounded bump (a door stop from a local hardware store) was placed on the baseboard immediately below the WayTag; visitors could find it with a white cane as they walked along the gallery wall.

Comparisons

Both of the BYOD solutions were designed to utilize the accessibility features of smartphones that are routinely carried by people who are blind or who have low vision. Off-the-shelf solutions were implemented that allowed people to access information on their own devices using their preferred accessibility settings. These apps allow a greater degree of independence in accessing label text in both cases, and navigation in the case of the WayAround tags.

Both case studies have their advantages and disadvantages. QR code solutions are non-proprietary, meaning users can use it whilst being assured that the software does not collect their data. The QR-code solution, being open-source, is recommended when projects are supported by infrastructure to store and access information relevant to the exhibit. The WayAround app, however, has proprietary hardware and software that collects private information and is recommended in cases where the project lacks infrastructure to store and access information.

The teams for both projects were a mix of blind and sighted members. They chose to implement off-the-shelf accessibility solutions rather than building their own infrastructure. The projects provide examples of how exhibition content can be made accessible and engaging to visitors who are blind or have low vision, whilst being inclusive to sighted audiences.

Insights

The trend of ‘bring your own device’ (BYOD) can potentially increase the accessibility of museums to people who are blind or have low vision. In fact, a 2019 survey cited in Cheryl’s paper showed that people who are blind or have low vision use their own mobile devices in a variety of ways: to listen to audiobooks; read ebooks; identify objects; navigate and convert files into accessible electronic formats.

While this shows that there is an opportunity for museum’s to offer greater access to and engagement with their exhibits via smartphone devices, exhibits still need to have their media content designed with this technology in mind

Meanwhile, as discussed earlier the open-source nature of QR codes makes them suitable for museums that already have the infrastructure to store and access information needed for their exhibit. The WayAround app (available for Apple and Android devices) will suit museums where infrastructure is lacking to store and access information.

One of the most notable findings of this study is the fact that BYOD programming can make museums accessible to those who are blind or have low vision, but it can also be used by sighted visitors. A key insight to remember is that the app responds to the preference settings of a user’s phone thereby enabling visitors with sight loss to experience galleries more independently through the familiar digital means as desired at any time.

UCAN GO

This second study is reminiscent of the  UCAN GO App, our own free indoor wayfinding app specifically designed with and for visually-impaired users. Our goal was to create an app that makes cultural buildings accessible for people with sight loss via step-by-step audio instructions that are personalised according to a user’s particular phone settings.

What makes our app unique, however, is that we created a mobile solution that was not dependent on digital connectivity or hardware that would need to be installed at the venue. We received a unanimously positive response from our user base as they felt more independent and confident while they navigate a new building.

UCAN GO user interface

Reflections on the multi-modal session

Scott, Ben and Cheryl have conducted practice-led research that has the opportunity to positively influence the museum community – institutions and visitors. It was a pleasure to learn of the considered multi-modal, multi-sensory experiences that they had all helped to produce. Of course, it’s also inspiring to see such important real-world projects undertaken at vastly different scales and benefitting visitors in vastly different settings. The first environment being a large world-leading institution and the second being more personal and community focused – but both creating new ways for people to explore and critique the world around them.

My own essay—entitled  “Mobile Digital Wayfinding Tools: Enabling and Enhancing the Experience of Visitors with Different Access Needs”—will be published by Museweb and The Smithsonian Institute soon, along with some other essays that were presented in this conference.

Calvium circle logo