27-Aug-19 | News

Accessibility devices use sound to allow the visually impaired to envision stars

Bieryla has, in fact, made the mission of accessibility and inclusivity a top priority, pioneering tools and redesigning space to help people with physical disabilities experience the wonders of astronomy. For instance, the lab uses a tactile printer to create a sort of topographical map of star systems that people can explore with their hands.

“Think of it like Braille,” Bieryla said. “The printer produces heat and, using special heat-sensitive paper, creates images that are raised so a student who can’t see the images can feel them and understand what other students are seeing.”

The printer represented the beginning of bigger efforts. With design help from Harvard science demonstrator Daniel Davis, Bieryla and Sóley Hyman ’19 built and distributed a device they created called LightSound. The devices use simple circuit board technology with sensors that convert light into sound — brighter light translates to higher pitch — to allow those with visual impairments to experience solar eclipses. The efforts of the Harvard team have been an extension of the work of the blind astronomer Wanda Diaz Merced, who pioneered “sonification” to turn data into sound.

The lab developed a second device called Orchestar that uses a different sensor to translate colored light into sound. It improves on the first by turning different colored lights into different pitches of sound. Blue light, for example, is a higher pitch, and red is a lower one. This is important because the color of a star relates to its temperature, so the Orchestar can be used to help the visually impaired understand the differences between stars. Both devices can interface with computers to collect and analyze data, and the lab has put assembly instructions for both devices, including computer code, online so anyone can build one.

“Data is data, whether you plot the numbers visually or plot the numbers with sound. You are tracking the same information,” Bieryla said. “We also know that your ears can be more sensitive than your eyes, so sometimes you can pick out subtleties in the data with your ears that you might miss with your eyes.”

“Sonification isn’t just a tool for the visually impaired, but for anyone to analyze data,’’ Hyman added.