Alot goes through Chieko Asakawa’s mind as she walks across the campus of Carnegie Mellon University in Pittsburgh. Her cane taps the ground ahead of her. She counts steps between one building and another. She has a few paths memorized. She wonders if the people she hears saying hi and hello are talking to her or answering their cellphones. She worries there may be a dog nearby; she has a phobia. Any distraction—rain, construction noise, chattering students—can disrupt her ability to perceive her environment and get to her destination.
Asakawa, who lost her sight at age 14, is a computer scientist, employed by IBM’s Tokyo arm since 1982. She has developed a variety of technologies to facilitate computer use for the blind: a Braille word processor, a digital library for documents written in Braille, and the IBM Home Page Reader, a text-to-speech Internet plug-in (now defunct, superseded by IBM’s Easy Web Browsing). Now, as a visiting researcher at CMU, she is taking assistive technology from the digital to the physical world. Instead of using technology to tell sight-impaired people what’s trending on Twitter, she’s using it to tell them what’s down the hallway.
Asakawa and her collaborators on the new project, called NavCog, have dotted a part of the CMU campus with beacons: Bluetooth emitters about the size of smoke detectors. Users connect with the beacons via a smartphone app (NavCog’s code is open-source, but the app currently only runs on iOS) and then a Siri-like voice guides them through the campus step by step. The team hopes to pair the beacons with facial recognition software that can identify acquaintances in the stream of passersby and even inform users when people they encounter are holding a cellphone to their ear or being led around by a dog. This system could make Asakawa’s walks through campus a whole lot simpler.