For people who are blind or have visually impairments, a cane can help with detecting obstacles — as long as those objects are on the ground. For anything above the knees, or further than just a few feet away, canes provide less help. Soon, though, a new app could help people better navigate their environments.
Researchers at the University of Alicante in Spain have developed an app that uses a phone's built-in 3D camera to detect obstacles, and then produces a vibration or tone to alert the user.
The scientists tested the appwith nine participants, who all had visual impairment severe enough that they couldn't see objects in their way without help. During the test, the participants wore an LG Optimus 3D Max phone around their necks on a lanyard, with the cameras facing forward.
As the subjects walked, the cameras picked up on objects. Because the 3D camera has two lenses, the phone had binocular vision, just like human eyes. This allowed the software to estimate the distance to objects within its field of view. When the app calculated that an object was closer than about six feet (2 meters), the phone vibrated or sounded a tone.
As the obstacle got closer, the frequency of vibrations or sound level increased. "It's like the systems for parking cars," Sáez told Live Science in an email.
Although the app doesn't give users exact directional information like sonar does, or tell the wearer how high the obstacle is, it does alert blind or visually impaired users that there's something they need to avoid. [10 Technologies That Will Transform Your Life]
The study appeared in the IEEE Journal of Biomedical Health Informatics on May 7.
The app isn't yet ready for prime time, Saez said; the particular model of phone the researchers used in their testing has been discontinued, he noted.
However, the team is developing a version for Google Glass, after winning a grant from the Vodafone Spain Foundation in 2013 for an earlier version of the app. Sáez said he hopes to have a full version available to consumers in 2015.
Google Glass could actually be a more convenient device for using the app than a smartphone. "The camera is much more stable on the head than any other part of the body," Sáez said. "That improves the results for all vision algorithms, because the camera swings much less."