Wearable System Helps Users with Vision Disabilities Navigate the Environment

A team of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a prototype navigation system that uses a 3-D camera, haptic feedback, and a Braille display to give people with vision disabilities greater independent mobility. Designed to replace or improve cane-only navigation, the team’s device uses a depth-sensing 3-D camera with an algorithm to intelligently identify obstacles in the user’s path. The system is comprised of a wearable camera, a vibrating belt and customizable Braille interface that gives more information about objects in the environment. For example, whether a chair is occupied or empty.

Robert Katzschmann, a graduate student at MIT and one of the lead authors of the paper wrote, “we did a couple of different tests with users who are blind. Having something that didn’t infringe on their other senses was important. So we didn’t want to have audio; we didn’t want to have something around the head, vibrations on the neck — all of those things, we tried them out, but none of them were accepted. We found that the one area of the body that is the least used for other senses is around your abdomen.”

Though still in prototype phase, the device shows promise in assisting people with vision disabilities to navigate environments more safely and independently.


The contents of this website were developed under a grant from the National Institute on Disability, Independent Living, and Rehabilitation Research (NIDILRR grant number 90RE5025-01-00). NIDILRR is a Center within the Administration for Community Living (ACL), Department of Health and Human Services (HHS). The contents of this website do not necessarily represent the policy of NIDILRR, ACL, HHS, and you should not assume endorsement by the Federal Government.