Five Innovations Harness New Technologies for People with Visual Impairment, Blindness

February is Low Vision Awareness Month

Newswise — During Low Vision Awareness Month, the National Eye Institute (NEI), part of the National Institutes of Health, is highlighting new technologies and tools in the works to help the 4.1 million Americans living with low vision or blindness. The innovations aim to help people with vision loss more easily accomplish daily tasks, from navigating office buildings to crossing a street. Many of the innovations take advantage of computer vision, a technology that enables computers to recognize and interpret the complex assortment of images, objects and behaviors in the surrounding environment.
Low vision means that even with glasses, contact lenses, medicine, or surgery, people find everyday tasks difficult to do. It can affect many aspects of life, from walking in crowded places to reading or preparing a meal, explained Cheri Wiggs, Ph.D., program director for low vision and blindness rehabilitation at the NEI. The tools needed to stay engaged in everyday activities vary based on the degree and type of vision loss. For example, glaucoma causes loss of peripheral vision, which can make walking or driving difficult. By contrast, age-related macular degeneration affects central vision, creating difficulty with tasks such as reading, she said.
Here’s a look at a few NEI-funded technologies under development that aim to lessen the impact of low vision and blindness.
Co-robotic cane
Navigating indoors can be especially challenging for people with low vision or blindness. While existing GPS-based assistive devices can guide someone to a general location such as a building, GPS isn’t much help in finding specific rooms, said Cang Ye, Ph.D., of the University of Arkansas at Little Rock. Ye has developed a co-robotic cane that provides feedback on a user’s surrounding environment.
Ye’s prototype cane has a computerized 3-D camera to “see” on behalf of the user. It also has a motorized roller tip that can propel the cane toward a desired location, allowing the user to follow the cane’s direction. Along the way, the user can speak into a microphone and a speech recognition system interprets verbal commands and guides the user via a wireless earpiece. The cane’s credit card-sized computer stores pre-loaded floor plans. However, Ye envisions being able to download floor plans via Wi-Fi upon entering a building. The computer analyzes 3-D information in real time and alerts the user of hallways and stairs. The cane gauges a person’s location in the building by measuring the camera’s movement using a computer vision method…..
Read more:
Source: NIH, National Eye Institute (NEI)