Wearable artificial vision device shows promise in helping legally blind people ‘read’
Study shows potential of assistive technology to improve quality of life for the visually impaired
CHICAGO – Oct. 17, 2016 – A unique wearable artificial vision device may help people who are legally blind “read” and recognize faces. It may also help these individuals accomplish everyday tasks with significantly greater ease than using traditional assistive reading devices, suggests a study presented today at AAO 2016, the 120th annual meeting of the American Academy of Ophthalmology.
Approximately 246 million people worldwide have low vision. This sight loss impairs a person’s ability to do simple daily tasks. Optical and electronic devices such as hand-held magnifiers, tele-microscopic glasses and computer and video magnifiers can help. But, typically these devices are bulky, cumbersome or not readily portable. With recent advancements in wearable electronic devices and optical character recognition technology that converts images to computer-readable text, University of California, Davis researchers hypothesized that these newer technologies could help improve patients’ ability to function in daily life. To test their theory, researchers asked a group of visually impaired patients to use a wearable artificial vision device to see its impact. They found that the device vastly improved patients’ daily productivity.
The researchers used the Orcam My Eye for their study. The device is unique because it clips to glasses, making it hands-free. It features a miniature camera that sees and recognizes what the user is viewing, whether text or a face, and then reads what it is seeing to the user via a small bone-conduction earpiece. The user activates the device by simply pointing a finger to the object or text, tapping it or pressing a trigger button.
Researchers tested the device on 12 legally blind people, who all had a visual acuity of less than 20/200…….
Read more: https://www.eurekalert.org/pub_releases/2016-10/aaoo-wav101616.php
Source: Eurek Alert