What do you get when you cross a computer and a seeing-eye dog?
That鈥檚 easy, according to a group of student researchers at the 91探花鈥檚 Human Interface Technology Laboratory: a first-of-its-kind Wearable Low Vision Aid. And, they add, the digital helper has distinct advantages over a canine: no feeding, no drooling and best of all, no need to worry about a pooper-scooper.
But the biggest advantage is that, in many ways, the computer has the potential to do a better job identifying walking hazards.
鈥淓ven with a cane or a guide dog, low-vision people can have a difficult time identifying obstacles that can be hazardous,鈥 said Eric Seibel, research assistant professor in mechanical engineering at the lab, who for the past four years has overseen the student team that developed the device. 鈥淭his is another set of eyes looking out for them.鈥
The group demonstrated the latest prototype of the technology recently at the Society for Information Display鈥檚 annual conference, being held in downtown Seattle. The work is funded by the National Science Foundation.
The Wearable Low Vision Aid was designed to be both portable and low cost, according to Seibel. A laptop computer that provides the 鈥渂rains鈥 for the system is carried in a backpack, and can be used for other tasks once the user gets to work or school. The imaging system is mounted on a pair of glasses and combines a ring of light-emitting diodes that fire bursts of infrared light in coordination with a small camera that collects images of the infrared-illuminated landscape.
Software created by the student team compares that infrared scene with the normally lit scene. Since closer objects reflect more light than distant ones, the system can 鈥渟ee鈥 which objects remain in the field of view and grow in size, indicating a possible collision. The computer assesses the situation and, if appropriate, generates a flashing icon to warn the wearer of the danger.
The device can be programmed to generate different icons for different hazards, so it not only warns of a hazard, but also lets wearers know exactly what they are dealing with.
鈥淥ne of the beauties of this is that we can customize it for different users,鈥 Seibel said. 鈥淔or some people, overhanging branches might be a big problem, and we can customize for that. In the city, it might be something else. All of these people have specific obstacles that they regularly encounter that are quite hazardous to them and we can cater the device to fit their unique needs.鈥
The device鈥檚 display consists of a vibrating crystal fiber 鈥 constructed of parts that cost less than $1 鈥 attached to a laser diode. The fiber vibrates at more than a thousand times a second to trace a series of horizontal lines and form a complete, translucent 鈥渟creen.鈥 That image is beamed into the user鈥檚 eye and painted onto the retina. The brightness of the display is adjustable so it can be seen both indoors and outside in full sunlight.
The method also bypasses many users鈥 visual problems.
鈥淏ecause this can be very bright, directed laser light, it actually goes straight through optical problems of the eye and paints a bigger, brighter image on the retina, even for people with retinal problems,鈥 Seibel said.
By using visual prompts, Seibel added, the device leaves the user鈥檚 hearing intact.
While portable, the current prototype is still a bit bulky. A smaller version is already in the works, according to Ryland Bryant, a recently graduated master鈥檚 degree student who was lead author of the Seattle conference paper. He has built a new circuit board that drops the overall weight of the system by about a half-pound. Currently, the backpack and computer weigh in at about 10 pounds.
The next step, Seibel says, could be to try using laser light to directly stimulate neurons in the eye, allowing people to 鈥渟ee鈥 even if the photoreceptors in their eyes are dead. Seibel and his collaborators have worked out some possibilities, but the concept is still preliminary.
鈥淚t鈥檚 in the proposal-writing stage and we need exploratory research funding to continue,鈥 he said.
