For the past several years, various research institutions and organizations have been experimenting with electronic “white canes” for the blind. One of these was the ultrasound-enabled UltraCane, which we profiled five years ago. Now, however, an associate professor of applied science at the University of Arkansas is working on something more advanced – a white cane that utilizes laser technology to give users the lay of the land.
U Arkansas’ Dr. Cang Ye and his colleagues plan to use a Flash LADAR (laser detection and ranging) three-dimensional imaging sensor to create a detailed model of the user’s environment. Unlike other laser ranging systems that require the laser to mechanically scan back and forth across the environment, Flash takes everything in at once, within sequential floodlit exposures that typically lasts less than a nanosecond each – this is particularly well-suited to people on the move.
The Flash system obtains two images per exposure, one that measures the physical range (or distance away) of each pixel, and one that measures their intensity. Ye’s team has created an algorithm called VR-Odometry (VRO), that uses this data to calculate the user’s position within their environment. VRO compares the same feature in each two adjacent intensity images, and observes the differences between the two to determine how the user and that feature are moving relative to one another. By combining this information with the range information for that same feature, the system lets the user know where they are in their environment, and where they’re going.
The Latest Streaming News: visually impaired technology updated minute-by-minute