top of page

Shanghai Jiao Tong University Professor Develops Wearable AI NavigationSystem for the Visually Impaired

International academic journal Nature Machine Intelligence published a

research paper introducing a wearable system designed to assist individuals with

visual impairments in navigation.



The paper was authored by Professor Gu Leilei’s team from the Qiyuan Research

Institute at the School of Computer Science and Engineering, Shanghai Jiao Tong

University. The system captures visual information through a camera and uses AI

algorithms to analyze the environment, identify obstacles and key objects, and make

real-time decisions. It then communicates navigation instructions to the user via

stereo sound delivered through bone-conduction headphones and tactile feedback

through artificial electronic skin on the wrist, guiding the user to move forward, left, or

right as needed. These instructions are updated continuously as the user moves,

providing step-by-step guidance toward the destination.

Professor Gu noted that previous navigation systems for the visually impaired often

treated users like autonomous vehicles. “But people aren’t machines—they have

their own characteristics and specific needs from such systems,” he emphasized.

The goal of this newly developed system is to enhance the mobility of visually

impaired individuals by helping them avoid obstacles through timely cues. The entire

system weighs about 200 grams and includes smart glasses equipped with an RGB-

D (Red, Green, Blue, and Depth) camera, two small patches of artificial electronic

skin, and a miniature single-board computer. The system can process input and

deliver output within 200–300 milliseconds, matching the average human reaction

time and allowing for seamless interaction with the user.

תגובות


bottom of page