Bats, dolphins, and other creatures that use sonar symbols to “see” what’s in front of them served as inspiration for the assistive technology.
Australian researchers have created a pair of smart glasses that aid blind users in perceiving their surroundings through the use of a cutting-edge method. The “sound icons” that the glasses produce are inspired by bats, dolphins, and other animals that use sound to perceive their physical environment. The “sound icons” change in speed in response to an object’s distance from the user. This contributes to the development of what the researchers refer to as “acoustic touch.”
The majority of assistive eyewear for the blind employ cameras and other visual sensors to record their surroundings. After that, some gadgets use an artificial intelligence system to process the data and define what’s in front of them—a crosswalk, a cab, a certain storefront, etc. However, this kind of translation puts more of an emphasis on object definitions than on assisting the user in comprehending their environment as a whole. Thus, they overlook small or insignificant items in front of them, such as a coffee cup or pillar. This affects the user’s safety in addition to keeping them from interacting with things that are necessary for daily living.
As opposed to this, the glasses—officially known as a particular kind of foveated audio device, or FAD—sonify things in close proximity to the wearer, regardless of their size. An OPPO Find X3 Pro Android phone and NReal’s Light Augmented Reality glasses with binaural speakers make up the FAD, according to a group of biomedical engineers in Sydney who have shared their findings in a recent report published in PLoS One. The NReal glasses provide a user with a trapezoid-shaped auditory field of view (FOV) that is assessed using sound icons. Similar to a game of “hot or cold,” these repeating sounds accelerate or decelerate based on how close an object is to within the field of view.
Fourteen individuals with little to no natural eyesight participated in the FAD test conducted by the researchers. The task required of the participants was to locate and recognise bowls, books, glasses, and bottles while sitting, standing, and moving around. The participants demonstrated a remarkable level of proficiency in identifying and locating objects while moving around the test area, in addition to their ability to locate and identify objects with relative ease when seated at a table.
The study’s lead scientist, Dr. Howe Zhu, stated in a release from the University of Technology Sydney, “The auditory feedback empowers users to identify and reach for objects with remarkable accuracy.” “Our results suggest that acoustic touch may provide the visually impaired community with a wearable and efficient means of sensory augmentation.”