Introduction
Nov 17, 2023
According to current statistics, there are about 43.3 million blind people worldwide, counting for about 0.48% of the total human population. Although in the last two decades, modern medicine had increased success in healing blindness, some of the current studies estimate that in the next 30 years, the number of blind people will increase by up to three times due to the growing and aging population. As one can imagine, blindness has a major impact on a persons’ life, from social integration to an increased susceptibility to accidents. For instance, the unemployment rate among blind individuals is three times higher than the average, whereas the risks associated with navigating sidewalks are at least twice as high. Moreover, in the case of young people, blindness restricts the access to normal education and limits personal progress.
If it is to define blindness, one of the most basic and simple definitions would be as an inability to see, caused by an incapacity to discern light from darkness. In this context, blind people “see” the world through their other senses, most often by hearing and touching. Thus, blindness and severe visual impairment disturb a person’s capacity to receive visual information and can have various causes, including congenital conditions, eye injuries, diseases, or degenerative conditions.
The development of new solutions to assist blind people is a major research domain having great potential to enhance the daily lives of such people. These innovative solutions aim to augment the perception of the surrounding environment for the blind and severely visually impaired. Since a Visually Impaired Person (VIP) cannot rely on their sight, these systems must possess the capability to sense the surroundings, identify pertinent information, and convey it to the user through their other senses, with hearing and touch being the most suitable sensory channels for this purpose. To effectively perceive the environment, these devices incorporate various arrays of sensors, including ultrasound sensors, Passive InfraRed (PIR) sensors, Inertial Measurement Unit (IMU) sensors, LiDAR, GPS, and/or cameras. Once the sensory data are collected, a data fusion algorithm is employed to analyze them and provide the user with information that is not only accurate but also relevant, useful, and presented in an appropriate manner. At present, some of the most advanced solutions designed to assist VIPs are based on Artificial Intelligence (AI), specifically using neural networks for tasks like computer vision and data analysis. These AI-driven systems play a crucial role in processing sensory input and providing meaningful insights to help individuals with visual impairments navigate and interact with their surroundings more effectively.