When people who are blind or visually impaired try to move around a city like Paris, they can easily get lost the moment they step out of a Metro station.
A pair of young French engineering students are working to tackle this problem by developing an augmented reality (AR) navigation app that identifies the most convenient route for them and uses “spatial sound” – also known as 3D sound – to guide them in the right direction . direction.
“We’re trying to make something very simple where you get the 3D sound from the right direction. You turn in the direction the sound is coming from and you’re ready, says SonarVision’s co-founder and CEO Nathan Daix to Euronews Next.
The app is currently under development and testing, but the young startup aims to make it available in 2023. The prototype is working in Paris but could “easily” be rolled out to other major European capitals, he said.
There are already apps out there that alert the user to surrounding points of interest — such as Blindsquare and Soundscape — but SonarVision’s added value, he said, is to guide users from point A to B like a “super high-precision GPS” that’s also highly intuitive.
Common wayfinding apps such as Google Maps and Apple Maps are not designed to accommodate the needs of people who are visually impaired and are difficult to use with screen readers, he said.
“One of the really frustrating issues with a lot of these products is precision,” Daix explained.
“GPS in cities in good times can be about 4 to 5 meters of precision. But in the worst moments, which is at least 30 percent of the time, you get 10-plus meters of inaccuracy.
“That means the GPS will tell you that you’ve arrived at your bus stop, but you’re actually on the other side of the street, and you still have to figure out a way to get to the bus stop, and you have no idea where it is”.
Scans buildings and city streets
To solve this, SonarVision uses the phone’s camera to scan buildings using AR technology and compares them to Apple’s database of scanned buildings for a given city.
“This allows us to accurately geotrack our users with anywhere from 20 centimeters to a meter of precision,” Daix said, adding that this allowed the app to keep the user on crossings and sidewalks while avoiding stairs and construction areas where possible. .
For the technology to work, all the user needs is a headset and an iPhone with a camera pointed at the road – although in future the camera may be part of AR glasses for more convenience.
“Your phone would be in your pocket and you’d have the glasses on your face doing the seeing part and the 3D sound coming out of the branches,” Daix said.
No substitute for a white cane
However, the app does not do real-time obstacle detection and is only designed to be a “complement” to a white cane, a guide dog or other devices that visually impaired people use to get around.
The technology to do that does exist, however: LiDAR, or light detection and ranging, technology has the potential to “really help people who are visually impaired,” he said.
“What it allows is to scan the depth of the environment. We’ve actually started working with LiDAR on an iPhone 12 Pro and have been able to develop a prototype that basically replaces the white stick.
“It allows us to detect obstacles, but not just obstacles on the floor – obstacles that are at head level, at body level… Those are really powerful things”.
The main reason SonarVision isn’t focusing on using LiDAR yet, Daix said, is that the smartphones that have it (like the iPhone 12 Pro) are expensive, and SonarVision aims to make the technology as accessible as possible.
“Today, the most important function we can work on in wayfinding – accurate and affordable wayfinding – is really one of the biggest problems yet to be solved for people with visual impairments,” he said.