Silicon Valley giant Google’s Project Guideline uses a smartphone’s camera to track a guideline on a course. This technology then beams audio cues to the user via the latest in bone-conducting headphones.
Thomas Panek, who lost his vision in his early 20s due to a genetic condition, on Thursday completed a 5km (3.1-mile) run in New York’s Central Park to put this next-generation AI navigation aid to the test.
I ain’t sitting still
Mr Panek, who runs guide dog school Guiding Eyes for the Blind, said: “The safest thing for a blind man is to sit still. I ain’t sitting still.
The 50-year-old marathon enthusiast became increasingly frustrated with being forced to follow slower runners as a guide, so he recently decided to look for new ways to run solo.
He wrote in a Google blog post: “I even qualified for the New York City and Boston Marathons five years in a row.
“But as grateful as I was to my human guides, I wanted more independence.”
He turned to Google to find a way for a phone to “tell me where to go,” adding: “Humans are born to run.”
The runner worked with Google’s parent company Alphabet Inc unit to help create the research program responsible for the prototype.
Xuan Yang, a Google researcher involved with the project, said in a statement: “It’s like teaching a kid how to learn where the line is.”
Should any athlete stray too far from the pre-determined course centre, a sound will amplify on whichever side they have deviated to.
The camera on a phone attached to a harness harnesses AI to constantly sweep the road for markers informing people of their position.
Mr Panek said: “If I drifted to the left of the line, the sound would get louder and more dissonant in my left ear.
“If I drifted to the right, the same thing would happen, but in my right ear.”
Beijing RULES OUT joining US and Russia in strategic arms control talk [FOCUS]
World War 3 panic: Russia reveals TERRIFYING new military weapon [ANALYSIS]
Russia makes massive deployments in response to US nuclear threat [SPOTLIGHT]
Within a few months, and a few adjustments, he was able to run laps on an indoor track without either a human or canine guide.
Mr Panek added: “It was the first unguided mile I had run in decades.”
He hopes Project Guideline can be adapted and expanded to provide independence to more people like him.
Mr Panek said: “Collaborating on this project helped me realise a personal dream of mine,”’ before thanking Google “and whoever came up with the idea of a hackathon.”
Tech pioneer Google has been increasingly investing in accessibility technology and last month unveiled Sound Notifications, a new feature inform the hearing impaired if there is water running, a dog barking or an activated fire alarm.
Users can be notified about ‘critical’ sounds through push notifications, smartphone vibrations or a flashing camera light.
While the feature is designed for the estimated 466 million people in the world with hearing loss, this is also anticipated to help those wearing headphones or simply distracted.
The company has also expanded Lookout, which can read mail aloud and verbally identify packaged goods.
Source: Read Full Article