No Touch Required, Just Look ‘N Speak
December 10, 2020Throughout his career, Mr Richard Cave has worked specifically with people living with speech and motor impairments, especially for those who are non-verbal and require assistance.
This is more than a job, it’s a passion. Every day, he tries to help people find easier and more accessible ways to express their everyday needs, opinions, feelings and identity. Assistive technology can help those with ALS in many ways, including eye gaze technology that enables people to type a message on a communication device and share it using eye movement alone.
With technologies like machine learning built into them, mobile devices have become more popular and powerful, and he has considered how this might be able to work even more seamlessly with assistive technologies to create adaptive products and augment adaptive devices.
This technology can open up significant opportunities especially for people around the world who may now have access to the technology for the first time. Google has a small group of people experimenting with its similar ideas on how the latest technology can be used to help people communicate.
This app is designed to let people use their eyes to choose prewritten phrases and have their phone speak them out loud. Look to Speak is available for Android devices running Android 9.0 and above, including Android One.
All you have to do is make a split-second movement from side to side or up and down to quickly choose from a list of phrases. The app is extremely sensitive to eye gaze, and all of the data remains private and never leaves the phone. With this in mind, the team has created a tutorial and a guide that highlight top tips, such as positioning your phone and using the eye gaze interaction.
Look to Speak is a ‘Start with One’ project on the “Experiments with Google” platform. It all started with an idea that could be impactful for one person and their community.
Go to g.co/looktospeak to learn more about the project and download the app.