Just recently, Google announced two new tools that describe how people with disabilities use Android phones with facial gestures. Google said that by raising their eyebrows or smiling, people with speech or physical disabilities can now use their Android smartphones hands-free.
Newly launched tools put machine learning and front cameras to work to detect facial and eye movement. Disabled users can now scan their phone screen and select a task by raising their eyebrows, smiling, opening their mouths, or looking left, right, or up.
The Centers for Disease Control and Prevention estimates that 61 million adults in the United States live with a disability. Thus, Google and its competitors Apple and Microsoft are determined to provide them with products and services that are easy to use.
Here’s how people with disabilities can use Android phones with facial gestures
“Every day, people use voice commands, like ‘Hey Google’, or their hands to navigate on their phones. However, this is not always possible for people with severe motor and speech disorders. Google said in a blog post.
“To make Android more accessible to everyone, we are launching new tools. They make it easier to control your phone and communicate using facial gestures, ”Google said.
Either way, one of the tools is “Camera Switches,” which allows people to use their face instead of swiping and pressing to interact with smartphones.
The other is Project Activate, a new Android app that lets people use these gestures to trigger an action, like playing a recorded phrase on a phone, sending a text, or making a call.
“Now, it’s possible for anyone to use eye movements and facial gestures tailored to their range of motion to navigate their phone – without hands or voices,” Google said.
The free Activate app is available in Canada, Great Britain, Australia and the United States on the Google Play Store. However, it will be available in other countries very soon.
Apple, Google and Microsoft have consistently deployed innovations that make Internet technology more accessible to people with disabilities. For example, voice-activated digital assistants built into speakers and smartphones. These allow people with sight or movement problems to tell computers what to do.
Also check: Google search dark mode is finally here for desktop users: how to turn it on
Likewise, there is software that identifies text on web pages or in images and then reads it aloud. There is also an automatic generation of subtitles that display what is said in the videos.
Apple has also integrated an “AssistiveTouch” function in the smart watch which controls the screens by detecting movements such as pinching fingers or squeezing hands.
“This feature also works with VoiceOver so you can navigate Apple Watch with one hand while using a cane or leading a service animal,” Apple said in an article.
Likewise, Microsoft describes accessibility as essential to enable everyone to use technological tools.
“To enable transformative change, accessibility must be a priority. We aim to integrate it into what we design for every team, organization, classroom and home. Microsoft said
Recommended: How Does Google Chrome Know What To Look For On The Internet?