Skip to content

Google’s Project GameFace will let Android users control devices hands-free

By | Published | No Comments

Google has expanded its Project GameFace, an open-source project that aims to make tech devices more accessible to Android, and can now be used to control the smartphone interface. The project was first introduced during Google I/O 2023 as a hands-free gaming mouse that can be controlled using head movements and facial expressions. These were designed for people who suffer from physical disabilities and cannot use their hands or voice to control devices. Keeping the functionality the same, the Android version adds a virtual cursor so users can control their device without touching it.

In an announcement made on its developer-focused blog Post, Google said, “We are open-sourcing more code for Project GameFace to help developers create Android applications to make every Android device more accessible. Through the device’s camera, it intuitively tracks facial expressions and head movements, translating them into intuitive and personalized controls. Additionally, the company also asked developers to use the tool to add accessibility features to their apps.

Project GameFace collaborated with Inclusa, an Indian organization that supports people with disabilities. Using collaboration, the project learned how its technologies could be extended to different use cases such as typing messages, searching for jobs, and more. It used MediaPipe’s face landmarks detection API and Android’s accessibility service to create a new virtual cursor for Android devices. After tracking the user using the front camera the cursor moves following his head movement.

The API recognizes 52 facial gestures including eyebrow raising, mouth opening, lip movements, and more. These 52 activities are used to control and map a wide range of functions on Android devices. An interesting feature is drawing. Users can use it to swipe up on the home screen. To create a drag effect, users must define a start and end point. This could be something like opening the mouth and moving the head, and closing the mouth again once the end point is reached.

Notably, while this technology has been made available on GitHub, it is now up to developers to build apps using this option to make it more accessible to users. Apple also recently introduced a new feature that uses eye-tracking to control the iPhone.


Affiliate links may be automatically generated – see our ethics statement for details.
Denial of responsibility! Thelocalreport.in is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us.The content will be deleted within 24 hours.

Reference Url

Surja, a dedicated blog writer and explorer of diverse topics, holds a Bachelor's degree in Science. Her writing journey unfolds as a fascinating exploration of knowledge and creativity.With a background in B.Sc, Surja brings a unique perspective to the world of blogging. Hers articles delve into a wide array of subjects, showcasing her versatility and passion for learning. Whether she's decoding scientific phenomena or sharing insights from her explorations, Surja's blogs reflect a commitment to making complex ideas accessible.