Developers can now integrate the accessibility feature into their apps, allowing users to control the cursor with facial gestures or by moving their heads. For example, they can open their mouth to move the cursor or raise their eyebrows to click and drag.
Project Gameface was announced during the Google I/O desktop conference last year, and uses the device’s camera and a database of facial expressions from… MediaPipe facial landmark detection API To manipulate the indicator.
Google explained in its announcement: “Through the device’s camera, it seamlessly tracks facial expressions and head movements, and translates them into intuitive, personalized control.” “Developers can now create apps where their users can configure their experience by customizing facial expressions, gesture sizes, cursor speed, and more.”
Although Gameface was initially designed for gamers, Google says it has also partnered with comprehensive – A social enterprise in India focusing on accessibility – to see how they can expand it to other settings such as work, school and social situations.
Project Gameface is inspired by a quadriplegic video game streamer Lance Carr, who suffers from muscular atrophy. Carr collaborated with Google on the project, aiming to create a less expensive and accessible alternative to expensive head-tracking systems.
“Freelance web ninja. Wannabe communicator. Amateur tv aficionado. Twitter practitioner. Extreme music evangelist. Internet fanatic.”