Inspiration
Gaming is a source of joy, connection, and creativity for many people, but those with limited motor skills often face significant barriers to fully experiencing it. We believe that everyone should have the opportunity to enjoy everyday activities, such as gaming, without limitations. Inspired by the desire to make gaming more inclusive and accessible, we set out to create a platform that enables people to play classic games, such as Pac-Man, using alternative input methods such as voice commands. Our goal is to break down barriers and empower individuals to enjoy gaming in a way that suits their abilities, making fun and entertainment truly accessible to all.
What it does
Our project provides an adaptive solution that allows individuals with limited motor skills to interact with digital experiences, such as gaming, using voice commands. Users can control gameplay by simply using voice commands to achieve the same actions.
The platform includes two core functionalities:
- Voice Command Control: Saying directional commands like "up," "down," "left," or "right" allows users to navigate and interact effortlessly without the need for physical input devices.
This system provides an accessible and engaging way for users to participate in digital activities, ensuring they can enjoy gaming and other interactive experiences.
How we built it
The game: EchoMotion's Pac-Man game was built using HTML, CSS, and JavaScript, with the Web-browser's speech API integrated to provide voice command functionality for an accessible gaming experience. The game board is dynamically created using JavaScript, where a predefined grid layout represents different elements such as walls, pac-dots, power pellets, and ghosts.
Voice detection: We used Web Speech API to allow players to control Pac-Man using voice commands by continuously listening for words like "up," "down," "left," and "right." When a command is detected, it updates Pac-Man’s movement within the grid, providing a hands-free way to play. The game also supports voice-activated game start by detecting the command "start."
Challenges we ran into
One of the biggest challenges we faced was trying to use WebGazer for eye motion detection to control the game. We encountered difficulties with calibrating the eye tracking and integrating it effectively into our game mechanics. Despite our efforts, we struggled to get it working properly within our setup, which led us to explore other ways to enhance accessibility in gaming.
To accommodate players with limited motor abilities who may also face speech impairments, we aimed to implement facial movement detection as an alternative control method. Using TensorFlow.js and the Face Mesh model, we were able to successfully track head movements in multiple directions—up, down, left, and right—providing an intuitive and hands-free way to interact with the game.
However, integrating facial movement detection into the Pac-Man game posed several challenges, primarily due to time constraints. Fine-tuning the system to ensure smooth and responsive gameplay required extensive effort in terms of calibration, optimization, and debugging. Given our limited time, we ultimately decided to prioritize integrating a speech detection system, which offered a more immediate and reliable accessibility solution within our project timeline.
Although facial movement detection could not be fully incorporated into the final version of the game, we successfully implemented it on a separate computer. This allows us to demonstrate its potential and showcase how it could be seamlessly integrated into future versions, offering more inclusive gameplay options for individuals with diverse accessibility needs.
Accomplishments that we're proud of
We are incredibly proud of successfully integrating voice command controls to create a more accessible gaming experience. This achievement allowed us to provide an intuitive, hands-free control option that can benefit individuals with limited mobility. We are also proud of how quickly we were able to develop the game logic and implement the Web Speech API, overcoming challenges and learning new technologies along the way.
This project was our first experience working with voice detection, and we are especially proud of how effectively we were able to implement it, ensuring a seamless and responsive gameplay experience. Additionally, we are proud of our ability to think outside the box and pursue an idea that is underrepresented in the gaming industry—making accessibility a core focus of our project. It’s rewarding to work on something we are truly passionate about and to contribute to an area that can have a meaningful impact on people's lives.
Even though we were unable to fully integrate facial movement detection into our Pac-Man game due to time constraints, we are proud that we successfully got it working on a separate setup. This achievement excites us about the potential for further development, and we are eager to continue refining and enhancing it in the future to make our game even more accessible and inclusive.
What we learned
We learned how to use FaceMesh, a machine learning model for face detection and tracking, to capture and analyze head movements. By leveraging this technology, we were able to implement intuitive controls where users can move their heads up, down, left, or right to navigate the game.
Additionally, we explored the implementation of voice recognition using the Web Speech API to detect spoken commands such as "up," "down," "left," and "right," translating them into in-game movements.
What's next for EchoMotion
We are committed to enhancing EchoMotion by introducing new features and expanding accessibility options. Our future plans include:
- Integrating Facial Movement Detection into the Pac-Man Game: One of our key goals is to successfully incorporate Face Mesh technology into our Pac-Man game, allowing players to control gameplay using head movements. This will offer an alternative control option for individuals who may have difficulty with traditional input methods.
- Implementing Eye Motion Detection: Adding eye movement tracking as a control method to provide users with even more intuitive interaction options.
- Expanding Game Options: Introducing a variety of new games to offer a more diverse and engaging experience for users of all interests and abilities.
- Adaptive Calibration System: Implementing a system that analyzes user movement patterns and automatically adjusts sensitivity for improved accuracy and ease of use
Built With
- api
- css
- facemesh
- html
- javascript
- react
- sass
- tensorflow
- web-api
Log in or sign up for Devpost to join the conversation.