Inspiration ☁️:
Special Thanks to https://github.com/Pawandeep-prog/Squid-Game in which our game was adapted from.
- We used this to frame our logic, and how mediapipe can be applied in this context.
We were inspired by the TV show Squid Games where Computer Vision and AI were used in a modernized Child-Like Game of Red-light Green-light. We believe computer vision and AI interactive games are the trend for the gaming market as we see the rise of VR games, so we felt this was the perfect opportunity to begin exploring this field!
What it does 💁♂️:
The user(you) represents the character Santa who is on his mission to deliver gifts once he comes down the Chimney. The Webcam of your computer sees your movements and translates them into the movements of Santa, where the objective is to not be caught by little Johnny as he randomly turns around. Johnny isn't wearing his glasses, so deciphering still objects/people is difficult for him, but when he sees moving objects he can tell who it is! That's why it's important that Santa doesn't move when Johnny turns around!
Rules:
You can only jog(move) when the music is on and the tree has a green dot on top.
When music pauses and there is a red dot on top of the Christmas tree you must stop
The white dot moves are you move and in order to win, the white dot must pass the finish line.
Make sure your whole body is in the frame for the game to start.
To play the game you must download a file from GitHub and type the command (pip install opencv-python/pip install mediapipe/pip install pygame)
. For macOS/linux follow the setup manual: https://google.github.io/mediapipe/getting_started/install.html
How we built it 🔨:
We used Python to employ the Computer vision aspect that can translate our visual movements through the webcam into actual movements in the game in real-time. We used the OpenCV and MediaPipe python frameworks. OpenCV is an open-source library for computer vision, machine learning, and image processing. For our game, the Mediapipe framework was used to implement the pose detection ML solution. Which takes frames as input and outputs landmarks on the body which can be used in the python program. Additionally, we used the Pygame library in order to load the Christmas (jingle bell) song and pause/unpause it when the program is running. We also included a safeguard, so that game only starts when the person's entire body is in frame.
Challenges we ran into 😳:
Creating a real-time translation between the movements of the user and the character on the screen proved to be initially difficult as any lag in a time-sensitive game can ruin it. We fixed this issue by accommodating the delays by delaying the music stop time so that it matches the movement delay from the webcam. We used optimization software to calculate run-time for the movements along with those of johnny (203 ms avg), to match the two.
Accomplishments that we're proud of 💪:
We are really proud we were able to make a fully functional video game in our first attempt using Computer Vision and Machine Learning! Also, we are really proud of how well we collaborated and were able to split up roles/tasks.
What we learned 🧠:
We learned how to tackle projects in game development, how to collaborate and use Github to track our progress, as well as general coding experience in Python to make games with the Webcam/vision functionality.
What's next for SneakSanta 💼:
We aim to create fully integrated, multiplayer games using the same concept of Red-Light Green Light, where we can relive our very own childhood games! We think that much farther down the line if we can integrate this into VR, it would be as if we were playing with each other in person! There's really a multitude of possibilities using Computer Vision but these are very viable avenues.
Built With
- ai
- computer-vision
- machine-learning
- mediapipe
- opencv
- pygame
- python
- tkinter
Log in or sign up for Devpost to join the conversation.