Exhibited at OpenSauce 2025
A robotic system that physically interacts with Nintendo Switch Joy-Cons to autonomously play Super Mario Party minigames. The system uses computer vision to understand game state and executes optimal strategies through precise mechanical control.
DEEP-BOO combines computer vision, real-time robotics, and game AI to compete in Mario Party minigames. The system reads gameplay via HDMI capture, processes frames to detect game state, and controls physical actuators to manipulate Joy-Cons with superhuman precision.
Key Capabilities:
- Vision System: 1280x720 @ 30fps HDMI capture with OpenCV-based state detection
- Mechanical Control: Spherical parallel manipulator for joystick + solenoid array for buttons
- Game AI: Custom algorithms optimized for reaction time, button mashing, and deterministic strategies
- Live Override: Manual Joy-Con control with seamless autonomous/manual mode switching
- Design: Spherical parallel manipulator with 2 degrees of freedom
- Actuators: 2x stepper motors (TMC2209 drivers, 16 microsteps, 1000mA RMS)
- Range: ±175 steps per axis (~±1.0 normalized range)
- Frame: 20mm x 40mm aluminum extrusion with T-nuts (pelican case compatible)
- Actuators: 4x push-pull solenoids (A, B, X, Y buttons)
- Drivers: 2x TB6612FNG motor drivers
- Power Control: 8-bit PWM (0-255) with configurable press duration
- Performance: 22.22 Hz sustained button mashing for Domination minigame
- Microcontroller: ESP32 (dual-core, 115200 baud serial)
- Vision: BlueAVS HDMI to USB Video Capture Card
- Firmware:
joycon-bluetooth/dual_stepper_joystick/dual-stepper-bluetooth/dual-stepper-bluetooth.ino - Power: External supply for solenoids, USB power for ESP32
- Bluetooth: Joy-Con controller for manual override mode
- Joystick: Binary packet
0x01 + int16_x + int16_y(little-endian) - Solenoids: Binary packet
0x02 + solenoid_id + power + duration - Servo (auxiliary): Binary packet
0x03 + angle - Status/Config: JSON commands over serial
See HARDWARE.md for detailed electronics documentation and pinout diagrams.
- Language: Python 3.10+
- Vision: OpenCV 4.x (template matching, color masking, HSV detection)
- Hardware Interface: PySerial (binary protocol with command queue)
- Manual Control: PyGame (Joy-Con Bluetooth input with background thread)
The system uses a state pattern to track game phases:
MINIGAME_SELECT → MINIGAME_INFO → MINIGAME_INSTRUCTIONS →
MINIGAME_STARTING → MINIGAME_STARTED → MINIGAME_ENDED → (loop)
Each state implements detection logic to identify when to transition. During MINIGAME_STARTED, the active minigame's process_frame() method runs at 30 FPS to analyze gameplay and issue robot commands.
- Sled to the Edge - Frame similarity analysis for optimal reaction timing
- UFO - Multi-phase: block tower building + side drop positioning
- On-Off/Off-On - Speedrun-derived button press sequence + randomization
- Domination - Sustained 22.22 Hz button mashing
- Cookie Cutters - Real-time shape detection (circles, stars, squares)
- Thwomp the Difference - HSV-based fruit color detection
See MINIGAMES.md for strategy breakdowns and implementation details.
deep-boo/
├── game-state-tracker/ # Main application
│ ├── main.py # Entry point (camera, robot, game loop)
│ ├── state_machine.py # State pattern orchestrator
│ ├── constants.py # Configuration and mappings
│ ├── states/ # State implementations
│ ├── minigames/ # Minigame-specific AI
│ ├── detectors/ # Vision detection modules
│ └── utils/ # Robot interface, Joy-Con control
├── joycon-bluetooth/ # Joy-Con input and manual control
├── button-control/ # Solenoid testing and firmware
├── cv-utils/ # Computer vision utilities
├── game-capture/ # Minigame selection detection
├── minigame-development/ # Experimental scripts
├── circuit-design/ # PCB gerber files
└── reference/ # Architecture documentation
See ARCHITECTURE.md for detailed code organization and LLM context.
- Python 3.10+
- ESP32 with Arduino IDE
- BlueAVS (or compatible) HDMI capture card
- Nintendo Switch with Super Mario Party
- Joy-Con controller (for manual override)
# Clone repository
git clone https://github.com/yourusername/deep-boo.git
cd deep-boo
# Install Python dependencies
pip install opencv-python numpy pyserial pygame matplotlib
# Flash Arduino firmware
# Open joycon-bluetooth/dual_stepper_joystick/dual-stepper-bluetooth/dual-stepper-bluetooth.ino
# Upload to ESP32 via Arduino IDE- Connect HDMI capture card to Nintendo Switch dock output
- Connect USB capture card to PC (verify device index with
python cv-utils/cap_test.py) - Flash ESP32 firmware via Arduino IDE
- Connect ESP32 to PC via USB (auto-discovers COM port)
- Pair Joy-Con controller via Bluetooth for manual override
- Mount joystick manipulator on Joy-Con left stick
- Position solenoid array over ABXY buttons
cd game-state-tracker
python run.py # Standard mode
python run.py --debug # With debug overlayKeyboard Controls (during runtime):
q- Quit applicationd- Toggle debug overlayu- Force state to UNKNOWN
Joy-Con Controls:
HOME + SL- Toggle manual/autonomous modePLUS- Adjust servo angleFace Buttons + Stick- Manual robot control (when in manual mode)
# View camera feed
python cv-utils/get-game-feed.py
# Test solenoid control
python button-control/solenoid_test_gui.py
# Read Joy-Con inputs
python joycon-bluetooth/read_inputs.py
# Develop color thresholds
python cv-utils/crop_and_threshold.py- Create new directory in
game-state-tracker/minigames/your_minigame/ - Implement
BaseMinigamesubclass withprocess_frame(frame)method - Add templates to
game-state-tracker/minigames/your_minigame/templates/ - Register in
game-state-tracker/constants.pyMINIGAME_LIST - Test with
python game-state-tracker/run.py --debug
See game-state-tracker/minigames/base_minigame.py for full API reference.
Joystick Range:
- Adjust
MAX_STEPSingame-state-tracker/utils/robot_interface.py:109(default: 175) - Match firmware
MIN_POSITION/MAX_POSITIONin.inofile (default: ±200)
Button Power:
- Configure per-button PWM in
robot_interface.pybutton_power_settingsdict - Test with
button-control/solenoid_test_gui.py
Camera Settings:
- Verify device index in
game-state-tracker/constants.pyCAMERA_INDEX(default: 1) - Adjust resolution/FPS if needed (current: 1280x720 @ 30fps)
Uses structural similarity index (SSIM) between consecutive frames to detect the moment when the character lands, triggering a precisely-timed button press with tunable delay offset.
Employs contour detection with circularity and aspect ratio analysis to classify cookie shapes in real-time, mapping detected shapes to corresponding button presses.
Background thread continuously reads Joy-Con input via PyGame. On HOME+SL combo, immediately centers joystick and releases all buttons, then passes Joy-Con input directly to robot interface with higher priority commands.
Robot interface maintains a priority queue where manual override commands (priority 10) preempt autonomous commands (priority 0-5), ensuring safe human intervention at any time.
[Space for exhibit stories, crowd reactions, win rates, etc. - to be filled in later]
- Active Development: Core system is production-ready; additional minigames in progress
- Archive: Event booth materials and legacy scripts moved to
archive/folder - Recordings: Empty
recordings/andscreenshots/directories for runtime output - CAD Files: Mechanical design files available in separate repository (TBD)
- Additional minigame implementations (9 remaining from original 15-game roster)
- Persistent score tracking and session logging
- Adaptive difficulty tuning based on performance metrics
- Expanded servo control for auxiliary game mechanics
[To be determined]
Built for OpenSauce 2025. Special thanks to the maker community and everyone who stopped by the booth!
Project Name Origin: DEEP-BOO is a play on "DeepBlue" (the chess AI) and "Boo" (the Mario ghost character).
For questions or collaboration: [Your contact info]
