Skip to content

JoshuaMosier/deep-boo

Repository files navigation

DEEP-BOO: Autonomous Mario Party Minigame Robot

Exhibited at OpenSauce 2025

A robotic system that physically interacts with Nintendo Switch Joy-Cons to autonomously play Super Mario Party minigames. The system uses computer vision to understand game state and executes optimal strategies through precise mechanical control.

Project Overview

Project Overview

DEEP-BOO combines computer vision, real-time robotics, and game AI to compete in Mario Party minigames. The system reads gameplay via HDMI capture, processes frames to detect game state, and controls physical actuators to manipulate Joy-Cons with superhuman precision.

Key Capabilities:

  • Vision System: 1280x720 @ 30fps HDMI capture with OpenCV-based state detection
  • Mechanical Control: Spherical parallel manipulator for joystick + solenoid array for buttons
  • Game AI: Custom algorithms optimized for reaction time, button mashing, and deterministic strategies
  • Live Override: Manual Joy-Con control with seamless autonomous/manual mode switching

Hardware Components

Joystick Control

  • Design: Spherical parallel manipulator with 2 degrees of freedom
  • Actuators: 2x stepper motors (TMC2209 drivers, 16 microsteps, 1000mA RMS)
  • Range: ±175 steps per axis (~±1.0 normalized range)
  • Frame: 20mm x 40mm aluminum extrusion with T-nuts (pelican case compatible)

Button Control

  • Actuators: 4x push-pull solenoids (A, B, X, Y buttons)
  • Drivers: 2x TB6612FNG motor drivers
  • Power Control: 8-bit PWM (0-255) with configurable press duration
  • Performance: 22.22 Hz sustained button mashing for Domination minigame

Electronics

  • Microcontroller: ESP32 (dual-core, 115200 baud serial)
  • Vision: BlueAVS HDMI to USB Video Capture Card
  • Firmware: joycon-bluetooth/dual_stepper_joystick/dual-stepper-bluetooth/dual-stepper-bluetooth.ino
  • Power: External supply for solenoids, USB power for ESP32
  • Bluetooth: Joy-Con controller for manual override mode

Communication Protocol

  • Joystick: Binary packet 0x01 + int16_x + int16_y (little-endian)
  • Solenoids: Binary packet 0x02 + solenoid_id + power + duration
  • Servo (auxiliary): Binary packet 0x03 + angle
  • Status/Config: JSON commands over serial

See HARDWARE.md for detailed electronics documentation and pinout diagrams.

Software Architecture

Core System

  • Language: Python 3.10+
  • Vision: OpenCV 4.x (template matching, color masking, HSV detection)
  • Hardware Interface: PySerial (binary protocol with command queue)
  • Manual Control: PyGame (Joy-Con Bluetooth input with background thread)

State Machine

The system uses a state pattern to track game phases:

MINIGAME_SELECT → MINIGAME_INFO → MINIGAME_INSTRUCTIONS →
MINIGAME_STARTING → MINIGAME_STARTED → MINIGAME_ENDED → (loop)

Each state implements detection logic to identify when to transition. During MINIGAME_STARTED, the active minigame's process_frame() method runs at 30 FPS to analyze gameplay and issue robot commands.

Implemented Minigames

  1. Sled to the Edge - Frame similarity analysis for optimal reaction timing
  2. UFO - Multi-phase: block tower building + side drop positioning
  3. On-Off/Off-On - Speedrun-derived button press sequence + randomization
  4. Domination - Sustained 22.22 Hz button mashing
  5. Cookie Cutters - Real-time shape detection (circles, stars, squares)
  6. Thwomp the Difference - HSV-based fruit color detection

See MINIGAMES.md for strategy breakdowns and implementation details.

Project Structure

deep-boo/
├── game-state-tracker/          # Main application
│   ├── main.py                  # Entry point (camera, robot, game loop)
│   ├── state_machine.py         # State pattern orchestrator
│   ├── constants.py             # Configuration and mappings
│   ├── states/                  # State implementations
│   ├── minigames/               # Minigame-specific AI
│   ├── detectors/               # Vision detection modules
│   └── utils/                   # Robot interface, Joy-Con control
├── joycon-bluetooth/            # Joy-Con input and manual control
├── button-control/              # Solenoid testing and firmware
├── cv-utils/                    # Computer vision utilities
├── game-capture/                # Minigame selection detection
├── minigame-development/        # Experimental scripts
├── circuit-design/              # PCB gerber files
└── reference/                   # Architecture documentation

See ARCHITECTURE.md for detailed code organization and LLM context.

Getting Started

Prerequisites

  • Python 3.10+
  • ESP32 with Arduino IDE
  • BlueAVS (or compatible) HDMI capture card
  • Nintendo Switch with Super Mario Party
  • Joy-Con controller (for manual override)

Installation

# Clone repository
git clone https://github.com/yourusername/deep-boo.git
cd deep-boo

# Install Python dependencies
pip install opencv-python numpy pyserial pygame matplotlib

# Flash Arduino firmware
# Open joycon-bluetooth/dual_stepper_joystick/dual-stepper-bluetooth/dual-stepper-bluetooth.ino
# Upload to ESP32 via Arduino IDE

Hardware Setup

  1. Connect HDMI capture card to Nintendo Switch dock output
  2. Connect USB capture card to PC (verify device index with python cv-utils/cap_test.py)
  3. Flash ESP32 firmware via Arduino IDE
  4. Connect ESP32 to PC via USB (auto-discovers COM port)
  5. Pair Joy-Con controller via Bluetooth for manual override
  6. Mount joystick manipulator on Joy-Con left stick
  7. Position solenoid array over ABXY buttons

Running the System

cd game-state-tracker
python run.py              # Standard mode
python run.py --debug      # With debug overlay

Keyboard Controls (during runtime):

  • q - Quit application
  • d - Toggle debug overlay
  • u - Force state to UNKNOWN

Joy-Con Controls:

  • HOME + SL - Toggle manual/autonomous mode
  • PLUS - Adjust servo angle
  • Face Buttons + Stick - Manual robot control (when in manual mode)

Testing Tools

# View camera feed
python cv-utils/get-game-feed.py

# Test solenoid control
python button-control/solenoid_test_gui.py

# Read Joy-Con inputs
python joycon-bluetooth/read_inputs.py

# Develop color thresholds
python cv-utils/crop_and_threshold.py

Development

Adding New Minigames

  1. Create new directory in game-state-tracker/minigames/your_minigame/
  2. Implement BaseMinigame subclass with process_frame(frame) method
  3. Add templates to game-state-tracker/minigames/your_minigame/templates/
  4. Register in game-state-tracker/constants.py MINIGAME_LIST
  5. Test with python game-state-tracker/run.py --debug

See game-state-tracker/minigames/base_minigame.py for full API reference.

Calibration

Joystick Range:

  • Adjust MAX_STEPS in game-state-tracker/utils/robot_interface.py:109 (default: 175)
  • Match firmware MIN_POSITION/MAX_POSITION in .ino file (default: ±200)

Button Power:

  • Configure per-button PWM in robot_interface.py button_power_settings dict
  • Test with button-control/solenoid_test_gui.py

Camera Settings:

  • Verify device index in game-state-tracker/constants.py CAMERA_INDEX (default: 1)
  • Adjust resolution/FPS if needed (current: 1280x720 @ 30fps)

Technical Highlights

Frame Similarity Analysis (Sled to the Edge)

Uses structural similarity index (SSIM) between consecutive frames to detect the moment when the character lands, triggering a precisely-timed button press with tunable delay offset.

Shape Detection (Cookie Cutters)

Employs contour detection with circularity and aspect ratio analysis to classify cookie shapes in real-time, mapping detected shapes to corresponding button presses.

Manual Override System

Background thread continuously reads Joy-Con input via PyGame. On HOME+SL combo, immediately centers joystick and releases all buttons, then passes Joy-Con input directly to robot interface with higher priority commands.

Command Queue with Priority

Robot interface maintains a priority queue where manual override commands (priority 10) preempt autonomous commands (priority 0-5), ensuring safe human intervention at any time.

Exhibit Experience (OpenSauce 2025)

[Space for exhibit stories, crowd reactions, win rates, etc. - to be filled in later]

Repository Notes

  • Active Development: Core system is production-ready; additional minigames in progress
  • Archive: Event booth materials and legacy scripts moved to archive/ folder
  • Recordings: Empty recordings/ and screenshots/ directories for runtime output
  • CAD Files: Mechanical design files available in separate repository (TBD)

Future Work

  • Additional minigame implementations (9 remaining from original 15-game roster)
  • Persistent score tracking and session logging
  • Adaptive difficulty tuning based on performance metrics
  • Expanded servo control for auxiliary game mechanics

License

[To be determined]

Acknowledgments

Built for OpenSauce 2025. Special thanks to the maker community and everyone who stopped by the booth!


Project Name Origin: DEEP-BOO is a play on "DeepBlue" (the chess AI) and "Boo" (the Mario ghost character).

For questions or collaboration: [Your contact info]

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors