67 Royale - may the best six-sevenner win

team name: PIP

  • Roshan Banisetti
  • Misbah Ahmed Nauman
  • Raahim Khan

Inspiration

The idea started with a ridiculous TikTok trend. People were alternating their hands up and down while chanting “six-seven”, and it had become this absurd cultural moment that somehow everyone knew about. When we saw the hackathon theme asking for something silly, dumb, and creative, we immediately thought:

What if we could turn this brainrot trend into an actual competitive game?

The concept was straightforward but technically interesting: use computer vision to track hand movements in real time, count the alternations accurately, and let people compete either solo for speed or head-to-head in live matches. We wanted to capture the chaotic energy of the trend while building something that actually worked well enough for competitive play.


Building the Project

Initial Setup and Computer Vision (Hours 0–8)

The first challenge was getting motion detection working reliably. We evaluated several options for hand tracking and settled on MediaPipe Hands because it provides 21 landmark points per hand, giving enough precision to distinguish real alternating movements from random hand waving.

The core algorithm had to solve a specific problem: how do you count a “rep” in the 67 motion?
We developed a velocity-based detection system that checks several conditions simultaneously:

  • Both hands must be vertically separated by at least 0.06 normalized units
  • Both hands must be actively moving (minimum velocity threshold of 0.008 per frame)
  • Hands must move in opposite directions to prevent cheating
  • State must flip between “left-up” and “right-up” to increment the counter

We added exponential smoothing with a factor of 0.4 to eliminate camera jitter and implemented a calibration phase where users hold their hands level for 3 seconds to establish a baseline. This step was critical for normalizing tracking across different camera angles and user heights.


WebRTC Implementation (Hours 8–16)

This was by far the hardest part of the project. I had never worked with WebRTC before, and getting peer-to-peer video connections working was far more complex than expected.

Our initial implementation used only STUN servers for NAT traversal. It worked perfectly on localhost but failed entirely across different networks. After hours of debugging, we realized most restrictive networks (university WiFi, mobile data, corporate firewalls) block direct P2P connections.

The solution was adding TURN relay servers. We integrated Metered.ca’s free tier, which relays traffic when direct connections fail. The signaling flow works as follows:

  1. Both players establish local camera streams
  2. Players are sorted by Firestore document ID to determine host/guest roles
  3. Host creates a WebRTC offer and writes it to Firestore
  4. Guest reads the offer, creates an answer, and writes it back
  5. Both exchange ICE candidates through Firestore subcollections
  6. Once connected, video streams flow peer-to-peer (or via TURN if required)

The breakthrough moment was realizing we needed iceCandidatePoolSize: 10 for faster connection setup and properly formatted TURN credentials. Once that clicked, we achieved smooth dual-camera feeds with essentially zero gameplay latency.


Real-time State Management (Hours 16–20)

We initially tried Firestore’s onSnapshot listeners for real-time duel updates. While intuitive, this caused permission issues since client-side Firestore was read-only for security reasons, and loosening write permissions introduced vulnerabilities.

We pivoted to API polling at a 2-second interval. The main challenge was avoiding stale closures in React. Polling callbacks need access to current game state, but closures capture outdated values.

We solved this using refs instead of state for values accessed inside polling callbacks:

This allowed us to read up-to-date values inside setInterval without recreating the interval on every render.


Anti-Cheat and Validation (Hours 20–22)

We implemented several layers of validation to prevent score manipulation and abuse:

  • JWT session tokens generated server-side with timestamps and duration metadata
  • Token verification on score submission to prevent replay attacks
  • Rate limiting (1 submission per 10 seconds per IP)
  • Server-side score sanity checks (maximum 20 reps per second)
  • Profanity filtering on usernames

The computer vision algorithm itself also acts as an anti-cheat mechanism. By enforcing opposite vertical motion using velocity sign checks, users cannot simply wiggle both hands at the same height. Both hands must be actively moving in opposite directions with sufficient vertical separation for a rep to be counted.


UI Polish and Final Features (Hours 22–24)

Misbah joined during the final stretch to refine the UI and overall feel of the product. We implemented a glassmorphism-inspired design with a clean, minimal aesthetic. The original canvas-based overlays were visually heavy and obstructed the camera feed, so we replaced them with lightweight React overlays that display only essential information such as rep count, timers, and opponent status.

One subtle but important fix was mirroring the camera feed. Users naturally expect to see themselves mirrored, similar to looking in a mirror, but our initial implementation showed the raw camera output. We applied a horizontal flip transformation to the rendering context, which immediately made the experience feel more natural and intuitive.


Technical Architecture

The project was organized as a monorepo, allowing tight integration between frontend, backend, and real-time systems:

  • Frontend: Next.js 16 with the App Router, TypeScript 5, and Tailwind CSS 4. React hooks combined with refs for performance-critical game state.
  • Computer Vision: MediaPipe Hands running client-side at approximately 30 FPS. All detection logic is encapsulated in a BrainrotDetector class with tunable sensitivity parameters.
  • Backend: Firebase Firestore for persistence, Next.js API routes for server-side validation, Firebase Admin SDK for secure writes, and custom JWTs for session management.
  • Real-time Communication: WebRTC for peer-to-peer video with TURN relay fallback. Firestore is used for WebRTC signaling, while API polling handles game state synchronization.
  • Deployment: Vercel for frontend hosting and Firebase for database and security rules, using only free-tier services.

Built With

+ 23 more
Share this project:

Updates