Inspiration

We had had some discussions recently about race and how it changed the living experience. When we heard about last year's project that slapped based off of a sound, we were inspired to make race discrepancies tangible.

What it does

Our robot uses a ML model to slap you - with a catch. Based on your appearance, you have a higher or lower probability of being slapped. Simple, but effective.

How we built it

This is put together with Python, wood, an Arduino, a servo, and a free trial from an online facial analysis API. OpenCV first records from the webcam and draws basic information on the screen (Frame Rate, etc), and then we use the API Kairos to determine ethnicity.

Challenges we ran into

The API is unreliable. Or so we thought - after several hours of trying to determine the ridiculous number of timeouts, we realized the issue was the laptop. After resetting the networking agent, the code worked like a charm!

What we learned

The best things are often the simplest.

Built With

Share this project:

Updates