Theme: music

Prize: Best Use of ElevenLabs

Inspiration

AeroMix was inspired by the idea of making DJ performance more accessible, affordable, and gesture-based, transforming music control into a natural and intuitive movement experience rather than a traditional interface. Traditional DJ equipment and music production tools can be expensive and often rely on clicking buttons, turning knobs, or adjusting sliders on a screen. While effective, this interaction style can feel limited and less immersive. We wanted to create a more interactive and expressive experience — something that allows users to physically engage with music rather than just clicking controls in software like Ableton Live.

What it does

AeroMix is a low-cost DJ hand gesture recognition glove that allows users to apply effects in Ableton Live using natural hand movements. The glove integrates three flex sensors mounted on the thumb, index, and middle fingers, along with an ultrasonic distance sensor mounted on the palm. These sensors function as toggles and sliding controls that are mapped directly to effects and features in Ableton. Activation requires nothing more than simple, intuitive gestures such as curling one or more fingers or covering the palm. Each sensor controls a specific audio parameter: Thumb flex sensor – Adjusts the track volume

Index finger flex sensor – Adds or controls reverb

Middle finger flex sensor – Changes the pitch of the song

Palm-mounted ultrasonic distance sensor – Controls the tempo of the track

By transforming physical gestures into real-time audio manipulation, AeroMix creates a more immersive and interactive DJ experience without the need for traditional knobs, sliders, or controllers.

How we built it

Hardware

The glove’s sensors were connected to an Arduino Uno R4 WiFi, which served as the main microcontroller for reading and processing sensor data. We used a breadboard to organize the circuit and provide sufficient 5V and GND distribution for all components. Each flex sensor was configured in a voltage divider circuit with a 10 kΩ resistor to ensure stable and accurate analog readings. The three flex sensors were mounted on the thumb, index, and middle fingers, while the ultrasonic distance sensor was mounted on the palm to detect hand distance for tempo control. To keep the system organized and portable, we designed and 3D-printed a small enclosure to house the Arduino and wiring. This helped protect the hardware, reduce loose connections, and improve the overall durability and presentation of the glove.

Software

To connect the physical hardware to our digital audio workstation (DAW), we built a robust desktop application using Python and PySide6. The GUI acts as our command center, providing live progress bars to visually monitor the exact analog read of each finger and the ultrasonic distance in centimeters. Raw sensor data is inherently noisy, so sending it directly to a music engine would result in terrible, jittery audio. We implemented a sophisticated mathematical processing pipeline in the Python backend. Once the sensor data is smoothed and normalized into a clean 0 to 1 percentage, the app uses the mido library to translate the physics into virtual MIDI Control Change (CC) commands for the thumb, index, and middle fingers along with distance from the ultrasonic sensor.

Challenges we ran into

One major challenge we faced was connecting Bluetooth using the Bluetooth module. We spent several hours troubleshooting the wiring, code, and communication settings, but it still did not work properly. After multiple attempts, we decided to move on and focus on completing other essential parts of the project to stay on schedule. This experience taught us the importance of time management and knowing when to pivot instead of getting stuck on one issue.

Accomplishments that we're proud of

We successfully built a hand gesture recognition glove that allows for an interactive DJ environment. We integrated a flex sensor and a distance sensor to detect hand movements and gestures accurately. We were also able to successfully transmit data from the Arduino to the app we built using ElevenLabs, and then route the processed output into Ableton Live. This full pipeline — from physical gesture input to real-time audio manipulation — was one of our biggest achievements. Seeing our hardware and software work together smoothly was very rewarding.

What we learned

Through this project, we learned how to integrate hardware and software systems effectively. We improved our skills in sensor calibration, serial communication, and debugging. We also learned how important system architecture is — especially when transferring data between multiple platforms (Arduino → App → ElevenLabs → Ableton). Small communication errors can break the entire pipeline, so clear structure and testing at each stage is critical. Most importantly, we learned resilience and teamwork. When something didn’t work (like Bluetooth), we adapted and found alternative ways to achieve our goal.

What's next for AeroMix

Next we want to implement bluetooth connection between our glove device and our web application so connecting a physical wire to the laptop hosting the app is unnecessary. Another feature would be using an accelerometer to detect rapid hand swings for more expressive interaction with DJ features.

Built With

  • arduino
  • elevenlabs
  • flexsensor
  • pyqt
  • ultrasonicdistancemeter
Share this project:

Updates