Inspiration

This project was inspired by a desire to free lighting designers from being trapped at their control boards when adjusting lighting. This project provides a is portable alternative, with simplified, more intuitive controls.

What it does

Gestures are interpreted into lighting commands, which are sent wirelessly to a server connected to the lighting controller.

How we built it

The Leap Motion device, which detects the user's hand movements, is attached to a portable computer. The computer runs a client program that is also a custom visualizer, and interprets gestures into hue, saturation and brightness data, which are converted to RGB values and sent over Wi-Fi to a server on the lighting computer. The server converts the RGB signal into commands for the lighting controller, which adjusts the lighting accordingly.

Challenges we ran into

We originally planned to use a DragonBoard 410c as the portable computer, but were unsuccessful in establishing communication between the Leap Motion device and the DragonBoard. We eventually determined that it was not possible to access the Leap Motion without being able to run the leapd service on the computer, and the Leap software is not currently available for ARM processors.

Accomplishments that we're proud of

The gestural interface worked very well, and the visualizer is fun and exciting to use on its own, even without the lights.

What we learned

We learned to code OSC commands, work with Leap Motion, and develop for the DragonBoard 410c and Grove sensor kit.

What's next for Leap Lighting

We could create an integrated, portable device using Leap Motion and a simple computer and transmitter, to replace the current laptop or tablet setup. We could also add functionality for controlling different lighting instruments and stage equipment. The gestural interface could be improved and expanded to cover moving lights, simultaneous control of multiple lights and more complex gestures. The system could also be enhanced by adding a VR or AR component, to let the user see more of the lighting setup and the space than would normally be visible.

Built With

Share this project:

Updates

posted an update

After a long night, we have most things working. We have a computer receiving leapmotion communication, rather than the dragonboard, as we couldn't get it to work. This computer is connected over a network to the lighting console computer. The lighting console computer receives the color data from the other computer and converts it into an Open Sound Control command, and sends it over to the software, which then interprets the commands and executes it. We are still working on getting a basic G.U.I. together, and making them all talk to each other, but we are very close to finishing.

Log in or sign up for Devpost to join the conversation.