Rune

A flexible gesture-motion controller for IoT devices, allowing you to intuitively interface with your newest smart products. Leveraging a custom machine learning model deployed on TensorFlow and the communication capabilities of both the Qualcomm DragonBoard and NodeMCU, the system is capable of discerning between 10 discrete gestures in real-time.

Hardware

  • Qualcomm DragonBoard 410c
  • NodeMCU ESP32
  • MPU 9250 9-axis IMU

Software

  • Python
  • C
  • TensorFlow
  • Numpy
  • MQTT Messaging Protocol (Mosquitto Client)

Nothing is Ever Easy

We faced the challenge of having a Raspberry Pi break on us during the event, turning a comfortable pace into an intense grind in an instant. It was a major setback, but we were able to recover by bringing up a Qualcomm DragonBoard development kit from scratch (which allowed us to pick up right where we left off and continue developing Rune).

Learning Through Failure

Our attempts at implementing a complicated multi-device communication scheme may have cost us time, but they afforded us the opportunity to take a deep dive into the inner workings of Debian Linux. From the protocols behind IoT communication to the nitty-gritty issues of dependency resolution, as a team we gained an immense amount of knowledge as we solved the problems which blocked us.

Architecture

Low level communication is handled via the MQTT messaging protocol with devices running on a shared wireless network. The DragonBoard subscribes to a topic using an instance of the Mosquitto client, where it receives messages pushed the the NodeMCU-IMU sensor combination.

A python script parses received MQTT messages and feeds them into our machine learning model, which then determines what gesture was performed. This gesture is then free to be handled however is convenient for the user, i.e. submitted as a POST request to a web front-end.

License

This project is licensed under the terms of the MIT license.

Built With

Share this project:

Updates