Inspiration
When you are in a situation of imminent danger, you don’t have the luxury of time to unlock your phone and call your friend. You want them to come and help you, with the wave of a finger. We wanted to build an app that allows users to cast a “charm” to protect themselves, without having to directly interact with their phone.
What it does
Moco allows users to cast “charms”, which can be any user-specified event, such as waving your arms, or shaking your wrist. These charms then trigger user-specified actions, such as sending a text to your friend with your location, or calling your friend to come pick you.
By tapping into your phone or watch’s accelerometer, Moco can detect custom motion patterns. Once Moco detects these patterns, it triggers an action. Think of it as iPhone shortcuts, but you don’t even have to interact with your iPhone.
How we built it
Using iOS’s CoreMotion framework, we can get access to the motion sensors of IOS devices, including the accelerometer and gyroscope. This data is logged and then streamed in the background to our Flask web server.
Under the hood, we implement two powerful algorithms for accurate charm-casting:: 1) Iterative Closest Point: we take moving averages from our time series and apply ICP to correct for rotation and translation. 2) Dynamic Time Warping: we then process the differences between the input stream and references, warping to account for temporal variance.
Using these algorithms, can detect if a user’s movement pattern matches any of their previously specified charms. If there is a match, our backend dispatches a command to the user’s iOS shortcuts to execute a specified action. We also built a web-app front end with React that allows users to visualize their charms, as well as input new custom charms and specify shortcut actions.
Challenges we ran into
We had some difficulties wrangling with various iOS APIs. There are also some difficulties matching a previously specified charm with a user’s movements.
Accomplishments that we're proud of
Firstly, we are happy that we’ve developed a novel tech solution that addresses a number of societal problems that we don’t have good solutions for. For instance, when people find themselves in uncomfortable social situations, when people are in situations of imminent danger, or when they’ve simply had too much to drink.
Next, we are proud of integrating various frameworks and technologies into our solution. We had to work with iOS APIs, Flask for our backend and React for our frontend. We also had to implement a technically complex pattern matching algorithm to match two different user actions. This was complicated by the inherent errors present in a device’s accelerometer and gyroscope. We also want to ensure detection even when the user performs the motion in a slightly different manner. It turns out that it is quite complex to determine the extent to which 2 3D motions are similar to each other.
What we learned
We learnt about the technical details of various iOS APIs and also the forms of motion data that iOS devices have access to. We learnt how to integrate various frontend and backend technologies into one cohesive project.
What's next for Moco
We envision Moco to be a general app for all kinds of motion commands that grant people the power of telekinesis (almost). For example, harnessing the power of wearable tech, users will be able to turn on the light with a simple gesture.
We also envision Moco to be an app that empowers the disabled to benefit fully from modern technologies. For example, a person who is blind would be able to perform various daily tasks by instead gesturing with their wearable devices.
With further development work, we believe we will be able to add many more gestures and actions that can revolutionize how people interact with their personal devices.
Contact us
monoid enquirer perplexed..panda enigmatic719
Log in or sign up for Devpost to join the conversation.