Inspiration

Wanted to do something with ARKit.

What it does

Tracks your facial expressions and translates them into Emoji. You can record the video and post it to your Snapchat Story.

How I built it

ARKit gives me data about specific facial expressions which i map to show emoji when a certain threshhold for that expression is passed.

Challenges I ran into

ReplayKit cost me hours to "fix". It's just bugged and i simply had to restart my iPhone... Couldn't believe the solution posted on Stackoverflow was real and that simple.

SnapKit kinda broke...

Accomplishments that I'm proud of

Facial Recognition and displaying of fitting Emoji aswell as recording and sharing works.

What I learned

Sometimes it's just Apples fault when your code fails. Also they seem to like changing Swift a lot...

What's next for snapMoji

Cut it down from an App to an iMessage extension or something similar.

Built With

Share this project:

Updates