Inspiration

We wanted to make an app that was interactive, fun and ultimately useful. We wanted to transition morse code from its days with the antiquated telegraph into the new and latest innovations in technology today. That's why we decided to make MorseTalk, an app which merges Computer Vision with Morse.

What it does

MorseTalk uses Vision and CoreML to recognize different hand gestures and convert them to morse code, all in a seamless and intuitive user interface. It takes the age-old communication methodology and combines it with today's cutting-edge innovations to create a unique and useful user experience.

How we built it

We started by designing and writing our app in SwiftUI leveraging the powerful capabilities of CoreML combined with Vision to detect hand poses.

Everyone on our team was new to SwiftUI when we started this project, and this has been a great learning experience and it was fulfilling to finally see our app come to fruition.

What's next for MorseTalk

MorseTalk is already published on the App Store, and this is a huge achievement for us! We plan to incorporate more features and polish the app even further in the coming months. Here's a sneak peak:

  • SharePlay Support
  • More Customization
  • Convert Morse to Sounds

Built With

Share this project:

Updates