Astro ASL

Inspiration

We were inspired by the space theme and knowing that there’s no sound in space. Hence, that gave us the idea to use computer vision and ASL to help people communicate in silent environments, not just in space, but anywhere sound doesn’t work.

What it does

ASTRO ASL (American Sign Language) is an AI-powered sign language transcriber.

How we built it

Used the Fullyhacks Flask template for the website and kept building it along the way mainly with HTML. Utilized OpenCV to decode the image along with MediaPipe to track and isolate the hand region. Then we passed the cropped hand to the categorical regression model for ASL prediction, and the prediction result was sent back and displayed in real time.

Challenges we ran into

  • Learning to sample and train the data
  • Setting up the environment
  • HTML background kept disappearing.

Accomplishments that we're proud of

  • We were able to troubleshoot a lot of problems
  • Improved prediction model accuracy from 14% to 95%

What we learned

  • Learning neural networks basics
    • Flask and Web-dev
    • Git
    • Version control
    • Figma
    • Communication, teamwork, leadership

What's next for Astro ASL

Train a better model for more accurate results and recognize more symbols than just the alphabet. Possibly add login and user authentication/database for the website and saving previous transcribed sentences. Eventually deploy the website, because it's run locally.

Built With

Share this project:

Updates