Inspiration
We wanted to create an app that directly impacted the lives of visually impaired people and thought about what might be some of the hardest things we would face if we were visually impaired. Once we realized that performing tasks like grocery shopping and exercising could become very difficult, we wanted to help solve this problem.
What it does
This app allows users to take photos of a food item's nutritional values. Then it will tell the user if that item is healthy or not and state its fat, protein, sodium, carbohydrate, and cholesterol levels. The information on nutritional labels can be very small, and this component of the app will help users decide whether or not to buy a food product without straining their eyes. This app also allows user to choose from a list of 20 exercises and perform it. Then, the app will give the user feedback on their form and tell them how to improve. Typically, if a person wanted to find out if they were doing an exercise correctly, they were search up the exercise on youtube or google and see how others do it. This is difficult for visually impaired people, because they may not be able to see exactly how the exercise is performed. This component of the app will solve this problem by telling the users exactly how to improve.
How we built it
We built this app using swift in Xcode. We used AVFoundation, specifically the AVSpeechSynthesizer and AVAudioSession, for the text-to-speech functionality. We also used the Vision Framework, specifically the VNDocumentCameraViewController, VNRecognizeTextRequest, and the VNDetectHumanBodyPoseRequest, for the image analysis. Furthermore, we used the Speech framework for the speech recognition.
Challenges we ran into
It was very time consuming to make the separate functions to analyze each exercise because each exercise has separate angle positions that we needed to test out.
Accomplishments that we're proud of
It was our first time using AV Foundations and we're really proud of the fact that we made a real-time app that will be helpful for visually impaired people.
What we learned
We learned a lot about X-code, AV Foundations, Vision Frameworks and text-to-speech technology.
What's next for NutriSight
We hope to release it on the app store to help visually impaired people.
Built With
- avfoundation
- speech
- swift
- vision
- xcode
Log in or sign up for Devpost to join the conversation.