Inspiration
Keeping track of throwing out expired medications and renewing prescriptions is a hassle for anyone, and is even more of a challenge for those who have trouble reading the tiny text on the labels. We designed Blake to be able to help visually impaired people interpret and track all of their medications in a way that is convenient for them.
What it does
We created a system that assists people with visual impairments in tracking and interpreting their prescriptions. When the user picks up a new prescription, he/she can scan it using our mobile app, which automatically identifies and saves the drug name, directions of use, and duration (in days) of prescription. Our database keeps track of current prescriptions, decrementing the remaining duration each day. On a daily basis, the user can ask Alexa for an overall overview of prescriptions for that day (a list of active prescriptions and a list of prescriptions that have less than three days left in their prescription duratIon), or for directions of use for a specific drug he/she is currently taking. Overall, the verbal communication between the user and Alexa, along with the mobile app's ability to automatically interpret prescriptions, especially aid users with visual impairments.
How we built it
We used Android Studio to build the mobile app, which uses the phone's camera to scan and OCR to recognize the text. We wrote our own algorithm to process the detected text, and passed the information to our .php scripts, which talked to the Microsoft Azure Cloud MySQL database. The Amazon Alexa can also fetch information (such as directions for each drug and which drugs are low on supply) to our visually impaired patients. We used Adobe Photoshop and Illustrator for front-end development. When designing the user interface we aimed for a clean and clear layout so that users with various visual impairments can navigate it easily.
Challenges we ran into
As first time hackers, there were many points throughout the process where we were faced with problems and bugs we had never encountered before, including learning how to use a new IDE (Android Studio), working within the limitations of Alexa's skills (e.g. inability to initiate conversations), accounting for the variety of formats that prescriptions come in, and making our code run efficiently.
What's next for Blake
In the future, we would like to make Blake able to process a wider variety of prescriptions. In general, Blake hopes to expand its use of verbal communication to improve the lives of the visually impaired.



Log in or sign up for Devpost to join the conversation.