Inspiration

It was a cold Sunday morning in Rochester when Team Storks met with the good doctor at Strong Memorial Hospital. Dr. Li needed a crack development team to develop some kind of app that could detect a stroke. So what were the three handsome young undergrads to do but take up the challenge at DandyHacks 2017!

What it does

We're building an iOS app that uses different metrics to determine if a person is having a stroke. The app is to be used by a caregiver who needs no prior medical experience. Our first feature takes two images of the human eye, one before and one after a light was shined in the eye, and tests if the pupil dilated. Lastly, we use 3D touch on iOS to determine how much grip force a subject can exert which each hand. Each of these is a common metric in determining if a patient is having a stroke. Our app will give its recommendation based on the result, and advise the user to seek medical attention.

How we built it

We split our development into three stages, Front End, Back End, and Communication. Our front end was written by Adam Rosenstein, a senior at the UR. He used xCode and Swift to code the app and its functionality. Our back end was written by Dominic Giambra, a junior at the UR. He used Python to code the image processing functions. The Communication between the two was handled by Unni Kurumbail, a junior at the UR. He used aws to host the back end, and wrote a REST API in CherryPy.

The app itself contains much of the functionality for the survey and pressure sensor as these are both relativity to do in Swift. The image processing function is entirely hosted on our aws ec2 back end, and the app sends the images to the server and receives the result of the test.

Challenges we ran into

This was a very difficult project, especially for three Chemical Engineering majors. None of us had ever worked with Swift, AWS, REST APIs, CherryPy, or image processing. Due to the precision needed in measuring pupil dilation, normal facial recognition function could not work, so Dominic had to come up with another algorithm to determine if the pupil had dilated between images. Unni spent quite a bit of time setting up our aws ec2 machine, as well as completely learning CherryPy and how to pull information from the front end. Another issue we stumbled into is Dominic has absolutely 0 ability to spell with such little sleep, so for a half hour Unni struggled to figure out why he couldn't get Dominic's back end scripts working, and it was due to Dominic's misspelling of the word Dilated in a call as "Dialated".

What we learned

We are very proud of how much we learned. Coming into this we had little experience with any of this, we completely taught ourselves app development, and some major key networking concepts. We now understand so much more about these processes, and developed some truly amazing skills. We also had to really learn what classified a stroke victim and which metrics to implement, so some medical research was also required during the 36 hours.

What's next for Stroke Detector

We need to make our app look pretty and improve the sensitivity of our pupil measurements.

Share this project:

Updates