Inspiration
Every year, around 500,000 people are diagnosed with melanoma (skin cancer). Melanoma detection is difficult for primary care physicians, who frequently neglect it. Significant mortality rates arise when melanoma is not discovered at an early stage.
What it does
We built a mobile app that primary care providers may easily utilize. All a physician needs to do is take a picture of the mole, and the app will send a request to the server, where the picture will be evaluated, and the physician will see the probability of the melanoma and the regions that influence the decision on the picture.
How we built it
We have built a Deep Learning model which evaluates the probability of melanoma by picture. We have tested pre-trained models such as ResNet and EfficientNet and trained them on SIIM-ISIC data. For the visualization of impact regions, we have used Grad-CAM. Our best model achieves 81% accuracy on a validation set.
This model was deployed on the Azure cloud so that the mobile app can communicate with it. The backend was built using Flask in Python.
A mobile app was developed for iOS in Swift as of now. It allows the physician to take a picture or select a picture from the gallery, upload it to the cloud and retrieve the probability of melanoma and modified image with impact regions.
Challenges we ran into
The size of the data was huge (100 GB, 30 of which are images). Even downloading that much data is difficult. Obviously, there were time constraints because training large deep learning models in a short period of time limits your ability to try new things, but the end result was very satisfactory given the time constraint of 48 hours.
Accomplishments that we're proud of
We are honored to be participating in the Medtech Challenge. It's fantastic to have the potential to design something that can save people's lives. We are proud of our deep learning model, which is capable of making pretty accurate predictions. We are proud of our mobile app, which has a gorgeous design and a simple user interface.
What we learned
Two of our team members participated in the Makeathon on-site, while the other two participated online. We increased our technical skills for this project. Using Azure was especially beneficial because we discovered that it contains a plethora of valuable features. We heard about the necessity of using deep learning in medicine and developing tools to help doctors make better decisions.
What's next for SkinSkanner
Continuing to improve the deep learning model. We intend to incorporate new technologies into Explainable AI so that doctors may learn more about AI decisions. Including patient management and automatic data storage on the server in the app. Doctors can then view the history of their predictions and share them with other doctors.
Log in or sign up for Devpost to join the conversation.