Inspiration:
As Floridians, we all know how chaotic large-scale natural disasters are with the recent surge in hurricane activity. During these crises, essential questions become surprisingly difficult to answer: Where can we find stores with fresh water in stock? Which roads are safe to drive? How would we know if a neighbor needs critical assistance when emergency services are overwhelmed? When traditional telecommunication and emergency service networks are strained to their limits, communities must step up to support each other. This isn't just theoretical for us – one of our team members lost power for three weeks during Hurricane Irma. During those challenging weeks, he survived thanks to food and water provided by neighbors who came together to help. It was this powerful example of community support that inspired us to develop Rescue Radar, a tool designed to help organize and strengthen these vital community efforts through up-to-date information sharing.
What it does
With a glance, Rescue Radar shows you what's happening in your neighborhood during a natural disaster through an interactive map. Every marker on the map represents real-time community reports, from road hazards to relief efforts. Simply tap any marker to see verified photos, detailed descriptions, and timestamps – helping you locate available resources, avoid dangerous areas, or find neighbors in need of assistance. Whether you're offering help or seeking it, Rescue Radar connects our community when it matters most.
How we built it
In order to make this application functional for everyone we leveraged the power of React-Native. React-Native allowed us to build both an iOS and Android version of the application smoothly. To create a React-Native application we used Expo to build and test the application as well as add important libraries such as React-Native-Maps that displays our modifiable map. On the server side we made use of ExpressJS, MongoDB, and Firebase. ExpressJS set up server-side routing through our application, MongoDB held all information about reports except for the image, and Firebase held all of the images tied to our markers. WatsonXAI provided us with a way to generate titles for each one of the reports and also to compare and semantically classify reports that are similar to one another, ensuring that multiple reports for the same incident would not occur.
Challenges we ran into
Our user-report pipeline is surprisingly sophisticated. We're quite proud of how to implement a working database that has full functionality with the tools offered through the wattson.ai api, but before it was up and running, we ran into many challenges trying to coordinate different parts of our pipeline. Especially since Firebase was a new tool for us, learning many of the skills needed to implement it during the hackathon, coordinating wattson.ai api calls on a live service was daunting at first, and fraut with errors but we manage to get the whole thing together
Our user-report pipeline showcases sophisticated engineering under the hood. We successfully integrated Watson.AI's API with our database infrastructure, achieving full functionality despite the technical hurdles we faced. The learning curve was steep – Firebase was new territory for us, and coordinating Watson.AI API with the data needs of a live app presented significant challenges. While our initial attempts encountered many errors, we persevered through the debugging process. The end result is a robust system that seamlessly handles user reports, demonstrating how far we've come from those first challenging hours of implementation.
Accomplishments that we're proud of
We understood that for this tool to be truly helpful during a natural disaster, it must be able to handl multiple reports at once but also not allow for the creation of multiple reports for the same thing. We implemented our pipeline to take in a report’s location data and reference reports already in our database to make sure no duplicate reports are created. If multiple reports have the same location with a similar description, our AI call can sort them and choose to classify them as either different reports or the same report. As a full-stack project, we're particularly proud of how we balanced sophisticated functionality with the practical limitations of disaster scenarios.
What we learned
Our team had a wide range of experience. Some of our members were first time hackathon competitors, others were very experienced in certain aspects of app development. But we were only able to achieve this full stack project, with the combination of our full stack.
Our team was fairly familiar with full stack development and app development through Expo. The main implementation that was unfamiliar to us was the use of IBM’s LLM tools. We learned how to pass in data from our own database into a model and then have that model’s response turned into data that could be used by our database and user reports.
Collaboration statement
Our team brought together a wide range of skills and experiences, including four computer science majors and a math major, creating a space for cross-disciplinary collaboration. With more than half of the team—Raymon, CJ, and Gabriel—participating in their first hackathon, we focused on building an inclusive and supportive environment where everyone contributed actively. Each member brought their unique strengths: CJ handled backend development, setting up the database and routing; Saurabh developed the project's front end; Gabriel crafted the prompting for the AI model; and Raymon integrated the AI model into the project. John, our math major, played a key role in developing the necessary mathematical operations. While each member had a primary focus, collaboration was central to every aspect of the project. From integrating the AI model to refining the user interface and ensuring smooth database performance, we worked together, sharing knowledge and offering feedback to create a cohesive and functional project. This teamwork kept everyone engaged and allowed us to harness our diverse skills, resulting in a well-rounded and successful outcome.
What's next for our Hackathon Team:
Rescue Radar has a lot of growth potential. Right now its a relatively simple real-time But there is a lot to implement before it is truly public-distribution ready.
Future functionalities: AI sanitizes all user reports. Deletion of posts after expiration time. AI confirmation of report images and descriptions. General bug fixes.
Built With
- expo.io
- express.js
- firebase
- ibm-watson
- mistral
- mongodb
- node.js
- react-native
- react-native-maps
- terraform
Log in or sign up for Devpost to join the conversation.