Inspiration
Providing basic aid, such as CPR, can be critical in life-or-death situations. Unfortunately, many people are not equipped with basic skills to deliver this essential care when it's needed most.
What it does
Provides step-by-step instructions on how to provide CPR by creating an overlay over the injured user and visualizing where to do chest compressions. We then "stream" the user's perspective to an emergency responder, where they're able to give further feedback on how to provide aid where the technology cannot.
How we built it
User Providing the AID
- Using Snapchat Spectacle's lens, we created a mixed reality in Typescript/Javascript and a bunch of calls to different APIs to guide the user in doing CPR. On the Emergency Operator's Side
- Created a Relay server: Node.js server using Express and WebSocket to receive frames from spectacles and display them to a connected client.
- React, JS/HTML web client that connects to the relay server and displays the stream on the web app.
- Used Perplexity API (Javascript) to generate textual step-by-step instructions based on the context of the Red Cross's direction. The steps are visualized one by one on a UI implemented on the Snap Spectacles. When the operator gives advice, they can prompt it into a chat box, where it will get turned into textual instructions in the UI display within the mixed reality.
- Used T-Mobile's Network Slicing and QoD to reduce latency and manage stable bandwidth.
- Used T-Mobile's Location Retrieval API to send Spectacle wearer's location to EMS operator.
Challenges we ran into
- Figuring out how to navigate through Spectacle's lack of developer resources and documentation.
- Creating a Relay server to receive frames from Spectacle and display them to a connected client.
Accomplishments that we're proud of
- Created an augmented reality (AR) visual that instructs how to render CPR on an unconscious individual.
- Implemented Perplexity's API to provide CPR instructions for AR visual.
- Integrated T-Mobile's Network Slicing, Quality-of-Service on Demand, and Location Retrieval API to ensure reduced latency and provide location of Spectacle wearer to EMS.
What we learned
- How to develop AR visual components for Spectacle via Lens Studio.
What's next for XR Aid
- Identify other bodily injuries on an individual, and provide instructions on how to render immediate aid before EMS arrives.
Built With
- api
- ar
- express.js
- javascript
- node.js
- perplexity
- react
- react-native
- server
- spectacles
- typescript
- vr
Log in or sign up for Devpost to join the conversation.