Story of Your Life
A VR-assisted system for exploring alternate future selves through immersive simulations, sensory feedback, and reflection.
Inspiration
People often feel a “gut feeling” that their life should move in a different direction but struggle to understand what that change might be.
Current tools such as journaling, personality tests, and career research rely on imagination and reflection. They rarely allow people to experience what a different life path might actually feel like.
At the same time, our bodies respond to situations before we consciously interpret them. Changes in physiological signals such as heart rate can indicate stress, excitement, or emotional engagement.
We designed this project to explore a simple idea:
What if people could experience possible futures and observe how their body reacts to them?
What it does
Story of Your Life allows users to explore possible future life scenarios through immersive simulations.
Users select a scenario such as:
- a different career
- moving to a new city
- a major life transition
The system places them in a simulated moment from that possible future.
During the experience, biometric signals such as heart rate are monitored to identify emotional responses.
After the simulation, users write reflections. The system visualizes biometric signals alongside the reflection timeline to highlight moments of strong response.
Over time, users can compare experiences and identify patterns in what environments or life paths resonate with them.
How we built it
We built the prototype by combining UI design and generative AI.
Interaction Design
We designed the user flow in Figma, structuring the experience into three steps:
- Select a life scenario
- Experience the simulation
- Reflect on the experience
Generative AI Simulations
We used generative AI to quickly produce visual environments representing different life scenarios, including workplaces and living environments.
This allowed us to create immersive simulations without building full VR worlds.
Sensory Data Layer
We simulated biometric signals such as heart rate to represent physiological responses during the simulation.
These signals are displayed as a timeline so users can see when their emotional response changes.
Reflection Interface
Users record thoughts after the experience. The system connects reflections with biometric signals to support interpretation.
Challenges we ran into
Identifying which sensory signals are useful for reflection. Many physiological signals exist, but not all of them provide meaningful insight into emotional responses. We had to decide which signals, such as heart rate or stress indicators, could help highlight moments of engagement without overinterpreting the data.
Another challenge was building immersive environments within hackathon time limits. Using generative AI allowed us to rapidly create believable scenarios.
What we learned
Designing immersive experiences requires focusing on emotional engagement and sensory feedback, not just visual interfaces.
Biometric signals can help people reflect on their reactions, but they must be used carefully.
We also learned that generative AI is effective for rapidly prototyping complex environments.
What's next
Future work includes:
- integrating real biometric sensors
- expanding the range of life simulations
- analyzing patterns across multiple simulations
The goal is to help people explore possible futures while understanding their emotional responses to them.
Built With
- figma


Log in or sign up for Devpost to join the conversation.