Inspiration
In the U.S., nearly 90,000 people are diagnosed with Parkinson's disease annually, with that number increasing by 50% each year. Of those diagnosed, 60% experience anxiety. The changes in brain neurochemical factors, fear of social abandonment due to physical weakness, and deterioration of consciousness cause Parkinson's patients to experience chronic anxiety and isolation while suffering from the disease.
Studies have shown that walking meditation can decrease anxiety levels by at least 37%. However, the loud environment sounds easily disrupt the meditation process, and the obstructed environment, with limited activity areas, diminishes the benefits of walking meditation. Furthermore, individuals with Parkinson's are easily startled by such noise, increasing the risk of collisions with obstacles and heightening anxiety and panic.
What it does
Unlike the current walking meditation method, MindfulStride is designed based on the needs of people with Parkinson's. MindfulStride transforms items in the user’s surroundings into natural objects during the walking session, providing a fluid boundary that ensures a more extensive walking space and safety. In this experience, the users will feel connected to society and the natural world. By incorporating spatial audio, the user is presented with an immersive natural scene that dynamically responds to their efforts and behavior.
Even for those without Parkinson's, MindfulStride provides an easy way to convert any dreary scene into a more tranquil and immersive experience. In the modern day, where urbanization is rampant, it never hurts to reconnect with nature.
How we built it
Technology We utilized Mixed Reality features provided by the Meta Presence Platform to build this project. Specifically, the spatial anchors allowed us to put virtual objects where the physical objects are and the scene manager to understand the environment.
For a more lightweight experience, we utilize the LookingGlass Portrait, which will display a fully animated and holographic scene of a woman talking to the user. If you don't feel like fully immersing yourself into the mixed reality world, this is one way to practice meditation. This woman is Sensei Hopkins, the voice you hear earlier while in the Meta Quest 3!
Challenges
While we had decided on addressing mental health issues, we originally had planned on giving the user a pet simulator experience, where they could overlay a 3D scan of their own pet over a robotic one, so as enjoy both the tactile and visual experience of playing with a pet while away from them. Unfortunately, we ended up dropping this idea after realizing the difficulties of image tracking using headsets, and pivoted to this new idea!
Even after the pivot to what would become our final project, there were difficulties. The meta spatial anchors and scene manager took time to get used to.
Accomplishments that we're proud of
- Getting the spatial anchors to work
- Creating grass that grows wherever you step
- Fully working animated Looking Glass hologram
What we learned
Rather than immediately jumping into development on one idea, it's best to spent the first part of the hackathon just experimenting with new technologies and learning so as to be able to discern what is possible. Not just in the sense of preventing you from being too ambitious, but also to allow you to know just how far and aspirational you can be with what you can accomplish. XR allows you to do some incredibly mind-blowing things!
How to create spatial audio in Unity
How to integrate the Looking Glass with Unity
How to create basic 3D animations
How to use Unity in general
How to use Meta's spatial anchors
What's next for MindfulStride
- Refining the visuals to be more immersive
- Deeper research of Parkinson's disease to create a more tailored and beneficial experience
- LLM backend so as to allow the user to speak with the meditation assistant
- Finding a work-around to room setup(perhaps real-time 3D scanning) to allow the user to walk anywhere in the real world.
Acknowledgements
We would like to thank a few people whose support and help are the only reason we managed to have a functional project by the end of this weekend.
Paul Sorenson was our go-to person for all things Unity. Any time it behaved unexpectedly or we realized a package we were using wasn't designed for VR, he was the one who would help us get back on the right track. If not for him, we would not have been able to successfully have a finished scene for the LookingGlass aspect of the project.
Sean Ong was the person who helped us figure out how to get spatial anchors working for the Meta Quest 3 experience. If not for him, the whole draw of our application, the ability to safely recreate an environment to walk around in and meditate in without fear of running into objects would not exist.
Finally, thank you to the LookingGlass team for helping us figure out how to use it to create some insanely cool visuals.
Log in or sign up for Devpost to join the conversation.