Inspiration
We’ve spent countless hours playing Lidar-based mods and games…from the iconic GMOD Lidar mod to a range of experimental scanning experiences. When the Quest 3 arrived with its next-gen XR capablities, we now had the tech to bring that magic into the real world.
Lidar IRL is our attempt to bring that dream into reality, letting players physically walk inside a Lidar experience instead of watching one on a screen.
What it does
Lidar IRL transforms your home into a mixed reality playground layered with a hidden, parallel dimension. Using your Void Scanner, you sweep your environment and reveal points of shimmering light wherever the beam touches.
Rooms unfold. Walls dissolve. Your familiar space becomes strange and newly perceivable, like gaining an entirely new sense.
With every scan, you “paint” the world into existence. Even simple actions like turning around or stepping into a hallway feel different when the void and your reality fuse into one impossible space. But as you stare into the abyss, it stares back. You’ll see what your body cannot feel, because you aren’t alone in the dark.
How we built it
Leveraging the open-source "continuous scanning system" from Anaglyph’s Lasertag, we built a live mesh-scanning pipeline capable of capturing the player’s world in real time. On top of that, we created custom Lidar shaders to place millions of persistent dots across the environment.
To add structure and progression, we designed an object insertion system that lets players discover items like keys, encounter entities, and ultimately unlock the exit to complete the experience.
Challenges we ran into
The biggest hurdle was building a reliable, flexible object insertion system. Creating objects that behave consistently in a constantly shifting, player-generated environment pushed us to rethink how mixed-reality games handle placement, discovery, and interaction.
Accomplishments that we're proud of
The pure joy of scanning your entire home never gets old. Despite being a one-month prototype, Lidar IRL delivers moments that feel impossible: stepping outside your front door, turning around, and seeing your whole house reconstructed in points of light.
It’s surreal, mesmerizing, and unlike anything we’ve experienced in MR.
What we learned
Truly seamless mixed reality is still in its infancy. Many MR experiences limit players to a single room, but your whole home, and the world beyond, is ripe for exploration. We learned that even non-XR players immediately understand and connect with the experience.
Rediscovering your own space in a dark void of shimmering Lidar dots feels transformative, almost like being given new senses. This response has encouraged us to keep pushing the boundaries of real-world exploration in XR.
On the technical side, the developers learned about Signed Distance Fields and how they can be used to store 3D environment data. This was their first time writing and using compute shaders in Unity. They extended existing open source implementations to add their own raymarch logic to process multiple SDF sources.
What's next for Lidar IRL
This prototype is just the beginning. Our vision includes deeper mechanics, richer environments, and a stronger sense of mystery and discovery.
Future Ideas
Local Multiplayer: This would be using Spatial SDK so players can explore the same hidden world together.
More Discoverables: from playful easter-egg posters to eerie anomalous objects inspired by SCP-style lore.
Full Sandbox Mode: customize dot limits, decay behavior, color palettes, and more—turn your home into the Lidar art studio of your dreams.
Improved Stability: fully address recentering issues to keep meshes and dots perfectly aligned.
Built With
- fmod
- meshy
- unity




Log in or sign up for Devpost to join the conversation.