Inspiration
I was captivated by deep-sea lagoon creatures and wanted to create a VR world where I could bring them to life. The discovery of glossy, gummy-like materials that create reflective, translucent textures pushed me to build an immersive underwater ecosystem you could interact with using your hands.
What it does
Flow and Ground is a VR creature nurturing game where you grow adorable sea creatures from babies to adults using hand gestures. Poke the life orb to spawn algae for babies, raise your hands to create seaweed for adults, and pinch crabs to protect your ecosystem. Watch your creatures swim, walk, run away from crabs and evolve in a vibrant underwater world.
How we built it
Built in Unity 6 with Meta XR SDK for Quest hand tracking. I created custom GLB creature models using NomadSculpt with special reflective materials, implemented gesture-based interactions using raycasting from finger tips, and developed an AI system for creature behaviour, including feeding, movement, and ageing mechanics.
The scatter of shells, stars, small rocks and seaweed is procedurally generated and appears different each time. I also made 2D sprites using Adobe Fresco, three plankton and one set of bubbles that spawn as particles throughout the level.
I made the ambient backing music using Suno and added additional SFX, such as pop when algae appear or crunch when food is eaten.
Challenges we ran into
Implementing reliable hand tracking gestures in the Meta simulator was tricky; getting raycast distances and collision detection right took multiple iterations. Learning to work with GLB files in Unity required new workflows and custom editor tools. The biggest challenge was simplifying from a 4-stage life cycle (baby/child/teen/adult) to 2 stages due to time constraints, while keeping the game engaging.
Accomplishments that we're proud of
The glossy, gummy creature materials look stunning in VR and really bring the deep sea aesthetic to life. Hand gesture interactions feel natural and responsive, such as poking, pinching, and raising hands to interact with the world is intuitive. Creating a comprehensive creature lifecycle system with AI-driven behaviour, feeding mechanics, and visual feedback within a short timeframe was a major achievement.
What we learned
I gained deep experience with Meta's hand tracking API and learned how to optimise gesture detection for VR. Working with GLB models taught me new Unity workflows and the importance of proper asset pipelines. Most importantly, I learned to scope aggressively for hackathons, cutting features early to polish what matters most.
What's next for Flow and Ground
Reintroduce the environmental balance mechanic where creature colours are affected the ecosystem state. Add more interaction types, such as petting creatures or rearranging decorations. Expand the creature lifecycle back to 4 stages with unique animations for each. Implement multiplayer so friends can nurture the ecosystem together.


Log in or sign up for Devpost to join the conversation.