Inspiration

We were inspired by the idea of transforming live visuals into a shared creative experience rather than a one-way performance. VJing usually isolates the performer from the audience, and we wanted to explore how Spectales glasses could make this art form more collective and interactive. Our goal was to turn real-time visuals into a space where people could connect, influence, and co-create together.

What it does

((Sync)scape) is a collaborative VJ experience built for Spectacles. It allows multiple people to join a shared session where a main VJ hosts the scene while others can influence live visuals in real time. Participants can adjust parameters such as colors, effects, and movements, creating evolving 3D compositions together. All visuals are rendered in augmented space using Spectacles, blending digital VFX and real environments into a single live performance.

How we built it

We developed ((Sync)scape) using Lens Studio 5.15 and the Spectacles SDK.

Spectacles Sync Kit enabled real-time collaboration between users

Spectacles Interaction Kit handled gesture and control inputs

Depth Edge detection allowed visuals to blend naturally with the environment

Sounds reacts to the microphone input

Blender and Rhino7 were used to model 3D assets

Custom shaders and particle systems powered the live VFX

A simple shared state synchronized the main host’s visuals with participant inputs

Challenges we ran into

Working with the multi-user synchronization was one of the main challenges. Getting smooth, real-time updates across devices required balancing visual fidelity with performance. We also run into some issues when trying to display a camera input to which we applied depth filter into a virtual screen (plane). Also having the booth spawning on the wrong position. We also had to adapt our design to the constraints of AR, ensuring that visuals felt integrated in physical space and did not overwhelm the viewer. Finally, creating intuitive controls for both the host and participants while keeping the experience artistic and fluid took several iterations.

Accomplishments that we're proud of

We managed to create a working multi-user visual experience on Spectacles in less than 24 hours. The connection between users worked reliably, and the visuals responded smoothly to real-time input. We are especially proud of how the system kept its artistic focus, encouraging creativity and connection rather than distraction. Seeing the visuals come alive collaboratively was a rewarding moment for the team.

What we learned

We learned how to design for AR interactions that feel expressive and human. The Spectacles SDK opened our eyes to how spatial computing can support artistic collaboration instead of isolating people. We also deepened our understanding of Lens Studio’s real-time rendering capabilities, synchronization kits, and visual optimization for wearable devices.

What's next for ((Sync)scape)

This opens up new forms of shared creativity, from live concerts, theater, opera and dance performances to art jams or immersive storytelling. It transforms AR from an isolated experience into a collective canvas. In the future, we’d love to expand it with music synchronization, and location-based collaboration.

Built With

Share this project:

Updates