Inspiration

Drawing from advancements in hand tracking and real-time 3D rendering, the project was intended to provide an immersive and intuitive way to interact with virtual objects and merge natural human gestures with digital spaces.

What it does

This allows users to interact with 3D objects in a browser-based environment using real-time webcam-based hand tracking. By recognizing specific gestures like pinch and grab, users can manipulate 3D objects such as cubes, spheres, and pyramids, using just their hands

How we built it

The project was built using MediaPipe for real-time hand tracking, whose coordinates are mapped onto a 3D space in Three.js. The interaction system uses Socket.io to handle communication between the front-end and backend.

Challenges we ran into

We particularly struggled with integrating the 4 aspects of the program that we made together -- which included an intuitive UI, the web cam feedback, the pinching aspect of the code, and our block creation and manipulation code.

Accomplishments that we're proud of

The ability to interact with 3D objects in real-time through hand gestures is a major accomplishment, alongside being able to integrate all the aspects of our code together.

What we learned

We have gained a deeper understanding of both computer vision and 3D web development, alongside learning how to apply hand tracking techniques to real-time interactive experiences.

What's next for Virtual Hands

We hope to introduce multiplayer support, enabling users to interact with each other in the same 3D space and expand the gesture set to further manipulate objects.

Share this project:

Updates