Inspiration
What it does
This project focuses on creating an affordable and innovative hardware device that integrates with the Meta Quest 3 to create a tactile sculpting experience. The hardware features a 3-degree-of-freedom arm with a handle. As the user moves the device, position data is collected and sent to Unity to simulate the sculpting tool. When a collision is detected, data is sent back to the device, activating the haptics by stopping the motors. This solution is aimed at artists, designers, and enthusiasts, offering a powerful yet budget-friendly alternative for 3D modeling and sculpting in immersive environments.
This will eventually be able to be used by any tele-operator to provide tactile feedback in virtually any environment or situation, making its applications in the commercial space nearly endless, especially for medical professionals in training.
How we built it
We designed the whole hardware using easily accessible metal parts and CAD to 3D print components. We used three servo motors to connect the component shafts. The hardware side for sending servo motor rotational data is coded using Python. The Python program is hosted on an ESP microcontroller and uses Singularity as a medium for connecting the microcontroller and the Meta Quest 3 using wifi. The software side utilizes the Meta MRUK SDK for mixed reality experience. The scene adapts passthrough features and for demonstration purposes a Voxel Cube has been created to simulate a sculpting environment. The signal received from Singularity is used to drive the rotational transforms of the hinges of a virtual tool, which replicates the real hardware instrument. Any collision interaction is sent back to the microcontroller which allow the Python program to provide tactile feedback.
Challenges we ran into
Much of the time has been spent of calibrating the transform of the hinges. The reverse kinematics algorithm for driving the motors based on collision data is non-linear and incomplete, leaving much room for improvement in software.
Accomplishments that we're proud of
This project was hard and finishing within a 72-hour timeframe and being able to show a complete demo to the judges makes us feel confident that we can deliver a functional product within a short period of time.
What we learned
The ability to communicate between the hardware and software is the greatest learning experience that we can take from this project. The latency and real-world data involved reminded us that the analog to digital pipeline is never as smooth as one expects.
What's next for Tac-Man
We aim to improve the movement and rotational calibration in order to allow the product to be able to sculpture and feel precisely. This can be also used for different use cases mainly in the medical training applications.




Log in or sign up for Devpost to join the conversation.