Experience the application live: https://tree-d-two.vercel.app/
View our demo here: https://youtu.be/5ANYjuCz39E
Collectors often struggle to accurately judge the physical size and presence of artwork when viewing it online. Tree-D Studio addresses this "Scalar Gap" by converting 2D paintings into real-world scaled, depth-aware 3D reliefs, bridging the sensory gap between digital images and physical presence.
Tree-D Studio is a cutting-edge web application that transforms traditional 2D paintings into immersive 3D reliefs. Utilizing a FastAPI backend powered by the Marigold depth estimation model (from Hugging Face), it generates high-quality normal, displacement, and roughness maps. This process captures intricate details like impasto and defining textures, making it particularly effective for oil paintings. The resulting 3D models can be interactively viewed and exported in GLTF or USDZ formats for AR/VR integration.
In cases where the Marigold backend server is unavailable, the application gracefully falls back to a TypeScript-based procedural method. While this fallback provides a rough surface normal estimation, it allows for continuous operation, albeit with less fidelity compared to the AI-driven approach.
- 2D Painting Input: The process begins with a 2D image (either from the Met Museum API or user upload).
- Surface Normal Generation: The Marigold model (or fallback) estimates the surface normals, defining the orientation of each pixel in 3D space.
- Roughness Calculation: Luminosity values for each pixel are analyzed to determine surface roughness.
- Displacement Manufacturing: Individual pixels are then "moved" based on normal and roughness data to create a tactile 3D relief.
- 3D Object Output: A Three.js 3D object is generated, ready for interactive viewing and export.
- Intuitive User Interface: Landing, Search, View, and Demo pages with a museum-inspired aesthetic.
- Met Museum API Integration: Explore and transform public-domain artworks from the Metropolitan Museum of Art.
- Custom Image Uploads: Transform your own 2D images with local-only processing.
- AI Normal Map Generation: Leverage the Marigold model for high-fidelity depth perception, with a procedural fallback.
- Tactile 3D Relief: Dynamic Displacement and Roughness Maps create a realistic, textured surface.
- Accurate 3D Scaling: Models are scaled precisely (100cm = 1 Three.js unit) for real-world AR/VR placement.
- Universal 3D Export: Export models in GLTF and USDZ (for iOS AR Quick Look) formats.
- Decision Support: Visualize artwork at accurate scale and preview in AR to understand fit in your environment.
- Next.js 14 (App Router)
- TypeScript
- Three.js (3D rendering library)
- Tailwind CSS (Styling)
- GLTFExporter (3D model export)
- FastAPI (Python web framework)
- Python 3.11+
- Hugging Face
prs-eth/marigold-normals-v1-1(AI model for normal map generation) - Uvicorn (ASGI server)
-
Clone the repository:
git clone https://github.com/mattw23n/tree-d cd tree-d -
Frontend Setup:
cd frontend npm install cd ..
-
Backend Setup:
cd backend python -m venv .venv # On Windows: .\.venv\Scripts\activate # On macOS/Linux: # source .venv/bin/activate pip install --upgrade pip pip install -r requirements.txt cd ..
-
Start the Backend Server (Marigold):
cd backend # Activate virtual environment (Windows): .\.venv\Scripts\activate # Activate virtual environment (macOS/Linux): # source .venv/bin/activate uvicorn marigold_server:app --host 127.0.0.1 --port 8000 cd ..
The backend server will lazily load the AI model weights on its first request.
-
Start the Frontend Development Server:
cd frontend npm run dev cd ..
-
Open in Browser:
Navigate to http://localhost:3000 in your web browser.
PINUS Hack 2026 Track 4
MIT
