Wenjia Wang1,*
Liang Pan1,*
Huaijin Pi1
Yuke Lou1
Xuqian Ren2
Yifan Wu1
Zhouyingcheng Liao1
Lei Yang3
Rishabh Dabral4
Christian Theobalt4
Taku Komura1
(*: Core Contributor)
1The University of Hong Kong
2Tampere University
3The Chinese University of Hong Kong
4Max-Planck Institute for Informatics
🎆 2026.Mar.10, we have released the code and data now, please have a try!
🎆 2026.Feb.22, EmbodMocap has been accepted to CVPR2026, codes and data will be released soon.
For new users, follow this order:
-
Main Pipeline - Quick downloads, preview / visualization, running the pipeline, and step-by-step workflow notes
-
Installation - Set up the environment, core dependencies, and manual download references
-
Visualization - Generate rendered videos or inspect scenes and motions interactively with Viser
Notes:
- Compared to the paper version, the open-source release replaces PromptDA with LingbotDepth.
fastis mainly for users who only care about mesh + motion for embodied tasks.standardis for users who also need RGBD/mask assets for training reconstruction models.- We provide an interactive visualization tool based on Viser - give it a try!
Our Viser-based visualization tool allows you to interactively browse scenes, sequences, and SMPL motions in 3D:
Features:
- Switch between multiple scenes and sequences
- Interactive 3D viewing of scene mesh and SMPL motion
- Real-time camera trajectory visualization
- Frame-by-frame playback control
See docs/visualization.md for detailed usage.
If you find this project useful in your research, please consider citing us:
@inproceedings{wang2026embodmocap,
title = {EmbodMocap: In-the-Wild 4D Human-Scene Reconstruction for Embodied Agents.},
booktitle = {CVPR},
author = {Wang, Wenjia and Pan, Liang and Pi, Huaijin and Lou, Yuke and Ren, Xuqian and Wu, Yifan and Liao, Zhouyingcheng and Yang, Lei, Dabral, Rishabh and Theobalt, Christian and Komura, Taku},
year = {2026}
}
We acknowledge VGGT, TRAM, ViTPose, Lang-Segment-Anything, PromptDA, Lingbot-Depth, SAM, COLMAP for their awesome codes.
Feel free to contact me for other questions or cooperation: [email protected]
