Skip to content

NVlabs/DOPE-Uncertainty

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DOPE-Uncertainty

It is the code base for ensemble-based uncertainty quantification in deep object pose estimation. For more details, see our project website, ICRA 2021 paper, and video.

Dependencies

  • The add (average distance) metric is computed by visii, so visii needs to be installed here.
  • Downlad neural network weights (in .pth) and save them to the content folder to replace proxy files. Note that we only provide the weights and models (already in content/models/grocery) for the Corn object for now.

Running Examples

We provide some demo images in uncertainty_quantification/output/test for code test and demonstrations. These demo images are generated by the NVISII render. There are two example scripts:

  1. uncertainty_quantification/run.py requires the ground truth poses for statistics. This script would first do pose estimation based on DOPE (but you do not need to install DOPE or ROS), and then do post-inference uncertainty quantification. The expected result is that this script would generate all files in uncertainty_quantification/output/test_result, including inference results, confidence plot, the most confident frame selection, uncertainty quantification correlation analysis, etc.
  2. uncertainty_quantification/run_realworld.py is similar, but do not need the ground truth poses. The expected result is that this script would generate all files in uncertainty_quantification/output/test_result_realworld. This script corresponds to the real-world grasping experiment in our paper, where there is no ground truth pose estimation.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages