SonoGym is a scalable training platform for challenging (orthopedic) surgical tasks with robotic ultrasound, built on NVIDIA IsaacLab (https://github.com/isaac-sim/IsaacLab). We provide model-based and learning-based ultrasound (US) simulation based on 3D label map and CT scans from real patient datasets. Our tasks include path planning for US navigation, bone surface reconstruction, and US-guided robotic surgery. Our platform enables benchmarking of various algorithms, including reinforcement learning (RL), safe RL, and imitation learning.
-
Install Isaac Lab by following the installation guide (IsaacLab V2.1.0). We recommend using the conda installation as it simplifies calling Python scripts from the terminal.
-
Clone this repository separately from the Isaac Lab installation (i.e. outside the
IsaacLabdirectory):
# Option 1: HTTPS
git clone https://github.com/SonoGym/SonoGym.git
# Option 2: SSH
git clone [email protected]:SonoGym/SonoGym.git- Using a python interpreter that has Isaac Lab installed, install the library
python -m pip install -e source/spinal_surgery --no-dependenciesDownload the simulation assets and ultrasound simulation models from https://huggingface.co/datasets/yunkao/SonoGym_assets_models
The folder contains 2 archived folders:
- assets: simulation assets including medical imaging, human models, robots and end-effectors.
- models: pix2pix models for learning-based ultrasound simulation.
Unzip and put the downloaded directories in the following path respectively:
assets -> SonoGym/source/spinal_surgery/spinal_surgery/assets
models -> SonoGym/models
Download the dataset from https://huggingface.co/datasets/yunkao/SonoGym_lerobot_dataset following the instruction in https://huggingface.co/docs/hub/en/datasets-downloading or https://huggingface.co/docs/huggingface_hub/main/en/guides/download.
This dataset allow training imitation learning policies with lerobot repo (https://github.com/huggingface/lerobot) for surgery and navigation, not necessary for training RL agents.
Specifically, they are collected with the following settings:
Isaac-robot-US-guidance-v0-single: ultrasound guidance, single patient, model-based US simulationIsaac-robot-US-guidance-v0-single-net: ultrasound guidance,single patient, learning-based US simulationIsaac-robot-US-guidance-5-models-v0: ultrasound guidance,single patient, 4 learning-based US simulation networksIsaac-robot-US-guided-surgery-v0-single-new: ultrasound-guided surgery, single patient, model-based US simulationIsaac-robot-US-guided-surgery-v0-single-net-new: ultrasound-guided surgery, single patient, learning-based US simulationIsaac-robot-US-guided-surgery-v0-5-net: ultrasound-guided surgery,single patient, 4 learning-based US simulation networksIsaac-robot-US-guided-surgery-v0-5: ultrasound-guided surgery,5 patient, model-based US simulation
It is recommended to put the downloaded dataset in the following path:
SonoGym/lerobot-dataset
Yon can play the robot with keyboard in a task environment with the following command:
(Note: When you run it the first time, IsaacSim might take long to load)
python workflows/teleoperation/teleop_se3_agent.py --task=Isaac-robot-US-guidance-v0--task argument can be chosen from
- Isaac-robot-US-guidance-v0
- Isaac-robot-US-guided-surgery-v0
- Isaac-robot-US-reconstruction-v0
Before running the following training scripts, set all environment visualization to false following the Change environment settings section.
You can train PPO agent with skrl with the following command:
python workflows/skrl/train.py --task=Isaac-robot-US-guidance-v0You can also train PPO with a cost predictor only for Isaac-robot-US-guided-surgery-v0 with the following command:
python workflows/skrl/train_sppo.pyThe reinforcement learning agent dafault configs can reproduce our results in the paper. You can modify the configs at
source/spinal_surgery/spinal_surgery/tasks/{tasks_name}/agents/skrl_{algorithm}_cfg.yaml
To train imitation learning agents, first install lerobot from https://github.com/huggingface/lerobot/tree/a445d9c9da6bea99a8972daa4fe1fdd053d711d2
After downloading the expert dataset, you can also train ACT or diffusion policy with:
python /path-to-lerobot/lerobot/scripts/train.py --config_path=workflows/lerobot/train_surgery_{method}_cfg.jsonwhere method can be either 'diffusion' or 'act'. Remmember to change "dataset": "root" to the path of your local dataset like path-to-repo/SonoGym/lerobot-dataset/Isaac-robot-US-guidance-v0-single-net.
Environment configurations are located under
source/spinal_surgery/spinal_surgery/tasks/{task_name}/cfgs/{task_name}.yamlFor example, you choose model-based or learning-based ultrasound simulation for the robot_US_guidance task by setting
sim:
us: 'net' # for learning-based, or 'conv' for model-based
in source/spinal_surgery/spinal_surgery/tasks/robot_US_guidance/cfgs/robot_US_guidance.yaml.
To turn on or turn off the visualization, you can also set
sim:
vis_us: true # or false
If the program died only with vis_us on, try uninstall PyQt5 and reinstall opencv-python.
PPO checkpoint can be played via:
python workflows/skrl/play.py --task {task_id} --checkpoint /path-to-checkpoint --num_envs 100PPO + safety filter checkpoint for the ultrasound-guided-surgery environment can be played via:
python workflows/skrl/play_sppo.py --checkpoint /path-to-checkpoint --num_envs 100lerobot checkpoints (for example, ACT for ultrasound guidance) can be played by:
python workflows/lerobot/play_lerobot_guidance.py --policy.config=workflows/lerobot/play_guidance_act_cfg.json --policy.path=/path-to-checkpoint/pretrained_model
Currently, we don't have the Docker for Isaac Lab publicly available. Hence, you'd need to build the docker image for Isaac Lab locally by following the steps here.
Once you have built the base Isaac Lab image, you can check it exists by doing:
docker images
# Output should look something like:
#
# REPOSITORY TAG IMAGE ID CREATED SIZE
# isaac-lab-base latest 28be62af627e 32 minutes ago 18.9GBFollowing above, you can build the docker container for this project. It is called isaac-lab-template. However,
you can modify this name inside the docker/docker-compose.yaml.
cd docker
docker compose --env-file .env.base --file docker-compose.yaml build isaac-lab-templateYou can verify the image is built successfully using the same command as earlier:
docker images
# Output should look something like:
#
# REPOSITORY TAG IMAGE ID CREATED SIZE
# isaac-lab-template latest 00b00b647e1b 2 minutes ago 18.9GB
# isaac-lab-base latest 892938acb55c About an hour ago 18.9GBAfter building, the usual next step is to start the containers associated with your services. You can do this with:
docker compose --env-file .env.base --file docker-compose.yaml upThis will start the services defined in your docker-compose.yaml file, including isaac-lab-template.
If you want to run it in detached mode (in the background), use:
docker compose --env-file .env.base --file docker-compose.yaml up -dIf you want to run commands inside the running container, you can use the exec command:
docker exec --interactive --tty -e DISPLAY=${DISPLAY} isaac-lab-template /bin/bashWhen you are done or want to stop the running containers, you can bring down the services:
docker compose --env-file .env.base --file docker-compose.yaml downThis stops and removes the containers, but keeps the images.
Copy the docker/cluster_extension folder to IsaacLab/docker.
Follow https://isaac-sim.github.io/IsaacLab/main/source/deployment/cluster.html, configure the cluster parameters and export to singularity image by running under IsaacLab:
./docker/cluster_extension/cluster_interface.sh push
By default, it will export the image named isaac-lab-template.
The data in source/spinal_surgery/spinal_surgery/assets/data can be huge. Instead of copying it with codes every time we submit the job, we can copy it to a path on cluster /cluster/path/to/dir/data only once through:
scp -r source/spinal_surgery/spinal_surgery/assets/data /cluster/path/to/dir/
Then by specifying in /docker/cluster_extension/.env.cluster
CLUSTER_DATA_PATH=/cluster/path/to/dir/data
The script will automatically link /cluster/path/to/dir/data to the /workspace/isaac_extension_template/source/spinal_surgery/spinal_surgery/assets/data folder.
First import your extension template in the file specified by CLUSTER_PYTHON_EXECUTABLE in .env.cluster through import spinal_surgery.
Then you can submit the job through:
./docker/cluster_extension/cluster_interface.sh job "argument1" "argument2" ...
By default, it will submit a job with image isaac-lab-template. You can change profile="template" to your own profile name at line 138 in docker/cluster_extension/cluster_interface.sh.
We have a pre-commit template to automatically format your code. To install pre-commit:
pip install pre-commitThen you can run pre-commit with:
pre-commit run --all-filesIn some VsCode versions, the indexing of part of the extensions is missing. In this case, add the path to your extension in .vscode/settings.json under the key "python.analysis.extraPaths".
{
"python.analysis.extraPaths": [
"<path-to-ext-repo>/source/spinal_surgery"
]
}If you encounter a crash in pylance, it is probable that too many files are indexed and you run out of memory.
A possible solution is to exclude some of omniverse packages that are not used in your project.
To do so, modify .vscode/settings.json and comment out packages under the key "python.analysis.extraPaths"
Some examples of packages that can likely be excluded are:
"<path-to-isaac-sim>/extscache/omni.anim.*" // Animation packages
"<path-to-isaac-sim>/extscache/omni.kit.*" // Kit UI tools
"<path-to-isaac-sim>/extscache/omni.graph.*" // Graph UI tools
"<path-to-isaac-sim>/extscache/omni.services.*" // Services tools
...