Skip to content

IMRL/OriField

Repository files navigation

Learning Orientation Field for OSM-Guided Autonomous Navigation

Abstract

OpenStreetMap (OSM) has gained popularity recently in autonomous navigation due to its public accessibility, lower maintenance costs, and broader geographical coverage. However, existing methods often struggle with noisy OSM data and incomplete sensor observations, leading to inaccuracies in trajectory planning. These challenges are particularly evident in complex driving scenarios, such as at intersections or facing occlusions. To address these challenges, we propose a robust and explainable two-stage framework to learn an Orientation Field (OrField) for robot navigation by integrating LiDAR scans and OSM routes. In the first stage, we introduce the novel representation, OrField, which can provide orientations for each grid on the map, reasoning jointly from noisy LiDAR scans and OSM routes. To generate a robust OrField, we train a deep neural network by encoding a versatile initial OrField and output an optimized OrField. Based on OrField, we propose two trajectory planners for OSM-guided robot navigation, called Field-RRT* and Field-Bezier, respectively, in the second stage by improving the Rapidly Exploring Random Tree (RRT) algorithm and Bezier curve to estimate the trajectories. Thanks to the robustness of OrField which captures both global and local information, Field-RRT* and Field-Bezier can generate accurate and reliable trajectories even in challenging conditions. We validate our approach through experiments on the SemanticKITTI dataset and our own campus dataset. The results demonstrate the effectiveness of our method, achieving superior performance in complex and noisy conditions. The code for network training and real-world deployment will be released.

PDF PDF Video Dataset Supplementary Material

Learning Orientation Field for OSM-Guided Autonomous Navigation

News

  • 2025-03-18: The paper was submitted to the Journal of Field Robotics.
  • 2025-08-14: The paper was accepted to the Journal of Field Robotics.
  • 2026-02-23: The source codes with pre-trained models and gradio app are released.

Orientation Field

Python Version Framework GitHub license

The implementation of the 'Orientation Field' pipeline for navigation.

This is the official implementation for Learning Orientation Field for OSM-Guided Autonomous Navigation. We provide the source code, pre-trained models, and instructions to reproduce our results.

Table of Contents

To-Do Checklist

This is a list of features we plan to add or improvements we are working on.

  • Initial release of the source code and pre-trained models.
  • Elaberate README.md with instructions for training and inference.
  • Provide a Gradio app for demo use of OriField.
  • Release ROS implementation.
  • Add support for multi-GPU training.
  • Create a Dockerfile for easier environment setup and deployment.
  • Add more pre-trained models for different backbones.

Installation

Follow these steps to set up the project environment. This project is tested on Ubuntu 18.04 with Python 3.9.7 and PyTorch 1.9.1.

  1. Clone the repository:

    git clone [email protected]:IMRL/OriField.git
    cd OriField
  2. Create a virtual environment (recommended):

    conda create -n OriField python=3.9
    conda activate OriField
  3. Install dependencies:

    Install Pytorch

    pip install torch==1.9.1+cu111 torchvision==0.10.1+cu111 torchaudio==0.9.1 -f https://download.pytorch.org/whl/torch_stable.html

    Install Detectron2

    python -m pip install detectron2==0.6 -f https://dl.fbaipublicfiles.com/detectron2/wheels/cu111/torch1.9/index.html

    Other required packages are listed in requirements.txt.

    pip install -r requirements.txt
  4. Install pybind: Here is the c++ implementation of BVE projection and Field Planner.

    pip install -v -e .

Start Training

python train_net.py

See run.sh or run_um.sh for specific settings.

Run Inference

You can use the pre-trained model in this resposity to perform inference:

python export_model.py

See run.sh or run_um.sh for specific settings.

Gradio App

python app.py --config-file ./output.example/config.yaml

Gradio App

Acknowledgements

This project is built upon the excellent work of several other open-source projects. We are deeply grateful to their authors and contributors. Our implementation is based on the following repositories:

  • trajectory-prediction: We utilized their bev projection pipeline and adapted it for our project.

  • PMF: We utilized their point cloud data processing pipeline and adapted it for our project.

  • MaskFormer: We utilized their model training pipeline and adapted it for our project.

License

This project is licensed under the MIT License. See the LICENSE file for more details.

About

This is the code repository for the submitted JFR paper "Learning Orientation Field for OSM-Guided Autonomous Navigation"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages