LiveNet is a Robust, Minimally Invasive Multi-Robot Controller for Safe and Live Navigation in Constrained Environments. This project is developed by Srikar Gouru, Sid Lakkoju, and Rohan Chandra.
Visit our website or read the paper for more info!
As defined in scenarios.py, the repository tests two scenarios, a Doorway scenario and an Intersection scenario. The environment uses x, y, heading, and velocity as the state vector and linear acceleration, angular velocity as the control vector. The model itself is defined in models.py and is implemented using PyTorch and CVXOPT. Model training is done in train.py and testing on custom scenarios of either the trained model or the MPC algorithm can be run using main.py. Various parameters related to tuning the environment, MPC setup, or model setup are located in config.py.
Various utility scripts also exist to ease the development process. datagen.py and data_logger.py can be used to generate testing data by running the MPC algorithm on various scenarios. plotter.py uses Matplotlib to visualize the scenario live, at the end of the run, or save it to an MP4 animation.
To recreate the results presented in the paper, run_experiments.py and summarize_experiments.py can be used to run N = 50 iterations of the any algorithm on the chosen scenario, and will output metrics regarding the # of collisions, # of deadlocks, makespan, change in velocity, path deviation, and compute time. To test robustness, run_suite.py and summarize_suite.py will run the algorithm on a suite of symmetric Doorway scenarios, each with varying initial or ending conditions. The weights of the LiveNet and BarrierNet models used for the paper can be found in the weights/ folder.