Skip to content

yunfanLu/UniINR

Repository files navigation

If you like our project, please give us a star ⭐ on GitHub for the latest update. For any questions, contact me via Issue or email (email preferred).
ylu066 at connect.hkust-gz.edu.cn

UniINR is an advanced neural framework designed to handle rolling shutter correction, deblurring, and interpolation in a unified approach guided by event-based inputs. This repository provides the code, datasets, and pre-trained models for reproducing results from our ECCV 2024 paper.

Overview

Our framework, UniINR, is aimed at enhancing video quality by addressing key challenges that arise in event-guided imaging, such as rolling shutter distortion, motion blur, and interpolation between frames. This repository includes all essential tools to train, test, and extend our approach.

UniINR Framework

Pre-trained Models and Logs

For quicker experiments, we provide pre-trained models along with their visualization results and log files. Access them through the provided download links, which should assist in replication and further experiments. Release Link: OneDrive

Dataset

Our work is built on two primary datasets used for training and testing:

  1. DeepUnrollNet Dataset (GitHub Link)
    • Since the original dataset might no longer be accessible, I provide this backup link.
  2. EvUnroll Dataset (GitHub Link)

Additionally, our training incorporates:

Simulating New Data

For generating new rolling shutter and blurred datasets, use the following scripts. Please note that both event simulation and rolling shutter simulation take the original high-frame-rate video as input.

  1. Rolling Shutter Blurred Frames:
    tools/1-rs-blur-dataset-generation/generate_rs_blur_frames_fastec.sh
  2. Rolling Shutter Sharp Frames:
    tools/1-rs-blur-dataset-generation/generate_rs_sharp_frames_fastec.sh

Or, customize dataset generation with:

python tools/1-rs-blur-dataset-generation/generate_rs_blur_frames.py \
    --dataset_path="./dataset/2-Fastec-Simulated/Train/" \
    --blur_accumulate_frames=<level of blur> \
    --blur_accumulate_step=<rolling step size> \
    --dataset="Fastec"

Training

To begin training, use the following command structure:

python egrsdb/main.py \
  --yaml_file=<YAML_FILE> \
  --log_dir=<LOG_DIR> \
  --alsologtostderr=True

Ensure that the specified YAML_FILE contains the correct configuration for your experiment setup.

Citation

If you find UniINR helpful in your research, please consider citing our paper:

@inproceedings{yunfanuniinr,
    title = {UniINR: Event-guided Unified Rolling Shutter Correction, Deblurring, and Interpolation},
    author = {Yunfan, LU and Liang, Guoqiang and Wang, Yusheng and Wang, Lin and Xiong, Hui},
    booktitle = {European Conference on Computer Vision (ECCV)},
    year = {2024},
}

Acknowledgements

We would like to thank the authors of the following works for making their datasets available, which greatly facilitated our research:

About

[ECCV2024] UniINR: Event-guided Unified Rolling Shutter Correction, Deblurring, and Interpolation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors