This repository contains the implementation of the paper "DCEA: DETR With Concentrated Deformable Attention for End-to-End Ship Detection in SAR Images".
Ensure the following dependencies are installed:
python==3.8.0torch==2.0.1torchvision==0.15.2onnx==1.14.0onnxruntime==1.15.1pycocotoolsPyYAMLscipy
You can install these dependencies using:
pip install -r requirements.txtTo ensure seamless integration, prepare your dataset in the COCO standard format as outlined below.
- Place the dataset in the following path:
configs/dataset/coco/. - Structure the dataset files as follows:
coco/
annotations/ # COCO annotation JSON files
train2017/ # training images
val2017/ # validation images
To train the model, use:
python train.py -c path/to/config -r path/to/checkpointReplace path/to/config with the path to your configuration file, and path/to/checkpoint with the path to an existing checkpoint if resuming training (optional).
To evaluate the model, run:
python train.py -c path/to/config -r path/to/checkpoint --test-onlyAdding --test-only will run evaluation only, without further training.
For inference, use:
python inference.pyBefore running inference, configure inference.py with the correct paths and parameters as needed.
This project is released under the MIT License.
This implementation is based on DETR and RT-DETR frameworks. We thank the original authors for their contributions.