English | 简体中文
Longfei Liu *‡
Yongjie Hou *
Yang Li *
Qirui Wang *
Youyang Sha
Yongjun Yu
Yinzhi Wang
Peizhe Ru
Xuanlong Yu†
Xi Shen †
* Equal Contribution ‡ Project Leader † Corresponding Author
- [2026-04-20] Our previous version DEIMv2 was used by two winning teams at the CVPR 2026 Maritime Computer Vision Workshop, taking home 2nd place in Thermal Object Detection Challenge and 3rd place in Vision-to-Chart Data Association Challenge.
- [2026-03-21] We have uploaded our models on 🤗 Hugging Face.
- [2026-03-19] Initial release of EdgeCrafter.
We have uploaded our models on 🤗 Hugging Face! You can also access these models via hf_models.ipynb. Have a try!
- Detection & Instance Segmentation: Instructions
- Pose Estimation: Instructions
Note: Latency is measured on an NVIDIA T4 GPU with batch size 1 under FP16 precision using TensorRT (v10.6).
| Model | Size | AP50:95 | #Params | GFLOPs | Latency (ms) | Config | Log | Checkpoint |
|---|---|---|---|---|---|---|---|---|
| ECDet-S | 640 | 51.7 | 10 | 26 | 5.41 | config | log | model |
| ECDet-M | 640 | 54.3 | 18 | 53 | 7.98 | config | log | model |
| ECDet-L | 640 | 57.0 | 31 | 101 | 10.49 | config | log | model |
| ECDet-X | 640 | 57.9 | 49 | 151 | 12.70 | config | log | model |
| Model | Size | AP50:95 | #Params | GFLOPs | Latency (ms) | Config | Log | Checkpoint |
|---|---|---|---|---|---|---|---|---|
| ECSeg-S | 640 | 43.0 | 10 | 33 | 6.96 | config | log | model |
| ECSeg-M | 640 | 45.2 | 20 | 64 | 9.85 | config | log | model |
| ECSeg-L | 640 | 47.1 | 34 | 111 | 12.56 | config | log | model |
| ECSeg-X | 640 | 48.4 | 50 | 168 | 14.96 | config | log | model |
| Model | Size | AP50:95 | #Params | GFLOPs | Latency (ms) | Config | Log | Checkpoint |
|---|---|---|---|---|---|---|---|---|
| ECPose-S | 640 | 68.9 | 10 | 30 | 5.54 | config | log | model |
| ECPose-M | 640 | 72.4 | 20 | 63 | 9.25 | config | log | model |
| ECPose-L | 640 | 73.5 | 34 | 112 | 11.83 | config | log | model |
| ECPose-X | 640 | 74.8 | 51 | 172 | 14.31 | config | log | model |
# Create conda environment
conda create -n ec python=3.11 -y
conda activate ec
# Install dependencies
pip install -r requirements.txtThe easiest way to test EdgeCrafter is to run inference on a sample image using a pre-trained model.
# 1. Download a pre-trained model (e.g., ECDet-L)
cd ecdetseg
wget https://github.com/capsule2077/edgecrafter/releases/download/edgecrafterv1/ecdet_l.pth
# 2. Run PyTorch inference
# Make sure to replace `path/to/your/image.jpg` with an actual image path
python tools/inference/torch_inf.py -c configs/ecdet/ecdet_l.yml -r ecdet_l.pth -i path/to/your/image.jpgThis project is released under the Apache 2.0 License.
We thank the authors of the following open-source projects that made this work possible: RT-DETR, D-FINE, DEIM, lightly-train, DETRPose, RF-DETR, DINOv3
If you find this project useful in your research, please consider citing:
@article{liu2026edgecrafter,
title={EdgeCrafter: Compact ViTs for Edge Dense Prediction via Task-Specialized Distillation},
author={Liu, Longfei and Hou, Yongjie and Li, Yang and Wang, Qirui and Sha, Youyang and Yu, Yongjun and Wang, Yinzhi and Ru, Peizhe and Yu, Xuanlong and Shen, Xi},
journal={arXiv},
year={2026}
}