Releases: IDEA-Research/detrex
Releases · IDEA-Research/detrex
detrex v0.5.0 Release
Release v0.5.0
Support New Algorithms and Benchmarks, including:
- Support Focus-DETR (ICCV'2023)
- Support SQR-DETR (CVPR'2023), credits to Fangyi Chen
- Support EVA-01 (CVPR'2023 Highlight)
- Support EVA-02 (ArXiv'2023)
- Support DINO-EVA benchmarks, including
dino-eva-01
anddino-eva-02
with LSJ augmentation. - Support Align-DETR (ArXiv'2023), credits to Zhi Cai
All the pretrained DINO-EVA checkpoints can be downloaded in Huggingface Space
detrex v0.4.0 Release
Main updates
- Support CO-MOT aims for End-to-End Multi-Object Tracking by Feng Yan.
- Release
DINO
with optimized hyper-parameters which achieves50.0 AP
under 1x settings. - Release pretrained DINO based on
InternImage
,ConvNeXt-1K pretrained
backbones. - Release
Deformable-DETR-R50
pretrained weights. - Release
DETA
and betterH-DETR
pretrained weights: achieving50.2 AP
and49.1 AP
respectively.
Pretrained Model
DETA-R50-5scale-12ep
bs=8:50.0AP
DETA-R50-5scale-12ep
aligned hyper-param:49.9AP
DETA-R50-5scale-12ep
with only freeze the stem of backbone:50.2AP
H-Deformable-DETR-two-stage-R50-12ep
aligned optimizer hyper-params:49.1AP
DINO-R50-4scale-12ep
aligned optimizer hyper-params:49.4AP
DINO-Focal-3level-4scale-36ep
:58.3AP
Benchmark ConvNeXt on DINO
- convnext-tiny-384:
52.4AP
- convnext-small-384:
54.2AP
- convnext-base-384:
55.1AP
- convnext-large-384:
55.5AP
Benchmark InternImage on DINO
- internimage-tiny:
52.3AP
- internimage-small:
53.5AP
- internimage-base:
54.7AP
- internimage-large:
57.0AP
Benchmark FocalNet on DINO
- focalnet-tiny
- focalnet-small
- focalnet-base
Other pre-trained weights
- Deformable-DETR R50:
44.9 AP
(better than 44.5AP from original repo) Group-DETR-R50-12ep
:37.8AP
detrex v0.3.0 Release
What's New
New Algorithms
- Support
Anchor-DETR
- Support
DETA
More training techniques
- Support
EMAHook
during training by settingtrain.model_ema.enabled=True
, which can further enhance the model performance. - Fully support mixed precision training by setting
train.amp.enabled=True
, which can reduce 20% to 30% GPU memory usage. - Support encoder-decoder checkpoint in DINO which may reduce 30% GPU memory usage. And for more details about the checkpoint usage, please refer to this PR: #200
- Support fast debugging by setting
train.fast_dev_run=True
. - Support a great slurm training scripts by @rayleizhu , please check this issue #213 for more details.
Release more than 10+ pretrained checkpoints
Method | Pretrained | Epochs | Box AP |
---|---|---|---|
DETR-R50-DC5 | IN1k | 500 | 43.4 |
DETR-R101-DC5 | IN1k | 500 | 44.9 |
Anchor-DETR-R50 | IN1k | 50 | 42.2 |
Anchor-DETR-R50-DC5 | IN1k | 50 | 44.2 |
Anchor-DETR-R101 | IN1k | 50 | 43.5 |
Anchor-DETR-R101-DC5 | IN1k | 50 | 45.1 |
Conditional-DETR-R50-DC5 | IN1k | 50 | 43.8 |
Conditional-DETR-R101 | IN1k | 50 | 43.0 |
Conditional-DETR-R101 -DC5 | IN1k | 50 | 45.1 |
DAB-DETR-R50-3patterns | IN1k | 50 | 42.8 |
DAB-DETR-R50-DC5 | IN1k | 50 | 44.6 |
DAB-DETR-R50-DC5-3patterns | IN1k | 50 | 45.7 |
DAB-DETR-101-DC5 | IN1k | 50 | 45.7 |
DN-DETR-R50-DC5 | IN1k | 50 | 46.3 |
DINO with EMA | IN1k | 12 | 49.4 |
DETA-R50-5scale | IN1k | 12 | 50.1 |
DETA-Swin-Large | object-365 | 24 | 62.9 |
Part of the pre-trained weights are converted from their official repo, and all the pre-trained weights can be downloaded in detrex Model Zoo
MaskDINO Release
MaskDINO Release
- Release for MaskDINO Source Code: MaskDINO
- The detrex version can be found in projects/maskdino
detrex v0.2.1 Release
Highlights
- DINO has been accepted to ICLR 2023!
- Thanks a lot for @powermano provides us a detailed usage about onnx export in detrex. Please see this issue #192
What's New
New Algorithm
- MaskDINO COCO instance-seg/panoptic-seg pre-release #154
New Features
- New baselines for
Res/Swin-DINO-5scale
,ViTDet-DINO
,FocalNet-DINO
, etc. #138, #155 - Support FocalNet backbone #145
- Support Swin-V2 backbone #152
Documentation
- Add ViTDet / FocalNet download links and usage example, please refer to Download Pretrained Weights.
- Add tutorial on how to verify the correct installation of detrex. #194
Bug Fixes
- Fix demo confidence filter not to remove mask predictions #156
Code Refinement
- Make more readable logging info for criterion and matcher #151
- Modified learning rate scheduler config usage, add fundamental scheduler configuration #191
New Pretrained Models
All the pretrained weights can be downloaded in detrex Model Zoo
DINO
Method | Pretrained | Epochs | Box AP |
---|---|---|---|
DINO-ViTDet-Base-4scale | MAE | 12 | 50.2 |
DINO-ViTDet-Base-4scale | MAE | 50 | 55.0 |
DINO-ViTDet-Large-4scale | MAE | 12 | 50.2 |
DINO-ViTDet-Large-4scale | MAE | 50 | 55.0 |
DINO-FocalNet-Large-3level-4scale | IN22k | 12 | 57.5 |
DINO-FocalNet-Large-4level-4scale | IN22k | 12 | 58.0 |
DINO-FocalNet-Large-4level-5scale | IN22k | 12 | 58.5 |
detrex v0.2.0 Release
What's New
- Rebuild cleaner config files for projects
- Support H-Deformable-DETR, thanks a lot for @JiaDingCN
- Release H-Deformable-DETR pretrained weights including
H-Deformable-DETR-R50
,H-Deformable-DETR-Swin-Tiny
,H-Deformable-DETR-Swin-Large
. - Add demo for visualizing customized input images or videos using pretrained weights in demo, please check our documentation about the usage.
- Release new baselines for
DINO-Swin-Large-36ep
,DAB-Deformable-DETR-R50-50ep
,DAB-Deformable-DETR-Two-Stage-50ep
.
New Pretrained Models
All the pretrained weights can be downloaded in detrex Model Zoo
H-Deformable-DETR
Method | Pretrained | Epochs | Query Num | Box AP |
---|---|---|---|---|
H-Deformable-DETR-R50 + tricks | IN1k | 12 | 300 | 48.9 |
H-Deformable-DETR-R50 + tricks | IN1k | 36 | 300 | 50.3 |
H-Deformable-DETR-Swin-T + tricks | IN1k | 12 | 300 | 50.6 |
H-Deformable-DETR-Swin-T+ tricks | IN1k | 36 | 300 | 53.5 |
H-Deformable-DETR-Swin-L + tricks | IN22k | 12 | 300 | 56.2 |
H-Deformable-DETR-Swin-L + tricks | IN22k | 36 | 300 | 57.5 |
H-Deformable-DETR-Swin-L + tricks | IN22k | 12 | 900 | 56.4 |
H-Deformable-DETR-Swin-L + tricks | IN22k | 36 | 900 | 57.7 |
DINO
Method | Pretrained | Epochs | Box AP |
---|---|---|---|
DINO-R50-4Scale-12ep | IN1k | 12 | 49.2 |
DAB-Deformable-DETR
Method | Pretrained | Epochs | Box AP |
---|---|---|---|
DAB-Deformable-DETR-R50 | IN1k | 50 | 49.0 |
DAB-Deformable-DETR-R50-Two-Stage | IN1k | 50 | 49.7 |
detrex v0.1.1 Release
What's New
- Add model analysis tools in tools.
- Support visualization on COCO eval results and annotations in tools.
- Support Group-DETR.
- Release more
DINO
training results in DINO. - Release better
Deformable-DETR
baselines in Deformable-DETR. - Fix ConvNeXt bugs.
- Add pretrained weights download links and usage in documentation, see Download Pretrained Backbone Weights.
- Add documentation for tools, see Practical Tools and Scripts.
New Pretrained Models
All the pretrained weights can be downloaded in detrex Model Zoo.
DINO
Method | Pretrained | Epochs | Box AP |
---|---|---|---|
DINO-R50-4Scale | IN1k | 24 | 50.60 |
DINO-R101-4Scale | IN1k | 12 | 49.95 |
DINO-Swin-Tiny-224-4Scale | IN1k | 12 | 51.30 |
DINO-Swin-Tiny-224-4Scale | IN22k to IN1k | 12 | 51.30 |
DINO-Swin-Small-224-4Scale | IN1k | 12 | 52.96 |
DINO-Swin-Base-384-4Scale | IN22k to IN1k | 12 | 55.83 |
DINO-Swin-Large-224-4Scale | IN22k to IN1k | 12 | 56.92 |
DINO-Swin-Large-384-4Scale | IN22k to IN1k | 12 | 56.93 |
Deformable-DETR
Method | Pretrained | Epochs | Box AP |
---|---|---|---|
Deformable-DETR-R50 + Box-Refinement | IN1k | 50 | 46.99 |
Deformable-DETR-R50 + Box-Refinement + Two-Stage | IN1k | 50 | 48.19 |