Skip to content

Latest commit

 

History

History
821 lines (802 loc) · 36.2 KB

model_zoo.md

File metadata and controls

821 lines (802 loc) · 36.2 KB

model zoo

Image Classification on Cifar10

Model Model Size(M) Flops(M) Top-1 ACC Inference Time(μs) Inference Device Download
CARS-A 7.72 469 95.923477 51.28 V100 tar
CARS-B 8.45 548 96.584535 69.00 V100 tar
CARS-C 9.32 620 96.744791 71.62 V100 tar
CARS-D 10.5 729 97.055288 82.72 V100 tar
CARS-E 11.3 786 97.245592 88.98 V100 tar
CARS-F 16.7 1234 97.295673 244.07 V100 tar
CARS-G 19.1 1439 97.375801 391.20 V100 tar
CARS-H 19.7 1456 97.425881 398.88 V100 tar

Image Classification on ImageNet

Model Model Size(M) Flops(G) Top-1 ACC Inference Time(s) Download
EfficientNet:B0 20.3 0.40 76.82 0.0088s/iters tar
EfficientNet:B4 74.3 4.51 82.87 0.015s/iters tar
EfficientNet:B8:672 88 63 85.7 0.98s/iters tar
EfficientNet:B8:800 88 97 85.8 1.49s/iters tar

Detection on COCO-minival

Model Model Size(M) Flops(G) mAP Inference Time(ms) Inference Device Download
SM-NAS:E0 16.23 22.01 27.11 24.56 V100 tar
SM-NAS:E1 37.83 64.72 34.20 32.07 V100 tar
SM-NAS:E2 33.02 77.04 40.04 39.50 V100 tar
SM-NAS:E3 52.05 116.22 42.68 50.71 V100 tar
SM-NAS:E4 92 115.51 43.89 80.22 V100 tar
SM-NAS:E5 90.47 249.14 46.05 108.07 V100 tar

Detection on CULane

Model Flops(G) F1 Score Inference Time(ms) Inference Device Download
AutoLane: CULane-s 66.5 71.5 - V100 tar
AutoLane: CULane-m 66.9 74.6 - V100 tar
AutoLane: CULane-l 273 75.2 - V100 tar

Super-Resolution on Urban100, B100, Set14, Set5

Model Model Size(M) Flops(G) Urban100 B100 Set14 Set5 Inference Time(ms) Download
PSNR SSIM PSNR SSIM PSNR SSIM PSNR SSIM
ESR-EA:ESRN-V-1 1.32 40.616 31.65 0.8814 32.09 0.8802 33.37 0.8887 37.79 0.9566 29.38 tar
ESR-EA:ESRN-V-2 1.31 40.21 31.69 0.8829 32.08 0.8810 33.37 0.8911 37.84 0.9569 31.25 tar
ESR-EA:ESRN-V-3 1.31 41.676 31.47 0.8803 32.05 0.8789 33.35 0.8878 37.79 0.9570 21.78 tar
ESR-EA:ESRN-V-4 1.35 40.17 31.58 0.8814 32.06 0.8810 33.35 0.0.8902 37.83 0.9567 30.98 tar
SR_EA:M2Mx2-A 3.20 196.27 32.20 0.8948 32.20 0.8842 33.65 0.8943 38.06 0.9588 11.41 tar
SR_EA:M2Mx2-B 0.61 35.03 31.77 0.8796 32.00 0.8989 33.32 0.8870 37.73 0.9562 8.55 tar
SR_EA:M2Mx2-C 0.24 13.49 30.92 0.8717 31.89 0.8783 33.13 0.8829 37.56 0.9556 5.59 tar

Segmentation on VOC2012

Model Model Size(M) Flops(G) Params(K) mIOU Download
Adelaide 10.6 0.5784 3822.14 0.7602 tar

DNet

Part 1

Model Accuracy FLOPS (G) Params (M) Inference Time (ms) Download
D-224p D-512p D-720p Pytorch-V100 Caffe-V100 Caffe-CPU
D-Net-21 61.51 0.21 2.59 2.02 2.84 6.11 3.02 2.26 22.30 tar
D-Net-24 62.06 0.24 2.62 1.98 2.77 6.60 2.89 2.51 23.50 tar
D-Net-30 64.49 0.30 3.16 1.95 2.81 7.22 2.86 2.71 27.30 tar
D-Net-39 67.71 0.39 4.37 2.06 3.01 7.12 3.10 2.70 23.80 tar
D-Net-40 66.92 0.40 3.94 1.98 2.96 6.42 2.97 2.48 17.60 tar
D-Net-59 71.29 0.59 7.13 2.30 3.31 8.10 3.28 2.71 33.10 tar
D-Net-94 70.23 0.94 5.80 2.19 3.54 8.75 2.93 2.84 39.10 tar
D-Net-124 76.09 1.24 11.80 2.87 5.09 15.42 4.36 3.65 56.30 tar
D-Net-147 71.91 1.47 10.47 2.50 5.28 23.84 2.29 2.24 45.20 tar
D-Net-156 74.46 1.56 15.24 2.52 4.13 11.89 3.02 2.86 32.40 tar
D-Net-159 75.21 1.59 19.13 3.05 4.71 12.76 4.55 3.78 43.30 tar
D-Net-166 72.93 1.66 10.82 2.27 4.26 11.42 2.97 2.68 50.60 tar
D-Net-167 74.18 1.67 10.56 2.51 4.21 12.03 2.92 2.84 43.60 tar
D-Net-172 76.41 1.72 17.44 4.02 10.72 36.41 3.51 34.33 106.20 tar
D-Net-177 75.55 1.77 19.48 3.65 5.40 14.09 5.66 4.64 58.80 tar
D-Net-234 78.80 2.34 28.45 5.03 8.01 21.35 8.69 7.44 87.10 tar
D-Net-263 76.87 2.63 21.42 3.42 6.04 19.13 4.44 4.08 90.40 tar
D-Net-264 76.52 2.64 20.17 3.12 5.54 16.88 4.27 4.01 62.50 tar
D-Net-275 78.28 2.75 30.76 4.09 10.56 34.76 4.22 4.03 96.60 tar
D-Net-367 79.37 3.67 41.83 5.56 15.09 66.57 6.86 6.05 130.90 tar
D-Net-394 77.91 3.94 25.15 3.35 7.79 24.97 4.38 4.12 75.80 tar
D-Net-426 78.24 4.24 25.31 3.56 8.37 27.06 - - - tar
D-Net-504 78.96 5.04 28.47 3.57 9.07 30.32 4.59 4.90 93.50 tar
D-Net-538 80.92 5.38 44.00 5.90 13.89 46.83 9.73 8.52 156.80 tar
D-Net-572 80.41 5.72 49.29 6.24 18.56 87.48 5.17 5.54 182.20 tar
D-Net-626 79.21 6.26 29.39 4.27 11.46 38.53 6.60 6.51 171.80 tar
D-Net-662 80.83 6.62 70.45 7.84 23.57 116.79 6.67 6.51 163.70 tar
D-Net-676 79.76 6.76 36.17 4.60 12.32 46.65 6.55 6.47 182.20 tar
D-Net-695 79.53 6.95 29.38 5.25 12.33 40.84 8.75 8.31 160.70 tar
D-Net-834 80.23 8.34 46.10 5.53 13.19 42.65 8.11 8.68 262.50 tar
D-Net-876 81.67 8.76 47.83 14.87 41.51 150.69 19.05 16.23 317.90 tar
D-Net-1092 80.39 10.92 42.21 5.18 17.18 80.49 7.11 7.68 232.50 tar
D-Net-1156 80.61 11.56 43.03 5.34 17.92 83.32 7.31 8.02 260.50 tar
D-Net-1195 80.63 11.95 45.49 5.55 18.40 85.05 7.95 8.63 259.10 tar
D-Net-1319 81.38 13.19 72.44 8.08 19.88 63.23 14.14 14.15 300.40 tar
D-Net-1414 81.22 14.14 79.39 8.05 21.49 76.60 12.34 12.17 251.90 tar
D-Net-1549 81.11 15.49 51.96 6.37 22.53 112.33 8.35 9.51 295.50 tar
D-Net-1772 81.52 17.72 77.81 7.67 28.05 128.57 11.10 12.29 357.60 tar
D-Net-1822 82.08 18.22 103.00 11.80 50.53 298.63 9.51 12.11 434.10 tar
D-Net-2354 82.65 23.54 130.45 20.94 77.97 397.44 19.08 21.13 670.70 tar
D-Net-2524 82.04 25.24 76.66 11.20 35.08 129.15 18.71 19.39 504.90 tar
D-Net-2763 82.42 27.63 87.34 12.19 38.15 140.61 19.96 21.15 599.60 tar
D-Net-2883 82.38 28.83 93.44 12.25 39.51 146.81 20.05 21.54 554.10 tar

Part 2

model_name input performance Params(M) FLOPS(G) Acts(M) bs1 bs8 bs64 224_1 224_8 512_1 512_8 720_1 720_8 ptm_time_bs1 ptm_time_bs8 Download
2-_32_11-12-112-1121112 224 61.51 2.59 0.20 1.07 3.35 3.33 5.11 2.00 3.40 2.74 11.63 6.15 37.03 0.50 2.22 pth
2-_32_2-11-112-1121112 224 62.06 2.62 0.23 1.27 3.23 3.25 5.30 1.87 3.58 2.74 11.64 6.59 41.28 0.53 2.42 pth
2-_32_2-11-1212-111112 224 64.49 3.16 0.29 1.38 2.52 3.11 5.94 2.04 3.62 2.92 12.88 7.15 45.50 0.60 2.65 pth
031-_32_1-1-221-11121 224 66.92 3.94 0.39 1.28 3.12 2.69 5.92 2.05 3.53 2.84 12.08 6.33 38.83 0.66 2.63 pth
011-_32_2-1-221-11121 224 67.71 4.37 0.38 1.56 3.52 3.38 6.48 2.05 3.80 2.99 13.06 7.19 45.45 0.75 3.08 pth
010-_64_1-211-2-11112 224 70.23 5.80 0.93 2.13 3.53 3.33 10.44 2.12 4.82 3.41 18.64 8.93 68.09 1.04 4.49 pth
011-_32_2-211-2-111122 224 71.29 7.13 0.58 1.96 3.78 3.91 8.94 2.39 4.56 3.23 15.13 8.01 57.70 1.11 4.05 pth
011-_32_2-121-1121-11112111 224 71.27 7.59 0.66 2.10 5.14 4.12 9.48 2.69 5.19 3.65 16.70 8.69 62.01 1.19 4.39 pth
2-_64_2-211-2-11112 224 71.91 10.47 1.46 3.04 2.46 3.16 15.53 2.57 10.70 5.31 38.41 23.90 151.78 1.69 6.57 pth
031-_64_1-211-2-11112 224 74.18 10.56 1.66 3.07 3.26 3.48 16.38 2.65 7.43 4.25 27.33 12.33 99.13 1.71 6.83 pth
020-_64_1-211-2-11112 224 72.93 10.82 1.65 2.45 3.28 3.34 14.08 2.40 6.47 4.16 24.49 11.53 86.73 1.61 5.83 pth
10001-_48_41-1-221-11121 224 76.09 11.80 1.21 5.03 4.93 5.12 20.30 2.78 7.72 5.13 32.45 15.27 115.47 2.29 10.36 pth
211-_32_41-211-121-11111121 224 76.29 12.85 1.42 4.19 4.92 4.46 20.04 3.36 11.30 5.97 39.53 18.11 145.97 2.20 8.88 pth
333-a01_64_111-2111-211111-211 224 76.86 13.47 2.34 8.08 6.97 6.75 35.23 4.99 23.57 12.69 96.26 39.08 345.12 2.96 14.54 pth
31311-a02a12_64_211-2111-211111-211 224 78.04 14.17 2.32 8.46 7.89 8.19 40.55 5.62 22.02 12.19 96.61 40.86 359.24 3.47 18.11 pth
031-_64_1-1-221-11121 224 74.46 15.24 1.55 2.56 3.50 3.85 14.54 2.53 8.30 4.12 25.37 11.80 94.53 2.05 6.20 pth
421-_64_4-11-1212-111112 224 76.41 17.44 1.69 6.52 3.27 4.98 30.92 4.12 21.48 10.73 82.45 36.27 304.81 5.19 17.47 pth
031-_32_1-1211-112111112-22 224 75.21 19.13 1.57 2.47 5.11 5.93 15.81 3.15 8.89 4.77 25.03 12.55 93.93 2.39 6.35 pth
031-_32_111-111121-1121111112-22 224 75.55 19.48 1.75 3.22 6.20 6.08 18.68 3.66 9.43 5.28 28.24 14.32 109.11 2.60 7.76 pth
031-_64_11-211-121-11111121 224 76.52 20.17 2.63 4.08 5.02 5.19 23.21 3.41 11.11 5.53 37.70 16.86 139.62 2.90 9.88 pth
011-_64_21-211-121-11111121 224 76.87 21.42 2.61 4.69 5.10 5.65 25.34 3.64 11.89 5.91 44.06 19.24 160.38 3.16 11.12 pth
031-_64_12-1111-11211112-2 224 78.16 24.42 4.56 6.10 4.81 6.93 38.58 3.49 14.01 8.75 66.67 29.43 248.20 3.77 15.15 pth
031-_64_1-1211-112111112-2 224 77.91 25.15 3.92 4.54 4.96 6.45 32.12 3.40 12.51 7.76 57.50 25.02 214.95 3.44 12.12 pth
031-_64_11-11211-112111112-2 224 78.24 25.31 4.24 5.29 4.98 6.54 35.19 3.56 13.27 8.37 62.53 27.06 232.30 3.65 13.63 pth
10001-_64_4-111-11122-1111111111111112 224 78.80 28.45 2.31 6.80 8.18 8.24 33.83 5.03 12.20 8.00 52.07 21.43 180.97 4.36 15.48 pth
031-_64_1-12112-111111112-2 224 78.96 28.47 5.01 5.47 4.78 7.41 39.57 3.60 14.71 9.04 70.00 30.32 263.07 3.96 14.76 pth
02031-a02_64_112-1-11111111121112-1 224 79.53 29.38 6.90 10.44 9.12 10.92 55.25 5.31 18.96 12.21 94.28 40.77 349.24 5.15 23.45 pth
011-128-111121111111-1211111112-11 224 79.21 29.39 6.23 7.45 7.22 9.66 48.53 4.20 17.58 11.28 86.41 38.45 327.18 4.69 20.20 pth
211-_64_4-11-1212-111112 224 78.28 30.76 2.72 5.52 4.50 6.50 30.58 4.31 20.67 10.58 78.84 34.65 283.53 4.28 13.36 pth
011-_128_111-21111-111112111121-111 224 79.76 36.17 6.73 7.88 6.90 10.41 52.64 4.70 20.58 12.35 96.16 46.64 395.22 5.55 21.88 pth
011-_128_1-1111211111121-111111111121-11 224 80.32 39.50 9.53 9.84 8.59 13.37 68.68 5.28 25.96 15.58 125.84 62.52 534.45 6.29 28.21 pth
201-a01_64_4-121-1121-11112111 224 79.37 41.83 3.64 7.18 7.29 9.55 42.98 5.54 28.78 15.05 120.06 66.54 561.49 5.86 18.67 pth
011-_128_1-11111211211-111111112111-1 224 80.39 42.21 10.88 9.61 6.61 14.08 72.45 5.18 26.55 17.07 141.86 80.55 676.20 6.52 29.56 pth
011-_128_1-111211111211-111111112111-1 224 80.61 43.03 11.52 10.21 7.03 14.89 76.31 5.34 27.80 17.85 148.11 83.31 699.94 6.76 31.42 pth
10001-_64_4-111111111-211112111112-11111 224 80.92 44.00 5.33 10.94 8.90 12.83 63.61 5.82 29.65 13.83 111.33 46.99 419.47 6.90 26.26 pth
201-a01_64_41-121-11121-111112111 224 79.76 44.07 4.14 8.61 8.33 10.79 49.85 6.19 31.42 16.93 135.30 76.74 650.75 6.51 22.18 pth
011-_128_1-1111121111211-11111111112111-1 224 80.63 45.49 11.90 10.51 8.14 15.39 79.23 5.78 28.85 18.57 152.05 84.91 717.28 7.07 32.50 pth
011-_64_211-2111-21111111111111111111111-211 224 80.23 46.10 8.30 8.46 8.58 12.26 61.03 5.55 22.94 13.34 101.35 42.82 399.46 6.58 25.46 pth
32341-a02c12_64_111-2111-21111111111111111111111-211 224 81.67 47.83 8.67 23.51 16.48 23.94 128.54 14.97 91.15 41.54 379.11 150.41 1366.67 9.50 43.72 pth
211-_64_41-211-121-11111121 224 80.41 49.29 5.68 8.38 5.43 10.82 53.08 6.17 36.73 18.62 154.45 87.25 714.54 7.02 23.36 pth
011-_128_1-111121111211111-1111111121111-1 224 81.27 51.96 15.44 12.17 7.78 18.77 95.30 6.28 34.92 22.48 194.34 112.36 944.00 8.20 39.98 pth
23311-a02c12_64_211-2111-211111-211 224 81.66 66.87 11.01 12.60 10.23 19.38 102.58 11.84 79.11 41.04 360.70 196.96 1713.64 10.42 39.89 pth
211-_64_41-121-11121-111112111 224 80.83 70.45 6.58 8.81 6.74 12.96 59.13 7.74 48.90 23.48 190.71 116.54 962.28 9.00 24.70 pth
02031-a02_64_111-2111-21111111111111111111111-211 224 81.38 72.44 13.13 14.58 15.52 19.53 94.60 8.20 33.68 20.03 150.70 63.20 580.97 10.13 38.02 pth
02031-a02_64_1121-11111111111111111111111111-211111121111-1 224 82.04 76.66 25.12 29.83 17.32 32.57 172.00 11.33 52.20 34.95 284.90 129.17 1120.81 13.90 73.14 pth
011-_128_2-111111111111111111-121111111111121111111-2 224 81.52 77.81 17.66 14.53 11.25 22.59 114.93 7.67 42.09 27.95 223.36 128.39 1080.82 11.35 48.58 pth
02031-a02_64_1-111111211111-111121121111-2 224 81.22 79.39 14.08 12.82 12.00 19.53 92.20 8.02 36.02 21.53 166.44 76.58 660.04 10.36 37.02 pth
02031-a02_64_1121-111111111111111111111111111-21111111211111-1 224 82.42 87.34 27.51 31.34 21.72 35.34 185.75 11.97 56.46 38.01 308.30 140.38 1219.08 15.25 78.39 pth
02031-a02_64_1121-111111111111111111111111111-21111112111111-1 224 82.38 93.44 28.70 31.64 18.79 36.44 191.20 12.21 58.64 39.37 319.61 146.69 1269.18 15.90 80.45 pth
211-_64_411-2111-21111111111111111111111-211 224 82.08 103.00 18.16 15.91 10.50 25.20 131.92 11.61 75.93 50.39 418.51 298.10 2543.09 14.29 52.45 pth
23311-a02c12_64_211-2111-21111111111111111111111-211 224 82.65 130.45 23.46 19.42 18.00 36.97 191.89 20.87 136.62 77.75 664.57 396.85 3414.92 18.66 68.88 pth

Reference

The compressed package of each model contains the model and inference sample code. If you have any questions, submit the issue in a timely manner. We will reply to you in a timely manner.