You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your great works.
I want to know which layers are more sensitive to quantization. I would be very grateful if you can share the (MP4) bit-width of each layer of MobileNet or ResNet.
The text was updated successfully, but these errors were encountered:
hustzxd
changed the title
bitwidth of each layer
bitwidth of each layer (discussion of MP)
Dec 14, 2020
I added the mixed-precision experiments. https://github.com/rhhc/ZeroQ-MP
However, there are relatively large accuracy gaps with the results of the paper in some cases.
So far, I have done several sets of experiments to compare the effects of quantization hyper-parameter(e.g., percentile). In particular, I only apply the “percentile” quantization to the activations.
I would be very grateful if the authors or others can give me some suggestions to improve the results of the experiment
Thanks for your great works.
I want to know which layers are more sensitive to quantization. I would be very grateful if you can share the (MP4) bit-width of each layer of MobileNet or ResNet.
Thanks for your great works.
I want to know which layers are more sensitive to quantization. I would be very grateful if you can share the (MP4) bit-width of each layer of MobileNet or ResNet.
The text was updated successfully, but these errors were encountered: