You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our work implements novel L2-Norm gradient (L2Grad) and variance of the weight distrbution (VarianceNorm) regularizers for quantization-aware training such that the distribution of weights are more compatible with post-training quantization especially for low bit-widths. We provide a theoretical basis that directly relates L2-Grad with post quan…