You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I'm trying to quantize MobileNetV3 with tflite, but int8-model performs very poor. I think, it is because of linear quantization, which is too simple method not appropriate for any weights distributions. What else can I try? Are you going to support logarithmic scale for quantization in the future?
The text was updated successfully, but these errors were encountered:
Hi! I'm trying to quantize MobileNetV3 with tflite, but int8-model performs very poor. I think, it is because of linear quantization, which is too simple method not appropriate for any weights distributions. What else can I try? Are you going to support logarithmic scale for quantization in the future?
The text was updated successfully, but these errors were encountered: