Attention layers in Tensorflow: how to compress? #30
Unanswered
edge7
asked this question in
Feedbacks-Model Compressor
Replies: 1 comment
-
Currently, the attention layer is not supported for TensorFlow models. If you don't have PyTorch model, why don't you try to use Trainer of PyNetsPresso? (please check https://github.com/Nota-NetsPresso/PyNetsPresso or https://github.com/Nota-NetsPresso/netspresso-trainer) Sorry for the inconvenience and will let you know when your models become compatible! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I am trying to compress a Tensorflow model which contains attention layers:
this is the implementation:
When trying to compress, I get an error. It is possible to understand more about this?
Beta Was this translation helpful? Give feedback.
All reactions