-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
关于FGM对抗训练中的emb_name参数 #5
Comments
这个参数名指的model的named_parameters中参数的命名,对于bert的话,这个值就是BertModel->embeddings->word_embeddings,这个word_embeddings就是emb_name,它完整的name应该是bert.embeddings.word_embeddings,但用字符串匹配的话只需要word_embeddings就够了。 |
十分感谢您的回复! |
我使用fgm的时候遇到了grad为none的情况,我查了一下说是torch只会保留叶节点的梯度?我看仓库代码里也不涉及对这部分的考虑,不知道你有遇到过这个问题吗,我是bert上面叠加了一层注意力机制 |
def attack(self, epsilon=1., emb_name='word_embeddings'):
emb_name这个参数要换成你模型中embedding的参数名
请问,这个embedding的参数名在哪里能看到呢?我看其他有的实现是在self.xxx = nn.Embedding(),其中xxx是emb_name的值,但是没有找到本项目的emb_name的值在哪里?谢谢您麻烦啦
The text was updated successfully, but these errors were encountered: