Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

start和end拼接的信息 #24

Open
336655asd opened this issue Feb 24, 2021 · 5 comments
Open

start和end拼接的信息 #24

336655asd opened this issue Feb 24, 2021 · 5 comments

Comments

@336655asd
Copy link

您好,我在阅读您的论文和代码时发现,您论文中的公式用到了start和end拼接的信息,但是我没有从代码中发现,只看到直接通过biaffine就结束了,您能帮我指点一下在哪嘛。

@juntaoy
Copy link
Owner

juntaoy commented Feb 24, 2021

你好麻烦问一下你在论文的哪里看到有拼接的,我看了一下没有看到有拼接的地方:)

@336655asd
Copy link
Author

相关公式

是这边的后半部分,您看一下

@juntaoy
Copy link
Owner

juntaoy commented Feb 24, 2021

我懂你的意思了:)

第二项Wm 被以下代码加到 Um 里提高运算效率了:
if add_bias_1:
vector_set_1 = tf.concat(
[vector_set_1, tf.ones([batch_size, bucket_size, 1])], axis=2)
if add_bias_2:
vector_set_2 = tf.concat(
[vector_set_2, tf.ones([batch_size, bucket_size, 1])], axis=2)
让我们假设batch_size 和 C=1 这样解释起来比较简单:
我们输入两个 n x d 的矩阵, 当你用上面的代码加如bias 以后矩阵变成了 n x (d+1)
这时候你需要建一个 (d+1)x 1 x (d+1)的 bilinear_map, 相比较原输入矩阵 你需要建的是一个 d x 1 x d 的bilinear_map 那么多出来的最后一行和最后一列就成了我们的Wm

@336655asd
Copy link
Author

哦哦,原来是这样啊,怪我代码没仔细推敲,多谢您的指点!

@juntaoy
Copy link
Owner

juntaoy commented Feb 24, 2021

不客气:)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants