Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using the same weights in two different positions seems wrong #23

Open
pligor opened this issue Jul 6, 2017 · 0 comments
Open

Using the same weights in two different positions seems wrong #23

pligor opened this issue Jul 6, 2017 · 0 comments

Comments

@pligor
Copy link

pligor commented Jul 6, 2017

Related to this jupyter notebook: https://github.com/ematvey/tensorflow-seq2seq-tutorials/blob/master/2-seq2seq-advanced.ipynb

At In[17] you are setting the weights W (and bias b) once and then you use them in two different places
First, you use them inside loop function at In[20] and then you use them again at In[23].

You are doing the same thing, aren't you? You want to calculate, at In[23] what you have already calculated inside In[20] but you have trouble extracting it?...

Cannot understand what is going on exactly. Could you help clarify it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant