Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chapter 6: layer_2_delta calculation #37

Open
Vaticinator opened this issue Mar 16, 2020 · 2 comments
Open

Chapter 6: layer_2_delta calculation #37

Vaticinator opened this issue Mar 16, 2020 · 2 comments

Comments

@Vaticinator
Copy link

Section "Backpropagation in Code":
There is:
layer_2_delta = (walk_vs_stop[i:i+1] - layer_2)
I belive it should be:
layer_2_delta = (layer_2 - walk_vs_stop[i:i+1])

@RomanGhost
Copy link

So it is possible, but for weight training you need + = instead of - =.

@LittleTownStudio
Copy link

LittleTownStudio commented Oct 3, 2020

  layer_2_delta = (walk_vs_stop[i:i+1] - layer_2)
  layer_1_delta = layer_2_delta.dot(weights_1_2.T)*relu2deriv(layer_1)

  weights_1_2 += alpha * layer_1.T.dot(layer_2_delta)
  weights_0_1 += alpha * layer_0.T.dot(layer_1_delta)     

  #### is same #######

  layer_2_delta = (layer_2 - walk_vs_stop[i:i+1])
  layer_1_delta=layer_2_delta.dot(weights_1_2.T)*relu2deriv(layer_1)

  weights_1_2 -= alpha * layer_1.T.dot(layer_2_delta)
  weights_0_1 -= alpha * layer_0.T.dot(layer_1_delta)


 ###
the two code result is same, but  i think the last code is logic right.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants