-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Backward mode differentiation #2
Comments
AraneusRota
changed the title
Komplexe Ausdrücke differenzieren
Backward mode differentiation
Jun 3, 2021
AraneusRota
added a commit
that referenced
this issue
Jun 14, 2021
AraneusRota
added a commit
that referenced
this issue
Jun 14, 2021
AraneusRota
changed the title
Backward mode differentiation
Runtime Backward mode differentiation
Jun 17, 2021
AraneusRota
changed the title
Runtime Backward mode differentiation
Backward mode differentiation
Jun 17, 2021
AraneusRota
added a commit
that referenced
this issue
Jul 22, 2021
…ns in function declarations which enables the user to write imperative code instead of passing the continuation manually
AraneusRota
added a commit
that referenced
this issue
Jul 22, 2021
AraneusRota
added a commit
that referenced
this issue
Jul 22, 2021
AraneusRota
added a commit
that referenced
this issue
Jul 26, 2021
…e of the derivation map (in '+'- and '*'-methods) ignoring the first update. This lead to wrong results if left and right hand sides are same (e.g. x * x)
AraneusRota
added a commit
that referenced
this issue
Jul 26, 2021
… is given a unique id for the derivation map because you can't use the object you want to construct in a function you want to pass to its constructor
AraneusRota
added a commit
that referenced
this issue
Sep 29, 2021
…was added multiple times into the tape when a "node" was used at multiple places in a calculation
AraneusRota
added a commit
that referenced
this issue
Sep 29, 2021
…ix to mark a monad which will be rewritten into flatMap-chains. '_yield' marks end of forless expressions. Other statements are also allowed but sometimes lead to problems (infinite compiler loop when defining functions)
AraneusRota
added a commit
that referenced
this issue
Sep 29, 2021
…es multiplication where only one parameter is allowed (like in the intro of the paper)
AraneusRota
added a commit
that referenced
this issue
Sep 29, 2021
AraneusRota
added a commit
that referenced
this issue
Sep 30, 2021
…low constructs and variables (val, var) possible)
AraneusRota
added a commit
that referenced
this issue
Oct 3, 2021
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Machine learning tasks use backpropagation to learn with neural networks therefore backward mode differentiation must be supported.
Motivation:
Usually a neural network has many inputs but only a few outputs
Multiple implementation styles are tried out (see checklist on top). Dual numbers aren't an own category anymore because they are probably used in both styles because you often need access to the normal evaluation result and the derivative of a term which can be achieved by using dual numbers.
The text was updated successfully, but these errors were encountered: