Understanding the performance of different neural network architectures on the MNIST handwritten digits dataset, implemented in Tensorflow.
-
Updated
Dec 13, 2021 - Jupyter Notebook
Understanding the performance of different neural network architectures on the MNIST handwritten digits dataset, implemented in Tensorflow.
A hands-on guide to automatic differentiation in TensorFlow using tf.GradientTape. Covers computing gradients for variables vs. constants, using tape.watch(), visualizing derivatives, and handling multiple parameters.
Add a description, image, and links to the gradienttape topic page so that developers can more easily learn about it.
To associate your repository with the gradienttape topic, visit your repo's landing page and select "manage topics."