Skip to content

Latest commit

 

History

History
46 lines (29 loc) · 2.18 KB

Lesson4.md

File metadata and controls

46 lines (29 loc) · 2.18 KB

Deep Learning

Lesson 4

Introduction to Neural Networks

Artificial Neural networks are a network of artificial neurons(perceptrons) which mimick the human brain neural networks to produce a meaningful output. Neural networks consist of the input layer, hidden layer and the output layer.

The neural network relies on a training dataset to predict an outcome based upon the input and hidden layer. Each feature or input in the neural network layer has a weight attached to it. These weights are multiplied by each input value and then added together.

Forward Propagation & Backpropagation in Neural Networks

Forward Propagation refers to the computational movement from input layers toward the output layer and back propagation refres to the computational movement from the output layer to the the input layers.

Why do we need Backpropagation in Neural Networks

In a neural network, the predicted output depends upon:

  • Input Values
  • Activation Function
  • Beta Coefficients of the inputs(weights associated with inputs)
  • Optimizers(Biased Terms)

Backpropagation in neural networks is a suitable method used to reduce the loss function and hence improve the prediction accuracy. In backward propagation, we compute the gradient of the loss function with respect to the inputs to adjust the weights to minimize the loss function.