Skip to content

Latest commit

 

History

History
25 lines (20 loc) · 506 Bytes

File metadata and controls

25 lines (20 loc) · 506 Bytes

Neural Network Implementation

Description

Neural Network implemented with different Activation Functions, Optimizers, and Loss Functions.

Activation Functions

  • Sigmoid
  • Relu
  • Leaky-Relu
  • Softmax

Optimizers

  • Gradient Descent
  • AdaGrad
  • RMSProp
  • Adam

Loss Functions

  • Cross-Entropy Loss
  • Hinge-Loss
  • Mean Squared Error (MSE)

Contributors