Skip to content

Description: This repository implements a 3-layer neural network to compare the performance of Gradient Descent, Momentum, and Adam optimization algorithms on a dataset, highlighting their training accuracy and convergence behavior.

Notifications You must be signed in to change notification settings

MohammedSaqibMS/Optimization_methods

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Network Optimization Algorithms 🌐

Overview 📈

This repository provides an implementation of a 3-layer neural network model that evaluates the performance of three different optimization algorithms: Gradient Descent, Momentum, and Adam. The objective is to compare their effectiveness in terms of accuracy and convergence behavior on a given dataset.

Acknowledgments 🙏

  • Special thanks to the Deep Learning Specialization for providing the foundational knowledge and skills necessary for implementing this project.

About

Description: This repository implements a 3-layer neural network to compare the performance of Gradient Descent, Momentum, and Adam optimization algorithms on a dataset, highlighting their training accuracy and convergence behavior.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published