Skip to content

its-nott-me/Linear-Regression

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

1 Commit
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿ“ˆ Linear Regression from Scratch

This project implements Linear Regression from scratch in Python, without relying on scikit-learnโ€™s ready-made models.
It explores different optimization methods, compares their performance, and visualizes convergence and predictions.

๐Ÿ‘‰ Open in Google Colab


๐Ÿš€ Implemented Methods

  • Closed-form solution (Normal Equation)
  • Gradient Descent (GD)
  • Stochastic Gradient Descent (SGD)
  • Momentum-based Gradient Descent
  • Adam Optimizer

๐Ÿ“Š Features

  • Custom StandardScaler for data normalization
  • Metrics: MSE, RMSE, MAE, Rยฒ score
  • Visualization utilities:
    • Loss curve comparison
    • Prediction vs. actual plots
    • Residual analysis
    • Convergence rate (log-scale)
    • Step-by-step gradient descent visualization

๐Ÿ–ผ๏ธ Example Outputs

Loss Curves Comparison

Loss Curves

Predictions Comparison

Predictions

Gradient Descent Steps

GD Steps


โš™๏ธ Getting Started

Clone the repo and open the notebook:

git clone https://github.com/yourusername/linear-regression-scratch.git
cd linear-regression-scratch
jupyter notebook LinearRegression.ipynb

Or run directly on Colab: ๐Ÿ‘‰ Colab Link

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published