This project implements Linear Regression from scratch in Python, without relying on scikit-learnโs ready-made models.
It explores different optimization methods, compares their performance, and visualizes convergence and predictions.
๐ Open in Google Colab
- Closed-form solution (Normal Equation)
- Gradient Descent (GD)
- Stochastic Gradient Descent (SGD)
- Momentum-based Gradient Descent
- Adam Optimizer
- Custom
StandardScalerfor data normalization - Metrics: MSE, RMSE, MAE, Rยฒ score
- Visualization utilities:
- Loss curve comparison
- Prediction vs. actual plots
- Residual analysis
- Convergence rate (log-scale)
- Step-by-step gradient descent visualization
Clone the repo and open the notebook:
git clone https://github.com/yourusername/linear-regression-scratch.git
cd linear-regression-scratch
jupyter notebook LinearRegression.ipynbOr run directly on Colab: ๐ Colab Link


