This repo contains tutorials to implement various ML algorithms from scratch or using pre-built libraries. This is a living repo and I will be adding more tutorials as I learn more. Hope it will be helpful for someone who wants to understand these algorithms conceptually as well as learn how to implement them using Python.
-
01_Gradient_Boosting_Scratch.ipynb This jupyter notebook has implementation of basic gradient boosting algorithm with an intuitive example. Learn about decision tree and intuition behind gradient boositng trees.
-
02_Collaborative_Filtering.ipynb Builting MovieLens recommendation system with collaborating filtering using PyTorch and fast.ai.
-
03_Random_Forest_Interpretetion.ipynb How to interpret a seemimngly blackbox algorithm. Feature importance, Tree interpretor and Confidence intervals for predictions.
-
04_Neural_Net_Scratch.ipynb Using MNSIT data, this notebook has implementation of neural net from scratch using PyTorch.
-
05_Loss_Functions.ipynb Exploring regression and classification loss functions.
-
06_NLP_Fastai.ipynb Naive bayes, logistic regression, bag of words on IMDB data.
-
07_Eigenfaces.ipynb Preprocessing of faces and PCA analysis on the data to recontruct faces and see similarities among differnt faces.
-
08_kmeans_scratch.ipynb Implementation and visualization of kmeans algorithm from scratch.
-
09_Quantile_Regression.ipynb Implementation of quantile regression using sklearn.
-
10_Transfer_Learn_MXNet.ipynb Tutorial on how to perform transfer learning using MXNet. Notebook used in this blogpost.
-
11_Applications_of_different_parts_of_an_ROC_curve.ipynb Understanding the importance of different parts of an ROC curve and exploring variants of AUC for ML applications