These programs can be run easily using either matlab or octave.
Implementing and visualizing linear regression using gradient descent as optimizer. (Accuracy)
Implementing and visualizing logistic regression using fminfunc as optimizer. (Training set accuracy: 89.0 %)
Implementing One vs All logistic regression to classify handwritten numbers. (Training set accuracy: 95.0 %)
Implementing a neural net with some pre trained weights on the same dataset as the previous problem using feedforward and backpropagation algorithm.
(Visualization of the neural network)
Learning and tuning hyperparameters such as lambda for regularization, via cross validation.
Implementing an linear SVM for random dataset using RBF(Radial Basis Function)
Implementing a basic K-Means algorithm to cluster some unsupervised data.
Learning and visualizing testing parameters for a model, such as F1 score, precision, accuracy, etc.