This is the Implementation of gradient descent algorithm to find the value of parameters(m,b) so that the cost function is minimized i.e. finding the line of best fit on the data.
Visualizing the data
A simple SSE algorithm for measuring error
def CostFunction(m,b,data):
sumError = 0
for itr in range(m_examples):
feature = data[itr,0]
label = data[itr,1]
predLabel = (m * feature) + b
sumError += (label - predLabel)**2
sumError = sumError/m_examples
return sumError
- Before fitting the parameters
- After Running Gradient Descent
The error changing as we move toward the minimum.
- numpy
- pandas (read the dataset)
- matplotlib (plotting)
Siraj Raval - Youtube - Intro - The Math of Intelligence
A Neural Network in 13 lines of Python (Part 2 - Gradient Descent) - Andrew Trask