Closed form solution for linear regression

In the previous post on Batch Gradient Descent and Stochastic Gradient Descent, we looked at two iterative methods for finding the parameter vector  which minimizes the square of the error between the predicted value  and the actual output  for all  values in the training set. A closed form solution for finding the parameter vector  is possible, and in this post…

Read More

Stochastic Gradient Descent

For curve fitting using linear regression, there exists a minor variant of Batch Gradient Descent algorithm, called Stochastic Gradient Descent. In the Batch Gradient Descent, the parameter vector  is updated as, . (loop over all elements of training set in one iteration) For Stochastic Gradient Descent, the vector gets updated as, at each iteration the…

Read More

Batch Gradient Descent

I happened to stumble on Prof. Andrew Ng’s Machine Learning classes which are available online as part of Stanford Center for Professional Development. The first lecture in the series discuss the topic of fitting parameters for a given data set using linear regression.  For understanding this concept, I chose to take data from the top…

Read More