Stochastic Gradient Descent

For curve fitting using linear regression, there exists a minor variant of Batch Gradient Descent algorithm, called Stochastic Gradient Descent. In the Batch Gradient Descent, the parameter vector  is updated as, . (loop over all elements of training set in one iteration) For Stochastic Gradient Descent, the vector gets updated as, at each iteration the…

Read More

Closed form solution for linear regression

In the previous post on Batch Gradient Descent and Stochastic Gradient Descent, we looked at two iterative methods for finding the parameter vector  which minimizes the square of the error between the predicted value  and the actual output  for all  values in the training set. A closed form solution for finding the parameter vector  is possible, and in this post…

Read More

Equal Gain Combining (EGC)

This is the second post in the series discussing receiver diversity in a wireless link. Receiver diversity is a form of space diversity, where there are multiple antennas at the receiver. The presence of receiver diversity poses an interesting problem – how do we use ‘effectively‘ the information from all the antennas to demodulate the…

Read More

Happy New Year 2010

Wishing all the readers of dsplog.com a great year 2010 ! Its been a mixed year for dsplog. Some key milestones a) Crossing 1000 subscribers with 1100+ comments in March 2009 b) Crossing 100 posts with 2200 subscribers and 2600+ comments in October 2009 c) As I write this, we have 102 posts with 2603…

Read More