Stochastic Gradient Descent

For curve fitting using linear regression, there exists a minor variant of Batch Gradient Descent algorithm, called Stochastic Gradient Descent. In the Batch Gradient Descent, the parameter vector  is updated as, . (loop over all elements of training set in one iteration) For Stochastic Gradient Descent, the vector gets updated as, at each iteration the…

Read More

Closed form solution for linear regression

In the previous post on Batch Gradient Descent and Stochastic Gradient Descent, we looked at two iterative methods for finding the parameter vector  which minimizes the square of the error between the predicted value  and the actual output  for all  values in the training set. A closed form solution for finding the parameter vector  is possible, and in this post…

Read More

Selection Diversity

This is the first post in the series discussing receiver diversity in a wireless link. Receiver diversity is a form of space diversity, where there are multiple antennas at the receiver. The presence of receiver diversity poses an interesting problem – how do we use ‘effectively‘ the information from all the antennas to demodulate the…

Read More

Support Vibha’s Dream Mile event

My friend Mr. Balaji volunteers for Vibha, a non-profit  organization whose mission is to ensure that every underprivileged child attains his or her right to education, health and opportunity. Vibha, which was founded in 1991 has a volunteer network of 825 members spread across Atlanta, Austin, Bay Area, Boston, Chicago, Dallas, Houston, Jacksonville, Los Angeles,…

Read More

2nd order sigma delta modulator

In a previous post, the variance of the in-band quantization noise for a first order sigma delta modulator was derived. Taking it one step furhter, let us find the variance of the quantization noise filtered by a second order filter. With a first order filter, the quantization noise passes through a system with transfer function…

Read More