Transmit beamforming

In this post lets discuss a closed-loop transmit diversity scheme, where the transmitter has the knowledge of the channel. As there is a feedback path required from the receiver, to communicate the channel seen by the receiver to the transmitter, the scheme is called closed-loop transmit diversity scheme. Recall that the transmit diversity using Space…

Read More

GATE-2012 ECE Q36 (math)

Question 36 on math from GATE (Graduate Aptitude Test in Engineering) 2012 Electronics and Communication Engineering paper. Q36. A fair coin is tossed till a head appears for the first time. The probability that the number of required tosses is odd, is (A) 1/3 (B) 1/2 (C) 2/3 (D) 3/4 Solution Let us start by…

Read More

GATE-2012 ECE Q7 (digital)

Question 7 on digital from GATE (Graduate Aptitude Test in Engineering) 2012 Electronics and Communication Engineering paper. Q7. The output Y of a 2-bit comparator is logic 1 whenever the 2 bit input A is greater than 2 bit input B. The number of combinations for which output is logic 1 is (A) 4 (B)…

Read More

Alamouti STBC

In the recent past, we have discussed three receive diversity schemes – Selection combining, Equal Gain Combining and Maximal Ratio Combining. All the three approaches used the antenna array at the receiver to improve the demodulation performance, albeit with different levels of complexity. Time to move on to a transmit diversity scheme where the information…

Read More

OCW: Communication System Design

While browsing through the web for materials on the wireless communication and implementation, found this rich set of articles as part of MIT OPEN COURSEWARE program. The course is from Vladimir Stojanovic, course materials for 6.973 Communication System Design, Spring 2006. MIT OpenCourseWare (http://ocw.mit.edu/), Massachusetts Institute of Technology.

Read More

Stochastic Gradient Descent

For curve fitting using linear regression, there exists a minor variant of Batch Gradient Descent algorithm, called Stochastic Gradient Descent. In the Batch Gradient Descent, the parameter vector  is updated as, . (loop over all elements of training set in one iteration) For Stochastic Gradient Descent, the vector gets updated as, at each iteration the…

Read More

Batch Gradient Descent

I happened to stumble on Prof. Andrew Ng’s Machine Learning classes which are available online as part of Stanford Center for Professional Development. The first lecture in the series discuss the topic of fitting parameters for a given data set using linear regression.  For understanding this concept, I chose to take data from the top…

Read More