Stochastic Gradient Descent

For curve fitting using linear regression, there exists a minor variant of Batch Gradient Descent algorithm, called Stochastic Gradient Descent. In the Batch Gradient Descent, the parameter vector  is updated as, . (loop over all elements of training set in one iteration) For Stochastic Gradient Descent, the vector gets updated as, at each iteration the…

Read More

Alamouti STBC

In the recent past, we have discussed three receive diversity schemes – Selection combining, Equal Gain Combining and Maximal Ratio Combining. All the three approaches used the antenna array at the receiver to improve the demodulation performance, albeit with different levels of complexity. Time to move on to a transmit diversity scheme where the information…

Read More

Maximal Ratio Combining (MRC)

This is the third post in the series discussing receiver diversity in a wireless link. Receiver diversity is a form of space diversity, where there are multiple antennas at the receiver. The presence of receiver diversity poses an interesting problem – how do we use ‘effectively‘ the information from all the antennas to demodulate the…

Read More

BPSK BER with OFDM modulation

Oflate, I am getting frequent requests for bit error rate simulations using OFDM (Orthogonal Frequency Division Multiplexing) modulation. In this post, we will discuss a simple OFDM transmitter and receiver, find the relation between Eb/No (Bit to Noise ratio) and Es/No (Signal to Noise ratio) and compute the bit error rate with BPSK.

Read More

GATE-2012 ECE Q16 (electromagnetics)

Question 16 on electromagnetics from GATE (Graduate Aptitude Test in Engineering) 2012 Electronics and Communication Engineering paper. Q16. A coaxial cable with an inner diameter of 1mm and outer diameter of 2.4mm is filled with a dielectric of relative permittivity 10.89. Given ,  the characteristic impedance of the cable is (A)  (B)  (C)  (D)  Solution To…

Read More

Closed form solution for linear regression

In the previous post on Batch Gradient Descent and Stochastic Gradient Descent, we looked at two iterative methods for finding the parameter vector  which minimizes the square of the error between the predicted value  and the actual output  for all  values in the training set. A closed form solution for finding the parameter vector  is possible, and in this post…

Read More

Solved!

SOLVED the Rubik’s cube !!!   After 6 months, 2 cube’s and countless twists and turns, extremely glad to reach here. Will enjoy the beauty of the solved cube for couple of days before breaking it and going over the whole journey again…. (Thanks dear Kunju for introducing me to the cube) Disclosure : After solving…

Read More