Comparing BPSK, QPSK, 4PAM, 16QAM, 16PSK, 64QAM and 32PSK

I have written another article in DSPDesginLine.com. This article can be treated as the third post in the series aimed at understanding Shannon’s capacity equation. For the first two posts in the series are: 1. Understanding Shannon’s capacity equation 2. Bounds on Communication based on Shannon’s capacity The article summarizes the symbol error rate derivations…

Read More

BER with Matched Filtering

In the post on transmit pulse shaping filter, we had discussed pulse shaping using rectangular and sinc. In this post we will discuss about optimal receiver structure when pulse shaping is used at the transmitter. The receiver structure is also called as matched filter. For the discussion, we will assume rectangular pulse shaping, the channel…

Read More

Equal Gain Combining (EGC)

This is the second post in the series discussing receiver diversity in a wireless link. Receiver diversity is a form of space diversity, where there are multiple antennas at the receiver. The presence of receiver diversity poses an interesting problem – how do we use ‘effectively‘ the information from all the antennas to demodulate the…

Read More

GATE-2012 ECE Q3 (communication)

Question 3 on Communication from GATE (Graduate Aptitude Test in Engineering) 2012 Electronics and Communication Engineering paper. Q3. In a baseband communications link, frequencies upto 3500Hz are used for signalling. Using a raised cosine pulse with 75% excess bandwidth and for no inter-symbol interference, the maximum possible signaling rate in symbols per second is, (A)…

Read More

GATE-2012 ECE Q13 (circuits)

Question 13 on analog electronics from GATE (Graduate Aptitude Test in Engineering) 2012 Electronics and Communication Engineering paper. Q13. The diodes and the capacitors in the circuit shown are ideal. The voltage  across the diode  is (A)  (B)   (C)  (D) Solution The first half of the circuit is a negative clamper circuit and the second half…

Read More

Stochastic Gradient Descent

For curve fitting using linear regression, there exists a minor variant of Batch Gradient Descent algorithm, called Stochastic Gradient Descent. In the Batch Gradient Descent, the parameter vector  is updated as, . (loop over all elements of training set in one iteration) For Stochastic Gradient Descent, the vector gets updated as, at each iteration the…

Read More