GATE-2012 ECE Q16 (electromagnetics)

Question 16 on electromagnetics from GATE (Graduate Aptitude Test in Engineering) 2012 Electronics and Communication Engineering paper. Q16. A coaxial cable with an inner diameter of 1mm and outer diameter of 2.4mm is filled with a dielectric of relative permittivity 10.89. Given ,  the characteristic impedance of the cable is (A)  (B)  (C)  (D)  Solution To…

Read More

Scaling factor in QAM

When QAM (Quadrature Amplitude Modulation) is used, typically one may find a scaling factor associated with the constellation mapping operation. It may be reasonably obvious that this scaling factor is for normalizing the average energy to one. This post attempts to compute the average energy of the 16-QAM, 64-QAM and M-QAM constellation (where is a…

Read More

GATE-2012 ECE Q7 (digital)

Question 7 on digital from GATE (Graduate Aptitude Test in Engineering) 2012 Electronics and Communication Engineering paper. Q7. The output Y of a 2-bit comparator is logic 1 whenever the 2 bit input A is greater than 2 bit input B. The number of combinations for which output is logic 1 is (A) 4 (B)…

Read More

Solved!

SOLVED the Rubik’s cube !!!   After 6 months, 2 cube’s and countless twists and turns, extremely glad to reach here. Will enjoy the beauty of the solved cube for couple of days before breaking it and going over the whole journey again…. (Thanks dear Kunju for introducing me to the cube) Disclosure : After solving…

Read More

Happy New Year 2010

Wishing all the readers of dsplog.com a great year 2010 ! Its been a mixed year for dsplog. Some key milestones a) Crossing 1000 subscribers with 1100+ comments in March 2009 b) Crossing 100 posts with 2200 subscribers and 2600+ comments in October 2009 c) As I write this, we have 102 posts with 2603…

Read More

Stochastic Gradient Descent

For curve fitting using linear regression, there exists a minor variant of Batch Gradient Descent algorithm, called Stochastic Gradient Descent. In the Batch Gradient Descent, the parameter vector  is updated as, . (loop over all elements of training set in one iteration) For Stochastic Gradient Descent, the vector gets updated as, at each iteration the…

Read More