- DSP log - http://www.dsplog.com -

GATE-2012 ECE Q15 (communication)

Posted By Krishna Sankar On January 25, 2013 @ 6:49 am In GATE | No Comments

Question 15 on communication from GATE (Graduate Aptitude Test in Engineering) 2012 Electronics and Communication Engineering paper.

## Solution

Entropy of a random variable $X$ is defined as ,

$H(X)=-\sum_{x\in X}p(x)\log_2 p(x)$ .

Refer Chapter 2 in Elements of Information Theory, Thomas M. Cover, Joy A. Thomas (Buy from Amazon.com [1]Buy from Flipkart.com [2])

Let us consider a simple case where $X$ can take two values 1 and 0 with probability $p$ and $1-p$ respectively, i.e.

$X = \{\begin{array}{lll}1, && \mbox{probability, }p\\0,&&\mbox{probability, }1-p\end{array}$.

The entropy of $X$ is,

$H(X)=-p\log_2p - (1-p)\log_2(1-p)$.

The plot of the entropy versus the probability$p$ is shown in the figure below.

clear all; close
p = [0:.001:1];
hx = -p.*log2(p) - (1-p).*log2(1-p);
plot(p,hx);
xlabel('probability, p'); ylabel('H(X)');
title('entropy versus probability, p');
axis([0 1 0 1]);grid on;

Figure : Entropy versus probability for binary symmetric source

It can be see that the entropy (also termed as uncertainty) is maximum when $p=1/2$ and for other values of $p$, the entropy is lower. The entropy becomes 0 when $p=0\mbox{ or }1$ i.e. when the value of $X$ becomes deterministic. If we extend this to a source with more than two symbols,  when probability of one of the symbols  becomes more higher than the other, the uncertainty decreases and hence entropy also decreases.

Based on the above, the right choice is (D) decreases

## References

[1] GATE Examination Question Papers [Previous Years] from Indian Institute of Technology, Madras http://gate.iitm.ac.in/gateqps/2012/ec.pdf [3]

[2] Elements of Information Theory, Thomas M. Cover, Joy A. Thomas (Buy from Amazon.com [1], Buy from Flipkart.com [2])

URL to article: http://www.dsplog.com/2013/01/25/gate-2012-ece-q15-communication/

URLs in this post: