課程名稱︰消息理論
課程性質︰選修
課程教師︰林茂昭
開課學院:電資
開課系所︰電信所
考試日期(年月日)︰2015.4.21
考試時限(分鐘):100分鐘
試題 :
Midterm Exam of Information Theory
April 21, 2015
1. (10%) Let X be a discrete random variable. Let g(X) be a function of X.
Show that H(g(X)) <= H(X).
2. Consider the following joint distribution in (X,Y):
\Y| a b c
X\|________________
|
1 | 1/6 1/12 1/12
|
2 |1/12 1/6 1/12
|
3 |1/12 1/12 1/6
^ ^
Let X(Y) be an estimator for X (based on Y) and let Pe = Pr{X(Y)≠X}.
^
(a)(5%) Find the minimum probablity of error estimator X(Y) and the
associated Pe.
(b)(5%) Evaluate Fano's inequality for this problem and compare.
3. (10%) Let X be a random variable for which P(X = 1) = 1/2, P(X = 2) = 1/4,
and P(X = 3) = 1/4. Lett X1, X2, X3,... be drawn i.i.d. according to this
distribution. Find the limiting behavior of the product
1/n
(X1X2....Xn)
4. Let X be a random variable for which P(X = 0) = 0.9 and P(X = 1) = 0.1.
(a)(7%) Please find a Huffman code for X^3.
(b)(8%) Please find a Shannon-Fano-Elias Code for X^3.
5. (10%) Consider a three-state stationary Markov chain with state transition
probabilities Pij, i,j ∈ {1,2,3} given by
| 1 2 3
___|____________
|
1 | 0 3/4 1/4
|
2 | 0 1/2 1/2
|
3 | 1 0 0
Please find the associated entropy rate.
6. (10%) Please show that the expected length L of an instantaneous D-ary
code for a random variable X is greater than or equal to the entropy H_D(X),
i.e. L >= H_D(X).
7. Define the conditional entropy
1
H (U) = —H(U ,...,U | U ,...,U )
L|L L 2L L+1 L 1
for a discrete stationary source of alphabet size K.
(a)(8%) Prove that H (U) is nonincreasing with L.
L|L
(b)(7%) Prove that H (U) → H(X).
L|L
8. (10%) Please describe the Kraft inequality and the McMillan inequality.
9. (10%) Please describe the method of using typical set for data compression.