[試題] 108-2 林茂昭 消息理論 期末考

作者: guangto666 (lookfortheredheart )   2020-06-22 19:20:29
課程名稱︰消息理論
課程性質︰選修
課程教師︰林茂昭
開課學院:電資學院
開課系所︰電信所
考試日期(年月日)︰2020.6.16
考試時限(分鐘):100 min
試題:
1.
Consider an additive Gaussian noise channel.
(a) What is the minimum siganl to noise tatio needed for achieving transmission
rate R = 1/2?
(b) In the literaturem it is reported that using a very long rate 1/2 binary
low density parity density (LDPC) code, a very low bit-error-rate can be
achieved at SNR only 0.04dB away from the Shannon channel. Is this Shanon
limit the SNR derived in part (a)? Explain it.
2.
Calculate the capacity of the folowing channel.
(a) The binary erasure channel with erasure probability α.
(b) The channel with input and output given by X = Y = {1,2,3}, and the
probability transition matrix is:
p(y|x) = 1 2 3
1 0.5 0.5 0
2 0 0.5 0.5
3 0.5 0 0.5
3.
(a)(5%) Find the differential entropy h(X) = -∫f㏑f for the random
variable with the exponential density, f(x) = λexp(-λx), x≧0.
(b)(5%) Please show that the exponential distribution with mean 1/λ is
the maximum entropy distribution among all continuous disstributions
supported in [0,∞] that have a mean of 1/λ.
(c)(5%) Let Yi = Xi + Zi, where Zi is i.i.d. exponential distribution with
mean μ. Assume that we have a mean constraint on the signal (i.e.
EXi ≦ λ). Show that the capacity of such a channel is C = ㏒(1+λ/μ).
4.
Consider an additive noise fading channel, with input X and output
Y = XV + Z, where Z is the additive noise, V is a random variable
representing fading, and Z and V are independent of each other and of X.
Please show that:
I(X;Y|V) ≧ I(X;Y)
5.
For the case of a continuous random variable X with mean zero and variance
σ^2 and square error distortion, show that
h(X) - 1/2(log(2πeD)) >= R(D) >= 1/2(log(2πeσ^2/D))
6.
Consider a random variable with tenary alphabet {A,B,C} with P(A) = P(B) = 0.4
, P(C) = 0.2. Use arithmetic coding to encode the sequence ACB, where we
assume A<B<C
7.
Assume that we have a sender of power P and two distant receivers. The
model of the channel is Y1 = X + Z1 and Y2 = X + Z2, where Z1 and Z2 are
arbitrarily correlated Gaussian random variables with variances N1 and N2
respectively and N1 < N2.
(a) (7%) Describe the capacity region of this Gaussian broadcast channel and
the procedure of decoding.
(b) (8%) Show that the rate pairs in the capacity region can be better than
the simple time-sharing.
8.
Use the sliding window Lempel-Ziv algorithm to encode he sequence
0000001101010000011010111.
9.
(a)Describe the capacity for the multiple access channel (X1 x X2, p(y|x1,x2), Y)
for a given product distribution p1(x1)p2(x2) on X1 x X2
(b)Describe the capacity region in 9a if y = x1+x2+z, where z is a zero mean
Gaussian random variable with variance N and power constraint on xj is Pj.
10. Suppose that the encoder does not have the knowledge of the typical set
for Xn but have the knowledge of the entropy rate H(X). How can we implement the
source coding for the sequence in Xn?

Links booklink

Contact Us: admin [ a t ] ucptt.com