课程名称 数 字 通 信 考试学期 04-05-2得分
适用专业无线电工程系 考试形式闭 卷 考试时间长度120分钟共 页
Section A:True or False (15%)
1. 1.When the period is exactly 2m, the PN sequence is called a
maximal-length-sequence or simply m-sequence.
2. 2.For a period of the maximal-length sequence, the autocorrelation
function is similar to that of a random binary wave.
3. 3.For slow-frequency hopping,symbol rate R s of MFSK signal is an
integer multiple of the hop rate R h. That is, the carrier frequency will change or hop several times during the transmission of one symbol. 4. 4.Frequency diversity can be done by choosing a frequency spacing
equal to or less than the coherence bandwidth of the channel.
5. 5.The mutual information of a channel therefore depends not only on
the channel but also on the way in which the channel used.
6. 6.Shannon’s second theorem specifies the channel capacity C as a
fundamental limit on the rate at which the transmission of reliable error-free messages can take place over a discrete memoryless channel and how to construct a good code.
7.7.The syndrome depends not only on the error pattern, but also on
the transmitted code word.
8.8.Any pair of primitive polynomials of degree m whose corresponding
shift registers generate m-sequences of period 2m-1 can be used to generate a Gold sequence.
9.9.Any source code satisfies the Kraft-McMillan inequality can be a
prefix code.
10.10.Let a discrete memoryless source with an alphabet ϕ have entropy
Hϕ and produce symbols once every s T seconds. Let a discrete ()
memoryless channel have capacity and be used once every
C c Tseconds. Then , if
()ϕ
≥
s c
H C
T T , there exists a coding scheme for which
the source output can be transmitted over the channel and be reconstructed with an arbitrarily small probability of error. Section B:Fill in the blanks (35%)
1. 1.The two commonly used types of spread-spectrum modulation:
and .
2. 2. A pseudo-noise (PN) sequence is a periodic binary sequence with a
waveform that is usually generated by means of a
.
3. 3.Due to , wireless communication is no longer
idealized AWGN channel model.
4. 4.There are the following diversity techniques in our discussion ,
diversity, diversity, diversity.
5. 5.Three major sources of degradation in wireless communications
are
, , and ; the latter two are byproducts of multipath.
6. 6.The information capacity of a continuous channel of bandwidth B
hertz, perturbed by additive white Gaussian noise of power spectral density N0/2 and limited in bandwidth to B, is given by
.
7.7.The or syndrome) is defined
as: .
8.8.For Linear Block Codes ,Correct all error patterns of Hamming
weight w(e)≤t2 ,if and only if .
9.9.TCM Combine and as a single entity to
attain a more effective utilization of the available
and .
10.10.In a DS/BPSK system, the feedback shift register used to
generate the PN sequence has length m=19, than the processing gain is .
11.11.Let X represent the outcome of a single roll of a fair die(骰子).
The entropy of X is .
12.12. A voice-grade channel of the telephone network has a bandwidth
of 3.4kHz,the information capacity of the telephone channel for a signal-to-noise ratio of 30dB is , the minimum signal-to-noise ratio required to support information
transmission through the telephone channel at the rate of 9,600b/s is . 13. 13. For a m-sequence generated by a linear feedback shift register of length 5, the total number of runs is , number of length-two runs is , the autocorrelation R(j)= (j ≠0).
14. 14. If the coherent bandwidth of the channel is small compared to the
message bandwidth, the fading is said to be . If the coherence time of the channel is large compared to the duration of the signal duration, the fading is said to be
.
15. 15. A source emits one of five symbols with probabilities 1/2, 1/4, 1/8, 1/16, 1/16, respectively. The successive symbols emitted by the source are statistically independent. The entropy of the source is 01234,,
s and s s s s . The average code-word length for any distortionless source encoding scheme for this source is bounded as .
16. 16. For a finite variance σ2, the random variable has the
largest differential entropy attainable by any random variable, and the entropy is uniquely determined by the . 17. 17. Set partitioning design partitions the M-ary constellation of interest successively and has progressively larger increasing
between their respective signal points.
18. 18. code and code have an error performance within a hair’s breadth of Shannon’s theoretical limit on
channel capacity in a physically realizable fashion.
19. 19. When an infinite number of decoding errors are caused by a finite number of transmission errors, the convolutional code is called
a .
Section C :Problems (50%)
1.A radio link uses a pair of 2m dish antennas with an efficiency of 70 percent each, as transmitting and receiving antennas. Other specifications of the link are: Transmitted power = 2 dBW (not include the power gain of antenna ) Carrier frequency = 12 GHz
Distance of the receiver form the transmitter = 200 m
Calculate (a) the free-space loss, (b) the power gain of each antenna,
(c)the received power in dBW.
2. A computer executes four instructions that are designated by the code words (00,01,10,11). Assuming that the instructions are used independently with probabilities(1/2,1/8,1/8,1/4). (a) (a) Construct a Huffman code for the instructions.
(b) (b) Calculate the percentage by which the number of bits used for the instructions may be reduced by the use of a Huffman code.
3. Consider the (15,8) cyclic code defined by the generator polynomial
37()1g X X X X =+++ (a) (a) Develop the encoder for this code.
(b) (b) Get the generator matrix and the parity-check matrix.
(c) (c) Construct a systematic code word for the message sequence 10110011. (d) (d) The received word is 110001000000001, determine the syndrome polynomial s(X) for this received word.
4. Consider the rate r = 1/3, constraint length K = 3 convolutional encoder. The generator sequences the encoder are as follows:
(1)(1,0,0)g = , , (2)(1,0,1)g =(3)
(1,1,1)g =(a) (a) Draw the block diagram of the encoder.
(b) (b) Construct the code tree
(c) (c) Construct the signal-flow graph and obtain the input-output state equations. (d) (d) Determine the encoder output produced by the message sequence 10111….
(e) (e) The received sequence is 110,001,101,110,000,011. Use the Viterbi algorithm to compute the decoded sequence.
答案 Section A :True or False (每题1.5分,共15分)
11. 1. When the period is exactly 2m
, the PN sequence is called a
maximal-length-sequence or simply m-sequence. (F )
12. 2. For a period of the maximal-length sequence, the autocorrelation
function is similar to that of a random binary wave. (T )
13. 3. For slow-frequency hopping ,symbol rate R s of MFSK signal is an
integer multiple of the hop rate R h . That is, the carrier frequency will change or hop several times during the transmission of one symbol . (F )
14. 4. Frequency diversity can be done by choosing a frequency spacing
equal to or less than the coherence bandwidth of the channel. (F )
15. 5. The mutual information of a channel therefore depends not only on
the channel but also on the way in which the channel used. (T )
16. 6. Shannon’s second theorem specifies the channel capacity C as a
fundamental limit on the rate at which the transmission of reliable error-free messages can take place over a discrete memoryless channel and how to construct a good code. (F )
17. 7. The syndrome depends not only on the error pattern, but also on
the transmitted code word. (F )
18. 8. Any pair of primitive polynomials of degree m whose corresponding
shift registers generate m-sequences of period 2m -1 can be used to generate a Gold sequence. (F )
19. 9. Any source code satisfies the Kraft-McMillan inequality can be a
prefix code. (F )
20. 10. Let a discrete memoryless source with an alphabet ϕ have
entropy ()H ϕ and produce symbols once every s T seconds. Let a
discrete memoryless channel have capacity and be used once every seconds. Then , if C c T ()s c
H T T C ϕ≥ , there exists a coding scheme for which the source output can be transmitted over the channel and be reconstructed with an arbitrarily small probability of error. (F )
Section B :Fill in the blanks (每空1分,共35分) 20. 1. The two commonly used types of spread-spectrum modulation: direct sequence and frequency hopping.
21. 2. A pseudo-noise (PN) sequence is a periodic binary sequence with a noiselike waveform that is usually generated by means of a feedback shift register .
22. 3. Due to multipath , wireless communication is no longer idealized
AWGN channel model.
23. 4. There are the following diversity techniques in our discussion ,Frequency diversity ,Time diversity ,Space diversity. 24. 5. Three major sources of degradation in wireless communications are co-channel interference , fading , and delay spread ; the latter two are byproducts of multipath .
25. 6. The information capacity of a continuous channel of bandwidth B hertz, perturbed by additive white Gaussian noise of power spectral
density N 0/2 and limited in bandwidth to B, is given by
20log (1) bits per second =+P C B N B
. 26. 7. The error-syndrome vector (or syndrome) is defined as: s = rH T
27. 8. For Linear Block Codes ,Correct all error patterns of Hamming
weight w (e )≤t 2 ,if and only if d min ≥ 2 t 2 + 1. 28. 9. TCM Combine coding and modulation as a single entity to attain a more effective utilization of the available bandwidth and power .
29. 10. In a DS/BPSK system, the feedback shift register used to
generate the PN sequence has length m=19, than the processing gain is 57dB .
30. 11. Let X represent the outcome of a single roll of a fair die(骰子).
The entropy of X is log 2(6) = 2.586 bits/symbol. 31. 12. A voice-grade channel of the telephone network has a bandwidth of 3.4kHz,the information capacity of the telephone channel for a signal-to-noise ratio of 30dB is 33.9 kbits/second ,the minimum signal-to-noise ratio required to support information transmission through the telephone channel at the rate of 9,600b/s is 7.8dB . 32. 13. For a m-sequence generated by a linear feedback shift register of length 5, the total number of runs is 16 , number of length-two runs is 4 , the autocorrelation R(j)= -1/31 (j ≠0).
33. 14. If the coherent bandwidth of the channel is small compared to the
message bandwidth, the fading is said to be frequency selective . If the coherence time of the channel is large compared to the duration of the signal duration, the fading is said to be time nonselective or time flat.
34. 15. A source emits one of five symbols with probabilities 1/2, 1/4, 1/8, 1/16, 1/16, respectively. The successive symbols emitted by the source are statistically independent. The entropy of the source is 15/8=1.875bits/symbol 01234,, s and
s s s s . The average code-word length for any distortionless source encoding scheme for this source is bounded as ϕ≥()L H .
35. 16. For a finite variance σ2, the Guassian random variable has the
largest differential entropy attainable by any random variable, and the entropy is uniquely determined by the variance of X .
36. 17. Set partitioning design partitions the M-ary constellation of
interest successively and has progressively larger increasing minimum Euclidean distance between their respective signal points.
37. 18. Turbo codes and Low-density parity-check codes have an error
performance within a hair’s breadth of Shannon’s theoretical limit on channel capacity in a physically realizable fashion.
38. 19. When an infinite number of decoding errors are caused by a finite
number of transmission errors, the convolutional code is called a catastrophic code.
Section C :Problems
1.A radio link uses a pair of 2m dish antennas with an efficiency of 70 percent each, as transmitting and receiving antennas. Other specifications of the link are:
Transmitted power = 2 dBW (not include the power gain of antenna ) Carrier frequency = 12 GHz
Distance of the receiver form the transmitter = 200 m Calculate (a)the free-space loss,
(b) the power gain of each antenna,
(c)the received power in dBW. (本题10分)
Solution:
(a) Free-space loss 2
1010log 4λπ⎛⎞
=⎜⎟⎝⎠
freespace L d
10310/12/1020log 1004200π⎛⎞
×==⎜⎟××⎝⎠
dB −
(b) The power gain of each antenna is 1010102
410log 10log 10log πλ××⎛⎞
==⎜⎟⎝⎠
t r A G G ()102
40.710log 310/12/1046.46ππ⎛⎞
×××⎜⎟=⎜⎟×⎝⎠
=dB
(c) The received power = transmitted power + G t +G r + free-space loss = 2 + 46.46 + 46.46 + (-100) = -5.08dBW
words (00,01,10,11). Assuming that the instructions are used independently with probabilities(1/2,1/8,1/8,1/4).
(c)(a)Construct a Huffman code for the instructions.
(d)(b)Calculate the percentage by which the number of bits used for the
instructions may be reduced by the use of a Huffman code.
(本题10分)
Solution:
(a) As low as possible
As high as possible
Computer code Probability Huffman Code
00 1/2 1
11 1/4 01
000
01 1/8
10 1/8 001
(e)(c)The number of bits used for the constructions based on the
computer code, in a probabilistic sense, is equal to
3. Consider the (15,8) cyclic code defined by the generator polynomial
37()1g X X X X =+++842()1h X X X X X (++++)
=(e) (a) Develop the encoder for this code.
(f) (b) Get the generator matrix and the parity-check matrix.
(g) (c) Construct a systematic code word for the message sequence 10110011.
(h) (d) The received word is 110001000000001, determine the syndrome polynomial s(X) for this received word. (本题15分) Solution: (a)
(b) generator matrix
372482235334
4
5
7
55686679177810()1()()()()()()()=+++=+++=+++=+++=+++=+++=+++=+++g X X X X Xg X X X X X 91011
12314
X g X X X X X X g X X X X X X g X X X X X
X g X X X X X X g X X X X X X g X X X X X
110100010000000011010001000000001101000100000000110100010000000011010001000000001101000100000000110100010000000011010001⎡⎤⎢⎥⎢
⎥⎢⎥⎢
⎥⎢
⎥′=⎢⎥⎢
⎥⎢⎥⎢⎥⎢⎥⎢⎥⎣⎦G 1101000100000000110100010000000011010001000000001101000100001101110000010000110111000001001110011000000101010001000
1⎡⎤⎢⎥⎢
⎥⎢⎥⎢
⎥⎢
⎥=⎢⎥⎢
⎥⎢⎥⎢⎥⎢⎥⎢⎥⎣⎦
G Parity-check matrix
8146715710
1
2
6
8
9
10
1113791011()1()()()X h X X X X X X h X X X X X X X h X X X X X X
X h X X X X X X −−−−=++++=++++=++++=++++
121481011121315911121141610121314
()()()3X h X X X X X X X h X X X X X X X h X X X X X X −−−=++++=++++=++++
100010111000000010001011100000001000101110000'0
00100010111000000010001011100000001000101110000000100010111⎡⎤⎢⎥⎢⎥⎢⎥⎢⎥=⎢⎥⎢⎥⎢⎥⎢⎥⎢⎥⎣⎦H 1000000100010110100000110011100010000011001110
0010001011100000001000101110000000100010111000000010001011
1⎡⎤⎢⎥⎢⎥⎢⎥⎢⎥=⎢⎥⎢⎥⎢⎥⎢⎥⎢⎥⎣
⎦
H (c) For the message sequence 10110011, the corresponding message polynomial is
236()1=++++m X X X X X 7 Firstly, 79101314()−=++++n k X m X X X X X X Secondly, divide ()−n k X m X by ,
()g X 79101314234676
3737
1111++++++++=++++++++++X X X X X X X X X X X X X X X X X X
The remainder is 234()1=++++b X X X X X Hence, the desired code polynomial is
234679101314()()()1−=+=+++++++++n k c X b X X m X X X X X X X X X X The systematic code word is 1011101,10110011
(d) The code polynomial corresponding to the received word 110001000000001 is
51()1+=++r X X X X divide by , we get
()r X ()g X 51425673
3737
1+111+++++=++++++++++X X X X X X X X X X X X X X X X
Hence, the syndrome polynomial s(X) for this received word is (0110011) 25()=+++s X X X X X 6
4. Consider the rate r = 1/3, constraint length K = 3 convolutional encoder. The generator sequences the encoder are as follows:
(1)(1,0,0)g = , , (2)(1,0,1)g =(3)(1,1,1)g =(f) (a) Draw the block diagram of the encoder. (g) (b) Construct the code tree
(h) (c) Construct the signal-flow graph and obtain the input-output state equations.
(i) (d) Determine the encoder output produced by the message sequence 10111….
(j) (e) The received sequence is 110,001,101,110,000,011. Use the Viterbi
algorithm to compute the decoded sequence. (本题15分) Solution:
(a) Encoder diagram (b) Code tree
a b c d a b c d
(f) (d) Encoder output produced by the message sequence 10111 is 111, 001, 100, 110, 101, 010, 011, 000, …
(g) (e) The received sequence is 110,001,101,110,000,011. The correct sequence is 111, 001, 100, 110, 010, 011. The decoded sequence is 101100.