ICT

April 6, 2018 | Author: Anonymous | Category: Documents
Report this link


Description

Information Coding Techniques DEPARTMENT OF INFORMATION TECHNOLOGY QUESTION BANK Subject Name: Information Coding Techniques Year /Sem: II / IV PART – A (2 MARKS) 1. 2. 3. 4. 5. 6. What is uncertainty? What is prefix coding? State the channel coding theorem for a discrete memory less channel Define a discrete memory less channel State channel capacity theorem Probability 0.25, 0.20, 0.15, 0.15, 0.10, 0.05 For the above given data, Calculate the entropy by encoding it using Shannon fano technique 7. Compare Huffman coding and Shannon fano coding 8. What are the advantages of Lempel – Ziv encoding algorithms over Huffman coding? 9. Find entropy of a source emitting symbols X,Y,Z with the probabilities of 1/5,1/2,1/3 respectively 10. State source coding theorem 11. Define capacity 12. Define information 13. Define Entropy 14. List the properties of information 15. Define information rate 16. Give the upper bound and lower bound for entropy 17. Define discrete memory less source 18. What is the entropy of an extended discrete memory less source? 19. Calculate the primary source entropy and the entropy of third extension of binary source with probabilities P0 = ¼, P1 = ¾ 20. What are the two functional requirements needed in the development of an efficient source encoder? 21. What is Lmin ? How it is determined? 22. Find the entropy of second order extension of a source whose alphabet x = {x0, x1, x2} with probability 23. What are uniquely decipherable codes? M aa na va N. co m UNIT – I INFORMATION ENTROPY FUNDAMENTALS Information Coding Techniques 24. What is data Compaction? 25. State Lempel-Ziv coding 26. If the probability of getting a head is ½ by tossing a coin, find the information associated with it? PART – B (16 MARKS) 1. (i) How will you calculate channel capacity? (ii)Write channel coding theorem and channel capacity theorem (iii)Calculate the entropy for the given sample data AAABBBCCD (iv)Prove Shannon information capacity theorem (2) (5) (3) (6) 2. 3. 4. M (i)Use differential entropy to compare the randomness of random variables (4) (ii)A four symbol alphabet has following probabilities Pr(a0) =1/2 Pr(a0) = 1/4 Pr(a0) = 1/8 Pr(a0) = 1/8 and an entropy of 1.75 bits. Find a codebook for this four letter alphabet that satisfies source coding theorem (4) (iii)Write the entropy for a binary symmetric source (4) (iv)Write down the channel capacity for a binary channel (4) (a) A discrete memory less source has an alphabet of five symbols whose probabilities of occurrence are as described here Symbols: X1 X2 X3 X4 X5 Probability: 0.2 0.2 0.1 0.1 0.4 Compare the Huffman code for this source .Also calculates the efficiency of the source encoder (8) (b) A voice grade channel of telephone network has a bandwidth of 3.4 kHz Calculate (i) The information capacity of the telephone channel for a signal to noise ratio of 30 dB and (ii) The min signal to noise ratio required to support information transmission through the telephone channel at the rate of 9.6Kb/s (8) A discrete memory less source has an alphabet of seven symbols whose probabilities of occurrence are as described below aa s0 s1 : 0.25 0.25 na va s2 s3 s4 0.125 0.0625 0.0625 Symbol: Prob (i) Compute the Huffman code for this source moving a combined symbols as high as possible N. s5 0.125 co m s6 0.125 (10) (4) (2) (ii) Calculate the coding efficiency (iii) Why the computed source has a efficiency of 100% Information Coding Techniques 5.(i) Consider the following binary sequences 111010011000101110100.Use the Lempel – Ziv algorithm to encode this sequence. Assume that the binary symbols 1 and 0 are already in the code book (12) (ii)What are the advantages of Lempel – Ziv encoding algorithm over Huffman coding? (4) P(X, Y)  9 M Apply Huffman coding procedure to following massage ensemble and determine Average length of encoded message also. Determine the coding efficiency. Use coding alphabet D=4.there are 10 symbols. X = [x1, x2, x3……x10] P[X] = [0.18,.0.17,0.16,0.15,0.1,0.08,0.05, 0.05,0.04,0.2] (16) aa Calculate the entropies H(X), H(Y), H(X/Y), and H (Y/X) na va 3/40 1/40 1/40 1/20 3/20 1/20 3/8 1/8 1/8 N. 6.A discrete memory less source has an alphabet of five symbols with their probabilities for its output as given here [X] = [x1 x2 x3 x4 x5 ] P[X] = [0.45 0.15 0.15 0.10 0.15] Compute two different Huffman codes for this source .for these two codes .Find (i) Average code word length (ii) Variance of the average code word length over the ensemble of source symbols (16) 7. A discrete memory less source X has five symbols x1,x2,x3,x4 and x5 with probabilities p(x1) – 0.4, p(x2) = 0.19, p(x3) = 0.16, p(x4) = 0.15 and p(x5) = 0.1 (i) Construct a Shannon – Fano code for X,and Calculate the efficiency of the code (7) (ii) Repeat for the Huffman code and Compare the results (9) (iii) 8. Consider that two sources S1 and S2 emit message x1, x2, x3 and y1, y2,y3 with joint probability P(X,Y) as shown in the matrix form. co m (16) Information Coding Techniques UNIT II - DATA AND VOICE CODING PART – A(2 MARKS) 1. 2. 3. 4. 5. 6. 7. 8. 9. Define Pulse Code Modulation (PCM.) Give the basic operations performed in transmitter and receiver of PCM system List the basic elements of a PCM system. Define sampling List the two different types of quantization. List the different types of line codes Give the block diagram of regenerative repeater. List the- three 'basic functions performed by a regenerative repeater A television signal with a bandwidth of 4.2 MHz. is transmitted using binary PCM.' The number of quantization levels is 512. Calculate the code word length 10. Briefly explain slope overloading 11. What are the advantages of coding speech at low bit rates? 12. Briefly explain sub band coding for speech signal 13. List the differences between delta modulation and adaptive delta modulation 14. Draw the block diagram for differential pulse code modulator 15. Draw the block diagram of DPCM signal encoder 16. Give the various pulse modulation techniques, pulse code modulation techniques available .How do they differ from each other? 17. Define Delta modulation 18. List the disadvantages of Delta modulation 19. Define Quantization 20. Define Quantization error 1. M (i) (ii) (iii) (iv) aa Compare and contrast DPCM and ADPCM Define pitch, period and loudness What is decibel? What is the purpose of DFT? na va PART – B (16 MARKS) (6) (6) (2) (2) (6) (6) (4) 2. i. ii. iii. Explain delta modulation with examples Explain sub-band adaptive differential pulse code modulation What will happen if speech is coded at low bit rates 3. With the block diagram explain DPCM system. Compare DPCM with PCM & DM systems (16) 4. i. Explain DM systems with block diagram (8) ii Consider a sine wave of frequency fm and amplitude Am, which is applied to a delta modulator of step size ∆ .Show that the slope overload distortion will occur N. co m Information Coding Techniques if Am > ∆ / ( 2fmTs) Where Ts sampling. What is the maximum power that may be transmitted without slope overload distortion? (8) 5. Explain adaptive quantization and prediction with backward estimation in ADPCM system with block diagram (16) 7. What is modulation? Explain how the adaptive delta modulator works with different algorithms? Compare delta modulation with adaptive delta modulation (16) 8. Explain pulse code modulation and differential pulse code modulation (16) UNIT III - ERROR CONTROL CODING PART –A (2 MARKS) 1. What is a generator polynomial? Give some standard generator polynomials 2. What is hamming distance in error control coding? 3. Why cyclic codes are extremely well – suited for error detection? 4. What is syndrome? 5. Define dual code 6. What is hamming code? 7. List the properties of generator polynomial of cyclic codes 8. Write the syndrome properties of linear block codes 9. Give the steps in encoding (n, k) cyclic codes 10. What are cyclic codes? Why they are called sub class of block codes 11. List the conditions satisfied by Hamming codes 12. Define convolution codes 13. Define linear block codes 14. Define constraint length of convolution code 15. What is the difference between systematic codes and non systematic codes 16. What is the use of syndromes? Explain syndrome decoding 17. Draw the block diagram of syndrome calculator 18. Draw the block diagram for an encoder of (7,4) Hamming code 19. Define minimum distance between code vectors 20. Define Hamming Weight 21. Define code rate 22. List the types of errors 23. List the types of codes 24. What is error control coding? Which are the functional blocks of a communication system that accomplish this? M aa na va N. co m 6. (i) Explain delta modulation systems with block diagrams (8) (ii) What is slope – overload distortion and granular noise and how it is overcome in adaptive delta modulation (8) Information Coding Techniques 25. List the properties of syndromes UNIT IV - COMPRESSION TECHNIQUES PART – B(16 MARKS) 1. Consider a hamming code C which is determined by the parity check matrix 1 1 0 1 1 0 0 1 0 1 1 0 1 0 0 1 1 1 0 0 1 H = (iii) 2. What is the length and the dimension K of the code? Why can the min Hamming distance dmin not be larger than three? (4) 3. M a. Consider the generator of a (7,4) cyclic code by generator polynomial g(x) – 1+x+x3.Calculate the code word for the message sequence 1001 and Construct systematic generator matrix G. (8) b. Draw the diagram of encoder and syndrome calculator generated by polynomial g(x)? (8) aa (i) Define linear block codes (2) (ii)How to find the parity check matrix? (4) (iii)Give the syndrome decoding algorithm (4) (iv)Design a linear block code with dmin ≥ 3 for some block length n= 2m-1 (6) na va (iii) Calculate the syndromes for all possible error vectors e with Hamming weight


Comments

Copyright © 2024 UPDOCS Inc.