UNIT V – INFORMATION
THEORY
PART-A
1.
Define information rate. NOV/DEC 2007
If the time rate at which source X emits symbols is r symbols per
second. The information rate R of the source is given by R=r H(X) bits/second
H(X) - entropy of the source.
2. Define entropy? NOV/DEC 2006, 2010
Entropy is
the measure of the average information content per second. It is given by the
Expression H(X)
=∑I P (xi) log2P (xi) bits/sample.
3. What is a prefix code? NOV/DEC 2003
In
prefix code, no codeword is the prefix of any other codeword. It is variable
length code. The binary digits are assigned to the messages as per their
probabilities of occurrence.
4. Define mutual
information. NOV/DEC 2010
Mutual information I(X, Y) of a channel is defined by I(X, Y)
=H(X)-H(X/Y) bits/symbol
H(X) - entropy of the source H(X/Y) -
conditional entropy of Y.
5. State Shannon Hartley
theorem. NOV/DEC 2010
The capacity ‘C’ of an additive Gaussian noise channel is
C=B log2 (1+S/N)
B= channel bandwidth, S/N=signal to noise ratio
6. Write the expression
for code efficiency in terms of entropy. APRIL/MAY 2004
Redundancy = 1 - code efficiency. Redundancy
should be as low as possible.
7. How is the efficiency of the coding technique measured? NOV/DEC
2005
Efficiency of the code =H(X) / L
L=∑p(xi)li
average code word
length .li=length
of the code word.
8. Name the two source coding techniques.
NOV / DEC 2004
1. Prefix coding
2. Shanon-fano coding
3. Huffman coding.
9. An event
has six possible outcomes with probabilities 1/2, 1/4, 1/8, 1/16, 1/32, 1/32.
Find the Entropy of the system. APRIL
/ MAY 2005
H = ∑Pk log2 (1/Pk)
= (½) log2 2 + (¼) log2 4
+ (1/16) log2 16 + (1/32) log2 32 + (1/32) log2 32
= 1.5625 Bits/ Message.
10.
When is the average information delivered by a source of alphabet size 2,
maximum? NOV / DEC 2004
Average information is maximum, when the two messages are equally
likely i.e., p1 = p2 = 1/2.
Then the maximum average information is given as,
Hmax = 1/2 log2 2 + 1/2
log2 2 = 1 bit / message.
11. Define bandwidth efficiency. NOV / DEC 2010
The ratio of the channel capacity to BW is
called bandwidth efficiency. BW efficiency = C / B
12. Write
down the formula for mutual information. APRIL / MAY 2005
The mutual information is defined as the amount of information
transferred when Xi is Transmitted Yj is received. It is represented by I (Xi,
Yj) and it is given as,
I (Xi, Yj) = log (P (Xi/Yj)/ P (Xi)) bits
13. Is the information of a continuous system non
negative? If so,why? NOV / DEC 2005
Yes, the information of a continuous system is non- negative. The
reason is that I(X; Y)>= 0 is one of its property.
14. What is
channel redundancy? NOV / DEC 2005
Redundancy
= 1 – code efficiency
15. Write the expression for code efficiency in terms of
entropy. NOV / DEC 2005
Code
efficiency
Entropy / Average
code word length = H / 


16. Define the significance of
the entropy H(X/Y) of a communication system where X is the transmitter and Y is the receiver. MAY /
JUNE 2006
H(X/Y) is
called conditional entropy. It represents uncertainty of X, on average
when Y is
known.
·
In other words H(X/Y) is
an average measure of uncertainty in X after Y is received.
·
H(X/Y) represents the
information lost in the noisy channel.
17. How does Shannon
–Fano coding differ from lossy source coding? MAY 2011
·
Shannon
Fano coding is lossless source coding Technique.
·
For
each message symbol Shannon Fano coding allots unique code According
to its probability of occurrence.
·
Due
to unique code, the message symbol is recovered without any error. Thus
there is no loss of information in Shannon-Fano coding.
18. How to increase the information capacity
of a communication channel? MAY 2011
· Increasing the
bandwidth B of the channel
· Increasing the signal
to noise ratio of the channel
· Maximizing the
average mutual information
19.
What are the types of the channel? MAY
2011
1.
Discrete
memoryless channels
· Binary symmetric
channels
· Erasure channel
· Binary communication
channels.
2.
Continuous
channels
· Gaussian channels.
20. Differentiate
lossy source coding from lossless source coding.
MAY
2011
Lossy
source coding
|
Lossless
source coding
|
Some information of the source is lost
during encoding
|
No information is lost during encoding
|
PCM,DM,ADM,BPCM are lossy source coding
technique
|
Huffman coding, instantaneous coding,
Shannon Fano coding are lossless source coding technique.
|
21. State any four
properties of entropy.
1. I(X, Y) =I(Y, X)
2. I(X, Y)>=0
3. I(X, Y) =H(Y)-H(Y/X)
4. I(X, Y) =H(X) +H(Y)-H(X, Y).
22. Give the expressions
for channel capacity of a Gaussian channel.
Channel capacity of Gaussian channel is given as, C = B log2
(1 + (S/N))
23. Define
the entropy for a discrete memory less source.
The entropy of a binary memory-less source H(X) =-p0log2p0-(1-p0)
log2 (1-p0) p0- probability of Symbol ‘0’, p1= (1- p0) =probability
of transmitting symbol ‘1’.
24. Define lossless
channel.
The channel
described
by a channel matrix with
only one nonzero element in each
column is called a
lossless channel. In
the lossless channel no sources information
is lost in transmission.
25. Define Deterministic channel.
A
channel described by a channel matrix with
only one nonzero element in each
row
is called a deterministic channel and
this element must be unity.
26. Prove that
I (xi xj) = I(xi) + I(xj) if xi and
xj are independent.
If xi and xj are independent.
P (xi xj) = P(xi) P(xj)
I (xi xj) = log1/P(xi xj)
= log 1/ P(xi) P(xj)
= I(xi) + I(xj)
27. Explain Shannon-Fano
coding.
An
efficient code can
be obtained by the following simple
procedure, known
as
Shannon- Fano algorthim.
1. List
the source symbols
in order of decreasing probability.
2. Partition
the set into two sets that
are
as close to equiprobable as possible, and sign 0 to the
upper set and 1 to the lower set.
3.Continue
this process, each
time partitioning the sets with as
nearly equal probabilities
as Possible until further partitioning is not possible.
28. What is data compaction?
For efficient signal
transmission the redundant information
must be removed from the signal prior to transmission .This information with
no loss of information is ordinarily performed
on a
signal in digital form and
is referred
to as data compaction or lossless data compression.
29. State the property of entropy.
1.0< H(X) < log2K,
is the radix of the alphabet X of the source.
30. What is source coding
and entropy coding?
A conversion of
the output of a DMS into
a sequence of binary symbols is
called source coding. The design of a variable length
code such
that its average code word
length approaches
the entropy of the DMS is often referred
to as entropy coding.
31. What do you meant by the capacity of the
channel?
It
is defined as the ability of a channel to convey information, which is related
to the noise characteristics.
32. Define discrete messages.
The
output emitted by a source during every unit of time i.e., at unit time
interval is known as discrete messages.
33. Define source
coding.
Source
encoding or source coding is the process which is used for the efficient
representation of data generated by a source.
34. What do you meant
by source encoder?
·
The
device which performs source coding or encoding is called source encoder.
·
Efficient
source encoder can be designed which uses the statistical properities of the source.
35. What is variable
length code?
·
Short
codeword is used to represent frequently occurring messages or symbols.
·
Longer
codeword is used to represent rarely occurring symbols is called
·
variable
length coding.
36. Define prefixing
code.
It
is also called Instantaneous coding, where no codeword should be a prefix of
any other codeword. It is uniquely decodable.
37. Define
information capacity theorem.
It
is defined as the maximum of the mutual information between the channel input Xk
and channel output Yk over all distributions on the input Xk
that satisfy the power constant.
38. What is the goal
of channel coding?
The
goal is to increase the resistance of a digital communication system to channel
noise.
39. What is channel efficiency?
The
transmission efficiency or channel efficiency is defined as the ratio of actual
transmission and Maximum transmission.
40. What do you meant
by source coding with a fidelity criterion?
The
information source may have a continuous amplitude as in the case of speech,
and the requirements is to quantize the amplitude of each sample generated by
the source to permit its representation by the code word of finite length as in
pulse code modulation. This problem is referred to as source coding with a
fidelity criterion.
41. Define rate
distortion function.
A
rate distortion function is defined as the smallest coding rate possible for
which the average distortion is guaranteed not to exceed D.
42. Express the
channel capacity for noise free channel and symmetric channel.
Noise free channel C = log 2 K bits/message
Symmetric channel C = log 2 K - A bits/message
43.
What are the drawbacks of source coding theorem?
For a perfect representation of the discrete
memoryless source, we are using source coding theorem in which the average code
word length must be at least as large as the entropy of the source.
44.
What happens
when the number of coding alphabet increases?
When the number of coding alphabet increases
the efficiency of the coding technique decreases.
45.
What is channel matrix and channel diagram?
The transition probability diagram of the
channel is called the channel diagram and its matrix representation is called
the channel matrix.
46. Why
Huffman coding is said to be optimum?
The coding is said to be optimum since no
other uniquely decodable set of code words, has a smaller average code word
length of a given discrete memoryless channel.
47.
Define
the bit of information.
Bit is the basic unit of information. It is
defined as the quantity of information required to permit a correct selection
of one out of a pair of equiprobable events.
48.
Define
Lempel ziv coding.
Encoding is done by parsing the source data
stream into segments that are
shortest subsequences not encountered
previously.
49. What are the drawbacks of Huffman code?
·
It
requires knowledge of the probabilistic model of the source. But knowing the source statistics in advance is
not possible at all times.
·
With
modeling text, the storage requirements prevent the Huffman code from capturing the higher-order
relationship between words.
50.
Define information content.
Total information content = Entropy +
Redundant information content.
51.
Define rate bandwidth.
Let the system is transmitting at the rate
Rb, which is equal to channel capacity. B is the BW then the rate bandwidth is
given as Rb / B.
52.
Define lossy source coding
·
Some
information of the source is lost during encoding
·
PCM,DM,ADM,BPCM
are lossy source coding technique
53.
Define lossless source coding
·
No
information is lost during encoding
·
Huffman
coding, instantaneous coding, Shannon Fano coding are lossless source coding technique.
PART
– B
8 MARKS:
1. What is entropy? Explain the important properties of entropy.(NOV/DEC 2006)
2.
Discuss source Coding or Shannon Fano coding theorem
(MAY/JUNE 2007)
3.
Discuss the Data Compaction. (MAY/JUNE 2007)
4.
List the properties of prefix codes and give an
example of prefix codes.(MAY/JUNE
2008)
5. Write note on binary symmetric channel. (APRIL/MAY2004)
6.
Discuss
the Different conditional entropies. (MAY/JUNE 2008)
7. Explain three
properties of mutual information. (APRIL/MAY2005)
8. Derive the Capacity
of Gaussian channel. (NOV/DEC 2005)
9.
Explain
the information capacity theorem. (APRIL/MAY2005)
10.Write a note on rate distortion theory?
(MAY/JUNE 2008)
11.Write the short note on Huffman coding.
(APRIL/MAY2011)
12.Write the short note on Lempel-Ziv, Huffman and Shannon Fano coding.
13.
Comparison between Huffman
coding
16
MARKS:
1.
Derive
the expression for channel capacity of a continuous channel. Find also the expression for channel capacity of
a continuous Channel of infinite bandwidth. (MAY/JUNE
2006)
2. Define Mutual
information. Find the relation between the mutual Information and the joint entropy of the channel input and
channel Output. (NOV/DEC 2006)
3.
Discuss the various techniques used for compression
of information. (MAY/JUNE 2009)
4. Refer class notes-
Problems
No comments:
Post a Comment