information theory-ppt
TRANSCRIPT
-
8/20/2019 Information Theory-ppt
1/30
Information Theory
Prepared by:
Amit DegadaTeaching Assistant,
ECED, NIT Surat
-
8/20/2019 Information Theory-ppt
2/30
Goal of Today’s Lecture
Information Theory……Some Introduction
Information easure
!unction Determination for Information
A"erage Information per Symbo#
Information rate
Coding
Shannon$!ano Coding
-
8/20/2019 Information Theory-ppt
3/30
Information Theory
It is a study of Communication Engineering
p#us aths%
A Communication Engineer has to !ight &ith 'imited Po&er
Ine"itab#e (ac)ground Noise
'imited (and&idth
-
8/20/2019 Information Theory-ppt
4/30
Information Theory deals with
The easure of Source
Information
The Information Capacity of
the channe#
Coding
If The rate of Information from a source does not e*ceed the
capacity of the Channe#, then there e*ist a Coding Scheme such that
Information can be transmitted o"er the Communication Channe# &ith
arbitrary sma## amount of errors despite the presence of Noise
-
8/20/2019 Information Theory-ppt
5/30
Information Measure
This is uti#i+ed to determine the information rate ofdiscrete Sources
Consider t&o essages
A Dog (ites a an igh probabi#ity 'ess information
A an (ites a Dog 'ess probabi#ity igh Information
So &e can say that
Information - ./0Probabi#ity of 1ccurrence2
-
8/20/2019 Information Theory-ppt
6/30
Information Measure
A#so &e can state the three #a& from Intution
3u#e /: Information I.m)2 approaches to 4 as P)
approaches infinity%
athematica##y I.m)2 5 4 as P) /
e%g% Sun 3ises in East
-
8/20/2019 Information Theory-ppt
7/30
Information Measure
3u#e 6: The Information Content I.m)2 must be Non
Negati"e contity%
It may be +ero
athematica##y I.m)2 75 4 as 4 85 P) 85/
e%g% Sun 3ises in 9est%
-
8/20/2019 Information Theory-ppt
8/30
Information Measure
3u#e : The Information Content of message
ha"ing igher probabi#ity is #ess than the
Information Content of essage ha"ing'o&er probabi#ity
athematica##y I.m)2 7 I.m ;2
-
8/20/2019 Information Theory-ppt
9/30
Information Measure
A#so &e can state for the Sum of t&o messages that theinformation content in the t&o combined messages issame as the sum of information content of eachmessage Pro"ided the occurrence is mutua##yindependent%
e%g% There &i## be Sunny &eather Today%
There &i## be C#oudy &eather Tomorro&
athematica##y
I .m) and m ;2 5 I.m) m ;2
5 I.m)2
-
8/20/2019 Information Theory-ppt
10/30
Information measure
So =uestion is &hich function that &e can use that measure theInformation>
Information 5 !./0Probabi#ity2
3e?uirement that function must satisfy/% Its output must be non negati"e =uantity%
6% inimum @a#ue is 4%
% It Shou#d ma)e Product into summation%
Information I.m)2 5 'og b ./0 P) 2
ere b may be 6, e or /4
If b 5 6 then unit is bits
b 5 e then unit is nats
b 5 /4 then unit is decit
-
8/20/2019 Information Theory-ppt
11/30
Conversion Between Units
102
10
lolnlo
ln 2 lo 2
vvv = =
-
8/20/2019 Information Theory-ppt
12/30
!"am#le
A Source generates one of four symbo#s
during each inter"a# &ith probabi#ities P/5/06,
P65/0, P5 P5/0B% !ind the Information
content of three messages%
-
8/20/2019 Information Theory-ppt
13/30
$verae Information Content
It is necessary to define the information content ofthe particu#ar symbo# as communication channe#dea#s &ith symbo#%
ere &e ma)e fo##o&ing assumption…%%
/% The Source is stationery, so Probabi#ity remainsconstant &ith time%
6% The Successi"e symbo#s are statistica##yindependent and come out at a"g rate of r symbo#sper second
-
8/20/2019 Information Theory-ppt
14/30
$verae Information Content
Suppose a source emits Possib#e symbo#s s/, s6,
…%%S ha"ing Probabi#ity of occurrence
p/,p6,……%pm
!or a #ong message ha"ing symbo#s N .772
s/ &i## occur P/N times, #i)e a#so
s6 &i## occur P6N times so on……%
1
1 M
i
Pi=
=∑
-
8/20/2019 Information Theory-ppt
15/30
$verae Information Content
Since s/ occurs p/N times so information
Contribution by s/ is p/N#og./0p/2%
Simi#ar#y information Contribution by s6 is
p6N#og./0p62% And So on……%
ence the Tota# Information Content is
And A"erage Information is obtained by
1
1lo
M
total i
i
i
I NP P
=
= ÷
∑
1
1lo
M total
i
ii
I H P
N P =
= = ÷
∑ (its0Symbo#
It means that In #ong message &e can e*pect bit of information persymbo#% Another name of is entropy%
-
8/20/2019 Information Theory-ppt
16/30
Information %ate
Information 3ate 5 Tota# Information0 time ta)en
ere Time Ta)en
n bits are transmitted &ith r symbo#s per second%
Tota# Information is n%
Information rate
nTb
r =
nH R
n
r
R rH
= ÷
= (its0sec
-
8/20/2019 Information Theory-ppt
17/30
&ome Maths
satisfies fo##o&ing E?uation
20 lo H M ≤ ≤
a*imum 9i## occur &hen a## the message ha"ing e?ua# Probabi#ity%
ence a#so sho&s the uncertainty that &hich of the symbo# &i## occur%
As approaches to its ma*imum @a#ue &e cant determine &hich message
&i## occur%
Consider a system Transmit on#y 6 essages ha"ing e?ua# probabi#ity ofoccurrence 4%% at that Time 5/
And at e"ery instant &e cant say &hich one of the t&o message &i## occur%
So &hat &ou#d happen for more then t&o symbo# source>
-
8/20/2019 Information Theory-ppt
18/30
'ariation of ( 's) #
'ets Consider a (inary Source,
means 56
'et the t&o symbo#s occur at the probabi#ityp and
/$p 3especti"e#y%
9here o 8 p 8 /%
So Entropy can be
2 2
1 1lo *1 + lo
1 H p p
p p
= + − ÷ ÷−
* + p= Ω orse Shoe !unction
-
8/20/2019 Information Theory-ppt
19/30
'ariation of ( 's) ,
* +0
dH d p
dp dp
Ω= =
2
2
1 10
1
d H
dp p p= − − <
−
No& 9e &ant to obtain the shape of the cur"e
1lo 0
p
p
− = ÷
@erify it by Doub#e differentiation
-
8/20/2019 Information Theory-ppt
20/30
!"am#le
-
8/20/2019 Information Theory-ppt
21/30
Ma"imum Information rate
R rH =
2ma" lo H M =
9e no& that
A#so
2ma" lo R r M =
ence
-
8/20/2019 Information Theory-ppt
22/30
Codin for -iscrete memoryless &ource
ere Discrete means The Source is emitting
different symbo#s that are fi*ed%
emory#ess 5 1ccurrence of present symbo# isindependent of pre"ious symbo#%
A"erage Code 'ength
1
i i
M
i
N p N =
= ∑
9here
Ni5Code #ength in (inary
digits .binits2
-
8/20/2019 Information Theory-ppt
23/30
Codin for -iscrete memoryless &ource
1b
R H
r N η = = ≤
Efficiency
-
8/20/2019 Information Theory-ppt
24/30
Codin for -iscrete memoryless &ource
rafts ine?ua#ity
1
2 1 M Ni
i
K −
=
= ≤∑
If this is satisfied then on#y the Coding is uni?ue#y Decipherab#e
or Separab#e%
-
8/20/2019 Information Theory-ppt
25/30
!"am#le
.ind The efficiency and /raft’s ineuality
mi pi Code I Code II Code III Code I@
A
(
C
D
F
G
G
G
44
4/
/4
//
4
/
/4
//
4
4/
4//
4///
4
/4
//4
///
This Code is not
Hni?ue#y Decipherab#e
-
8/20/2019 Information Theory-ppt
26/30
&hannon .ano Codin Techniue
A#gorithm%
Step /: Arrange a## messages in descendingorder of probabi#ity%
Step 6: De"ide the Se?% in t&o groups in such a&ay that sum of probabi#ities in eachgroup is same%
Step : Assign 4 to Hpper group and / to 'o&er
group%
Step : 3epeat the Step 6 and for roup / and 6 andSo on……%%
-
8/20/2019 Information Theory-ppt
27/30
!"am#le
essages
i
Pi No% 1f
(its
Code
/
6
J
K
mB
F
/0B0
/0B
/0/J
/0/J
/0/J
/06
/06
4
/
/
/
/
/
/
/
4
4
/
/
/
/
/
4
/
4
4
/
/
/
4
/
4
/
/
4
/
Coding Procedure
/
4
/44
/4/
//44
//4/
///4
////4
/////
-
8/20/2019 Information Theory-ppt
28/30
This can be do&n#oaded from
&&&%amitdegada%&eeb#y%com0do&n#oad
After :4 Today
-
8/20/2019 Information Theory-ppt
29/30
uestions
-
8/20/2019 Information Theory-ppt
30/30
Than3 4ou