digital communication systems - siit.tu.ac.th€¦ · c. e. shannon (1916-2001) 4 1938 mit...
Post on 17-Jun-2020
5 Views
Preview:
TRANSCRIPT
1
Digital Communication SystemsECS 452
Asst. Prof. Dr. Prapun Suksompong(ผศ.ดร.ประพันธ ์สขุสมปอง)
prapun@siit.tu.ac.th
1. Intro to Digital Communication Systems
Office Hours: BKD, 6th floor of Sirindhralai building
Monday 10:00-10:40Tuesday 12:00-12:40Thursday 14:20-15:30
2
“The fundamental problem of
communication is that of
reproducing at one point either exactly or approximately a message selected at another point.”
Shannon, Claude. A Mathematical Theory Of
Communication. (1948)
Shannon: Father of the Info. Age
3 [http://www.uctv.tv/shows/Claude-Shannon-Father-of-the-Information-Age-6090][http://www.youtube.com/watch?v=z2Whj_nL-x8]
Documentary Co-produced by the
Jacobs School, UCSD-TV, and the California Institute for Telecommunications and Information Technology
Won a Gold award in the Biography category in the 2002 Aurora Awards.
C. E. Shannon (1916-2001)
4
1938 MIT master's thesis: A Symbolic Analysis of Relay and Switching Circuits
Insight: The binary nature of Booleanlogic was analogous to the ones and zeros used by digital circuits.
The thesis became the foundation of practical digital circuit design.
The first known use of the term bit to refer to a “binary digit.”
Possibly the most important, and also the most famous, master’s thesis of the century.
It was simple, elegant, and important.
C. E. Shannon: Master Thesis
5
Boole/Shannon Celebration
6
Events in 2015 and 2016 centered around the work of George Boole, who was born 200 years ago, and Claude E. Shannon, born 100 years ago.
Events were scheduled both at the University College
Cork (UCC), Ireland and the Massachusetts
Institute of Technology (MIT)
http://www.rle.mit.edu/booleshannon/
An Interesting Book
7
The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age
by Paul J. Nahin
ISBN: 9780691151007
http://press.princeton.edu/titles/9819.html
C. E. Shannon (Con’t)
8
1948: A Mathematical Theory of Communication Bell System Technical Journal,
vol. 27, pp. 379-423, July-October, 1948.
September 1949: Book published. Include a new section by Warren Weaver that applied Shannon's theory to human communication.
Create the architecture and concepts governing digital communication.
Invent Information Theory: Simultaneously founded the subject, introduced all of the major concepts, and stated and proved all the fundamental theorems.
A Mathematical Theory of Communication
9
Link posted in the “references” section of the website.
[An offprint from the Bell System Technical Journal]
C. E. Shannon
10 …with some remarks by Toby Berger.
Claude E. Shannon Award
11
Claude E. Shannon (1972)
David S. Slepian (1974)
Robert M. Fano (1976)
Peter Elias (1977)
Mark S. Pinsker (1978)
Jacob Wolfowitz (1979)
W. Wesley Peterson (1981)
Irving S. Reed (1982)
Robert G. Gallager (1983)
Solomon W. Golomb (1985)
William L. Root (1986)
James L. Massey (1988)
Thomas M. Cover (1990)
Andrew J. Viterbi (1991)
Elwyn R. Berlekamp (1993)
Aaron D. Wyner (1994)
G. David Forney, Jr. (1995)
Imre Csiszár (1996)
Jacob Ziv (1997)
Neil J. A. Sloane (1998)
Tadao Kasami (1999)
Thomas Kailath (2000)
Jack KeilWolf (2001)
Toby Berger (2002)
Lloyd R. Welch (2003)
Robert J. McEliece (2004)
Richard Blahut (2005)
Rudolf Ahlswede (2006)
Sergio Verdu (2007)
Robert M. Gray (2008)
Jorma Rissanen (2009)
Te Sun Han (2010)
Shlomo Shamai (Shitz) (2011)
Abbas El Gamal (2012)
Katalin Marton (2013)
János Körner (2014)
Arthur Robert Calderbank (2015)
Alexander S. Holevo (2016)David Tse (2017)
[ http://www.itsoc.org/honors/claude-e-shannon-award ]
IEEE Richard W. Hamming Medal
12
1988 - Richard W. Hamming1989 - Irving S. Reed1990 - Dennis M. Ritchie and Kenneth L. Thompson1991 - Elwyn R. Berlekamp1992 - Lotfi A. Zadeh1993 - Jorma J. Rissanen1994 - Gottfried Ungerboeck1995 - Jacob Ziv1996 - Mark S. Pinsker1997 - Thomas M. Cover1998 - David D. Clark1999 - David A. Huffman2000 - Solomon W. Golomb2001 - A. G. Fraser2002 - Peter Elias2003 - Claude Berrou and Alain Glavieux2004 - Jack K. Wolf2005 - Neil J.A. Sloane
2006 -Vladimir I. Levenshtein2007 - Abraham Lempel2008 - Sergio Verdú2009 - Peter Franaszek2010 -Whitfield Diffie,
Martin Hellman, and Ralph Merkle
2011 - Toby Berger2012 - Michael Luby, Amin Shokrollahi2013 - Arthur Robert Calderbank2014 - Thomas Richardson
and Rüdiger L. Urbanke2015 - Imre Csiszar2016 - Abbas El Gamal2017 - Shlomo Shamai2018 - Erdal Arikan
“For contributions to Information Theory, including source coding and its applications.”
[http://www.ieee.org/documents/hamming_rl.pdf]
[http://www.cvaieee.org/html/toby_berger.html]
Information Theory
13
The science of information theory tackles the following questions [Berger]
1. What is information, i.e., how do we measure it quantitatively?
2. What factors limit the reliability with which information generated at one point can be reproduced at another, and what are the resulting limits?
3. How should communication systems be designed in order to achieve or at least to approach these limits?
Elements of communication sys.
14
Noise, Interference,Distortion
ReceiverTransmitterInformation Source DestinationChannel
ReceivedSignal
TransmittedSignalMessage Message
ModulationCoding
Analog (continuous)Digital (discrete)
+ Transmission loss (attenuation)
AmplificationDemodulationDecodingFiltering
(ECS 332)
[Shannon, 1948]
The Switch to Digital TV
15
Japan: Starting July 24, 2011, the analog broadcast has ceased and only digital broadcast is available.US: Since June 12, 2009, full-power television stations nationwide have been broadcasting exclusively in a digital format.Thailand: Use DVB-T2. Launched in 2014.
[https://upload.wikimedia.org/wikipedia/commons/thumb/b/bd/Digital_broadcast_standards.svg/800px-Digital_broadcast_standards.svg.png]
News: The Switch to Digital Radio
16
Norway (the mountainous nation of 5 million) is the first country to shut down its national FM radio network in favor of digital radio.
Start on January 11, 2017 At which point, 99.5% of the population has access to DAB reception
with almost three million receivers sold. 70% of Norwegian households regularly tune in digitally
Take place over a 12-month period, conducting changes region by region.
December 13, 2017: All national networks are DAB-only. Local broadcasters have five years to phase out their FM stations.
New format: Digital Audio Broadcasting (DAB)http://gizmodo.com/norway-is-killing-fm-radio-tomorrow-1791019824http://www.worlddab.org/country-information/norwayhttp://www.smithsonianmag.com/smart-news/norway-killed-radio-star-180961761/http://www.latimes.com/world/la-fg-norway-radio-20170114-story.htmlhttps://www.newscientist.com/article/2117569-norway-is-first-country-to-turn-off-fm-radio-and-go-digital-only/http://fortune.com/2017/12/18/norway-fm-radio-digital-audio-broadcasting/
Digital Audio Broadcasting
17
Initiated as a European research project in the 1980s.
The Norwegian Broadcasting Corporation (NRK) launched the first DAB channel in the world on 1 June 1995 (NRK Klassisk)
The BBC and Swedish Radio (SR) launched their first DAB digital radio broadcasts in September 1995.
Audio quality varies depending on the bitrate used.
The Switch to DAB in Norway
18
Co-exist with FM since 1995. Provide a clearer and more reliable network that can better
cut through the country's sparsely populated rocky terrain. FM has always been problematic in Norway since the nation’s
mountains and fjords makes getting clear FM signals difficult.
Offer more channels at a fraction of the cost. Allow 8 times as many radio stations Norway currently has five national FM radio stations. With DAB, it will be able to have around 40.
The FM radio infrastructure was coming to the end of its life, Need to either replace it or fully commit to DAB anyway
Can run at lower power levels the infrastructure electricity bills are lower
The Switch to Digital Radio
19
Switzerland and Denmark are also interested in phasing out FM Great Britain says it will look at making the switch
once 50 percent of listeners use digital formats currently at 35 percent Unlikely to happen before 2020.
and when the DAB signal reaches 90 percent of the population. Germany had set a 2015 date for dumping FM many years ago, but
lawmakers reversed that decision in 2011. In North America,
FM radio, which has been active since the 1940s, shows no sign of being replaced any time soon, either in the United States or Canada.
There are around 4,000 stations using HD radio technology in the United States, and HD radio receivers are now common fixtures in new cars.
In Thailand, NBTC planed to start digital radio trial within 2018.
http://thaidigitalradio.com/ความคบืหน้าล่าสดุ-วิทยุ/
20
Selected by the U.S. FCC in 2002 as a digital audio broadcasting method for the United States.
Embed digital signal “on-frequency” immediately above and below a station’s standard analog signal
Provide the means to listen to the same program in either HD (digital radio with less noise) or as a standard broadcast (analog radio with standard sound quality).
Spectrum of FM broadcast station
without HD Radio with HD Radio
[ https://en.wikipedia.org/w
iki/HD
_Radio
]
Countries using DAB/DMB
21 https://en.wikipedia.org/wiki/Digital_audio_broadcasting
2017 hurricane season in the US
22 http://edition.cnn.com/2017/10/10/weather/hurricane-nate-maria-irma-harvey-impact-look-back-trnd/index.htmlhttps://www.vox.com/energy-and-environment/2017/9/28/16362522/hurricane-maria-2017-irma-harvey-rain-flooding-climate-change
Radio broadcasts are critical during a disaster
23
In areas hit hardest by things like hurricanes, earthquakes, fires, or even shootings:
Vulnerabilities of mobile phone infrastructure Cell phone infrastructure is often knocked out. Overwhelmed from everyone trying to access information. Three weeks after Hurricane Maria pummeled Puerto Rico,
more than 76 percent of cell sites still aren’t functioning.
Radio broadcast signals, which use low frequencies and can travel much further distances and penetrate through obstacles, usually remain up.
FM capability in modern cellphone
24
FM capability is baked into the Qualcomm LTE modem inside nearly every cellphone. You can easily turn your phone into an FM radio if it has an embedded chipset and the
proper circuitry to connect that chip to an FM antenna. Need
an app like NextRadio something to act as an antenna, such as headphones or nonwireless speakers.
Until a few years ago, device manufacturers disabled the function. Wireless carriers wanted customers to stream music and podcasts, and consume more
data. Broadcasters and public safety officials have long urged handset manufacturers and
wireless carriers to universally activate the FM chip. ITU (International Telecommunications Union) issued an opinion in March 2017 urging all
mobile phone makers to include and turn on FM radios on their devices. Major US carriers now allow FM chips to be turned on. Manufacturers like Samsung,
LG, HTC and Motorola have activated FM radio on their phones. September 28, 2017: FCC blasted Apple for not activating FM receivers built into
iPhones. Apple responded that iPhone 7 and iPhone 8, and iPhone X don’t use a chipset with an embedded FM radio.
https://www.cnet.com/news/everything-you-need-to-know-about-fm-radio-on-your-phone/https://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/fcc-wants-apple-to-turn-on-iphone-fm-receivers-that-may-not-existhttps://www.wired.com/2016/07/phones-fm-chips-radio-smartphone/
Pokémon Communications
25
Pikachu's language
26
Some of Pikachu's speech is consistent enough that it seems that some phrases actually mean something.
Pikachu always uses "Pikapi" when referring to Ash (notice that it sounds somewhat similar to "Satoshi").
Pi-Kachu: He says this during the sponsor spots in the original Japanese, Pochama (Piplup)
Pikachu-Pi: Kasumi (Misty)
Pika-Chu: Takeshi (Brock), Kibago (Axew)
Pikaka: Hikari (Dawn)
PiPiPi: Togepy (Togepi)
PikakaPika: Fushigidane (Bulbasaur)
PikaPika: Zenigame (Squirtle), Mukuhawk (Staraptor), Goukazaru (Infernape) or Gamagaru (Palpitoad)
PiPi-kachu: Rocket-dan (Team Rocket)
Pi-Pikachu: Get da ze! (He says this after Ash wins a Badge, catches a new Pokémon or anything similar.)
Pikachu may not be the only one to use this phrase, as other Pokémon do this as well. For example, when Iris caught Emolga, Axew said Ax-Axew (Ki-Kibago in the Japanese).
Pika-Pikachu: He says this when referring to himself.
Four-symbol variable-length code?
[https://www.youtube.com/watch?v=XumQrRkGXck]
Rate-Distortion Theory
27
The theory of lossy source coding
1
Digital Communication SystemsECS 452
Asst. Prof. Dr. Prapun Suksompongprapun@siit.tu.ac.th
2. Source Coding
Office Hours: BKD, 6th floor of Sirindhralai building
Monday 10:00-10:40Tuesday 12:00-12:40Thursday 14:20-15:30
Elements of digital commu. sys.
2
Noise & In
terferen
ce
Information Source
Destination
Channel
ReceivedSignal
TransmittedSignal
Message
Recovered Message
Source Encoder
Channel Encoder
DigitalModulator
Source Decoder
Channel Decoder
DigitalDemodulator
Transmitter
Receiver
Remove redundancy
Add systematic redundancy
System Under Consideration
3
Noise & In
terferen
ce
Information Source
Destination
Channel
ReceivedSignal
TransmittedSignal
Message
Recovered Message
Source Encoder
Channel Encoder
DigitalModulator
Source Decoder
Channel Decoder
DigitalDemodulator
Transmitter
Receiver
Remove redundancy
Add systematic redundancy
Main Reference
4
Elements of Information Theory
2006, 2nd Edition
Chapters 2, 4 and 5
‘the jewel in Stanford's crown’
One of the greatest information theorists since Claude Shannon (and the one most like Shannon in approach, clarity, and taste).
English Alphabet (Non-Technical Use)
5
The ASCII Coded Character Set
6[The ARRL Handbook for Radio Communications 2013]
0 16 32 48 64 80 96 112
US UK
(American Standard Code for Information Interchange)
Example: ASCII Encoder
7
Characterx
Codewordc(x)
⋮E 1000101
⋮L 1001100
⋮O 1001111
⋮V 1010110
⋮
SourceEncoder
Information Source
“LOVE”“1001100100111110101101000101”
>> M = 'LOVE';>> X = dec2bin(M,7);>> X = reshape(X',1,numel(X))X =1001100100111110101101000101
MATLAB:
Remark:numel(A) = prod(size(A))(the number of elements in matrix A)
English Redundancy: Ex. 1
8
J-st tr- t- r--d th-s s-nt-nc-.
English Redundancy: Ex. 2
9
yxx cxn xndxrstxndwhxt x xm wrxtxngxvxn xf x rxplxcx xllthx vxwxls wxth xn 'x' (t gts lttl hrdr f y dn'tvn kn whr th vwls r).
English Redundancy: Ex. 3
10
To be, or xxx xx xx, xxxx xx xxx xxxxxxxx
Entropy Rate of Thai Text
11
Introduction to Data Compression
12 [ https://www.khanacademy.org/computing/computer-science/informationtheory/moderninfotheory/v/compressioncodes ]
Introduction to Data Compression
13 [ https://www.khanacademy.org/computing/computer-science/informationtheory/moderninfotheory/v/compressioncodes ]
ASCII: Source Alphabet of Size = 128
14[The ARRL Handbook for Radio Communications 2013]
0 16 32 48 64 80 96 112
(American Standard Code for Information Interchange)
Ex. Source alphabet of size = 4
15
Ex. DMS (1)
16
, , , ,X a b c d e
a c a c e c d b c ed a e e d a b b b db b a a b e b e d cc e d b c e c a a ca a e a c c a a d cd e e a a c a a a bb c a e b b e d b cd e b c a e e d d cd a b c a b c d d ed c e a b a a c a d
Information Source
1 , , , , ,50, otherwise
X
x a b c d ep x
Approximately 20% are letter ‘a’s[GenRV_Discrete_datasample_Ex1.m]
Ex. DMS (1)
17 [GenRV_Discrete_datasample_Ex1.m]
clear all; close all;
S_X = 'abcde'; p_X = [1/5 1/5 1/5 1/5 1/5];
n = 100;MessageSequence = datasample(S_X,n,'Weights',p_X)MessageSequence = reshape(MessageSequence,10,10)
>> GenRV_Discrete_datasample_Ex1
MessageSequence =
eebbedddeceacdbcbedeecacaecedcaedabecccabbcccebdbbbeccbadeaaaecceccdaccedadabceddaceadacdaededcdcade
MessageSequence =
eeeabbacdeeacebeeeadbcadcccdcebdcacccaedebabcbedacdceeeacadddbccbdcbacdeecdedccaeddcbaaeddcecabacdae
Ex. DMS (2)
18
1,2,3,4X
1 , 1,21 , 2,41 , 3,480, otherwise
X
x
xp x
x
Information Source
Approximately 50% are number ‘1’s
2 1 1 2 1 4 1 1 1 11 1 4 1 1 2 4 2 2 13 1 1 2 3 2 4 1 2 42 1 1 2 1 1 3 3 1 11 3 4 1 4 1 1 2 4 14 1 4 1 2 2 1 4 2 14 1 1 1 1 2 1 4 2 42 1 1 1 2 1 2 1 3 22 1 1 1 1 1 1 2 3 22 1 1 2 1 4 2 1 2 1
[GenRV_Discrete_datasample_Ex2.m]
Ex. DMS (2)
19 [GenRV_Discrete_datasample_Ex2.m]
clear all; close all;
S_X = [1 2 3 4]; p_X = [1/2 1/4 1/8 1/8];
n = 20;
MessageSequence = randsrc(1,n,[S_X;p_X]);%MessageSequence = datasample(S_X,n,'Weights',p_X);
rf = hist(MessageSequence,S_X)/n; % Ref. Freq. calc.stem(S_X,rf,'rx','LineWidth',2) % Plot Rel. Freq.hold onstem(S_X,p_X,'bo','LineWidth',2) % Plot pmfxlim([min(S_X)-1,max(S_X)+1])legend('Rel. freq. from sim.','pmf p_X(x)')xlabel('x')grid on
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 50
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
0.5
x
Rel. freq. from sim.pmf pX(x)
20
DMS in MATLABclear all; close all;
S_X = [1 2 3 4]; p_X = [1/2 1/4 1/8 1/8]; n = 1e6;
SourceString = randsrc(1,n,[S_X;p_X]);
rf = hist(SourceString,S_X)/n; % Ref. Freq. calc.stem(S_X,rf,'rx','LineWidth',2) % Plot Rel. Freq.hold onstem(S_X,p_X,'bo','LineWidth',2) % Plot pmfxlim([min(S_X)-1,max(S_X)+1])legend('Rel. freq. from sim.','pmf p_X(x)')xlabel('x')grid on
SourceString = datasample(S_X,n,'Weights',p_X);
Alternatively, we can also use
[GenRV_Discrete_datasample_Ex.m]
A more realistic example of pmf:
21 [http://en.wikipedia.org/wiki/Letter_frequency]
Relative freq. of letters in the English language
A more realistic example of pmf:
22
Relative freq. of letters in the English languageordered by frequency
[http://en.wikipedia.org/wiki/Letter_frequency]
Example: ASCII Encoder
23
Characterx
Codewordc(x)
⋮E 1000101
⋮L 1001100
⋮O 1001111
⋮V 1010110
⋮
SourceEncoder
Information Source
“LOVE”“1001100100111110101101000101”
>> M = 'LOVE';>> X = dec2bin(M,7);>> X = reshape(X',1,numel(X))X =1001100100111110101101000101
MATLAB:
c(“L”) c(“O”) c(“V”) c(“E”)
Cod
eboo
k
Remark:numel(A) = prod(size(A))(the number of elements in matrix A)
The ASCII Coded Character Set
24[The ARRL Handbook for Radio Communications 2013]
0 16 32 48 64 80 96 112
A Byte (8 bits) vs. 7 bits
25
>> dec2bin('I Love ECS452',7)ans =1001001010000010011001101111111011011001010100000100010110000111010011011010001101010110010
>> dec2bin('I Love ECS452',8)ans =01001001001000000100110001101111011101100110010100100000010001010100001101010011001101000011010100110010
>> dec2bin('I Love You',8)ans =01001001001000000100110001101111011101100110010100100000010110010110111101110101
Geeky ways to express your love
26
>> dec2bin('i love you',8)ans =01101001001000000110110001101111011101100110010100100000011110010110111101110101
https://www.etsy.com/listing/91473057/binary-i-love-you-printable-for-your?ref=sr_gallery_9&ga_search_query=binary&ga_filters=holidays+-supplies+valentine&ga_search_type=all&ga_view_type=galleryhttp://mentalfloss.com/article/29979/14-geeky-valentines-day-cardshttps://www.etsy.com/listing/174002615/binary-love-geeky-romantic-pdf-cross?ref=sr_gallery_26&ga_search_query=binary&ga_filters=holidays+-supplies+valentine&ga_search_type=all&ga_view_type=galleryhttps://www.etsy.com/listing/185919057/i-love-you-binary-925-silver-dog-tag-can?ref=sc_3&plkey=cdf3741cf5c63291bbc127f1fa7fb03e641daafd%3A185919057&ga_search_query=binary&ga_filters=holidays+-supplies+valentine&ga_search_type=all&ga_view_type=gallery http://www.cafepress.com/+binary-code+long_sleeve_tees
Summary: Source Encoder
27
SourceEncoder
Information Source
“LOVE”“1001100100111110101101000101”
c(“L”) c(“O”) c(“V”) c(“E”)
• An encoder · is a function that maps each of the symbol in the source alphabet into a corresponding (binary) codeword.
• The list for such mapping is called the codebook.
Source Symbolx
Codewordc(x)
⋮E 1000101
⋮L 1001100
⋮O 1001111
⋮V 1010110
⋮
Discrete Memoryless Source (DMS)
• The codewordcorresponding to a source symbol is denoted by
.• the length of
• Each codeword is constructed from a code alphabet.
• For binary codeword, the code alphabet is 0,1
source string encoded string
• The source alphabet is the collection of all possible source symbols.
• Each symbol that the source generates is assumed to be randomly selected from the source alphabet.
w/o extension
Morse code
28
Telegraph network
Samuel Morse, 1838
A sequence of on-off tones (or , lights, or clicks)
(wired and wireless)
Example
29 [http://www.wolframalpha.com/input/?i=%22I+love+you.%22+in+Morse+code]
Example
30
Morse code: Key Idea
31
Frequently-used characters are mapped to short codewords.
Relative frequencies of letters in the English language
Morse code: Key Idea
32
Frequently-used characters (e,t) are mapped to short codewords.
Relative frequencies of letters in the English language
Morse code: Key Idea
33
Frequently-used characters (e,t) are mapped to short codewords.
Basic form of compression.
รหสัมอร์สภาษาไทย
34
Example: ASCII Encoder
35
Character Codeword
⋮E 1000101
⋮L 1001100
⋮O 1001111
⋮V 1010110
⋮
SourceEncoder
Information Source
“LOVE”“1001100100111110101101000101”
>> M = 'LOVE';>> X = dec2bin(M,7);>> X = reshape(X',1,numel(X))X =1001100100111110101101000101
MATLAB:
Another Example of non-UD code
36
Suppose we want to convey the sequence of outcomes from rolling a dice.
x c(x)
1 1
2 10
3 11
4 100
5 101
6 110
A sequence of throws such as 53214 is encoded as 10111101100
Another Example of non-UD code
37
Suppose we want to convey the sequence of outcomes from rolling a dice.
x c(x)
1 1
2 10
3 11
4 100
5 101
6 110
The encoded string 11 could be interpreted as 11: 1 1 3: 11
The encoded string 110 could be interpreted as 12: 1 10 6: 110
Another Example of non-UD code
38
x c(x)
A 1
B 011
C 01110
D 1110
E 10011
Another Example of non-UD code
39
x c(x)
A 1
B 011
C 01110
D 1110
E 10011
Consider the encoded string 011101110011.
It can be interpreted as CDB: 01110 1110 011 BABE: 011 1 011 10011
[ https://en.wikipedia.org/wiki/Sardinas%E2%80%93Patterson_algorithm ]
Game: 20 Questions
40
20 Questions is a classic game that has been played since the 19th century.
One person thinks of something (an object, a person, an animal, etc.)
The others playing can ask 20 questions in an effort to guess what it is.
20 Questions: Example
41
Shannon–Fano coding
42
Proposed in Shannon’s “A Mathematical Theory of Communication” in 1948
The method was attributed to Fano, who later published it as a technical report. Fano, R.M. (1949). “The transmission of information”.
Technical Report No. 65. Cambridge (Mass.), USA: Research Laboratory of Electronics at MIT.
Should not be confused with Shannon coding, the coding method used to prove Shannon's
noiseless coding theorem, or with Shannon–Fano–Elias coding (also known as Elias coding), the
precursor to arithmetic coding.
Prof. Robert Fano (1917-2016)Shannon Award (1976 )
Huffman Code
43
MIT, 1951 Information theory class taught by Professor Fano. Huffman and his classmates were given the choice of
a term paper on the problem of finding the most efficient binary code.
or a final exam.
Huffman, unable to prove any codes were the most efficient, was about to give up and start studying for the final when he hit upon the idea of using a frequency-sorted binary tree and quickly proved this method the most efficient.
Huffman avoided the major flaw of the suboptimal Shannon-Fanocoding by building the tree from the bottom up instead of from the top down.
David Huffman (1925–1999)Hamming Medal (1999)
Huffman’s paper (1952)
44[D. A. Huffman, "A Method for the Construction of Minimum-Redundancy Codes," in Proceedings of the IRE, vol. 40, no. 9, pp. 1098-1101, Sept. 1952.][ http://ieeexplore.ieee.org/document/4051119/ ]
Summary:
45
[2.16-17] A good code must be uniquely decodable (UD). Difficult to check.
[2.24] Consider a special family of codes: prefix(-free) code. Always UD. Same as being instantaneous.
All codes
Nonsingular codes
UD codes
Prefix-free
Huffman
codes
codes
[Defn 2.30] Huffman’s recipe Repeatedly combine the two least-likely (combined) symbols. Automatically give prefix-free code.
[2.37] For a given source’s pmf, Huffman codes are optimalamong all UD codes for that source.
[Defn 2.36]
Each source symbol can be decoded as soon as we come to the end of the codeword corresponding to it
No codeword is a prefix of any other codeword.
[Defn 2.18]
Huffman coding
46 [ https://www.khanacademy.org/computing/computer-science/informationtheory/moderninfotheory/v/compressioncodes ]
Ex. Huffman Coding in MATLAB
47 [Huffman_Demo_Ex1]
Observe that MATLAB automatically give the expected length of the codewords
pX = [0.5 0.25 0.125 0.125]; % pmf of XSX = [1:length(pX)]; % Source Alphabet[dict,EL] = huffmandict(SX,pX); % Create codebook
%% Pretty print the codebook.codebook = dict;for i = 1:length(codebook)
codebook{i,2} = num2str(codebook{i,2});endcodebook
%% Try to encode some random source stringn = 5; % Number of source symbols to be generatedsourceString = randsrc(1,10,[SX; pX]) % Create data using pXencodedString = huffmanenco(sourceString,dict) % Encode the data
[Ex. 2.31]
Ex. Huffman Coding in MATLAB
48
codebook =
[1] '0' [2] '1 0' [3] '1 1 1'[4] '1 1 0'
sourceString =
1 4 4 1 3 1 1 4 3 4
encodedString =
0 1 1 0 1 1 0 0 1 1 1 0 0 1 1 0 1 1 1 1 1 0
[Huffman_Demo_Ex1]
Ex. Huffman Coding in MATLAB
49 [Huffman_Demo_Ex2]
pX = [0.4 0.3 0.1 0.1 0.06 0.04]; % pmf of XSX = [1:length(pX)]; % Source Alphabet[dict,EL] = huffmandict(SX,pX); % Create codebook
%% Pretty print the codebook.codebook = dict;for i = 1:length(codebook)
codebook{i,2} = num2str(codebook{i,2});endcodebook
EL
[Ex. 2.32]
The codewords can be different from our answers found earlier.
The expected length is the same.
>> Huffman_Demo_Ex2
codebook =
[1] '1' [2] '0 1' [3] '0 0 0 0' [4] '0 0 1' [5] '0 0 0 1 0'[6] '0 0 0 1 1'
EL =
2.2000
Ex. Huffman Coding in MATLAB
50
pX = [1/8, 5/24, 7/24, 3/8]; % pmf of XSX = [1:length(pX)]; % Source Alphabet[dict,EL] = huffmandict(SX,pX); % Create codebook
%% Pretty print the codebook.codebook = dict;for i = 1:length(codebook)
codebook{i,2} = num2str(codebook{i,2});endcodebook
EL
[Exercise]
codebook = [1] '0 0 1'[2] '0 0 0'[3] '0 1' [4] '1'
EL =1.9583
>> -pX*(log2(pX)).'ans =
1.8956
56
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1-0.4
-0.35
-0.3
-0.25
-0.2
-0.15
-0.1
-0.05
0
x
Entropy and Description of RV
57 [ https://www.khanacademy.org/computing/computer-science/informationtheory/moderninfotheory/v/information-entropy ]
Entropy and Description of RV
58 [ https://www.khanacademy.org/computing/computer-science/informationtheory/moderninfotheory/v/information-entropy ]
Summary: Optimality of Huffman Codes
59
Consider a given DMS withknown pmf … [Defn 2.36] A code is optimal
if it is UD and its corresponding expected length is the shortestamong all possible UD codes for that source.
[2.37] Huffman codes are optimal.
All codes
Nonsingular codes
UD codes
Prefix-free
Huffman
codes
codes
Expected length (per source symbol) of an optimal code
Expected length (per source symbol) of a Huffman code
1
[2.49-2.54] Bounds on expected lengths:
Summary: Entropy
60
Entropy measures the amount of uncertainty (randomness) in a RV.
Three formulas for calculating entropy: [Defn 2.41] Given a pmf of a RV , ≡ ∑ log .
[2.44] Given a probability vector ,
≡ ∑ log .
[Defn 2.47] Given a number , ≡ log 1 log 1
[2.56] Operational meaning: Entropy of a random variable is the average length of its shortest description.
Set 0log 0 0.
binary entropy function
Examples
61
Example 2.31
Example 2.32
Huffman
1.75H X X
Huffman
2.14 2.2H X X
Efficiency = 100%
Efficiency 97%
Examples
62
Example 2.33
Example 2.34
ABCD
Huffman
2.29 2.3H X X
Huffman
1.86 2H X X Efficiency 93%
Efficiency 99%
Summary: Entropy
63
Important Bounds
deterministic uniform The entropy of a uniform (discrete) random variable:
The entropy of a Bernoulli random variable:
binary entropy function
Huffman Coding: Source Extension
66
1 2 3 4 5 6 7 80.4
0.5
0.6
0.7
0.8
0.9
1
n: order of extension
i.i.d.
BernoullikX p0.1p
nL
1
0.533
0.645
[Ex.2.40]
Huffman Coding: Source Extension
67
1 2 3 4 5 6 7 80.4
0.6
0.8
1
1.2
1.4
1.6
1.8
Order of source extension
H X
1H Xn
n
i.i.d.
BernoullikX p0.1p
nL
[Ex.2.40]
Summary: Source Extension
68
The encoder operates on the blocks rather than on individual symbols.
[Defn 2.39] -th extension coding: 1 block = successive source symbols
= expected (average) codeword length per source symbol when Huffman coding is used with -th extension
→
1 2 3 4 5 6 7 80.4
0.6
0.8
1
1.2
1.4
1.6
1.8
Order of source extension
H X
1H Xn
nL
1
Digital Communication SystemsECS 452
Asst. Prof. Dr. Prapun Suksompongprapun@siit.tu.ac.th
3 Discrete Memoryless Channel (DMC)
Office Hours: BKD, 6th floor of Sirindhralai building
Monday 10:00-10:40Tuesday 12:00-12:40Thursday 14:20-15:30
2
Digital Communication SystemsECS 452
Asst. Prof. Dr. Prapun Suksompongprapun@siit.tu.ac.th
3.1 DMC Models
Summary: DMC
11
|
,
Channel Input Alphabet
|
,
matrix
matrixrow vector
row vector “AND”
Put together All values at corresponding positions in the matrix.
x
y
Q
1
i
m
p x
p x
p x
,
1
,X Y i ji
m
x
x
x
p x y
x
y 1 j ny y y
jq y q pQ
P
1
i
m
x
x
x
1
j
n
y
y
y
j iQ y x
1
j ii
m
x
yx
x
Q x
Review: Evaluation of Probability from the Joint PMF Matrix
12
Consider two random variables X and Y.
Suppose their joint pmf matrix is
Find
0.1 0.1 0 0 0 0.1 0 0 0.1 00 0.1 0.2 0 00 0 0 0 0.3
2 3 4 5 6xy
1346
3 4 5 6 75 6 7 8 96 7 8 9 108 9 10 11 12
2 3 4 5 6xy
1346
Step 1: Find the pairs (x,y) that satisfy the condition“x+y < 7”
One way to do this is to first construct the matrix of x+y.
,X YP
x y
Review: Evaluation of Probability from the Joint PMF Matrix
13
Consider two random variables X and Y.
Suppose their joint pmf matrix is
Find
0.1 0.1 0 0 0 0.1 0 0 0.1 00 0.1 0.2 0 00 0 0 0 0.3
2 3 4 5 6xy
1346
3 4 5 6 75 6 7 8 96 7 8 9 108 9 10 11 12
2 3 4 5 6xy
1346
Step 2: Add the corresponding probabilities from the joint pmf (matrix)
,X YP
x y7 0.1 0.1 0.1
0.3
Review: Evaluation of Probability from the Joint PMF Matrix
14
Consider two random variables X and Y.
Suppose their joint pmf matrix is
Find
0.1 0.1 0 0 0 0.1 0 0 0.1 00 0.1 0.2 0 00 0 0 0 0.3
2 3 4 5 6xy
1346
,X YP
Review: Sum of two discrete RVs
15
Consider two random variables X and Y.
Suppose their joint pmf matrix is
Find
0.1 0.1 0 0 00.1 0 0 0.1 00 0.1 0.2 0 00 0 0 0 0.3
2 3 4 5 6xy
1346
3 4 5 6 75 6 7 8 96 7 8 9 108 9 10 11 12
2 3 4 5 6xy
1346
,X YP
x y7 0.1
17
Digital Communication SystemsECS 452
Asst. Prof. Dr. Prapun Suksompongprapun@siit.tu.ac.th
3.2 Decoder and
Recipe for finding of any decoder
25
Use the matrix. If unavailable, can be found by scaling each row of the matrix by its
corresponding . Write values on top of the values for the matrix. For column in the matrix, circle the element whose corresponding
value is the same as . the sum of the circled probabilities. 1 .
0.2
0.8
00 01
0.10 0.04 0.060.24 0.
.5 0.21
0.30.3 0.4 0.3 32 0.24
Q P
1 2 3 1 2 3x y xy1 1 0 x̂ y
0
1
1
3
0.5
0.3
X Y2
ˆ11
1
3 02
x yy
27
Digital Communication SystemsECS 452
Asst. Prof. Dr. Prapun Suksompongprapun@siit.tu.ac.th
3.3 Optimal Decoder
Review: ECS315 (2017)
29
Guessing Game 1
30
There are 15 cards. Each have a number on it. Here are the 15 cards:
1 2 2 3 3 3 4 4 4 4 5 5 5 5 5 One card is randomly selected from the 15 cards.
You need to guess the number on the card.
Have to pay 1 Baht for incorrect guess.
The game is to be repeated n = 10,000 times.
What should be your guess value?
31
close all; clear all;
n = 5; % number of time to play this game
D = [1 2 2 3 3 3 4 4 4 4 5 5 5 5 5];X = D(randi(length(D),1,n));
if n <= 10X
end
g = 1cost = sum(X ~= g)
if n > 1averageCostPerGame = cost/nend
>> GuessingGame_4_1_1X =
3 5 1 2 5g =
1cost =
4averageCostPerGame =
0.8000
32
close all; clear all;
n = 5; % number of time to play this game
D = [1 2 2 3 3 3 4 4 4 4 5 5 5 5 5];X = D(randi(length(D),1,n));
if n <= 10X
end
g = 3.3cost = sum(X ~= g)
if n > 1averageCostPerGame = cost/nend
>> GuessingGame_4_1_1X =
5 3 2 4 1g =
3.3000cost =
5averageCostPerGame =
1
33
close all; clear all;
n = 1e4; % number of time to play this game
D = [1 2 2 3 3 3 4 4 4 4 5 5 5 5 5];X = D(randi(length(D),1,n));
if n <= 10X
end
g = ?cost = sum(X ~= g)
if n > 1averageCostPerGame = cost/nend
Guessing Game 1
341 1.5 2 2.5 3 3.5 4 4.5 5
0.65
0.7
0.75
0.8
0.85
0.9
0.95
1
Guess value
AV
erag
e C
ost P
er G
ame
1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 60.65
0.7
0.75
0.8
0.85
0.9
0.95
1
Guess value
AV
erag
e C
ost P
er G
ame
Guessing Game 1
35
Optimal Guess: The most-likely value
Summary: MAP vs. ML Decoders
36
Decoder is derived from the matrix Select (by circling) the maximum value
in each column (for each value of ) in the matrix.
The corresponding value is the value of .
MAP ,
optimal
ˆ argmax ,
ˆ
argmax
argmax
X Yx
x
x
x y p x y
x y
P X x Y y
Q y p xx
MLˆ argmaxx
x y Q y x
Decoder is derived from the matrix Select (by circling) the maximum value
in each column (for each value of ) in the matrix.
The corresponding value is the value of .
Once the decoder (the decoding table) is derived and are calculated from adding the corresponding probabilities in the matrix.
Optimal at least when is uniform (the channel inputs are equally likely)
a posteriori probability
likelihood funtion
Can be derived without knowing the channel input probabilities.
prior probability
Example: MAP Decoder [Ex. 3.36]
37
0.2
0.8
00 01
0.10 0.04 0.060.24 0.
.5 0.21
0.30.3 0.4 0.3 32 0.24
Q P
1 2 3 1 2 3x y xy1 1 1 MAPx̂ y
0
1
1
3
0.5
0.3
X Y2
( ) 0.24 0.32 0.( ) 1 21 24 0.P P
MAPˆ
13 1
1 12
x yy
Example: ML Decoder [Ex. 3.47]
38
( ) 0.10 0.( 32 0.06) 1 1 0.52PP
( ) 0.10 0.( 32 0.24) 1 1 0.34PP
0.2
0.8
00 01
0.10 0.04 0.060.24 0.
.5 0.21
0.30.3 0.4 0.3 32 0.24
Q P
1 2 3 1 2 3x y xy0 1 0 MLx̂ y
0
1
1
3
0.5
0.3
X Y2
0.2
0.8
00 01
0.10 0.04 0.060.24 0.
.5 0.21
0.30.3 0.4 0.3 32 0.24
Q P
1 2 3 1 2 3x y xy
0
1
1
3
0.5
0.3
X Y2
MLx̂ y0 1 1
Sol 1:
Sol 2:
ML
1ˆ01
3 12
yy x
ML
1ˆ01
3 02
yy x[Same as Ex. 3.27]
[Agree with Ex. 3.33]
MAP Decoder
39
%% MAP DecoderP = diag(p_X)*Q; % Weight the channel transition probability by the
% corresponding prior probability.[V I] = max(P); % For I, the default MATLAB behavior is that when there are
% multiple max, the index of the first one is returned.Decoder_Table = S_X(I) % The decoded values corresponding to the received Y
%% Decode according to the decoder tablex_hat = y; % preallocationfor k = 1:length(S_Y)
I = (y==S_Y(k));x_hat(I) = Decoder_Table(k);
end
PE_sim = 1-sum(x==x_hat)/n % Error probability from the simulation
%% Calculation of the theoretical error probabilityPC = 0;for k = 1:length(S_X)
I = (Decoder_Table == S_X(k));Q_row = Q(k,:); PC = PC+ p_X(k)*sum(Q_row(I));
endPE_theretical = 1-PC
[DMC_decoder_MAP_demo.m]
ML Decoder
40
%% ML Decoder[V I] = max(Q); % For I, the default MATLAB behavior is that when there are
% multiple max, the index of the first one is returned.Decoder_Table = S_X(I) % The decoded values corresponding to the received Y
%% Decode according to the decoder tablex_hat = y; % preallocationfor k = 1:length(S_Y)
I = (y==S_Y(k));x_hat(I) = Decoder_Table(k);
end
PE_sim = 1-sum(x==x_hat)/n % Error probability from the simulation
%% Calculation of the theoretical error probabilityPC = 0;for k = 1:length(S_X)
I = (Decoder_Table == S_X(k));Q_row = Q(k,:); PC = PC+ p_X(k)*sum(Q_row(I));
endPE_theretical = 1-PC
[DMC_decoder_ML_demo.m]
47
Digital Communication SystemsECS 452
Asst. Prof. Dr. Prapun Suksompongprapun@siit.tu.ac.th
3.5 Repetition Code in Communications Over BSC
Hypercube
48 https://www.youtube.com/watch?v=Q_B5GpsbSQw
Hypercube
49 https://www.youtube.com/watch?v=Q_B5GpsbSQw
n-bit space
50
n = 1
n = 2
n = 3
n = 4
Channel Encoder and Decoder
51
Noise & In
terferen
ce
Information Source
Destination
Channel
ReceivedSignal
TransmittedSignal
Message
Recovered Message
Source Encoder
Channel Encoder
DigitalModulator
Source Decoder
Channel Decoder
DigitalDemodulator
Transmitter
Receiver
Add systematic redundancy
X: channel input
Y: channel output
0
1
0
1
p
1-p
p
1-p
S
Better Equivalent Channel
52
Noise & In
terferen
ce
Information Source
Destination
Channel
ReceivedSignal
TransmittedSignal
Message
Recovered Message
Source Encoder
Channel Encoder
DigitalModulator
Source Decoder
Channel Decoder
DigitalDemodulator
Transmitter
Receiver
Remove redundancy
Add systematic redundancy Better
EquivalentChannel
S
Example: Repetition Code
53
Original Equivalent Channel:
BSC with crossover probability p = 0.01
New (and Better) Equivalent Channel:
Use repetition code with n = 5 at the transmitter Use majority vote at the receiver New BSC with
0
1
0
1
p
1-p
p
1-p
0
1
0
1
p
1-p
p
1-p
Repetition Code with
n = 5
Majority Vote
0
1
0
1
1053 1 5
4 1 55 1
Example: ASCII Encoder and BSC
54
Noise & In
terferen
ce
Information Source
Destination
Channel
ReceivedSignal
TransmittedSignal
Message
Recovered Message
Source Encoder
Channel Encoder
DigitalModulator
Source Decoder
Channel Decoder
DigitalDemodulator
Transmitter
Receiver
Add systematic redundancy
X: channel input
Y: channel output
“LOVE”
“1001100100111110101101000101”
0
1
0
1
p
1-p
p
1-p
The ASCII Coded Character Set
55[The ARRL Handbook for Radio Communications 2013]
0 16 32 48 64 80 96 112
Example: ASCII Encoder
56
Character Codeword
⋮E 1000101
⋮L 1001100
⋮O 1001111
⋮V 1010110
⋮
SourceEncoder
Information Source
“LOVE”“1001100100111110101101000101”
>> M = 'LOVE';>> X = dec2bin(M,7);>> X = reshape(X',1,numel(X))X =1001100100111110101101000101
MATLAB:
c(“L”) c(“O”) c(“V”) c(“E”)
System considered
57
Noise & In
terferen
ce
Information Source
Destination
Channel
ReceivedSignal
TransmittedSignal
Message
Recovered Message
Source Encoder
Channel Encoder
DigitalModulator
Source Decoder
Channel Decoder
DigitalDemodulator
Transmitter
Receiver
Add systematic redundancy
X: channel input
Y: channel output
0
1
0
1
p
1-p
p
1-p
THE WIZARD OF OZ (1900)written by L. Frank Baum
Introduction
Folklore, legends, myths and fairy tales have followed childhoodthrough the ages, for every healthy youngster has a wholesome andinstinctive love for stories fantastic, marvelous and manifestlyunreal. The winged fairies of Grimm and Andersen have brought morehappiness to childish hearts than all other human creations.Yet the old time fairy tale, having served for generations, may
101010010010001000101010000010101111001001101101010000011010010100010001000001001111100011001000001001111101101001000000100000010000001010000110001011100101100000110000010100100010100001001111011111100101101001111010011101001100101110111001000001100010111100101000001001100010111001000001000110111001011000011101110110101101000001000010110000111101011101101000101000010100001010100100111011101110100111001011011111100100111010111000111110100110100111011111101110000101000010100100000010000010001101101111110110011010111101100110111111100101100101010110001000001101100110010111001111100101110111011001001110011010110001000001101101111100111101001101000111001101000001100001110111011001000100000110011011000011101001111001011110010100000111010011000011101100110010111100110100000110100011000011110110110010101000001100110110111111011001101100110111111101111100101110010001000001100011110100011010011101100110010011010001101111110111111001000001010111010011010001110010110111111101011100111110100001000001110100110100011001010100000110000111001111100101111001101011000100000110011011011111110010010000011001011110110110010111100101111001010000011010001100101110000111011001110100110100011110010100000111100111011111110101110111011001111110011111010011001011110010010000011010001100001111001101000001100001010000011101111101000110111111011001100101111001111011111101101110010101000001100001110111011001000001010110100111011101110011111010011010011101110110001111101001101001111011011001010100000110110011011111110110110010101000001100110110111111100100100000111001111101001101111111001011010011100101111001101000001100110110000111011101110100110000111100111110100110100111000110101100010000011011011100001111001011101101100101110110011011111110101111001101000001100001110111011001000100000110110111000011101110110100111001101100101111001111101001101100111100100010101110101110111011100101100101110000111011000101110010000010101001101000110010101000001110111110100111011101100111110010111001000100000110011011000011101001111001011010011100101111001101000001101111110011001000001000111111001011010011101101110110101000001100001110111011001000100000100000111011101100100110010111100101110011110010111011100100000110100011000011110110110010101000001100010111001011011111110101110011111010001110100010000011011011101111111001011001010001010110100011000011110000111000011010011101110110010111100111110011010000011101001101111010000011000111101000110100111011001100100110100111100111101000010000011010001100101110000111100101110100111001101000001110100110100011000011101110010000011000011101100110110001000001101111111010011010001100101111001001000001101000111010111011011100001110111001000001100011111001011001011100001111010011010011101111110111011100110101110000101001000000100000101100111001011110100010000011101001101000110010101000001101111110110011001000100000111010011010011101101110010101000001100110110000111010011110010111100101000001110100110000111011001100101010110001000001101000110000111101101101001110111011001110100000111001111001011110010111011011001011100100010000011001101101111111001001000001100111110010111011101100101111001011000011110100110100111011111101110111001101011000100000110110111000011111001
0.01
[ErrorProbabilityoverBSC.m]
Results
58
101010010010001000101010000010101111001001101101010000011010010100010001000001011111100011001000001001111101101001000000100000010000001010000110001011100101100000110000011100100110100001001111011111100101101001111010011101001100101110111001000001100010111100101000001001100010111001000001000110111001011000011101110110101101100001000010110000111101011101101000101000010100001010100100111011101110100111001011011111100100111010111000111110100110100111011111101110000101000010100100000011000010001101101111110110011010111101100110111111100101100101010111000000001101100110010111001111100101110111011001001010011000110001000001101101111100111101001101000111001101000001100001110111011001000100000110011011000011101001110001011110010100000111010011000011101100110011111100110100000110100011000011110110110010101000001100110110110111011001101100110111111101111100101110010001000001100011110100011010011101100110010011010001101111110111111001000001010111010011010001110010110111111101011100111110100001000001110100110100011001010100000110000111001111100101111001101011000000000110011011011111110010010010011001011110110110010111100101111001010000011011101100101110000111011001110100110100011110010100000111100111011111110101110111011001111110011111010011001011110010010000011010001100001111001101000001100001010000011101111101000110111111011001100101111001111011111101101110010101000001100001110111011001000001010110100111011001110011111000011110011101110110001111101001101001111001011001010100000110110011011111110110110010101000001100110110111111100100100000111001111101001101111111001011010011100101111000101000001100110110000101011101110100110000111100111110100110100111000110101100010000011011011100001111001011101101100101110110011011111110101011001101000001100101110111011001000100000110110111000011101110110100111001101100101111001111101001101100111100100010101110101110111011100101100101110000111011000101110010000010101001101000110010101000001110111110100111011101100111110010111001000100000110011011000011101001111001011010011100101111001101000001101111110011001000001000111111001011010011101101110110101000001100001110111011001000101010100000111011101100100110010111100101110011110010111011100100000110100011000011110110100010101000001100010111001011011111110101110011111010001110100010000011011011101111111001011001010001010110100011000011110000111000011010011101110110010111100111110011000000011101001101111010000011000111101000110100111010001100100110100111100111101000010000011010001100101110000111100101110100111001101000001110100110100011000011101110010000011000011101100110110001000001101111110010011010001110101111001001000001101000111010111011011100001110111001000001100011111001011001011100001111010011010011101111110111011100110101111000101001000000100000101100111001011110100010000011101001101000110010101100001101111110110011001000100010111010011010011101101110110101000001100110110000111010010110010111100101000001110100110000111011001100101010110001000001101000110000111101101101001110111011001110100000111000111001011110010111011011001011100100010000011001101101111111001001000001100111110010111011101100101111001011000011110100110100111011111101110111000101011000100000110110111000011111001
101010010010001000101010000010101111001001101101010000011010010100010001000001001111100011001000001001111101101001000000100000010000001010000110001011100101100000110000010100100010100001001111011111100101101001111010011101001100101110111001000001100010111100101000001001100010111001000001000110111001011000011101110110101101000001000010110000111101011101101000101000010100001010100100111011101110100111001011011111100100111010111000111110100110100111011111101110000101000010100100000010000010001101101111110110011010111101100110111111100101100101010110001000001101100110010111001111100101110111011001001110011010110001000001101101111100111101001101000111001101000001100001110111011001000100000110011011000011101001111001011110010100000111010011000011101100110010111100110100000110100011000011110110110010101000001100110110111111011001101100110111111101111100101110010001000001100011110100011010011101100110010011010001101111110111111001000001010111010011010001110010110111111101011100111110100001000001110100110100011001010100000110000111001111100101111001101011000100000110011011011111110010010000011001011110110110010111100101111001010000011010001100101110000111011001110100110100011110010100000111100111011111110101110111011001111110011111010011001011110010010000011010001100001111001101000001100001010000011101111101000110111111011001100101111001111011111101101110010101000001100001110111011001000001010110100111011101110011111010011010011101110110001111101001101001111011011001010100000110110011011111110110110010101000001100110110111111100100100000111001111101001101111111001011010011100101111001101000001100110110000111011101110100110000111100111110100110100111000110101100010000011011011100001111001011101101100101110110011011111110101111001101000001100001110111011001000100000110110111000011101110110100111001101100101111001111101001101100111100100010101110101110111011100101100101110000111011000101110010000010101001101000110010101000001110111110100111011101100111110010111001000100000110011011000011101001111001011010011100101111001101000001101111110011001000001000111111001011010011101101110110101000001100001110111011001000100000100000111011101100100110010111100101110011110010111011100100000110100011000011110110110010101000001100010111001011011111110101110011111010001110100010000011011011101111111001011001010001010110100011000011110000111000011010011101110110010111100111110011010000011101001101111010000011000111101000110100111011001100100110100111100111101000010000011010001100101110000111100101110100111001101000001110100110100011000011101110010000011000011101100110110001000001101111111010011010001100101111001001000001101000111010111011011100001110111001000001100011111001011001011100001111010011010011101111110111011100110101110000101001000000100000101100111001011110100010000011101001101000110010101000001101111110110011001000100000111010011010011101101110010101000001100110110000111010011110010111100101000001110100110000111011001100101010110001000001101000110000111101101101001110111011001110100000111001111001011110010111011011001011100100010000011001101101111111001001000001100111110010111011101100101111001011000011110100110100111011111101110111001101011000100000110110111000011111001
THE WIZARD OF OZ (1900)written by L. Frank Baum
Introduction
Folklore, legends, myths and fairy tales have followed childhoodthrough the ages, for every healthy youngster has a wholesome andinstinctive love for stories fantastic, marvelous and manifestlyunreal. The winged fairies of Grimm and Andersen have brought morehappiness to childish hearts than all other human creations.Yet the old time fairy tale, having served for generations, may
THE WIZARD _F OZ (19009 written by L. Frank0Baum
Introduction
0Folklore. legendS myths and faiby talgs have fmllowed childhoodthrough the ages, for$every nealthy youngster has a wholesome andilspynctire love for storieq fa.tastic, marvelou3 end manifestlyunreal. The winged fairies of Grimm and*Andersen havE brought morehappiness to chihdish hearts than all odhur human creations/Yet the0old"timm fai2y tale, having qerved for generationq, may
Results
59
THE WIZARD OF OZ (1900)written by L. Frank Baum
Introduction
Folklore, legends, myths and fairy tales have followed childhoodthrough the ages, for every healthy youngster has a wholesome andinstinctive love for stories fantastic, marvelous and manifestlyunreal. The winged fairies of Grimm and Andersen have brought morehappiness to childish hearts than all other human creations.Yet the old time fairy tale, having served for generations, may
THE WIZARD _F OZ (19009 written by L. Frank0Baum
Introduction
0Folklore. legendS myths and faiby talgs have fmllowed childhoodthrough the ages, for$every nealthy youngster has a wholesome andilspynctire love for storieq fa.tastic, marvelou3 end manifestlyunreal. The winged fairies of Grimm and*Andersen havE brought morehappiness to chihdish hearts than all odhur human creations/Yet the0old"timm fai2y tale, having qerved for generationq, may
The whole book which is saved in the file “OZ.txt” has 207760 characters (symbols).
The ASCII encoded string has 207760×7 = 1454320 bits.
The channel corrupts 14545 bits.
This corresponds to 14108 erroneous characters.
Results
60
The file “OZ.txt” has 207760 characters (symbols).
The ASCII encoded string has 207760×7 = 1454320 bits.
The channel corrupts 14545 bits.
This corresponds to 14108 erroneous characters.
>> ErrorProbabilityoverBSCbiterror =
14545BER =
0.010001237691842theoretical_BER =
0.010000000000000characterErrror =
14108CER =
0.067905275317674theoretical_CER =
0.067934652093010
Results
61
The file “OZ.txt” has 207760 characters (symbols).
The ASCII encoded string has 207760×7 = 1454320 bits.
The channel corrupts 14545 bits.
This corresponds to 14108 erroneous characters (symbols).
CER A character (symbol) is successfully recovered if and only if none of its bits are corrupted.
BSC’s crossover probability
Crossover probability and readability
62
When the first novel of the series, Harry Potter and the Philosopher's Stone (published in some countries as Harry Potter and the Sorcerer's Stone), opens, it is apparent that some significant event has taken place in the wizarding world--an event so very remarkable, even the Muggles notice signs of it. The full background to this event and to the person of Harry Potter is only revealed gradually through the series. After the introductory chapter, the book leaps forward to a time shortly before Harry Potter's eleventh birthday, and it is at this point that his magical background begins to be revealed.
When the first novel of the series, Harry Pottez and the Philosopher's Stone (p5blished in some countries as Harry Potter cnd the Sorcerep's Stone), opens, it i3 apparent that soMecignifacant event!haS taken0place in the wi~arding 7orld--ao event so`very!bemark!blu, even the Mufgles nodice signs"of it. The fuld background to this event and to the person of Harry P/tTer is only revealed gradually through th series. After the introfuctory chapter, the boo+ leaps forward to a time shortly before Harpy Potteb7s eleventh`birthday, and )t is at this poi~t that his -agikal bac{ground begins to be revealed.
0.01 CER 0.07
Original
Crossover probability and readability
63
When the first novel of the series, Harry Pottez and the Philosopher's Stone (p5blished in some countries as Harry Potter cnd the Sorcerep's Stone), opens, it i3 apparent that soMecignifacant event!haS taken0place in the wi~arding 7orld--ao event so`very!bemark!blu, even the Mufgles nodice signs"of it. The fuld background to this event and to the person of Harry P/tTer is only revealed gradually through th series. After the introfuctory chapter, the boo+ leaps forward to a time shortly before Harpy Potteb7s eleventh`birthday, and )t is at this poi~t that his -agikal bac{ground begins to be revealed.
0.01 CER 0.07
Human may be able to correct some (or even all) of these errors.
Crossover probability and readability
64
w(en th% birst .ovo,`of the serieq, Hcrry Potter(and the Phidosop`er's Suone (xub|ishe$(in some!Countries as @arry Potter and t`e Snr#erer's S|ong)- opens, it is apparent thatsoeesmgfificant erent ha3 taieN place in the wizardino world-,an event!so very remarkable, even thEEugglds notiae qigns of it. Tledfull back'tound to this event ane to the perron of Harry Popter is onl{ reveqned gsadeally thro}gH th%$serias. After the int2oducpory chcptur, the0jook deapsforward to"a!tmme shmrtly befosE Harry"Potter's eleventh jirthdiy cnd ht is a| thi3 po{nt tHat@is mAgiial background begijs to rm rerealed.
Whethe first nOwgl nf the susi-Q-@Harr} PoutEr(and |he PhilosoxhEr's Ctonepuclyshed in som% coultries as Harrx @ tter and the S_rcermr7s Spone), opdns, id is apparent that {omg`signifikant evmnt ias taKen!Placa in tHe 7ijardIng world--an event so Very remaroqble, eve.!thE MugglaC fotice signc of"it. Uhe full backf2ound`to thas even| ant`0o the pEssoj of @arry Qotteb iw only revealed gradu!lly vhvoug` the rerier. Afte2 the IndRoductori chaptar,t`ebook leats ForwaRf tc a 4imE shostl= before!Hcssy potter's u|Eveoth$firthdA{, and iT is ad this pomNt uhav `ir magica, back'bound cegins to bE 2evealed.
0.02 CER 0.13
0.03 CER 0.19
Crossover probability and readability
65
When phe fir v okval ov"th% serie3, @`rry0Pntter efdxtxeThil soph%rs Stone0(p}blisjed!in{ooe c un|pye{ agav0y Potter aj`(the sorcerer"s S4o|e)< opdns- mt"Is!apParEnt 4hat somusiwnidiga.v evant iAs take."plhge in(uhe w)zard)ng wo{|d--An event so very Rumar{ableleteN0Dhe %ugcles$n t)ce signs of$At. Tje!&ul|!backep/und Dk thkw`event ajt(to vhd per{On of8Ikxry P_Pter is oN,y rereAeud gredualli 4hroufh5ie qeriesn Af|ir the )~trofUctkry!ciapter,$tler%ok lE`ps for erd8to!a d)hg 3Hostly redobd HArry(Potter/r elaventI(birpl%ay,))nd(iD i3 1t tlishohlt vhat iis$iagical bac+gropnd bedans to bg rEve!ied/
Whef th% &i2sv nkvdl"On(txE"serm-s< HaRtY Qo|p%R$anlthe Phi$)qop8gb'r YtoNe (puclirhedin23/ee c uNpr9es aZ Harby!PovDdZ qnd0THA!Uorojev's Qpof'), pegsL iT is0aqazenP Tiet`{nlesau*!fICQ~t eve.t`xA# raken pOqb%%)D }Hm`wizprdYjv"wOrnd--a~%W%Jv s' tury2maskABdd$`eden(tl| LuxGxec`nOtike c)gzq of ktTiu!f5mm"cackG@ Ud(to"vhhQ a~aNd alt tn0vid veRckn of HaRvq$Xntter#isxohk{ regea,ed@&saduadLy u(2otGh"tau griEs."AfTex0T`g mntr DUCt ry kh `ter,$thd(fomN0j`apv ngrwarTt-0c t,me"1xortly bEemsL |ar2q Pnfter'3 aMen-n5i@Fipth$`q, aoh It i3d1t piac0pmhnP d*if Zas mafibin"je#k7poUndpb%dins tk`be qe6e!lgd.
0.05 CER 0.30
0.10 CER 0.52
BER vs. CER
660 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Bit Error Rate (BER)
Cha
ract
er E
rror r
ate
(CE
R)
[BERCER.m]
BER vs. CER
67 10-2 10-1 10010-2
10-1
100
Bit Error Rate (BER)
Cha
ract
er E
rror r
ate
(CE
R)
[BERCER.m]
1
Digital Communication SystemsECS 452
Asst. Prof. Dr. Prapun Suksompongprapun@siit.tu.ac.th
4. Mutual Information and Channel Capacity
Office Hours: BKD, 6th floor of Sirindhralai building
Monday 10:00-10:40Tuesday 12:00-12:40Thursday 14:20-15:30
Reference for this chapter
2
Elements of Information Theory
By Thomas M. Cover and Joy A. Thomas
2nd Edition (Wiley)
Chapters 2, 7, and 8
1st Edition available at SIIT library: Q360 C68 1991
Asst. Prof. Dr. Prapun Suksompongprapun@siit.tu.ac.th
Information-Theoretic Quantities
3
Digital Communication SystemsECS 452
Conditional Entropies
4
≡ log∈
≡
| ≡ | ≡ | log |∈
| ≡ |∈
Amount of randomness in
Amount of randomness still remained in when we know that .
The average amount of randomness still remained in when we know
;
,
given a particular value
Apply the entropy calculation to a row from the matrix
|
Diagrams [Figure 16]
5
∩ \A
∪
\B
Venn Diagram
;| |
,
Information Diagram
∩ \A
∪
\B
Probability Diagram
Diagrams [Figure 16]
6
;| |
,
Information Diagram
∩ \A
∪
\B
Probability Diagram
Diagrams
7
;| |
,
Information Diagram
∩ \A
∪
\B
Probability Diagram
Asst. Prof. Dr. Prapun Suksompongprapun@siit.tu.ac.th
Operational Channel Capacity
8
Digital Communication SystemsECS 452
Example: Repetition Code
9
Original Equivalent Channel:
BSC with crossover probability p = 0.01
New (and Better) Equivalent Channel:
Use repetition code with n = 5 at the transmitter Use majority vote at the receiver New BSC with
0
1
0
1
p
1-p
p
1-p
0
1
0
1
p
1-p
p
1-p
Repetition Code with
n = 5
Majority Vote
0
1
0
1
1053 1 5
4 1 55 1
[Figure 14]
Example: Repetition Code
10
Original Equivalent Channel:
BSC with crossover probability p = 0.1
New (and Better) Equivalent Channel:
Use repetition code at the transmitter Use majority vote at the receiver
New BSC with new crossover probability
0
1
0
1
p
1-p
p
1-p
0
1
0
1
p
1-p
p
1-p
Repetition Code
Majority Vote
0
1
0
1
Reminder [From ECS315]
11
[From ECS315]
12
Example: Repetition Code
13
0
1
0
1
p
1-p
p
1-p
Repetition Code
Majority Vote
0
1
0
1
1 0.1
3 32 1 3
3 0.0280
5 53 1 5
4 1 55 0.0086
7 0.00279 8.9092 10
11 2.9571 10
Asst. Prof. Dr. Prapun Suksompongprapun@siit.tu.ac.th
Information Channel Capacity
14
Digital Communication SystemsECS 452
Channel Capacity
15
Channel Capacity
“Operational”: max rate at which reliablecommunication is possible
“Information”: [bpcu]
Arbitrarily small error probability can be achieved.
Shannon [1948] shows that these two quantities are actually the same.
MATLAB
16
function H = entropy2s(p)% ENTROPY2 accepts probability mass function % as a row vector, calculate the corresponding % entropy in bits.p=p(find(abs(sort(p)-1)>1e-8)); % Eliminate 1p=p(find(abs(p)>1e-8)); % Eliminate 0if length(p)==0
H = 0;else
H = simplify(-sum(p.*log(p))/log(sym(2)));end
function I = informations(p,Q)X = length(p);q = p*Q;HY = entropy2s(q);temp = [];for i = 1:X
temp = [temp entropy2s(Q(i,:))];endHYgX = sum(p.*temp);I = HY-HYgX;
Capacity calculation for BAC
17
Capacity of 0.0918 bits is achieved by 0.5380, 0.4620p
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
0.1
p0I(X
;Y)
0
1
0
1
0.9
0.1
0.4
0.6
X Y
0.1 0.90.4 0.6
Q
Capacity calculation for BAC
18
close all; clear all;syms p0p = [p0 1-p0];Q = [1 9; 4 6]/sym(10);
I = simplify(informations(p,Q))
p0o = simplify(solve(diff(I)==0))
po = eval([p0o 1-p0o])
C = simplify(subs(I,p0,p0o))
eval(C)
>> Capacity_Ex_BACI =(log(2/5 - (3*p0)/10)*((3*p0)/10 - 2/5) - log((3*p0)/10 + 3/5)*((3*p0)/10 +
3/5))/log(2) + (log((5*2^(3/5)*3^(2/5))/6)*(p0 - 1))/log(2) +
(p0*log((3*3^(4/5))/10))/log(2)
p0o =(27648*2^(1/3))/109565 - (69984*2^(2/3))/109565 + 135164/109565
po =0.5376 0.4624
C =(log((3*3^(4/5))/10)*((27648*2^(1/3))/109565 - (69984*2^(2/3))/109565 + 135164/109565))/log(2) - (log((104976*2^(2/3))/547825 - (41472*2^(1/3))/547825 + 16384/547825)*((104976*2^(2/3))/547825 - (41472*2^(1/3))/547825 + 16384/547825) + log((41472*2^(1/3))/547825 - (104976*2^(2/3))/547825 + 531441/547825)*((41472*2^(1/3))/547825 - (104976*2^(2/3))/547825 + 531441/547825))/log(2) + (log((5*2^(3/5)*3^(2/5))/6)*((27648*2^(1/3))/109565 -(69984*2^(2/3))/109565 + 25599/109565))/log(2)
ans =0.0918
0
1
0
1
0.9
0.1
0.4
0.6
X Y0.1 0.90.4 0.6
Q
Same procedure applied to BSC
19
close all; clear all;syms p0p = [p0 1-p0];Q = [6 4; 4 6]/sym(10);
I = simplify(informations(p,Q))
p0o = simplify(solve(diff(I)==0))
po = eval([p0o 1-p0o])
C = simplify(subs(I,p0,p0o))
eval(C)
>> Capacity_Ex_BSCI =(log((5*2^(3/5)*3^(2/5))/6)*(p0 - 1))/log(2) -(p0*log((5*2^(3/5)*3^(2/5))/6))/log(2) - (log(p0/5 + 2/5)*(p0/5 + 2/5) - log(3/5 - p0/5)*(p0/5 -3/5))/log(2)p0o =1/2po =
0.5000 0.5000C =log((2*2^(2/5)*3^(3/5))/5)/log(2)ans =
0.0290
0
1
0
1
0.4
0.6
0.4
0.6
X Y0.6 0.40.4 0.6
Q
Blahut–Arimoto algorithm
20
function [ps C] = capacity_blahut(Q)% Input: Q = channel transition probability matrix% Output: C = channel capacity% ps = row vector containing pmf that achieves capacity
tl = 1e-8; % tolerance (for the stopping condition)n = 1000; % max number of iterations (in case the stopping condition
% is "never" reached") nx = size(Q,1); pT = ones(1,nx)/nx; % First, guess uniform X.for k = 1:n
qT = pT*Q;% Eliminate the case with 0% Column-division by qTtemp = Q.*(ones(nx,1)*(1./qT));%Eliminate the case of 0/0l2 = log2(temp); l2(find(isnan(l2) | (l2==-inf) | (l2==inf)))=0;logc = (sum(Q.*(l2),2))';CT = 2.^(logc);A = log2(sum(pT.*CT)); B = log2(max(CT));if((B-A)<tl)
breakend% For the next looppT = pT.*CT; % un-normalizedpT = pT/sum(pT); % normalizedif(k == n)
fprintf('\nNot converge within n loops\n')end
endps = pT;C = (A+B)/2; [capacity_blahut.m]
Capacity calculation for BAC: a revisit
21
close all; clear all;
Q = [1 9; 4 6]/10;
[ps C] = capacity_blahut(Q)
>> Capacity_Ex_BAC_blahutps =
0.5376 0.4624C =
0.0918
0
1
0
1
0.9
0.1
0.4
0.6
X Y0.1 0.90.4 0.6
Q
Richard Blahut
22
Former chair of the Electrical and Computer Engineering Department at the University of Illinois at Urbana-Champaign
Best known for Blahut–Arimotoalgorithm (Iterative Calculation of C)
Claude E. Shannon Award
23
Claude E. Shannon (1972)
David S. Slepian (1974)
Robert M. Fano (1976)
Peter Elias (1977)
Mark S. Pinsker (1978)
Jacob Wolfowitz (1979)
W. Wesley Peterson (1981)
Irving S. Reed (1982)
Robert G. Gallager (1983)
Solomon W. Golomb (1985)
William L. Root (1986)
James L. Massey (1988)
Thomas M. Cover (1990)
Andrew J. Viterbi (1991)
Elwyn R. Berlekamp (1993)
Aaron D. Wyner (1994)
G. David Forney, Jr. (1995)
Imre Csiszár (1996)
Jacob Ziv (1997)
Neil J. A. Sloane (1998)
Tadao Kasami (1999)
Thomas Kailath (2000)
Jack KeilWolf (2001)
Toby Berger (2002)
Lloyd R. Welch (2003)
Robert J. McEliece (2004)
Richard Blahut (2005)
Rudolf Ahlswede (2006)
Sergio Verdu (2007)
Robert M. Gray (2008)
Jorma Rissanen (2009)
Te Sun Han (2010)
Shlomo Shamai (Shitz) (2011)
Abbas El Gamal (2012)
Katalin Marton (2013)
János Körner (2014)
Arthur Robert Calderbank (2015)
Alexander S. Holevo (2016)David Tse (2017)
[ http://www.itsoc.org/honors/claude-e-shannon-award ]
Berger plaque
24
Raymond Yeung
25
BS, MEng and PhD degrees in electrical engineering from Cornell University in 1984, 1985, and 1988, respectively.
top related