ensc327 communications systems 16: probability (chap. 8)
TRANSCRIPT
![Page 1: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/1.jpg)
ENSC327
Communications Systems
16: Probability (Chap. 8)
1
Jie Liang
School of Engineering Science
Simon Fraser University
![Page 2: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/2.jpg)
Probability
� Most signals of interest are random.
� The performance of communication systems is affected by random noise.
� Probability theory and random process are needed.
� History of Probability Theory:
2
� History of Probability Theory:� Created in 1654 by Pascal and Fermat after a question from a gambler.
� 1812: Laplace applied probabilistic ideas to many scientific and practical problems.
� Several definitions have been developed.
� 1933: Kolmogorov outlined an axiomatic definition of probability that formed the basis of the modern theory.
� Probability theory is now part of a more general discipline known as measure theory.
![Page 3: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/3.jpg)
1 Classical Definition of Probability
� Also called a priori Definition of Probability� Allows probabilities to be computed in special cases without
experimentation.
� Most notably, probabilities can be computed in games of chance.
� A priori probabilities are most commonly computed for equally likely outcomes
3
outcomes
� A random experiment is performed.
� Outcome: The result of a random experiment.
� Event: a collection of outcomes.
� The probability of event A is defined as:
� n: Total number of outcomes
� nA: Number of favorable outcomes belonging to an event A.
n
nAP
A=][
![Page 4: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/4.jpg)
1 Classical Definition of Probability
� If we randomly draw a card from a deck of 52 cards:
� Number of possible outcomes:
� If the event of interest is drawing a Heart:
4
� If the event of interest is drawing a King:
� If the event of interest is drawing a King OR a Heart:
![Page 5: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/5.jpg)
2. Relative Frequency Definition of
Probability
� The Relative Frequency approach to probability is
� Limitation of the classical definition: it implicitly
defines all outcomes to be equiprobable, which is not
always true.
5
� The Relative Frequency approach to probability is
well-suited to a wide range of scientific disciplines.
� Assumption: the probability of an event can be
measured by repeated trials.
� The probability of event A is defined as
n
nAP
A
n
lim][∞→
=
![Page 6: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/6.jpg)
Some Notations from Set Theory
�Union of A and B:
� Given two sets A and B, their union is the set
consisting of all objects which are elements of A
or of B or of both.
BA∪
6
or of B or of both.
�Intersection of A and B:
�The intersection of A and B is the set of all objects
which are both in A and in B.
![Page 7: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/7.jpg)
Some Notations from Set Theory
�Complement of A:
7
�Null Set: null or impossible event
![Page 8: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/8.jpg)
3 Axiomatic Definition of Probability
� Most general approach to probability.
� Treat a random experiment and its outcomes as a sample space S and its points.
� Each possible outcome is mapped to a sample point sk.
� An event corresponds to a single sample point or a set
8
� An event corresponds to a single sample point or a set of sample points.
� A single sample point is called an elementary event.
� The entire sample space S is called the sure event.
![Page 9: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/9.jpg)
Axiomatic Definition of Probability
� A probability system consists of the triple:
1. A sample space S of elementary events (outcomes).
2. A class of events that are subsets of S.
3. A probability measure P[A] to each event A with the following axims:
9
following axims:
(i) P[S] = 1.
(ii) 0 ≤ P[A] ≤ 1.
(iii)
exclusive.mutually are
B andA if )()()( BPAPBAP +=∪
![Page 10: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/10.jpg)
The Venn Diagrams
�Event: a collection of outcomes.
10
sample space, S
![Page 11: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/11.jpg)
Some Properties of ProbabilityAxims:
(i) P[S] = 1.
(ii) 0 ≤ P[A] ≤ 1.
(iii) B. andA exclusivemutually for )()()( BPAPBAP +=∪
11
1 ( )( ) P AP A = -Property 1:
![Page 12: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/12.jpg)
Some Properties of Probability
Property 2:
Proof:
).(- )()()( BAPBPAPBAP ∩+=∪
12
![Page 13: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/13.jpg)
Some Properties of Probability
� Joint probability:
� Prob. that both event A and B occur
Marginal Probability:
13
� Marginal Probability:
� The probability of one event, regardless of the other event.
� Obtained by summing the joint probability over the un-
required event.
![Page 14: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/14.jpg)
Some Properties of Probability
�Conditional Probability of event A given event
B occurred:
14
�Conditional Probability of B given A:
![Page 15: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/15.jpg)
Some Properties of Probability
�Bayes’ rule:
15
Bayes’ rule is useful when P(A|B), P(A) and P(B) can be
easily determined, but P(B|A) is desired.
![Page 16: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/16.jpg)
Some Properties of Probability
� Independence: Events A and B are
independent if
( )( | )
or ( | ) ( )
P A B P A
P B A P B
=
=
16
or ( | ) ( )P B A P B=
![Page 17: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/17.jpg)
Law of Total Probability
� If {B1, B2, …, Bn} are pairwise disjoint and
their union is the entire sample space, then
for any event A of the same sample space:
17
![Page 18: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/18.jpg)
Example
� Tossing two fair coins simultaneously:
� Possible outcomes: HH, HT, TH, TT
� Event A: at least one head (HH, HT, TH)
� Event B: a match (HH, TT)
18
( ) ?P A = ( ) ?P B =
Are A and B independent?
![Page 19: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/19.jpg)
Example
�Consider a Binary Symmetric Channel (BSC)
s = r =
P(0r | 0s) = 1 -
P(1r | 1s) = 1 -
P(1r | 0s) =
ε
ε
ε
19
�Questions:
� Prob. of getting j errors in k bits?
� Most probable input given that a “1” is received?
P(1r | 0s) =
P(0r | 1s) =
ε
ε
P(0s) = p
![Page 20: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/20.jpg)
1st Question
�The prob. of j errors in k bits:
� Assume the transmissions of different bits are
independent
20
![Page 21: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/21.jpg)
2nd Question
�Most probable input given we receive a “1”?
� Need to compare
�The most probable input is 1 if
21
�The most probable input is 1 if
�The most probable input is 0 if
��Maximum a-posteriori (MAP) criterion
![Page 22: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/22.jpg)
2nd Question
22
![Page 23: ENSC327 Communications Systems 16: Probability (Chap. 8)](https://reader030.vdocuments.site/reader030/viewer/2022032610/6238d4a861aab724337a1709/html5/thumbnails/23.jpg)
Example
�Assume p = 0.8, 1.0=ε
23