unit-4 by ms. prachi€¦ · submitted by: ms. prachi (electrical branch) submitted to : mr. amit...
TRANSCRIPT
Syllabus
Statistics & Probability : Elementary theory of probability, Baye’s theorem with simple applications, Expected value. Theoretical probability distributions – Binomial, Poisson and Normal distributions.
UNIT 4
STATISTICS AND PROBABILITY
10/4/2017
Submitted by: Ms. Prachi (Electrical Branch)
Submitted to : Mr. Amit Sharma (Department of Mathematics)
UNIT-4
STATISTICS AND PROBABILITY
� Experiment: An action or an operation which can produce well defined results.
� Outcomes: All possible results of the experiment are the outcome of the experiment.
� Types of Experiment: 1. Deterministic Experiment 2. Random Experiment
� Deterministic Experiment: An outcome whose future outcome can be determined in advance is called deterministic experiment or predictable experiment. For example:-Drawing a ball from a bag consisting of 7 red balls, we get a red ball by repeatedly drawing a ball.
� Random Experiment: If in each trial of an experiment conducted under identical conditions, the outcome is not unique but may be any one of the possible outcomes, then such an experiment is called a random experiment. For Example :- a) Tossing a coin b) Throwing a die c) Selecting a card from a pack of playing cards
� Sample Space: Sample space is the set of all possible outcomes of a random experiment and it will be denoted but letter S. For example :- a) If we consider tossing of 2 coins, then sample space (S) of the experiment will be, S = {HH, TT, TH, HT} b) In throwing an ordinary die, the sample space is S = {1, 2, 3, 4, 5, 6}
� In general, we say that sample space serves as the universal set for all the questions with the experiment and event is any subset of the sample space S.
� Trial of an Event: Any particular performance of a random experiment is called a trial and outcome or combination of outcomes is termed as trial of an event. Example :- a) If a coin is tossed repeatedly, the result is not unique we may get any of the 2 faces, head or tail. Thus, tossing a coin is a random experiment or trial and
getting of head/tail is an event. b) In an experiment which consists of a throw of a six faced die and observing the number of points that appear. The possible outcomes are 1, 2, 3, 4, 5, and 6.
� Note: A pack of cards consists of 4 suits called spades, diamonds, clubs and hearts. Each suit consists of 13 cards of which 9 cards are numbered from 2 to 10 and rest 4 are an ace, king, queen and a jack (or knave). Spades and clubs are black faced cards while hearts and diamonds are red faced cards.
� Exhaustive events or cases: The total number of possible outcomes of a random experiment is known as the exhaustive events or cases. For example :- a) In tossing a coin there are 2 exhaustive cases i.e. head and tail (the possibility of a coin standing on the edge being ignored). b) In drawing 2 cards from a pack of cards, the exhaustive number of cases is 52C2, since 2 cards can be drawn out of 52 cards out of 52C2 ways.
� Favorable Events: The number of cases favorable to an event in a trial is the number of outcomes which entail the happening of an event. Example :- a) In throwing a card from a pack of cards, the number of cases favorable to drawing an ace is 4, for drawing a spade is 13 and for drawing a red is 26. b) In throwing of 2 dice, the number of cases favorable to getting the sum 5 is (1, 4) (4, 1) (2, 3) (3, 2).
� Mutually Exclusive Events: Two or more events are called mutually exclusive events if occurrence of one of them prevents happening of the other i.e. both the events cannot occur together at a same time in the same experiment.
Example :- a) In tossing of a coin, the events head and tail are mutually exclusive. b) In throwing a die all the 6 faces numbered from 1 to 6 are mutually exclusive since any one of these faces come, the possibility of the others in the same trial is ruled out.
� Equally Likely Events: Two events are called equally likely events when there is no reason to expect any one rather than the other. For example :- a) When a card is drawn from a well shuffled pack
of cards any card may appear in the draw. So 52 cases are equally likely. b) Rolling of a single die, then each number on that represents equally likely events.
� Independent Events: The events are said to be independent if occurrence of one does not affect the occurrence of other at the same time in the same experiment.
For example:-
a) When a die is thrown thrice, the result of the first throw does not affect the result of the second throw.
b) If we draw a card from a pack of 52 cards and replace it before drawing the second card, the result of the second draw is independent of the first draw.
� Dependent Events:
The events are said to be dependent if the occurrence of one does affect the occurrence of the other.
For example :- a) In drawing a card from a pack of 52 cards, if we draw
2nd card from this pack, then this will be affected by
first card
� Algebra of Events:
Union: - E1∪E2=E1+E2={s∈S s∈E1 or s∈E2}
Intersection: - E1 ∩E2=E1E2={s∈S s∈E1 and s∈E2}
Complement :- =E {s∈S s ∉E}
As the events are subsets of sample space S, hence all the laws of set theory hold for algebra of events as follows:
a) Commutative law: E1∪E2= E2∪E1; E1∩E2= E2∩E1 b) Associative law: E1∪ [E2∩E3]= [E1∪E2]∩E3 ;
E1∩ [E2∪E3] = [E1∩E2] ∪E3
c) Distributive law: E1∪ [ E2 ∩E3]= [E1∪E2] ∩ [E1∪E3]
E1∩ [E2∪E3] = [E1∩E2] ∪ [E1∩E3]
d) Identity law: E∪ φ=E; E∩S=E
e) Complementation law: E∪Ē=S; E∩Ē=φ
f) Idempotent law: E∪E=E; E∩E=E
g) Domination law: E∩S=S; E∩ φ=φ
h) Absorption law: E1∩ [E1∪E2]= E1; E1∪ [E1∩E2]= E2 i) De-Morgan’s law: E1∪E2= E1∩Ē2
j) (Ē)=E
SOME NOTATIONS:
Let A and B be two events. Then,
a) Neither A nor B:- BA∪
b) Exactly one of the events A and B occur:- ( )()( BABA ∩∪∩ )
c) Not more than one of the events A and B occurs:-
( )()()( BABABA ∩∪∩∪∩
d) If A occurs so does B:- A⊂ B
� Partition of events: Collectively exhaustive and mutually exclusive set of events A1, A2, ……., An form a partition of the sample space S and the sample space S may be then called as an event space. For example: a) Consider the rolling of a dice, then
S= {1, 2, 3, 4, 5, 6} and E1= {2, 4, 6}, E2= {1, 3, 5} then E1 and E2 are mutually exclusive and collectively exhaustive set of events which form a partition of the sample space S.
� Probability(Classical or Mathematical Probability): If a random experiment or a trial results in ‘n’ exhaustive, mutually exclusive and equally likely outcomes or cases, out of which m are favorable to the occurrence of an event E, then the probability ‘p’ of occurrence(or happening) of E, usually denoted by P(E) is given by,
p=P(E) = ������ � ������ ���
���� ������ � ���=
�
�
� Remark:
a) Since m≥0,n>0 and m ≤ n, we get from above equation P(E)≥0 and P(E)≤1 i.e. 0≤P(E)≤1
b) The non happening of the event E is called the complementary event of E and is denoted by E or Ec.
c) P(E )=1-P(E)
Question 1: Find the probability of getting 2 heads in throwing 2
coins?
Solution: The sample space S is given as,
S = {HH, HT, TH, TT}
Let E be the event of getting 2 heads in the experiment of throwing 2 coins.
E = {HH}
n(E) = 1 and n(S) = 4
so, P(E) = �(�)
�(�) =
�
�
Question 2: Find the probability of having 53 Sundays in a year?
Solution: One year has 365 days i.e. one year has 52 weeks and 1 day
remains which may be any one of the Sunday, Monday,
Tuesday, Wednesday, Thursday, Friday or Saturday.
Total possibilities = 7
So, number of cases favorable for the day being Sunday=1
P (53 Sundays) = �
�
Question 3: What is the chance that a leap year selected at a random?
will contain 53 Sundays?
Solution: In a leap year (which consists of 366 days), there are 52
complete weeks and 2 days extra. The following are the possible
combination for the 2 extra days,
a) Sunday and Monday b) Monday and Tuesday c) Tuesday and Wednesday d) Wednesday and Thursday e) Thursday and Friday f) Friday and Saturday g) Saturday and Sunday
In order that a leap year selected at a random should contain 53 Sundays, one of the 2 extra days must be Sunday. Since out of 7 possibilities above, 2 [a) and g)] are favorable cases.
Required Probability = �
�
� Remark: 1) If P (A) =
�
��� we can also say that odd in favor of ‘A’ are m: n or odds against the
event A are n: m. 2) When an event is certain to occur its probability is 1. 3) Sine any event imaginable is either certain, impossible or somewhere in between, it is
reasonable to conclude that for any event A, 0≤P (A) ≤1.
� Addition rule of probability: If E1 and E2 are 2 events for sample space S, then the probability of occurrence of at least one of the event E1 and E2 is, P (E1∪E2) = P (E1) +P (E2)-P (E1∩E2) PROOF: We know, E1∪E2 = E1+E2 -E1∩E2 n(E1∪E2)=n( E1)+n(E2)-n(E1∩E2 ) Dividing both sides by n(S) we get,
�(� 1∪� 2 )
�( ) =�( � 1)
�( )+
�(� 2 )
�( )−
�(� 1∩ � 2 )
�( )
P(E1∪E2) = P(E1)+P(E2)-P(E1∩E2)
If E1 and E2 are mutually exclusive events then,
P(E1∪E2) = P(E1)+P(E2)
Question 4: Let E1 and E2 be any 2 events then,
)()()() 21221 EEPEPEEPa ∩∩ −=
Solution: )()()( 21221 EEPEPEEP ∩∩ −=
E2 = (E1 ∩ E2) ∪ )( 21 EE ∩
E2 = (E1 ∩ E2) + (E1∩E2)
P(E2) =P (E1 ∩ E2)+P(E1∩E2)
P(E1∩E2)= P(E2) -P (E1 ∩ E2)
� Probability of at least one event: If E1, E2, ……., En are events then, (E1∪ E2∪ .......∪ En)
∪ (E1∪ E2∪ .......∪ En )= S
Question: Two cards are drawn at random from a pack of 52 cards. Find the
probability that both cards are of red color or they are queen.
Solution: Let E1 and E2 be the events of getting either both the red cards or both the
cards as queen.
Then,
P(E1)=26 ∁ 2
52 ∁ 2
;
P(E2)=4 ∁ 2
52 ∁ 2
And, P(E1∩E2)=3∁$
52 ∁$
∴Probability, P(E1∪E2) =P(E1)+P(E2)- P(E1∩E2)
=26 ∁ 2
52 ∁ 2
+4 ∁ 2
52 ∁ 2
−3∁$
52 ∁$
= �&!÷(�!×��!)
*�!÷(�!×*+!)+
�!÷(�!×�!)
*�!÷(�!×*+!)−
�!÷(�!×+!)
*�!÷(�!×*+!)
= &*+
�&*�+
��
�&*�−
�
�&*�
P(E1∪E2)= **
���
� Conditional Probability: The probability of the happening of event E1 when it is known that E2 has already happened is called the conditional probability of E1 and is denoted by P(E1 E2) or
P(�,
�$)and is defines as
P(-� -�⁄ ) = /(�,∩�$)
/(�$); provided P(E2)≠ 0
Question: If a die is thrown what is the probability of occurrence of number
greater than 1if it is known that only odd number can come.
Solution: If E1 be the event of getting number greater than 1 given E2 is the event
of getting odd number.
E1 = {2, 3, 4, 5, 6}; E2 = {1, 3, 5}; E1∩E2 = {3, 5}
P(E1) = 2
&=
�
� ; P(E1∩E2)=
�
&=
�
2
∴P(-1 -2)⁄ = 6(-1∩-2)
6(-1)
= ,7,$
=�
2
� Multiplication Theorem of Probability: If E1 and E2 are any two events in the sample space S, then P(E1∩E2)=P(E1E2)=P(E1).P(-� -�⁄ )if P(E1)≠ 0 Or, P(E1∩E2)=P(E2).P(-� -�)⁄ if P(E2)≠ 0
Question: An urn contains 10 black and 5 white balls. Two balls are drawn from
the urn one after the other without replacement. What is the probability
that both balls drawn are black?
Solution: Let E1 be event of drawing 1st black ball and E2 be event of drawing
2nd black ball without replacement.
Sample space is given as,
S = {B, B, B, B, B, B, B, B, B, B, W, W, W, W, W}; n(S) =15
E1 = {B, B, B, B, B, B, B, B, B, B}; n(E1)=10
E2 = {B, B, B, B, B, B, B, B, B}; n(E2)=9
P(E1)=�+
�*; P(E2)=
8
��; P(E1∩E2)=
∴Probability that both balls drawn are black is,
/(�$∩�,)
/(�,) = P(-� -�)⁄ .P(E1)
� Theorem of Total Probability: Let E1,E2,E3,………,En be partition of sample space S such that P(Ei)≠0 ∀ i then for any E in S. P(E) = ∑ 6(-;
�;<� ). P(- -;⁄ )
= P(E1).P(- -�⁄ )+P(E2).P(- -�⁄ )+…………+P(En).P(- -�⁄ )
� Baye’s Theorem:
If E1,E2,E3,…….,En are mutually exclusive events of a random experiment with P(Ei)≠0 (i=1,2,3,…….,n) whose union is the sample space S, then for any arbitrary event E associated with or caused by Ei of the sample space of above experiment with P(Ei)>0, we have
P(-; -⁄ )=/(�>)./(� �>⁄ )
∑ /(@AB, �A)./(� �A⁄ )
; for any (i=1,2,3,….,n)
Question: Bag 1 contains 3 red and 4 black balls while another Bag 2 contains 5
red and 6 black balls. One ball is drawn at random from one of the bag
and it is found to be red. Find the probability that it was drawn from Bag
2?
Solution: Let E1 be event of drawing ball from Bag 1 and E2 be event of drawing
ball from Bag 2.
P(E1)= �
�
P(E2)=�
�
Let E be event of drawing red ball. Then probability of drawing red ball
given that it is drawn from Bag 1.
P(- -�⁄ )=2
�
Similarly,
P(- -�⁄ )=*
��
Then by Baye’s Theorem,
P(-� -⁄ )=/(�$)./(� �$⁄ )
∑ /($AB, �A)./(� �A⁄ )
= ,$
.C
,,
/(��)./(� �,⁄ )�/(��)./(� �$⁄ )
= ,$
.C
,,,$
.7D
�,$
.C
,,
= 2*
&E
� Random Variable: A random variable X is a function X:S→R that assigns a real number X(s) to each s ∈ S(sample space), corresponding to a random experiment E. Hence domain of random variable is S. Random variables are generally denoted by capital letters as X,Y,Z etc. The values taken by the random variables X,Y,Z are generally denoted by lowercase letters x,y,z etc. For example: a) A drug is given to two sick patients. Let random variable X represent number of cures that occur. Hence, X = 0,1,2.
b) A single fair die is rolled and random variable X represent the number that turns up. Hence, X can take up value 1,2,3, 4,5 and 6.
� Discrete Random Variable: It has either finite or countably infinite number of values. Example: a) The number shown when die is thrown.
b) The number of complaints received at the office of an airline on
a given day.
� Continuously Random Variable: A random variable which can take infinite number of values in an interval is known as continuous random variable. Example: a)The height of a person. b)The weight of a fish.
� Probability Distribution of Discrete Random Variable: The probability distribution of a discrete random variable lists all the possible values that the random variable can assume and their probabilities. If a random variable assumes values x0,x1,x2,……,xn with probabilities p0,p1,p2,…….,pn then probability distribution is, X=x x0 x1………….. xn P(x) p0 p1………….. pn
For example: a) When rolling a die the probability distribution is,
X=x 1 2 3 4 5 6 P(x) 1 6⁄ 1 6⁄ 1 6⁄ 1 6⁄ 1 6⁄ 1 6⁄
� Probability Mass Function: Let X be discrete random variable such that P(X=Xi) = pi then pi is called probability mass function (PMF) if it satisfies the following conditions- a) pi ≥0 ∀ i b) ∑ J;; =1
Question: Check whether the following functions serve as PMF-
a)P(X=x)=KL�
� ∀ x=1,2,3,4
b)P(X=x)=K$
�*∀ x=1,2,3,4
Solution: a) P(X=x)= KL�
� ∀ x=1,2,3,4
P(X=1)= �L�
�=
L�
�
P(X=2)= �L�
�=0
P(X=3)= 2L�
�=
�
�
P(X=4)= �L�
�=1
∴ ∑ 6(; X=xi)= P(X=1)+ P(X=2)+ P(X=3)+ P(X=4)
=L�
�+0+
�
�+1
=1
It is not a PMF since P(X=1)<1
� Probability Density Function: The function f(x) for a continuous random variable X is said to be PDF if a) f(x) ≥ 0
b) N O(P)QPR
LR=1
� Remark: 1) P(a ≤ X ≤ b)= N f(P)QP
U
V
2) P(a ≤ X ≤ b)=P(a< W ≤ X) = 6(Y ≤ W < X) = 6(Y < W < X)
Question: The diameter of an electric cable say assumed to be continuous random
Variable with PDF, f(x)=6x(1-x), 0<x<1.
a) Check that above given f(x) is PDF. b) Determine the number ‘b’ such that P(x<b)=P(x>b)
Solution: f(x)=6x(1-x); 0<x<1
a) For 0<x<1, f(x)≥
N O(P)QP�
+=N Z6P(1 − P)[QP
�
+
=[&K$
�−
&K7
2[ 1
0
=&�
�−
&�7
2−
&×+
�+
&×+
2
=1
So it is a PDF.
b) P(x<b)=P(x>b)
N Z6P(1 − P)[QPU
+=N Z6P(1 − P)[QP
�
U
Z3P� − 2P2[ b0=Z3P� − 2P2[ 1
b
3b2-2b3=3-2-3b2+2b3 6b2-4b3=1 2(3b2-2b3)=1 4b3-6b2+1=0 (2b-1)(2b2-2b-1)=0
b=�
� or b=
�±√(�$��×�×�)
��
b=�±√2
�>1 so this value is not acceptable as 0<x<1
Hence b=�
�
� Expectation: The expectation of a random variable X is defined as,
X = E(X)= ∑ _(P;)J;; if X is discrete random variable
N _(P)O(P)QPR
LR if X is continuous random variable
Question: Find the expectation of the number on a die when thrown? Solution: We have, xi = 1,2,3,4,5,6
pi = �
&,
�
&,
�
&,
�
&,
�
&,
�
&
∴ X = E(X) = 1 ×�
&+ 2 ×
�
&+ 3 ×
�
&+ 4 ×
�
&+ 5 ×
�
&+ 6 ×
�
&
X = ��
&
� Bernoulli’s Trials: Trials of a random experiment are called Bernoulli trials if they satisfy following conditions: 1)There should be finite number of trials. 2)Trials should be independent. 3)Each trial has two outcome either success or failure. 4)The probability of success remains same in each trial.
� Binomial Distribution: Let an experiment be repeated ‘n’ times. Each of its trials are independent and has two outcomes either success (S) or failure (F). The probability of success being p and probability of failure being q=(1-p) is constant for each trial. A random variable X(denoting number of successes) associated with these experiment is said to follow Binomial distribution if its probability mass function is given by, P(r)=P(X=r)=nCrp
rqn-r; r=0,1,2,……….,n
The two independent constants ‘n’ and ‘p’ in the distribution are known as parameters of distribution. Hence X is a binomial variate and we use notation X~B(n,p) Also here p+q=1 Proof: Here we are considering ‘n; independent Bernoulli trials. If ‘r’ successes occur then there will be (n-r) failure. One of the events may be say SSFSFFFS……FSF where S denotes success and F denotes failure. Then P(SSFSFFFS…….FSF)=P(S).P(S).P(F).P(S). P(F).P(F).P(F).P(S)…..P(F) = p.p.q.p.q.q.q.p…….q = (p.p.p……p)(q.q.q……q)
r times (n-r) times
But these ‘r’ success in ‘n’ trial can occur in nCr ways.
Here, P(r successes) = nCrprqn-r; r=0,1,2,……,n
Result:
1) To prove that above P(X=r) = nCr prqn-r is,
a) Clearly P(X=r) ≥0 for all from r=0,1,2,…..,n as p,q≥0.
b) ∑ 6(X = e)�+ = ∑ rnr
rn qpC −�
+
=nC0 p0 qn+nC1 p
1qn-1+……………+nCn pnqn-n
=qn+npqn-1+……+pn
=(q+p)n
= 1n [since p+q = 1]
=1
c)If we assume that ‘n’ trials constitute a set and if we consider N such sets then
the number of sets in which we get exactly ‘r’ successes = N(nCr prqn-r);
r=-,1,2,……..,n
� Mean: Mean(f) = E(X)=∑ rx�
g<� )(rP
=∑ e� rnrr
n qpC −�g<�
=∑ e.�!
g!(�Lg)!. Jgh�Lg�
g<�
=∑ e.�!
g.(gL�)!(�Lg)!. Jgh�Lg�
g<�
=∑�.(�L�)!
(gL�)!Z(�L�)L(gL�)[!. J. JgL�h(�L�)L(gL�)�
g<�
=np∑ )1()1(11
1 −−−−−
− rnrr
n qpC�g<�
=np[n-1C0p0qn-1+n-1C1p
1qn-2+…….+n-1Cnpn-1q0]
=np[p+q]n-1
=np
� Variance: Var(X)=E(X2)-[E(X)]2
=∑ P r
��g<� p(r)-(np)2
E(X2)= ∑ P r
��g<� p(r)=∑ (e��
g<� ) nCr Jg$h�Lg$
=∑ (e��
g<� )P(r) =∑ Ze(e − 1) + e[6(e)�
g<� =∑ Ze(e − 1)[6(e)�
g<� + ∑ e6(e)�g<�
=∑ Ze(e − 1)[6(e)�g<� +np
∑ Ze(e − 1)[6(e)�
g<� = ∑ Ze(e − 1)[�g<� prqn-r.nCr
=∑ Ze(e − 1)[�g<�
�!
g!(�Lg)! prqn-r
=∑ Ze(e − 1)[�g<�
�(�L�).(�L�)!
g(gL�).(gL�)!Z(�L�)L(gL�)[! prqn-r
=∑ �(�L�).(�L�)!
(gL�)!Z(�L�)L(gL�)[!�g<�
p2.pr-2.q(n-2)-(r-2)
=n(n-1)p2 ∑(�L�)!
(gL�)!Z(�L�)L(gL�)[!�g<� pr-2.q(n-2)-(r-2)
=n(n-1)p2∑ 2−n igL��g<� .pr-2.q(n-2)-(r-2)
=n(n-1)p2[p+q]n-2 =n(n-1)p2
∴Var(X)=n(n-1)p2+np-n2p2 =n2p2-np2+np-n2p2
=np(1-p) =npq [∵p+q=1]
Question: Out of 400 families with 4 children each, how many families would be
expected to have
a) 2 boys and 2 girls b) atleast 1 boy c) atmost 2 girls c) children of both sex
Assume equal probability for boys and girls
Solution: Probability for boys and girls are equal i.e.
p=q=�
�
N=800; n=4
Let random variable X denote number of girls.
a) P(X=r)=N nCr pr qn-r
=800×4C2 p2 q4-2
=800× 6 ×(0.5)2×(0.5)2
=�E++
��
=300
Number of families having 2 boys and 2 girls = 300
b) P(X≥ 1)=P(X=1)+P(X=2)+P(X=3)+P(X=4)
=800[4C1p1q3+4C2p
2q2+4C3p3q1+4C4p
4q0]
=800[4× 0.5 ×(0.5)3+6 ×(0.5)2 ×(0.5)2+4×(0.5)3×0.5+1×(0.5)4
=750
Number of families having atleast 1 boy=750
c) P(X ≥2)=[P(X=2)+P(X=3)+P(X=4)]
=800[4C2 p2q2+4C3 p
3q1+4C4 p4q0]
=800[6×(0.5)2×(0.5)2+4×(0.5)3×(0.5)+1×(0.5)4]
=550
Number of families having atmost 2 girls =550
d) Probability of children of both sex,
P(X ≥1 and X <4)=800[P(X=1)+P(X=2)+P(X=3)]
=800[4C1 p1q3+4C2 p
2q2+4C3 p3q1
=700
Number of families having children of both sex=700
� Fitting of Binomial Distribution: (Recurrence relation for the Probabilities of Binomial Distribution)
/(g��)
/(g)=
nk
)1(11
+−++
rnrr qp
)( rnrr
n qpC −
=�!
(g��)!Z�L(g��)[!×
(�Lg)!g!
�!×
l
m
=�Lg
g��×
l
m
P(r+1)= )(.1
rPq
p
r
rn
+−
Question: The following data gives the number of seeds germinating out of 10 on
damp filter paper for 80 set of seeds. Fit a binomial distribution to this
data.
No. of seeds(x)
0 1 2 3 4 5 6 and above
No. of sets (f)
6 20 28 12 8 6 0 ……
Solution: Mean=∑∑
i
ii
f
xf
No. of 0 1 2 3 4 5 6 and
seeds(x) above No. of sets(f)
6 20 28 12 8 6 0 ……
f i xi 0 20 56 36 32 30 0 …. Total=174
Mean=���
E+=2.175
∴ np = 2.175
Also, n=10
p = 0.2175
And, q = (1-p)=(1-0.2175)=0.7825
r P(r) f(r)=NP(r)=80P(r) 0 P(0)=10C0(0.2175)0(0.7825)10
=0.08607 f(0)=6.8856 ≅ 6.9
1 P(1)=0.2392 f(1)=19.136 ≅ 19 2 P(2)=0.2992 f(2)=23.936 ≅ 24 3 P(3)=0.2217 f(3)=17.736 ≅ 17.74 4 P(4)=0.1078 f(4)=8.6 5 P(5)=0.03598 f(5)=2.8784 ≅ 2.88 6 P(6)=0.008335 f(6)=0.6668 ≅ 0.7
Hence , the theoretical frequency distribution as obtained by fitting of binomial distribution is,
x 0 1 2 3 4 5 6 f 6.9 19.1 24 17.74 8.6 2.88 0.7
� Poisson Distribution: It was discovered by French mathematician S.D.Poisson in 1837. It is a distribution related to the probabilities of events which are extremely rare but which have a large number of independent opportunities for occurrence. A discrete random variable X, which can take random values 0,1,2,…… s said to follow Poisson distribution if its PMF is given by,
P(X=r)=opq �r
g! r=0,1,2,3…………
m>0 is said to be the parameter of distribution and we say X ~P(m) is Poisson variable . The above expression is PMF as:
a)P(x=r)= opq �r
g! ≥ 0 ∀ r=0,1,2,………where m>0
b) ==∑∞
=0
)(r
rXP opq �r
g!
=e-m∑∞
=0r
rm .�
g!
=e-m
++++ .........
6211
32 mmm
=e-m em =1 Proof: Poisson distribution can be derived as a limiting case of Binomial distribution under the condition n→ ∞, J → 0 and np=m (finite) According to Binomial distribution, the probability of ‘r’ successes in n independent trials is given by, P(XB=r)=nCr p
r qn-r, r=0,1,2,3,…….n
=�!
g!(�Lg)! Jg (1 − J)�Lg
=�(�L�)(�L�)………….Z�L(gL�)[×(�Lg)!
g!(�Lg)! Jg (1 − J)�Lg
=�l(�lLl)………..Z�lL(gL�)l[
g! (1 − J)�Lg
Now applying the limits n→ ∞, J → 0 and np=m (finite), we get
P(XP=r)=�(�L+)(�L+)…………(�L+)
g!uvw�→R
l→+
(�Ll)@
(�Ll)r
=�r
g!uvw�→R
l→+(1 − w x⁄ )� __________________(1)
Now let u=(1 − w x⁄ )� Taking log on both sides, log u=n log(1 − w x⁄ )
∵log(1-x)=−P −K$
�−
K7
2−
Ky
�
log u=n
−−−−
.........32 3
3
2
2
n
m
n
m
n
m
log u= ....32 3
3
2
2
−−−−n
m
n
mm
log u=
+++− ....
321
2
2
n
m
n
mm
u=e n
m
n
mm
2
22
32−−−
∴ uvw�→Rl→+
z={L�\
Now from equation (1) we get,
P(XP=r)= opq �r
g! , r=0,1,2,…….
� Mean: f = E(X)
=∑∞
=0
)(r
r rPx
∑
∑∞
=
−
∞
=
−
−∠=
∠=
1
0
1r
rm
r
rm
rr
mer
r
mer
m
mee
mmmme
mmmme
r
me
mm
m
m
r
rm
==
+
∠+
∠++=
+
∠+
∠++=
−∠=
−
−
−
∞
=
− ∑
.......32
1
......32
1
32
432
1
� Variance: Var(X)=E(X2)-[E(X)]2 =E(X2)-m2
mm
meme
mm
mme
mmm
mme
mr
me
mr
me
mr
merr
r
mer
r
merr
r
merrr
r
mer
rPxXE
mm
m
m
r
rm
r
rm
rm
r
r
rmrm
r
r
rm
r
rm
rr
+=
+=
+
+
∠++=
+
+
∠+
∠++=
+−∠
=
+−∠
=
+∠
−=
∠+
∠−=
∠+−=
∠=
=
−
−
−
∞
=
−
∞
=
−
−∞
=
∞
=
−−∞
=
∞
=
−
∞
=
−
∞
=
∑
∑
∑
∑∑
∑
∑
∑
2
2
22
5432
2
2
0
00
0
0
2
0
22
.......2
1
......32
)2(
)2(
)1(
)1(
])1([
)()(
∴Var(X)=m2+m-m2 Var(X)=m
Question: Find probability that atmost 5 defective fuse will be found in a box of
200 fuses if experience shows that 2% of such fuses are defective.
Solution: Let X denote the number of defective fuses
n=200
p=2%=0.02
∴P(X=r)= opq �r
g!
m=np=200*0.02=4
7851.0
86.42*
120
1024
24
256
6
64841
5
4
4
4
3
4
2
4
1
4
0
4
4)5(
4
4
544434241404
4
==
+++++=
∠+
∠+
∠+
∠+
∠+
∠=
∠=≤∴
−
−
−−−−−−
−
e
e
eeeeee
r
eXP
r
� Normal Distribution:
It is a continuous distribution and it was discovered by French mathematician De-Moinre
in 1793 as the limiting case of Binomial distribution under the following conditions:
1)n, the number of independent trials is infinitely large.
2)Neither p nor q is very small.
Definition:
A continuous random variable X is said to follow normal distribution with parameters
f(mean) and σ(standard deviation), if its PDF is given by,
2
2
2
)(
2
1)( σ
µ
πσ
−−=
x
exf
where, −∞ < P < ∞; −∞ < f < ∞; ~ > 0
To prove f(x) is PDF:
a)f(x)≥0 ∀ −∞ < P < ∞; ~ > 0
1
*1
2
11
1
1
2
2
2
2
][2
2
2
1)(
2
1)()
0
12
1
0
2
1
0
2
0
2
0
2
2
2
1
2
2
2
2
=
=
Γ=
=
=
=
=⇒=
=
=
=∴
=
=−
=
∫
∫
∫
∫
∫
∫∫
∫∫
∞ −−
∞ −−
∞−
∞ −
∞ −
∞
∞−
−∞
∞−
∞
∞−
−−∞
∞−
ππ
π
π
π
π
π
π
π
σ
σµ
πσσ
µ
dtte
dtte
t
dte
dtzdztz
put
dze
functionevenisItdze
zdedxxf
dxdz
zx
put
dxedxxfb
t
t
t
z
z
z
x
∵
� Properties of Normal Probability Curve:
Y
N(f, ~�)
N( f, ~�)
X=f X
The above is the bell shaped curve and is called normal probability curve.
Its properties are:
a)The bell shaped curve is symmetrical about the mean x=f. As x increases
numerically, f(x) decreases rapidly and maximum value of f(x) occurs at
x=f.
b)Mean,median and mode of the distribution coincide.
c)X-axis is an asymptote to the curve.
d)The points of inflexion are f ± ~. If ~ is relatively large, the curve tends to be
flat, if ~ is small, curve tends to be peaked.
e)As f(x)≥0, so no portion of the curve lies below the x-axis and N f(x)dxR
LR=1
implies that the area lying under the normal probability curve is unity.
� Standard form of Normal Distribution: z=
�Lμ
� ⇒ x = σz + μ
The random variable z has mean 0 and standard deviation σ=1 and is known as standard normal variate z ~N(0,1) To find P(x1<x<x2):
1)Obtain z1=�,Lμ
� and z2=
�$Lμ
�
2)Now P(x1<x<x2)=P(z1<z<z2) 3)Try to convert P(z1<z<z2) in the form of P(0<z<z1) orP(0<z<z2) or combination of both. 4)use normal to find P(0<Z<z)
µ
ππ
µπ
µπ
µ
πµ
πµ
πµ
µπ
σµπ
σ
σµσ
µπσ
σµ
=
=
Γ=
=
=
=
=⇒=
=
+=
+=
=⇒
+=⇒=−
=
==
∫
∫
∫
∫
∫
∫
∫
∫
∞ −−
∞ −−
∞−
∞ −
∞ −
∞
∞−
−
∞
∞−
−−
∞
∞−
Mean
dtte
dtte
t
dteMean
dtzdztz
let
dze
dze
dzezMean
dzdx
zxzx
put
dxxe
dxxfXEMean
t
t
t
z
z
z
x
21
2
2
2
2
022
1
)(2
1
2
1
)()(
0
12
1
0
2
1
0
2
0
2
0
2
2
2
1
2
2
2
2
µ
ππµπµπµπµ
πµ
πµ
µπ
σµπ
σ
σµσ
µπσ
σµ
=
=
Γ=
=
=
=
=⇒=
=
+=
+=
=⇒
+=⇒=−
=
==
∫
∫
∫
∫
∫
∫
∫
∫
∞ −−
∞ −−
∞−
∞ −
∞ −
∞
∞−
−
∞
∞−
−−
∞
∞−
Mean
dtte
dtte
t
dteMean
dtzdztz
let
dze
dze
dzezMean
dzdx
zxzx
put
dxxe
dxxfXEMean
t
t
t
z
z
z
x
2
1
2
2
2
2
022
1
)(2
1
2
1
)()(
0
12
1
0
2
1
0
2
0
2
0
2
2
2
1
2
2
2
2
� Variance:
Var(X)=
zdzdttz
put
dzez
dzez
z
z
=⇒=
=
=
∫
∫∞ −
∞
∞−
−
2
2
2
2
2
0
222
222
2
2
πσ
πσ
2
2
2
2
0
12
32
0
212
0
2
2
12
2
1
2
12
2
32
2
2
22
2
σ
ππ
σπ
σπ
σπ
σπ
σπ
σ
=
=
Γ=
Γ=
=
=
=
∫
∫
∫
∞ −−
∞−
∞−
dtte
dtte
t
dtte
t
t
t
Question: The distribution of weekly wages for 500 workers in a factory is
approximately normal with mean and standard deviation of Rs.75
and Rs. 15 respectively. Find number of workers who receive weekly
wages
a)more than Rs. 90 b)less than Rs. 45
Solution: Let X denote weekly wages of workers.
N=500; μ = 75; σ = 15
And z=�Lμ
�
a)P(X>90)=P(z>1)
=P(z>0)-P(0<z<1)
=0.5-0.3413
=0.1587
Number of worker with weekly wages greater than Rs.90=500*0.1587=79
b)P(X<45)
z=�*L�*
�*=-2
P(X<45)=P(x<-2)
=P(z>2)
=P(0<z<∞)-P(0<z<2)
=0.5-0.4772
=0.0228
Number of workers with weekly wages less than Rs. 45=500*0.00228=11
Thanks