chernoff bounds, and etc. presented by kwak, nam-ju
Post on 17-Dec-2015
221 Views
Preview:
TRANSCRIPT
Chernoff Bounds, and etc.
Presented by Kwak, Nam-ju
TopicsA General Form of Chernoff BoundsBrief Idea of Proof for General Form of
Chernoff BoundsMore Tight form of Chernoff BoundsApplication of Chernoff Bounds: Ampli-
fication Lemma of Randomized Algo-rithm Studies
Chebyshev’s InequalityApplication of Chebyshev’s InequalityOther Considerations
A General Form of Chernoff BoundsAssumptionXi’s: random variables where
Xi∈{0, 1} and 1≤i≤n.P(Xi=1)=pi and therefore E[Xi]=pi.X: a sum of n independent random
variables, that is,
μ: the mean of X
A General Form of Chernoff BoundsWhen δ >0
Brief Idea of Proof for General Form of Chernoff Bounds
Necessary BackgroundsMarcov’s Inequality For any random variable X≥0,
When f is a non-decreasing func-tion,
Brief Idea of Proof for General Form of Chernoff Bounds
Necessary BackgroundsUpper Bound of M.G.F.
Brief Idea of Proof for General Form of Chernoff Bounds
Proof of One General Case
(proof)
Brief Idea of Proof for General Form of Chernoff Bounds
Proof of One General Case
Here, put a value of t which min-
imize the above expression as fol-lows:
Brief Idea of Proof for General Form of Chernoff Bounds
Proof of One General Case
As a result,
More Tight form of Chernoff BoundsThe form just introduced has no
limitation in choosing the value of δ other than that it should be pos-itive.
When we restrict the range of the value δ can have, we can have tight versions of Chernoff Bounds.
More Tight form of Chernoff BoundsWhen 0<δ<1
Compare these results with the upper bound of the general case:
Application of Chernoff Bounds: Amplification Lemma of Randomized Algorithm Studies A probabilistic Turing machine is a non-
deterministic Turing machine in which each nondeterministic step has two choices. (coin-flip step)
Error probability: The probability that a certain probabilistic TM produces a wrong answer for each trial.
Class BPP: a set of languages which can be recognized by polynomial time probabilistic Turing Machines with an error probability of 1/3.
Application of Chernoff Bounds: Amplification Lemma of Randomized Algorithm Studies However, even though the error proba-
bility is over 1/3, if it is between 0 and 1/2 (exclusively), it belongs to BPP.
By the amplification lemma, we can construct an alternative probabilistic Turing machine recognizing the same language with an error probability 2-a where a is any desired value. By adjust-ing the value of a, the error probability would be less than or equal to 1/3.
Application of Chernoff Bounds: Amplification Lemma of Randomized Algorithm Studies
How to construct the alternative TM?
(For a given input x)1. Select the value of k.2. Simulate the original TM 2k times.3. If more than k simulations result in
accept, accept; otherwise, reject.Now, prove how it works.
Application of Chernoff Bounds: Amplification Lemma of Randomized Algorithm Studies
Xi’s: 1 if the i-th simulation pro-duces a wrong answer; otherwise, 0.
X: the summation of 2k Xi’s, which means the number of wrongly an-swered simulations among 2k ones.
ε: the error probabilityX~B(2k, ε)μ=E[X]=2k ε
Application of Chernoff Bounds: Amplification Lemma of Randomized Algorithm Studies
P(X>k): the probability that more than half of the 2k simulations get a wrong answer.
We will show that P(X>k) can be less than 2-a for any a, by choos-ing k appropriately.
Application of Chernoff Bounds: Amplification Lemma of Randomized Algorithm Studies
Here we set δ as follows:
Therefore, by the Chernoff Bounds,
Application of Chernoff Bounds: Amplification Lemma of Randomized Algorithm Studies
To make the upper bound less than or equal to 2-a,
Application of Chernoff Bounds: Amplification Lemma of Randomized Algorithm Studies
Here, we can guarantee the right term is positive when 0<ε<1/2.
Chebyshev’s InequalityFor a random variable X of any
probabilistic distribution with mean μ and standard deviation σ,
To derive the inequality, utilize Marcov’s inequality.
Application of Chebyshev’s In-equalityUse of the Chebyshev Inequality
To Calculate 95% Upper Confi-dence Limits for DDT Contami-nated Soil Concentrations at a
Using Chebyshev’s Inequality to Determine Sample Size in Bio-metric Evaluation of Fingerprint Data Superfund Site.
Application of Chebyshev’s In-equalityFor illustration, assume we have
a large body of text, for example articles from a publication. As-sume we know that the articles are on average 1000 characters long with a standard deviation of 200 characters. From Cheby-shev's inequality we can then de-duce that at least 75% of the ar-ticles have a length between 600 and 1400 characters (k = 2).
Other Considerations
The only restriction Markov’s In-equality impose is that X should be non-negative. It even doesn’t matter whether the standard de-viation is infinite or not.
e.g. a random variable X with P.D.F.
it has a finite mean but a infinite standard deviation.
Other Considerations
P.D.F.
E[X]
Var(x)
Conclusion
Chernoff’s Bounds provide rela-tively nice upper bounds without too much restrictions.
With known mean and standard deviation, Chebyshev’s Inequality gives tight upper bounds for the probability that a certain random variable is within a fixed distance from the mean of it.
Conclusion
Any question?
top related