vapnik-chervonenkis dimension part ii: lower and upper bounds
Post on 15-Jan-2016
213 views
TRANSCRIPT
Vapnik-Chervonenkis Dimension
Part II: Lower and Upper bounds
PAC Learning model
• There exists a distribution D over domain X• Examples: <x, c(x)>• Goal:
– With high probability (1-)– find h in H such that – error(h,c ) <
Definitions: Projection
• Given a concept c over X– associate it with a set (all positive examples)
• Projection (sets)– For a concept class C and subset S– C(S) = { c S | c C}
• Projection (vectors)– For a concept class C and S = {x1, … , xm}– C(S) = {<c(x1), … , c(xm)> | c C}
Definition: VC-dim
• Clearly |C(S) | 2m
• C shatters S if |C(S) | =2m
• VC dimension of a class C:– The size d of the largest set S that shatters C.– Can be infinite.
• For a finite class C– VC-dim(C) log |C|
Lower bounds: Setting
• Static learning algorithm:– asks for a sample S of size m()– Based on S selects a hypothesis
Lower bounds: Setting
• Theorem:– If VC-dim(C) = then C is not learnable.
• Proof:– Let m = m(0.1,0.1)– Find 2m points which are shattered (set T)– Let D be the uniform distribution on T– Set ct(xi)=1 with probability ½.
• Expected error ¼.• Finish proof!
Lower Bound: Feasible
• Theorem– VC-dim(C)=d+1, then m()=(d/)
• Proof:– Let T be a set of d+1 points which is shattered.– Let the distribution D be:
• z0 with prob. 1-8
• zi with prob. 8/d
Continue
– Set ct(z0)=1 and ct(zi)=1 with probability ½
• Expected error 2• Bound confidence
– for accuracy
Lower Bound: Non-Feasible
• Theorem– For two hypotheses m()=((log 1))
• Proof:– Let H={h0, h1}, where hb(x)=b
– Two distributions:
– D0: Pr[<x,1>]= ½ - and Pr[<y,0>]= ½ +
– D1: Pr[<x,1>]= ½ + and Pr[<y,0>]= ½ -
Epsilon net
• Epsilon bad concepts– B ( c ) = { h | error(h,c) > }
• A set of points S is an -net w.r.t. D if – for every hin B ( c )
– there exists a point x in S– such that h(x) c(x)
Sample size
• Event A:– The sample S1 is not an epsilon net, |S1|=m.
• Assume A holds– Let h be a epsilon-bad consistent hypothesis.
• Sample an additional sample S2
– with probability at least 1/2
– the errors of h on S2 is m/2
– for m=|S2|= O(1/
continues
• Event B– There exists h in B ( c )
– and h consistent with S1
– h has m/2 errors on S2
• Pr[ B | A ] 1/2– 2 Pr[B] P[A]
• Let F be the projection of C to S1 S2
– F=C(S1 S2 )
Error set
• ER(h)={ x : x S1 S2 and c(x)=h(x)}
• |ER(h)| m/2
• Event A: – ER(h) S1 =
• Event B: – ER(h) S1 =
– ER(h) S2= ER(h)
Combinatorial problem
• 2m black and white balls– exactly l black balls
• Consider a random partition to S1 and S2
• The probability that all the black balls in S2
l
l
i im
im
l
m
l
m
2
1
22
1
0
Completing the proof
• Probability of B– Pr[B] |F| 2-l |F| 2-m/2
• Probability of A– Pr[A] Pr[B] |F| 2-m/2
• Confidence Pr[A] • Sample
– m=O( (1/) log 1/ (1/) log |F| )
• Need to bound |F| !!!
Bounding |F|
• Define:– J(m,d)=J(m-1,d) + J(m-1,d-1)– J(m,0)=1 and J(0,d)=1
• Solving the recursion
• Claim:– Let VC-dim(C)=d and |S|=m,
– then |C(S)| J(m,d)