the small subgraph conditioning method and hypergraphs...
TRANSCRIPT
![Page 1: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/1.jpg)
The small subgraph conditioning methodand hypergraphs
Catherine Greenhill
School of Mathematics and StatisticsUNSW Sydney
![Page 2: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/2.jpg)
The small subgraph conditioning method:
![Page 3: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/3.jpg)
The small subgraph conditioning method:
An analysis of variance technique introduced by Robinson &
Wormald (1992).
![Page 4: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/4.jpg)
The small subgraph conditioning method:
An analysis of variance technique introduced by Robinson &
Wormald (1992).
Technique for analysing a random variable Y = Yn,
particularly to show that Pr(Y > 0) → 1, where the
second moment method does not apply.
![Page 5: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/5.jpg)
The small subgraph conditioning method:
An analysis of variance technique introduced by Robinson &
Wormald (1992).
Technique for analysing a random variable Y = Yn,
particularly to show that Pr(Y > 0) → 1, where the
second moment method does not apply.
Also establishes the asymptotic distribution of Y and a
property called contiguity of two probability spaces.
![Page 6: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/6.jpg)
The small subgraph conditioning method:
An analysis of variance technique introduced by Robinson &
Wormald (1992).
Technique for analysing a random variable Y = Yn,
particularly to show that Pr(Y > 0) → 1, where the
second moment method does not apply.
Also establishes the asymptotic distribution of Y and a
property called contiguity of two probability spaces.
See Wormald’s 1999 regular graphs survey
+ Janson’s 1995 paper with “contiguity” in the title.
![Page 7: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/7.jpg)
Suppose we have a sequence of probability spaces Gn indexed
by n, and a random variable Y = Yn defined on Gn.
![Page 8: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/8.jpg)
Suppose we have a sequence of probability spaces Gn indexed
by n, and a random variable Y = Yn defined on Gn.
Want to show Y > 0 asymptotically almost surely (a.a.s.); that
is, Pr(Y > 0) → 1 as n→ ∞.
![Page 9: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/9.jpg)
Suppose we have a sequence of probability spaces Gn indexed
by n, and a random variable Y = Yn defined on Gn.
Want to show Y > 0 asymptotically almost surely (a.a.s.); that
is, Pr(Y > 0) → 1 as n→ ∞.
If E(Y 2) = (1 + o(1)) E(Y )2 then, by Chebyshev’s inequality,
Pr(Y = 0) ≤ Pr(|Y − EY | ≥ EY ) ≤ Var(Y )
(EY )2
![Page 10: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/10.jpg)
Suppose we have a sequence of probability spaces Gn indexed
by n, and a random variable Y = Yn defined on Gn.
Want to show Y > 0 asymptotically almost surely (a.a.s.); that
is, Pr(Y > 0) → 1 as n→ ∞.
If E(Y 2) = (1 + o(1)) E(Y )2 then, by Chebyshev’s inequality,
Pr(Y = 0) ≤ Pr(|Y − EY | ≥ EY ) ≤ Var(Y )
(EY )2
=E(Y 2)
(EY )2− 1
![Page 11: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/11.jpg)
Suppose we have a sequence of probability spaces Gn indexed
by n, and a random variable Y = Yn defined on Gn.
Want to show Y > 0 asymptotically almost surely (a.a.s.); that
is, Pr(Y > 0) → 1 as n→ ∞.
If E(Y 2) = (1 + o(1)) E(Y )2 then, by Chebyshev’s inequality,
Pr(Y = 0) ≤ Pr(|Y − EY | ≥ EY ) ≤ Var(Y )
(EY )2
=E(Y 2)
(EY )2− 1 = o(1).
![Page 12: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/12.jpg)
Suppose we have a sequence of probability spaces Gn indexed
by n, and a random variable Y = Yn defined on Gn.
Want to show Y > 0 asymptotically almost surely (a.a.s.); that
is, Pr(Y > 0) → 1 as n→ ∞.
If E(Y 2) = (1 + o(1)) E(Y )2 then, by Chebyshev’s inequality,
Pr(Y = 0) ≤ Pr(|Y − EY | ≥ EY ) ≤ Var(Y )
(EY )2
=E(Y 2)
(EY )2− 1 = o(1).
Second moment method works.
![Page 13: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/13.jpg)
What if E(Y 2)(EY )2 → C for some constant C > 1?
![Page 14: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/14.jpg)
What if E(Y 2)(EY )2 → C for some constant C > 1?
The second moment method is not strong enough.
![Page 15: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/15.jpg)
What if E(Y 2)(EY )2 → C for some constant C > 1?
The second moment method is not strong enough.
Robinson & Wormald faced exactly this problem when studying
Hamilton cycles in random 3-regular graphs.
![Page 16: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/16.jpg)
Write [n] = 1,2, . . . , n and let Gn,d denote a uniformly random
d-regular graph on [n], where d is fixed.
![Page 17: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/17.jpg)
Write [n] = 1,2, . . . , n and let Gn,d denote a uniformly random
d-regular graph on [n], where d is fixed.
In 1984, Robinson & Wormald proved E(Y 2)(EY )2 → 3/e,
where Y is the number of Hamilton cycles in Gn,3.
![Page 18: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/18.jpg)
Write [n] = 1,2, . . . , n and let Gn,d denote a uniformly random
d-regular graph on [n], where d is fixed.
In 1984, Robinson & Wormald proved E(Y 2)(EY )2 → 3/e,
where Y is the number of Hamilton cycles in Gn,3.
This implied that Pr(Y > 0) ≥ 2 − 3/e ≈ 0.896.
![Page 19: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/19.jpg)
Write [n] = 1,2, . . . , n and let Gn,d denote a uniformly random
d-regular graph on [n], where d is fixed.
In 1984, Robinson & Wormald proved E(Y 2)(EY )2 → 3/e,
where Y is the number of Hamilton cycles in Gn,3.
This implied that Pr(Y > 0) ≥ 2 − 3/e ≈ 0.896.
In the same paper, Robinson & Wormald improved this to
Pr(Y > 0) ≥ 2 − 3/e13/12 ≈ 0.985
by studying triangle-free 3-regular graphs.
![Page 20: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/20.jpg)
Write [n] = 1,2, . . . , n and let Gn,d denote a uniformly random
d-regular graph on [n], where d is fixed.
In 1984, Robinson & Wormald proved E(Y 2)(EY )2 → 3/e,
where Y is the number of Hamilton cycles in Gn,3.
This implied that Pr(Y > 0) ≥ 2 − 3/e ≈ 0.896.
In the same paper, Robinson & Wormald improved this to
Pr(Y > 0) ≥ 2 − 3/e13/12 ≈ 0.985
by studying triangle-free 3-regular graphs.
⇒ Small cycles can have a big effect!
![Page 21: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/21.jpg)
Robinson & Wormald (1992): Proved that Pr(Y > 0) → 1,
so almost all cubic graphs are Hamiltonian. They wrote that
this result “has been suspected for some time”.
This paper introduced the small subgraph conditioning method.
![Page 22: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/22.jpg)
Robinson & Wormald (1992): Proved that Pr(Y > 0) → 1,
so almost all cubic graphs are Hamiltonian. They wrote that
this result “has been suspected for some time”.
This paper introduced the small subgraph conditioning method.
Janson (1995) observed that R & W’s proof technique also
• gives the asymptotic distribution of Y ,
![Page 23: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/23.jpg)
Robinson & Wormald (1992): Proved that Pr(Y > 0) → 1,
so almost all cubic graphs are Hamiltonian. They wrote that
this result “has been suspected for some time”.
This paper introduced the small subgraph conditioning method.
Janson (1995) observed that R & W’s proof technique also
• gives the asymptotic distribution of Y , and
• establishes a property called “contiguity” between Gn,3 and
a probability space, denoted G(Y )n,3 , where each 3-regular graph
G on [n] has probability proportional to Y (G).
![Page 24: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/24.jpg)
Le Cam (1960):
Suppose (An) and (Bn) are two sequences of probability spaces
on the same sequence of underlying sets (Ωn).
![Page 25: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/25.jpg)
Le Cam (1960):
Suppose (An) and (Bn) are two sequences of probability spaces
on the same sequence of underlying sets (Ωn).
Say (An) and (Bn) are (mutually) contiguous if
PrAn(En) → 1 if and only if PrBn(En) → 1
for all En ⊆ Ωn.
![Page 26: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/26.jpg)
Le Cam (1960):
Suppose (An) and (Bn) are two sequences of probability spaces
on the same sequence of underlying sets (Ωn).
Say (An) and (Bn) are (mutually) contiguous if
PrAn(En) → 1 if and only if PrBn(En) → 1
for all En ⊆ Ωn.
Write An ≈ Bn when (An) and (Bn) are contiguous.
Janson: contiguity is “qualitative asymptotic equivalence”.
![Page 27: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/27.jpg)
Recall, G(Y )n,3 gives each 3-regular graph G on [n] probability
proportional to Y (G).
![Page 28: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/28.jpg)
Recall, G(Y )n,3 gives each 3-regular graph G on [n] probability
proportional to Y (G).
Robinson & Wormald (1992), restated: G(Y )n,3 ≈ Gn,3.
![Page 29: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/29.jpg)
Recall, G(Y )n,3 gives each 3-regular graph G on [n] probability
proportional to Y (G).
Robinson & Wormald (1992), restated: G(Y )n,3 ≈ Gn,3.
Observe, a graph in G(Y )n,3 is Hamiltonian with probability 1.
![Page 30: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/30.jpg)
Recall, G(Y )n,3 gives each 3-regular graph G on [n] probability
proportional to Y (G).
Robinson & Wormald (1992), restated: G(Y )n,3 ≈ Gn,3.
Observe, a graph in G(Y )n,3 is Hamiltonian with probability 1.
Then contiguity immediately implies that
Pr(Gn,3 is Hamiltonian) → 1.
![Page 31: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/31.jpg)
Can also rephrase R & W result to say: Gn,3 is contiguous
with the superposition of a uniformly random Hamilton cycle
and a uniformly random perfect matching, both on [n].
![Page 32: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/32.jpg)
Can also rephrase R & W result to say: Gn,3 is contiguous
with the superposition of a uniformly random Hamilton cycle
and a uniformly random perfect matching, both on [n].
+ =
![Page 33: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/33.jpg)
Can also rephrase R & W result to say: Gn,3 is contiguous
with the superposition of a uniformly random Hamilton cycle
and a uniformly random perfect matching, both on [n].
+ =
![Page 34: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/34.jpg)
Can also rephrase R & W result to say: Gn,3 is contiguous
with the superposition of a uniformly random Hamilton cycle
and a uniformly random perfect matching, both on [n].
+ =
NO!
![Page 35: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/35.jpg)
Can also rephrase R & W result to say: Gn,3 is contiguous
with the superposition of a uniformly random Hamilton cycle
and a uniformly random perfect matching, both on [n].
+ =
![Page 36: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/36.jpg)
Can also rephrase R & W result to say: Gn,3 is contiguous
with the superposition of a uniformly random Hamilton cycle
and a uniformly random perfect matching, both on [n].
+ =
![Page 37: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/37.jpg)
Can also rephrase R & W result to say: Gn,3 is contiguous
with the superposition of a uniformly random Hamilton cycle
and a uniformly random perfect matching, both on [n].
+ =
Leads to “contiguity arithmetic”. See Wormald’s 1999 survey.
![Page 38: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/38.jpg)
Can also rephrase R & W result to say: Gn,3 is contiguous
with the superposition of a uniformly random Hamilton cycle
and a uniformly random perfect matching, both on [n].
+ =
Leads to “contiguity arithmetic”. See Wormald’s 1999 survey.
(Warning: 1 + 1 6= 2.)
![Page 39: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/39.jpg)
Small subgraph conditioning method (SSCM)
Robinson & Wormald (1992,1994), Janson (1995)
![Page 40: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/40.jpg)
Small subgraph conditioning method (SSCM)
Robinson & Wormald (1992,1994), Janson (1995)
Let λi > 0 and δi ≥ −1 be constants, for i ≥ 1.
![Page 41: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/41.jpg)
Small subgraph conditioning method (SSCM)
Robinson & Wormald (1992,1994), Janson (1995)
Let λi > 0 and δi ≥ −1 be constants, for i ≥ 1.
Suppose that for all n you have random variables Xin and
Yn, defined on same probability space Gn, where the Xin are
nonnegative integer-valued and EYn 6= 0.
![Page 42: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/42.jpg)
Small subgraph conditioning method (SSCM)
Robinson & Wormald (1992,1994), Janson (1995)
Let λi > 0 and δi ≥ −1 be constants, for i ≥ 1.
Suppose that for all n you have random variables Xin and
Yn, defined on same probability space Gn, where the Xin are
nonnegative integer-valued and EYn 6= 0.
Further suppose that:
(A1) Xind→ Zi as n→ ∞, jointly for all i ≥ 1, where Zi ∼ Po(λi)
are independent Poisson.
![Page 43: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/43.jpg)
(A2) For any sequence x1, . . . , xm of nonnegative integers,
E(Yn | X1n = x1, . . . , Xmn = xm)
EYn→
m∏
i=1
(1 + δi)xi e−λiδi
as n→ ∞.
![Page 44: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/44.jpg)
(A2) For any sequence x1, . . . , xm of nonnegative integers,
E(Yn | X1n = x1, . . . , Xmn = xm)
EYn→
m∏
i=1
(1 + δi)xi e−λiδi
as n→ ∞.
(A3)∑∞i=1 λiδ
2i <∞.
![Page 45: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/45.jpg)
(A2) For any sequence x1, . . . , xm of nonnegative integers,
E(Yn | X1n = x1, . . . , Xmn = xm)
EYn→
m∏
i=1
(1 + δi)xi e−λiδi
as n→ ∞.
(A3)∑∞i=1 λiδ
2i <∞.
(A4)E(Y 2
n )(EYn)2 → exp
(
∑∞i=1 λiδ
2i
)
as n→ ∞.
![Page 46: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/46.jpg)
(A2) For any sequence x1, . . . , xm of nonnegative integers,
E(Yn | X1n = x1, . . . , Xmn = xm)
EYn→
m∏
i=1
(1 + δi)xi e−λiδi
as n→ ∞.
(A3)∑∞i=1 λiδ
2i <∞.
(A4)E(Y 2
n )(EYn)2 → exp
(
∑∞i=1 λiδ
2i
)
as n→ ∞.
Then (distribution version, Janson 1995):
Yn
EYn
d−→ W =∞∏
i=1
(1 + δi)Zi e−λiδi as n→ ∞;
![Page 47: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/47.jpg)
(A2) For any sequence x1, . . . , xm of nonnegative integers,
E(Yn | X1n = x1, . . . , Xmn = xm)
EYn→
m∏
i=1
(1 + δi)xi e−λiδi
as n→ ∞.
(A3)∑∞i=1 λiδ
2i <∞.
(A4)E(Y 2
n )(EYn)2 → exp
(
∑∞i=1 λiδ
2i
)
as n→ ∞.
Then (distribution version, Janson 1995):
Yn
EYn
d−→ W =∞∏
i=1
(1 + δi)Zi e−λiδi as n→ ∞;
moreover, this and the convergence in (A1) hold jointly.
![Page 48: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/48.jpg)
(A2) For any sequence x1, . . . , xm of nonnegative integers,
E(Yn | X1n = x1, . . . , Xmn = xm)
EYn→
m∏
i=1
(1 + δi)xi e−λiδi
as n→ ∞.
(A3)∑∞i=1 λiδ
2i <∞.
(A4)E(Y 2
n )(EYn)2 → exp
(
∑∞i=1 λiδ
2i
)
as n→ ∞.
Then (distribution version, Janson 1995):
Yn
EYn
d−→ W =∞∏
i=1
(1 + δi)Zi e−λiδi as n→ ∞;
moreover, this and the convergence in (A1) hold jointly.
Also, W > 0 almost surely if and only if δi > −1 for all i.
![Page 49: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/49.jpg)
(A2) For any sequence x1, . . . , xm of nonnegative integers,
E(Yn | X1n = x1, . . . , Xmn = xm)
EYn→
m∏
i=1
(1 + δi)xi e−λiδi
as n→ ∞.
(A3)∑∞i=1 λiδ
2i <∞.
(A4)E(Y 2
n )(EYn)2 → exp
(
∑∞i=1 λiδ
2i
)
as n→ ∞.
![Page 50: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/50.jpg)
(A2) For any sequence x1, . . . , xm of nonnegative integers,
E(Yn | X1n = x1, . . . , Xmn = xm)
EYn→
m∏
i=1
(1 + δi)xi e−λiδi
as n→ ∞.
(A3)∑∞i=1 λiδ
2i <∞.
(A4)E(Y 2
n )(EYn)2 → exp
(
∑∞i=1 λiδ
2i
)
as n→ ∞.
Then (contiguity version, Wormald 1999):
Pr(Yn > 0) = exp
(
−∑
δi=−1
λi
)
+ o(1)
and G(Yn)n ≈ Gn if δi > −1 for all i.
![Page 51: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/51.jpg)
Many structural results about regular graphs, regular bipartite
graphs, proved using SSCM by various authors:
Delcourt, Frieze, Greenhill, Janson, Jerrum, Kim, Kwan, Molloy, Postle,
Pra lat, Robalewska, Robinson, Rucinski, Shi, Wind, Wormald.
(Apologies to any I’ve missed!)
![Page 52: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/52.jpg)
Many structural results about regular graphs, regular bipartite
graphs, proved using SSCM by various authors:
Delcourt, Frieze, Greenhill, Janson, Jerrum, Kim, Kwan, Molloy, Postle,
Pra lat, Robalewska, Robinson, Rucinski, Shi, Wind, Wormald.
(Apologies to any I’ve missed!)
A couple of examples:
Kim & Wormald (2001)
A.a.s. Gn,4 is the union of two Hamilton cycles.
![Page 53: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/53.jpg)
Many structural results about regular graphs, regular bipartite
graphs, proved using SSCM by various authors:
Delcourt, Frieze, Greenhill, Janson, Jerrum, Kim, Kwan, Molloy, Postle,
Pra lat, Robalewska, Robinson, Rucinski, Shi, Wind, Wormald.
(Apologies to any I’ve missed!)
A couple of examples:
Kim & Wormald (2001)
A.a.s. Gn,4 is the union of two Hamilton cycles.
Pra lat & Wormald (2019)
Almost all 5-regular graphs have a 3-flow.
![Page 54: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/54.jpg)
More recent applications use SSCM to study thresholds in
random constraint satisfaction problems, e.g.
![Page 55: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/55.jpg)
More recent applications use SSCM to study thresholds in
random constraint satisfaction problems, e.g.
Bapst, Coja-Oghlan, Efthymiou (2017),
random colourings in G(n,m)
Coja-Oghlan & Wormald (2018), random k-SAT formulae
Coja-Oghlan, Kapetanopoulos, Muller (2020+),
random constraint satisfaction problems
![Page 56: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/56.jpg)
More recent applications use SSCM to study thresholds in
random constraint satisfaction problems, e.g.
Bapst, Coja-Oghlan, Efthymiou (2017),
random colourings in G(n,m)
Coja-Oghlan & Wormald (2018), random k-SAT formulae
Coja-Oghlan, Kapetanopoulos, Muller (2020+),
random constraint satisfaction problems
Idea: the planted model is easier to study, so prove that this
model is contiguous with respect to the standard model.
![Page 57: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/57.jpg)
More recent applications use SSCM to study thresholds in
random constraint satisfaction problems, e.g.
Bapst, Coja-Oghlan, Efthymiou (2017),
random colourings in G(n,m)
Coja-Oghlan & Wormald (2018), random k-SAT formulae
Coja-Oghlan, Kapetanopoulos, Muller (2020+),
random constraint satisfaction problems
Idea: the planted model is easier to study, so prove that this
model is contiguous with respect to the standard model.
Also: Bank, Moore, Neeman, Netrapalli (2016),
community detection in sparse networks.
![Page 58: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/58.jpg)
Rest of talk: structural results for regular graphs, or regular
uniform hypergraphs. Firstly, regular graphs.
![Page 59: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/59.jpg)
Rest of talk: structural results for regular graphs, or regular
uniform hypergraphs. Firstly, regular graphs.
Usually, calculations are performed in the configuration model
where Xin is the number of cycles of length i.
![Page 60: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/60.jpg)
Rest of talk: structural results for regular graphs, or regular
uniform hypergraphs. Firstly, regular graphs.
Usually, calculations are performed in the configuration model
where Xin is the number of cycles of length i.
(These are the “small subgraphs” in the name of the method.)
![Page 61: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/61.jpg)
Rest of talk: structural results for regular graphs, or regular
uniform hypergraphs. Firstly, regular graphs.
Usually, calculations are performed in the configuration model
where Xin is the number of cycles of length i.
(These are the “small subgraphs” in the name of the method.)
Bollobas (1980): the Xin are asymptotically independent
Poisson with mean (d− 1)i/(2i).
![Page 62: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/62.jpg)
Rest of talk: structural results for regular graphs, or regular
uniform hypergraphs. Firstly, regular graphs.
Usually, calculations are performed in the configuration model
where Xin is the number of cycles of length i.
(These are the “small subgraphs” in the name of the method.)
Bollobas (1980): the Xin are asymptotically independent
Poisson with mean (d− 1)i/(2i).
The SSCM works when the variance of Y is well-controlled by
the short cycle counts.
![Page 63: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/63.jpg)
Configuration model (Bollobas, 1980)
Start with n cells, each containing d points. Take a uniformly
random perfect matching of dn points into dn/2 pairs.
![Page 64: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/64.jpg)
Configuration model (Bollobas, 1980)
Start with n cells, each containing d points. Take a uniformly
random perfect matching of dn points into dn/2 pairs.
![Page 65: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/65.jpg)
Configuration model (Bollobas, 1980)
Start with n cells, each containing d points. Take a uniformly
random perfect matching of dn points into dn/2 pairs.
Shrink each cell to a vertex to get a d-regular multigraph.
Every simple graph is equally likely.
![Page 66: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/66.jpg)
Configuration model (Bollobas, 1980)
Start with n cells, each containing d points. Take a uniformly
random perfect matching of dn points into dn/2 pairs.
Shrink each cell to a vertex to get a d-regular multigraph.
Every simple graph is equally likely.
Bender & Canfield (1978): Pr(simple) ∼ e−(d2−1)/4.
![Page 67: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/67.jpg)
Suppose that Y is a random variable of interest in a random
configuration, and YG is the corresponding variable in Gn,d.
![Page 68: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/68.jpg)
Suppose that Y is a random variable of interest in a random
configuration, and YG is the corresponding variable in Gn,d.
Assume we have proved (A1)–(A4) hold for Y .
If Pr(Y = 0) = o(1) then
Pr(YG = 0) ≤ Pr(Y = 0)
Pr(simple)= o(1).
![Page 69: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/69.jpg)
Suppose that Y is a random variable of interest in a random
configuration, and YG is the corresponding variable in Gn,d.
Assume we have proved (A1)–(A4) hold for Y .
If Pr(Y = 0) = o(1) then
Pr(YG = 0) ≤ Pr(Y = 0)
Pr(simple)= o(1).
Also, if the uniform and Y -weighted configuration models are
contiguous then G(Y )n,d ≈ Gn,d.
![Page 70: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/70.jpg)
Suppose that Y is a random variable of interest in a random
configuration, and YG is the corresponding variable in Gn,d.
![Page 71: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/71.jpg)
Suppose that Y is a random variable of interest in a random
configuration, and YG is the corresponding variable in Gn,d.
Assume we have proved (A1)–(A4) hold for Y , and hence
Y
EY
d−→ W =∞∏
i=1
(1 + δi)Zi e−λiδi
for some constants λi, δi, and where the random variables
Zi ∼ Po(λi) are independent.
![Page 72: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/72.jpg)
Suppose that Y is a random variable of interest in a random
configuration, and YG is the corresponding variable in Gn,d.
Assume we have proved (A1)–(A4) hold for Y , and hence
Y
EY
d−→ W =∞∏
i=1
(1 + δi)Zi e−λiδi
for some constants λi, δi, and where the random variables
Zi ∼ Po(λi) are independent.
Now (A2) implies that
EYGEY
=E(Y | X1n = X2n = 0)
EY→ e−λ1δ1−λ2δ2.
(A configuration gives a simple graph iff X1n = X2n = 0.)
![Page 73: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/73.jpg)
By the joint convergence of ( YE(Y )
, X1n, X2n) to (W,Z1, Z2), we
conclude that
L(
YGEY
)
= L(
Y
EY
∣
∣
∣
∣
∣
X1n = X2n = 0
)
d−→ L(W | Z1 = Z2 = 0)
= L
e−λ1δ1−λ2δ2
∞∏
i=3
(1 + δi)Zi e−λiδi
.
![Page 74: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/74.jpg)
By the joint convergence of ( YE(Y )
, X1n, X2n) to (W,Z1, Z2), we
conclude that
L(
YGEY
)
= L(
Y
EY
∣
∣
∣
∣
∣
X1n = X2n = 0
)
d−→ L(W | Z1 = Z2 = 0)
= L
e−λ1δ1−λ2δ2
∞∏
i=3
(1 + δi)Zi e−λiδi
.
Hence
YGEYG
∼ eλ1δ1+λ2δ2YGEY
d−→∞∏
i=3
(1 + δi)Zi e−λiδi.
![Page 75: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/75.jpg)
By the joint convergence of ( YE(Y )
, X1n, X2n) to (W,Z1, Z2), we
conclude that
L(
YGEY
)
= L(
Y
EY
∣
∣
∣
∣
∣
X1n = X2n = 0
)
d−→ L(W | Z1 = Z2 = 0)
= L
e−λ1δ1−λ2δ2
∞∏
i=3
(1 + δi)Zi e−λiδi
.
Hence
YGEYG
∼ eλ1δ1+λ2δ2YGEY
d−→∞∏
i=3
(1 + δi)Zi e−λiδi.
TL;DR Delete i = 1,2 factors to get result for regular graphs!
![Page 76: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/76.jpg)
What about hypergraphs?
Let Gn,r,s denote a uniformly random r-regular s-uniform
hypergraph on [n]. Here r, s are fixed constants. Assume s|rn.
![Page 77: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/77.jpg)
What about hypergraphs?
Let Gn,r,s denote a uniformly random r-regular s-uniform
hypergraph on [n]. Here r, s are fixed constants. Assume s|rn.
Calculations are performed in the configuration model.
![Page 78: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/78.jpg)
What about hypergraphs?
Let Gn,r,s denote a uniformly random r-regular s-uniform
hypergraph on [n]. Here r, s are fixed constants. Assume s|rn.
Calculations are performed in the configuration model.
![Page 79: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/79.jpg)
What about hypergraphs?
Let Gn,r,s denote a uniformly random r-regular s-uniform
hypergraph on [n]. Here r, s are fixed constants. Assume s|rn.
Calculations are performed in the configuration model.
loop!
repeated edge!
Cooper, Frieze, Molloy & Reed (1996): Pr(simple) ∼ e−(r−1)(s−1)
2 .
![Page 80: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/80.jpg)
Let Xin be the number of loose i-cycles in Gn,r,s, for i ≥ 2,
and let X1n be the number of 1-cycles.
![Page 81: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/81.jpg)
Let Xin be the number of loose i-cycles in Gn,r,s, for i ≥ 2,
and let X1n be the number of 1-cycles.
x x y z
x x x y
![Page 82: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/82.jpg)
Let Xin be the number of loose i-cycles in Gn,r,s, for i ≥ 2,
and let X1n be the number of 1-cycles.
x x y z
x x x y
Cooper et al. (1996) proved that the Xin are
asymptotically independent Poisson random variables, with
EXin → λi =((r − 1)(s− 1))i
2i.
So (A1) holds.
![Page 83: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/83.jpg)
PROBLEM: In the configuration model, when s ≥ 3, the event
“is simple” is NOT captured by conditioning on the event
X1n = X2n = 0, or on the event X1n = 0.
![Page 84: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/84.jpg)
PROBLEM: In the configuration model, when s ≥ 3, the event
“is simple” is NOT captured by conditioning on the event
X1n = X2n = 0, or on the event X1n = 0.
Since Pr(repeated edge) = o(1), conditional probabilities are
no problem, but we must be careful with the expected value.
![Page 85: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/85.jpg)
PROBLEM: In the configuration model, when s ≥ 3, the event
“is simple” is NOT captured by conditioning on the event
X1n = X2n = 0, or on the event X1n = 0.
Since Pr(repeated edge) = o(1), conditional probabilities are
no problem, but we must be careful with the expected value.
⇒ The SSCM can’t tell usEYGEY .
![Page 86: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/86.jpg)
Cooper, Frieze, Molloy & Reed (1996): existence threshold
for perfect matchings, which a.a.s. exist in Gn,r,s when s < σr,
where
σr =log r
(r − 1) log(r/(r − 1))+ 1.
![Page 87: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/87.jpg)
Cooper, Frieze, Molloy & Reed (1996): existence threshold
for perfect matchings, which a.a.s. exist in Gn,r,s when s < σr,
where
σr =log r
(r − 1) log(r/(r − 1))+ 1.
Altman, Greenhill, Isaev, Ramadurai (2020):
existence threshold for loose Hamilton cycles, which a.a.s. exist
in Gn,r,s when r > ρ(s), where
ρ(s) ≈ es−1
s− 1− s− 2
2+ os(1).
(The os(1) term tends to zero exponentially fast as s→ ∞.)
![Page 88: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/88.jpg)
Greenhill, Isaev, Liang (arXiv:2005.07350):
existence threshold for spanning trees in Gn,r,s.
![Page 89: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/89.jpg)
Greenhill, Isaev, Liang (arXiv:2005.07350):
existence threshold for spanning trees in Gn,r,s.
If s ≥ 5 then spanning trees a.a.s. exist in Gn,r,s when
r > ρ(s), where
ρ(s) ≈ es−2
s− 1− s2 − 3s+ 1
2(s− 1)+ os(1).
![Page 90: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/90.jpg)
Greenhill, Isaev, Liang (arXiv:2005.07350):
existence threshold for spanning trees in Gn,r,s.
If s ≥ 5 then spanning trees a.a.s. exist in Gn,r,s when
r > ρ(s), where
ρ(s) ≈ es−2
s− 1− s2 − 3s+ 1
2(s− 1)+ os(1).
If s = 2,3,4 then any r ≥ 2 gives a.a.s. existence, except
(r, s) = (2,2).
![Page 91: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/91.jpg)
Greenhill, Isaev, Liang (arXiv:2005.07350):
existence threshold for spanning trees in Gn,r,s.
If s ≥ 5 then spanning trees a.a.s. exist in Gn,r,s when
r > ρ(s), where
ρ(s) ≈ es−2
s− 1− s2 − 3s+ 1
2(s− 1)+ os(1).
If s = 2,3,4 then any r ≥ 3 gives a.a.s. existence, except
(r, s) = (2,2).
We build on earlier work by Greenhill, Kwan, Wind (2014) for
graphs, which
• found expected number of spanning trees in Gn,d for d ≥ 3,
• gave asymptotic distribution for cubic graphs.
![Page 92: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/92.jpg)
A tree is connected and acyclic, where these terms are defined
using Berge cycles and Berge paths. No 2-cycles means that
edges overlap in at most 1 vertex (linear).
![Page 93: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/93.jpg)
A tree is connected and acyclic, where these terms are defined
using Berge cycles and Berge paths. No 2-cycles means that
edges overlap in at most 1 vertex (linear).
A necessary condition for an s-uniform hypergraph on [n] to
contain a spanning tree is that
n = (s− 1)t+ 1
where t = n−1s−1 ∈ Z
+ is the number of edges in the spanning
tree.
![Page 94: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/94.jpg)
Trees in uniform hypergraphs
Suppose that n = (s− 1)t+ 1 for some t ∈ Z+.
Bolian (1988) The number of s-uniform trees on [n] is
nt−1 (n− 1)!
t! ((s− 1)!)t.
![Page 95: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/95.jpg)
Trees in uniform hypergraphs
Suppose that n = (s− 1)t+ 1 for some t ∈ Z+.
Bolian (1988) The number of s-uniform trees on [n] is
nt−1 (n− 1)!
t! ((s− 1)!)t.
When s = 2 we recover Cayley’s formula (here t = n− 1).
![Page 96: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/96.jpg)
Trees in uniform hypergraphs
Suppose that n = (s− 1)t+ 1 for some t ∈ Z+.
A tree degree sequence is a sequence x = (x1, . . . , xn) of
positive integers which sum to st.
![Page 97: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/97.jpg)
Trees in uniform hypergraphs
Suppose that n = (s− 1)t+ 1 for some t ∈ Z+.
A tree degree sequence is a sequence x = (x1, . . . , xn) of
positive integers which sum to st.
Bacher (2011)
The number of s-uniform trees on [n] with degree sequence x
is
(s− 1) (n− 2)!
((s− 1)!)t
n∏
i=1
1
(xj − 1)!.
![Page 98: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/98.jpg)
Trees in uniform hypergraphs
Suppose that n = (s− 1)t+ 1 for some t ∈ Z+.
A tree degree sequence is a sequence x = (x1, . . . , xn) of
positive integers which sum to st.
Bacher (2011)
The number of s-uniform trees on [n] with degree sequence x
is
(s− 1) (n− 2)!
((s− 1)!)t
n∏
i=1
1
(xj − 1)!.
This generalises the result of Moon (1970) in the graph case.
These results can be proved using a hypergraph analogue of
Prufer codes.
![Page 99: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/99.jpg)
Expected number
By summing over all tree degree sequences x, we showed that
the expected number of spanning trees in the configuration
model is
EY =(s− 1)(n− 2)!
((s− 1)!)t
∑
x
n∏
j=1
(r)xj
(xj − 1)!
p(rn− st)
p(rn)
![Page 100: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/100.jpg)
Expected number
By summing over all tree degree sequences x, we showed that
the expected number of spanning trees in the configuration
model is
EY =(s− 1)(n− 2)!
((s− 1)!)t
∑
x
n∏
j=1
(r)xj
(xj − 1)!
p(rn− st)
p(rn)
=rn (s− 1) (n− 2)!
((s− 1!)t
((r − 1)n
t− 1
) p(rn− st)
p(rn)
where p(sN) is the number of ways to partition sN points into
N subsets (parts) of s points.
![Page 101: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/101.jpg)
Apply Stirling’s formula:
EY
∼ (r − 1)1/2(s− 1)
n(rs− r − s)s+1
2(s−1)
(
(s− 1)r (r − 1)r−1
r(rs−r−s) (rs− r − s)(rs−r−s)/(s−1)
)n/s
.
![Page 102: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/102.jpg)
Apply Stirling’s formula:
EY
∼ (r − 1)1/2(s− 1)
n(rs− r − s)s+1
2(s−1)
(
(s− 1)r (r − 1)r−1
r(rs−r−s) (rs− r − s)(rs−r−s)/(s−1)
)n/s
.
The behaviour is dominated by the base of the exponential:
taking the logarithm, let
Ls(r) = rs log(s− 1) + (r − 1) log(r − 1)
− rs−r−ss log(r) − rs−r−s
s(s−1)log(rs− r − s).
![Page 103: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/103.jpg)
The behaviour is dominated by the base of the exponential:
taking the logarithm, let
Ls(r) = rs log(s− 1) + (r − 1) log(r − 1)
− rs−r−ss log(r) − rs−r−s
s(s−1)log(rs− r − s).
We proved that if s ∈ 2,3,4 then Ls(r) > 0 for all r ≥ 2,
except (r, s) = (2,2).
![Page 104: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/104.jpg)
The behaviour is dominated by the base of the exponential:
taking the logarithm, let
Ls(r) = rs log(s− 1) + (r − 1) log(r − 1)
− rs−r−ss log(r) − rs−r−s
s(s−1)log(rs− r − s).
We proved that if s ∈ 2,3,4 then Ls(r) > 0 for all r ≥ 2,
except (r, s) = (2,2).
For s ≥ 5 there is a unique threshold ρ(s) ∈ (2,∞) so that
Ls(r) is
< 0 for r ∈ [2, ρ(s)),
> 0 for r ∈ (ρ(s),∞).
![Page 105: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/105.jpg)
The behaviour is dominated by the base of the exponential:
taking the logarithm, let
Ls(r) = rs log(s− 1) + (r − 1) log(r − 1)
− rs−r−ss log(r) − rs−r−s
s(s−1)log(rs− r − s).
We proved that if s ∈ 2,3,4 then Ls(r) > 0 for all r ≥ 2,
except (r, s) = (2,2).
For s ≥ 5 there is a unique threshold ρ(s) ∈ (2,∞) so that
Ls(r) is
< 0 for r ∈ [2, ρ(s)),
> 0 for r ∈ (ρ(s),∞).
s 5 6 7 8 9 10 11
ρ(s) 3.03 8.71 22.14 54.61 133.59 327.25 805.84
![Page 106: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/106.jpg)
Short cycles
We calculated that in the configuration model,
E(Y Xj)
EY−→ λj(1 + δj)
where
δj =
(
rr−1 − s+ 1
)j − 2
((r − 1)(s− 1))j.
(Similar calculations for more than one cycle.)
![Page 107: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/107.jpg)
Short cycles
We calculated that in the configuration model,
E(Y Xj)
EY−→ λj(1 + δj)
where
δj =
(
rr−1 − s+ 1
)j − 2
((r − 1)(s− 1))j.
(Similar calculations for more than one cycle.)
Then we showed that (A2) and (A3) hold, and
exp
∞∑
k=2
λkδ2k
=r2
√s− 1
√
(r2 − rs+ r + s− 1)(rs− r − s)(r − 1).
![Page 108: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/108.jpg)
Second moment
We must prove a certain 2-variable real function has a unique
global maximum in the interior of a given bounded domain.
![Page 109: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/109.jpg)
Second moment
We must prove a certain 2-variable real function has a unique
global maximum in the interior of a given bounded domain.
We express the second moment as, up to a (1 + o(1)) factor,
∑
(k,b)∈Dψ(k/n, b/n) exp(nϕ(k/n, b/n))
where k, b are two parameters arising from the combinatorics
and D is the natural domain of these parameters. The function
ψ(α, β) is relatively unimportant . . .
![Page 110: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/110.jpg)
. . . and
ϕ(α, β) =(α+ β) log(r − 1) + g(α+ β) + g(r − 1 − α− β)
− 2s−1g(β) − g(α) − 1
s(s−1)g(rs− r − s− sβ)
− 1s−1g(1 − (s− 1)α− β)
where g(x) = x logx for x > 0, and g(0) = 0.
![Page 111: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/111.jpg)
. . . and
ϕ(α, β) =(α+ β) log(r − 1) + g(α+ β) + g(r − 1 − α− β)
− 2s−1g(β) − g(α) − 1
s(s−1)g(rs− r − s− sβ)
− 1s−1g(1 − (s− 1)α− β)
where g(x) = x logx for x > 0, and g(0) = 0.
Lemma: Assume that r, s ≥ 2 such that r > ρ(s) when s ≥ 5,
or r ≥ 3 when s ∈ 2,3,4. Then ϕ has a unique maximum in
the relevant domain at the point
α0 = 1r(s−1)
, β0 = rs−r−sr(s−1)
.
![Page 112: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/112.jpg)
. . . and
ϕ(α, β) =(α+ β) log(r − 1) + g(α+ β) + g(r − 1 − α− β)
− 2s−1g(β) − g(α) − 1
s(s−1)g(rs− r − s− sβ)
− 1s−1g(1 − (s− 1)α− β)
where g(x) = x logx for x > 0, and g(0) = 0.
Lemma: Assume that r, s ≥ 2 such that r > ρ(s) when s ≥ 5,
or r ≥ 3 when s ∈ 2,3,4. Then ϕ has a unique maximum in
the relevant domain at the point
α0 = 1r(s−1)
, β0 = rs−r−sr(s−1)
.
This implies that (A4) holds ⇒ can apply SSCM to Y .
![Page 113: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/113.jpg)
What about that PROBLEM going from EY to EYG?
![Page 114: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/114.jpg)
What about that PROBLEM going from EY to EYG?
Happily, Aldosari & Greenhill (arXiv:1907.04493) used
asymptotic enumeration, in a more general setting that covers
constant r, s, to show that
EYG ∼ e−λ1δ1 EY .
![Page 115: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/115.jpg)
What about that PROBLEM going from EY to EYG?
Happily, Aldosari & Greenhill (arXiv:1907.04493) used
asymptotic enumeration, in a more general setting that covers
constant r, s, to show that
EYG ∼ e−λ1δ1 EY .
This leads to the existence threshold result, and gives us the
asymptotic distribution: if EYG → ∞ then
YGEYG
d−→∞∏
j=2
(1 + δj)Zj eλjδj as n→ ∞.
![Page 116: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/116.jpg)
Some ingredients in the proof
We used Generalised Jensen’s identity: for b ≥ 2,
∑
k1+···+kb=m,kj≥0
b∏
i=1
(xi + ckiki
)
=m∑
k=0
(k + b− 2
k
) (x1 + · · · + xb + cm− k
m− k
)
ck.
![Page 117: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/117.jpg)
Some ingredients in the proof
We used Generalised Jensen’s identity: for b ≥ 2,
∑
k1+···+kb=m,kj≥0
b∏
i=1
(xi + ckiki
)
=m∑
k=0
(k + b− 2
k
) (x1 + · · · + xb + cm− k
m− k
)
ck.
This led to a more tractable form for the expression for the
second moment, and enabled us to extend Greenhill, Kwan,
Wind (2014).
![Page 118: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/118.jpg)
Some ingredients in the proof
We used Generalised Jensen’s identity: for b ≥ 2,
∑
k1+···+kb=m,kj≥0
b∏
i=1
(xi + ckiki
)
=m∑
k=0
(k + b− 2
k
) (x1 + · · · + xb + cm− k
m− k
)
ck.
This led to a more tractable form for the expression for the
second moment, and enabled us to extend Greenhill, Kwan,
Wind (2014).
Also generating functions (for short cycles) and a Laplace
summation theorem from Greenhill, Janson and Rucinski (2010)
to help with the second moment calculations.
![Page 119: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/119.jpg)
Greenhill, Janson, Rucinski (2010), Laplace summation tool.
Say you want to evaluate
∑
ℓ∈(L+ℓn)∩nKan(ℓ)
where
![Page 120: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/120.jpg)
Greenhill, Janson, Rucinski (2010), Laplace summation tool.
Say you want to evaluate
∑
ℓ∈(L+ℓn)∩nKan(ℓ)
where
L ⊆ Rm is a lattice with full rank,
ℓn is a shift vector,
![Page 121: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/121.jpg)
Greenhill, Janson, Rucinski (2010), Laplace summation tool.
Say you want to evaluate
∑
ℓ∈(L+ℓn)∩nKan(ℓ)
where
L ⊆ Rm is a lattice with full rank,
ℓn is a shift vector,
K ⊂ Rm is a compact convex set with non-empty interior,
an(ℓ) is a product of factorials and powers.
![Page 122: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/122.jpg)
If (away from the boundary)
an(ℓ) ∼ bnψ(ℓ/n) exp(nϕ(ℓ/n))
![Page 123: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/123.jpg)
If (away from the boundary)
an(ℓ) ∼ bnψ(ℓ/n) exp(nϕ(ℓ/n))
and ϕ(x) has a unique maximum in the interior of K, at x0
and a couple of other mild conditions,
![Page 124: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/124.jpg)
If (away from the boundary)
an(ℓ) ∼ bnψ(ℓ/n) exp(nϕ(ℓ/n))
and ϕ(x) has a unique maximum in the interior of K, at x0
and a couple of other mild conditions, then
∑
ℓ∈(L+ℓn)∩nKan(ℓ) ∼ bn (2πn)m/2ψ(x0)
det(L) det(−H0)−1/2exp
(
nϕ(x0))
![Page 125: The small subgraph conditioning method and hypergraphs ...people.maths.ox.ac.uk/scott/dmpfiles/catherine.pdf · The small subgraph conditioning method: An analysis of variance technique](https://reader030.vdocuments.site/reader030/viewer/2022041119/5f328fe8b9381c224b2f2479/html5/thumbnails/125.jpg)
If (away from the boundary)
an(ℓ) ∼ bnψ(ℓ/n) exp(nϕ(ℓ/n))
and ϕ(x) has a unique maximum in the interior of K, at x0
and a couple of other mild conditions, then
∑
ℓ∈(L+ℓn)∩nKan(ℓ) ∼ bn (2πn)m/2ψ(x0)
det(L) det(−H0)−1/2exp
(
nϕ(x0))
where
det(L) is the determinant of the lattice L,
and H0 is the Hessian of ϕ at x0.