kitp, may 2015
TRANSCRIPT
![Page 1: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/1.jpg)
Glen Evenbly
KITP, May 2015
Disentangling Tensor Networks
Guifre Vidal Tensor Network Renormalization, arXiv:1412.0732
![Page 2: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/2.jpg)
Proper consideration of entanglement is important in the study quantum many-body physics
Overview
Tensor Network Ansatz: wavefunctions designed to reproduce ground state entanglement scaling
PEPS
MPS
MERA
Today: consideration of entanglement in designing a real-space renormalization transformation
![Page 3: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/3.jpg)
Outline: Tensor Network Renormalization
The set-up: Representation of partition functions and path integrals as tensor networks
Previous approaches: Levin and Nave’s Tensor Renormalization Group (LN-TRG), conceptual and computation problems.
New approach: Tensor network renormalization (TNR): proper removal of all short-ranged degrees of freedom via disentanglers
Benchmark results
Extensions
Overview
![Page 4: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/4.jpg)
partition function of 2D classical statistical model
spac
e
space
• tensors encode Boltzmann weights
• contraction of tensor network equals weighted sum over all microstates
Overview express many-body system as a tensor network:
• row of tensors encodes small evolution in imaginary time
Eucl
idea
n tim
e
space
Euclidean path integral of 1D quantum model
• contraction of tensor network equals weighted sum over all trajectories
![Page 5: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/5.jpg)
scalar
Overview Goal: to contract the tensor network to a scalar:
⟨𝜓|𝑜|𝜓⟩
• Monte-carlo sampling
• Transfer matrix methods
• Real-space renormalization (coarse-graining)
Approaches:
i.e. could represent an expectation value in the quantum system
![Page 6: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/6.jpg)
Overview Goal: to contract the tensor network to a scalar:
• Monte-carlo sampling
• Transfer matrix methods
• Real-space renormalization (coarse-graining)
Approaches:
blocking + truncation
scalar
![Page 7: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/7.jpg)
description in terms of very many microscopic
degrees of freedom
Iterative RG transformations description in terms of a few
effective (low-energy, long distance) degrees of freedom
each transformation removes short-range (high energy)
degrees of freedom
Basic idea of RG:
Overview
initial description coarser description
![Page 8: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/8.jpg)
Early real-space RG: Kadanoff’s “spin blocking” (1966)
lattice of classical spins
𝐻 𝑇, 𝐽 initial description:
majority vote blocking
coarser lattice
𝐻 𝑇′, 𝐽′ renormalized parameters:
Overview
…successful only for certain systems
![Page 9: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/9.jpg)
Levin, Nave (2006) : “Tensor renormalization group (LN-TRG)”
Key change: a more general prescription for deciding which degrees of freedom can safely be removed at each RG step
L.P. Kadanoff (1966): “Spin blocking”
spiritual successor
Overview
truncated SVD contract
![Page 10: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/10.jpg)
Levin, Nave (2006) : “Tensor renormalization group (LN-TRG)”
Key change: a more general prescription for deciding which degrees of freedom can safely be removed at each RG step
L.P. Kadanoff (1966): “Spin blocking”
spiritual successor
Overview
+ many improvements and generalizations:
Xie, Jiang, Weng, Xiang (2008): “Second Renormalization Group (SRG)”
Xie, Chen, Qin, Zhu, Yang, Xiang (2012): “Higher Order Tensor Renormalization Group (HOTRG)”
Gu, Levin, Wen (2008): “Tensor Entanglement Renormalization Group (TERG)”
Gu, Wen (2009): “Tensor Entanglement Filtering Renormalization(TEFR)”
![Page 11: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/11.jpg)
Levin, Nave (2006) : “Tensor renormalization group (LN-TRG)”
Key change: a more general prescription for deciding which degrees of freedom can safely be removed at each RG step
L.P. Kadanoff (1966): “Spin blocking”
spiritual successor
Overview
Today: introduce new method of tensor RG (for partition functions and path integrals) that resolves significant computational and conceptual problems of previous approaches
![Page 12: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/12.jpg)
original microscopic description
coarser effective
description
coarser effective
description
RG step RG step RG step
Flaw with previous tensor RG methods (such as LN-TRG):
Flaw: each RG step removes some (but not all) of the short-ranged degrees freedom
some short-ranged correlation removed
some short-ranged correlation propagated
still contains microscopic “baggage”
Consequences: • Accumulation of short ranged detail can cause computational
breakdown; cost scales exponentially in RG step! • Effective theory still contains unwanted microscopic detail; one
does not recover proper structure of RG fixed points
Overview
![Page 13: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/13.jpg)
original microscopic description
coarser effective
description
coarser effective
description
RG step RG step RG step
all short-ranged correlation removed
New approach: “Tensor Network Renormalization (TNR)” arXiv:1412.0732
all short-ranged correlation removed
all short-ranged correlation removed
A way of implementing real-space RG that addresses all short-ranged degrees of freedom at each RG step
Advantages:
• Prevents harmful accumulation of short-ranged detail, allowing for a sustainable RG flow:
exponential cost scaling (previous tensor RG)
constant cost (new approach, TNR)
• Proper RG flow is achieved, TNR reproduces the correct structure of RG fixed points
Overview
![Page 14: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/14.jpg)
Outline: Tensor Network Renormalization
The set-up: Representation of partition functions and path integrals as tensor networks
Previous approaches: Levin and Nave’s Tensor Renormalization Group (LN-TRG), conceptual and computation problems.
New approach: Tensor network renormalization (TNR): proper removal of all short-ranged degrees of freedom via disentanglers
Benchmark results
Extensions
Overview
![Page 15: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/15.jpg)
Overview: Tensor Networks
𝐴𝑖𝑖𝑖𝑖 𝑖
𝑖
𝑖
𝑖 𝐴
Diagrammatic notation: Contraction of two tensors: 𝑖
𝑖
𝑖 𝐴 𝑚
𝑛
𝑜
𝐴 �𝐴𝑖𝑖𝑖𝑖𝐴𝑚𝑚𝑚𝑖𝑖
𝑖
𝜒 × 𝜒 × 𝜒 × 𝜒
bond dimension
i.e. such that the tensor is a array of numbers
𝐴𝑖𝑖𝑖𝑖 Let be a four index tensor with 𝑖, 𝑖, 𝑖, 𝑖 ∈ 1,2,3, … ,𝜒
Square lattice network (PBC):
� 𝐴𝑖𝑖𝑖𝑖𝐴𝑚𝑚𝑚𝑖𝐴𝑖𝑘𝑘𝑘𝐴𝑚𝑜𝑜𝑘 …𝑖𝑖𝑖𝑖𝑚𝑚…
𝐴
𝐴
𝐴
𝐴
𝐴
𝐴
𝐴
𝐴
𝐴
𝐴
𝐴
𝐴
𝐴
𝐴
𝐴
𝐴
𝑖 𝑖
𝑖
𝑖
𝑚 𝑛
𝑜 𝑝
𝑞
𝑟 𝑠
𝑡
Contracts to a scalar:
𝑍
≡ tTr ⊗𝑥=1
𝑁𝐴 = Z
![Page 16: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/16.jpg)
𝐻 𝜎 = −�𝜎𝑖𝜎𝑖𝑖,𝑖
Hamiltonian functional for Ising ferromagnet:
𝜎𝑖 𝜎𝑖
𝜎𝑖 𝜎𝑖
𝜎𝑜 𝜎𝑜
𝜎𝑘 𝜎𝑘
𝜎𝑘 𝜎𝑚
𝜎𝑚 ∈ +1,−1 𝜎𝑚
𝜎𝑢 𝜎𝑣
𝜎𝑤 𝜎𝑥
Square lattice of Ising spins:
𝑍 = �𝑒−𝐻 𝜎 𝑇⁄
𝜎
Partition function:
Encode the Boltzmann weights of a plaquette of spins in a four-index tensor
𝜎𝑖 𝜎𝑖
𝜎𝑖 𝜎𝑖
𝑖 𝑖
𝑖 𝑖 𝐴
𝐴𝑖𝑖𝑖𝑖 = 𝑒 𝜎𝑖𝜎𝑗+𝜎𝑗𝜎𝑘+𝜎𝑘𝜎𝑙+𝜎𝑙𝜎𝑖 𝑇⁄ where:
Partition functions as Tensor Networks
![Page 17: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/17.jpg)
𝜎𝑖 𝜎𝑖
𝜎𝑖 𝜎𝑖
𝜎𝑜 𝜎𝑜
𝜎𝑘 𝜎𝑘
𝜎𝑘 𝜎𝑚
𝜎𝑚 ∈ +1,−1 𝜎𝑚
𝜎𝑢 𝜎𝑣
𝜎𝑤 𝜎𝑥
𝐴
𝐴
𝐴
𝐴 𝐴
𝐴
𝐴 𝐴
𝐴
𝑖 𝑖
𝑖 𝑖
𝑞 𝑟
𝑠 𝑡
𝑚 𝑛
𝑜 𝑝
𝑢 𝑣
𝑥 𝑤
𝑍
𝐻 𝜎 = −�𝜎𝑖𝜎𝑖𝑖,𝑖
𝐴𝑖𝑖𝑖𝑖 = 𝑒 𝜎𝑖𝜎𝑗+𝜎𝑗𝜎𝑘+𝜎𝑘𝜎𝑙+𝜎𝑙𝜎𝑖 𝑇⁄ where: Hamiltonian functional for
Ising ferromagnet:
Square lattice of Ising spins:
Partition function:
Partition functions as Tensor Networks
𝑍 = �𝑒−𝐻 𝜎 𝑇⁄
𝜎
![Page 18: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/18.jpg)
Partition function:
𝐴
𝐴
𝐴
𝐴 𝐴
𝐴
𝐴 𝐴
𝐴
= tTr ⊗𝑥=1
𝑁𝐴
Partition function given by contraction of tensor network
𝑍
𝐻 𝜎 = −�𝜎𝑖𝜎𝑖𝑖,𝑖
𝑍 = �𝑒−𝐻 𝜎 𝑇⁄
𝜎
𝐴𝑖𝑖𝑖𝑖 = 𝑒 𝜎𝑖𝜎𝑗+𝜎𝑗𝜎𝑘+𝜎𝑘𝜎𝑙+𝜎𝑙𝜎𝑖 𝑇⁄ where:
Partition functions as Tensor Networks
Hamiltonian functional for Ising ferromagnet:
Square lattice of Ising spins:
![Page 19: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/19.jpg)
𝐻 = �ℎ(𝑟, 𝑟 + 1)𝑘
Nearest neighbour Hamiltonian for a 1D quantum system:
|𝜓GS⟩⟨𝜓GS| = lim𝛽→∞
𝑒−𝛽𝐻
Evolution in imaginary time yields projector onto ground state:
= 𝐻even + 𝐻odd
= � ℎ(𝑟, 𝑟 + 1)𝑘 even
+ � ℎ(𝑟, 𝑟 + 1)𝑘 odd
lim𝛽→∞
𝑒−𝛽𝐻 = 𝑒−𝜏𝐻𝑒−𝜏𝐻𝑒−𝜏𝐻𝑒−𝜏𝐻 …
Expand in small time steps:
𝑒−𝜏𝐻 = 𝑒−𝜏𝐻even𝑒−𝜏𝐻odd + 𝑜 𝜏2
Suzuki-Trotter expansion:
Path Integrals as Tensor Networks
![Page 20: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/20.jpg)
𝑒−𝜏𝐻 = 𝑒−𝜏𝐻even𝑒−𝜏𝐻odd + 𝑜 𝜏2
= 𝐻even + 𝐻odd 𝐻 = � ℎ(𝑟, 𝑟 + 1)𝑘 even
+ � ℎ(𝑟, 𝑟 + 1)𝑘 odd
Separate Hamiltonian into even and odd terms:
Expand path integral in small discrete time steps:
lim𝛽→∞
𝑒−𝛽𝐻 = 𝑒−𝜏𝐻𝑒−𝜏𝐻𝑒−𝜏𝐻𝑒−𝜏𝐻 …
𝑒−𝜏𝐻even 𝑒−𝜏ℎ
𝑒−𝜏𝐻odd
Exponentiate even and odd separately :
𝑒−𝜏ℎ
𝑒−𝜏ℎ
𝑒−𝜏ℎ
𝑒−𝜏ℎ
𝑒−𝜏ℎ
𝑒−𝜏ℎ
𝑒−𝜏ℎ
𝑒−𝜏ℎ
𝑒−𝜏ℎ
𝑒−𝜏ℎ
𝑟 = 0 1 2 3 4 5 6 7 8 9 10
≈ 𝑒−𝜏𝐻
Path Integrals as Tensor Networks
![Page 21: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/21.jpg)
lim𝛽→∞
𝑒−𝛽𝐻
𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ
𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ
𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ
𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ
𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ
𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ 𝑒−𝜏ℎ
Path Integrals as Tensor Networks
|𝜓GS⟩⟨𝜓GS|
![Page 22: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/22.jpg)
partition function of 2D classical statistical model
spac
e
space
• tensors encode Boltzmann weights • row of tensors encodes small evolution in imaginary time
• contraction of tensor network equals weighted sum over all microstates
Eucl
idea
n tim
e
space
Euclidean path integral of 1D quantum model
Overview encode many-body systems as a tensor network:
• contraction of tensor network equals weighted sum over all trajectories
![Page 23: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/23.jpg)
Outline: Tensor Network Renormalization
The set-up: Representation of partition functions and path integrals as tensor networks
Previous approaches: Levin and Nave’s Tensor Renormalization Group (LN-TRG), conceptual and computation problems.
New approach: Tensor network renormalization (TNR): proper removal of all short-ranged degrees of freedom via disentanglers
Benchmark results
Extensions
![Page 24: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/24.jpg)
Tensor Renormalization Group (LN-TRG) Tensor renormalization group (LN-TRG) is a method for coarse-graining tensor networks based upon blocking and truncation steps
𝑊 𝑆
𝑉 singular value decomposition
𝜒 = 256
Levin, Nave (2006)
blocking
𝜒 = 16
𝐴
• can the block tensor be truncated?
Example of blocking + truncation: 2D classical Ising (critical temp) • take a (4 x 4) block of tensors from the partition function • contract to a single tensor; each (16-dim) index describes the
state of four classical spins
![Page 25: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/25.jpg)
Tensor Renormalization Group (LN-TRG) Tensor renormalization group (LN-TRG) is a method for coarse-graining tensor networks based upon blocking and truncation steps
Levin, Nave (2006)
• can the block tensor be truncated?
Example of blocking + truncation: 2D classical Ising (critical temp) • take a (4 x 4) block of tensors from the partition function • contract to a single tensor; each (16-dim) index describes the
state of four classical spins
𝑊 𝑆
𝑉
𝜒 = 256 10 0 10 1 10 2 10 -10
10 -5
10 0 Sp
ectr
um o
f si
ngul
ar w
eigh
ts
Only keeping the largest 30 singular values yields truncation error (~10-3):
Yes!
![Page 26: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/26.jpg)
Tensor Renormalization Group (LN-TRG) works through alternating truncated SVD and contraction steps:
𝜒 dim
coarser network
contract
Tensor Renormalization Group (LN-TRG) 𝜒 dim
𝐴
𝑊 𝑆
𝑉
truncated SVD
𝜒� ≪ 𝜒2
discard singular values smaller than desired
truncation error 𝛿
truncated SVD
initial network
𝜒� dim
![Page 27: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/27.jpg)
Tensor Renormalization Group (LN-TRG) 𝜒 dim
𝐴
𝑊 𝑆
𝑉
truncated SVD
𝜒� ≪ 𝜒2
discard singular values smaller than desired
truncation error 𝛿
projector acts as a (approximate) resolution
of the identity
𝐴 𝐴
𝜒�
−
i.e. choose isometry to minimise truncation error
𝛿 ≡
𝛿
alternative approach: implement truncation through projector of the form for isometric
𝑊
𝑊†
𝑊 𝑊† 𝑊
𝑊
![Page 28: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/28.jpg)
contract insert
projectors
𝑤 𝑤†
Tensor Renormalization Group (LN-TRG) Levin, Nave (2006)
initial network coarser network
contract truncated
SVD
![Page 29: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/29.jpg)
contract insert
projectors
𝑤 𝑤†
Tensor Renormalization Group (LN-TRG) 𝐴 0
repeat
𝐴(1) generates RG flow in the space of tensors: 𝐴 0 → 𝐴 1 → ⋯ → 𝐴 𝑜 → ⋯
• is this RG flow sustainable?
• does it converge to the expected fixed points?
Levin, Nave (2006)
![Page 30: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/30.jpg)
10 -4
10 -3
10 -2
10 -1
10 0
s = 0 s = 2 s = 6 s = 1
Spec
trum
of 𝐴(𝑜
)
10 2 10 1 10 0 10 2 10 1 10 0 10 2 10 1 10 0 10 2 10 1 10 0
RG flow at criticality (2D Ising) with TRG
Tensor Renormalization Group (LN-TRG) RG flow in the
space of tensors: 𝐴 0 → 𝐴 1 → 𝐴 2 → ⋯ → 𝐴 𝑜 → ⋯
~10 ~20 ~40 >100 Bond dimension required for truncation error < 10-3:
𝜒
Cost of iteration: 𝑂 𝜒6 4 × 109 6 × 107 1 × 106 > 1012
Cost of LN-TRG scales exponentially with RG iteration!
![Page 31: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/31.jpg)
Consider 2D classical Ising ferromagnet at temperature T: Encode partition function (temp T) as a tensor network:
𝐴𝑇0 𝑇 < 𝑇𝐶
𝑇 = 𝑇𝐶
𝑇 > 𝑇𝐶
ordered phase
critical point (correlations at all length scales) disordered phase
Phases:
Tensor Renormalization Group (LN-TRG) RG flow in the
space of tensors: 𝐴 0 → 𝐴 1 → 𝐴 2 → ⋯ → 𝐴 𝑜 → ⋯
Proper RG flow:
𝐴order 𝐴disorder
𝐴(1) 𝐴(2)
𝐴(0)
𝑇 = 0 𝑇 = 𝑇𝐶
𝑇 = ∞
𝐴crit
![Page 32: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/32.jpg)
disordered phase
𝑇 = 1.1 𝑇𝑐
Proper RG flow: 2D classical Ising Numerical results, Tensor renormalization group (LN-TRG):
|𝐴 1 | |𝐴 2 | |𝐴 3 | |𝐴 4 |
𝑇 = 2.0 𝑇𝑐
LN-TRG converges to different fixed point
tensors
LN-TRG does not give proper RG flow:
𝑇 = 0 𝑇 = 𝑇𝐶
𝑇 = ∞
𝐴(1) 𝐴(2)
𝐴(0)
does not converge!
![Page 33: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/33.jpg)
Tensor Renormalization Group (LN-TRG)
RG flow in the space of tensors: 𝐴 0 → 𝐴 1 → 𝐴 2 → ⋯ → 𝐴 𝑜 → ⋯
𝐴 0
𝐴 1
LN-TRG generates an RG flow in the space
of tensors
LN-TRG can be very powerful and useful numerically but...
• does not reproduce a proper RG flow • computational breakdown when near or at criticality
can we understand this?
Levin, Nave (2006)
![Page 34: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/34.jpg)
Tensor Renormalization Group (LN-TRG) Levin, Nave (2006)
basic step of LN-TRG:
• isometries remove some (but not all!) short-ranged correlated degrees of freedom
• LN-TRG fails to remove some short-ranged correlations, which propagate to next length scale
Example: corner-double line (CDL) tensors
![Page 35: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/35.jpg)
𝐴 = 𝑖
𝑖
𝑖
𝑖
𝑖1 𝑖2
𝑖1 𝑖2
𝑖2 𝑖1
𝑖2 𝑖1 Imagine “A” is a special tensor such that each index can be decomposed as a product of smaller indices,
1 2 1 2 1 2 1 2( )( )( )( )ijkl i i j j k k l lA A=
1 2 1 2 1 2 1 2 1 1 2 2 1 1 2 2( )( )( )( )i i j j k k l l i j j k k l l iA δ δ δ δ≡
such that certain pairs of indices are perfectly correlated:
These are called corner double line (CDL) tensors. CDL tensors are fixed points of TRG.
Partition function built from CDL tensors represents a state with
short-ranged correlations
Fixed points of LN-TRG
![Page 36: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/36.jpg)
Some short-ranged always correlations remain under LN-TRG!
Fixed points of LN-TRG
single iteration of LN-TRG:
CDL tensor
=
new CDL tensor
![Page 37: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/37.jpg)
short-range correlated
Is there some way to ‘fix’ tensor renormalization such that all short-ranged correlations are addressed?
others are artificially promoted to the next length scale TRG removes some short ranged correlations, but…
• always retains some of the microscopic (short-ranged) details • can cause computational breakdown when near criticality
LN-TRG
short-range correlated
propagated
removed
Fixed points of LN-TRG
![Page 38: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/38.jpg)
Outline: Tensor Network Renormalization
The set-up: Representation of partition functions and path integrals as tensor networks
Previous approaches: Levin and Nave’s Tensor Renormalization Group (LN-TRG), conceptual and computation problems.
New approach: Tensor network renormalization (TNR): proper removal of all short-ranged degrees of freedom via disentanglers
Benchmark results
Extensions
![Page 39: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/39.jpg)
previous RG schemes for tensor networks based upon blocking:
Tensor Network Renormalization arXiv:1412.0732
i.e. isometries responsible for combining and truncating indices
but blocking alone fails to remove short-ranged degrees of freedom... ...can one incorporate some form of unitary disentangling into a
tensor RG scheme?
Tree tensor network (TTN) Multi-scale entanglement renormalization ansatz (MERA)
![Page 40: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/40.jpg)
insert isometries
𝑤 𝑤†
contract
Tensor Network Renormalization arXiv:1412.0732
= ≈
![Page 41: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/41.jpg)
𝑢†
= 𝑢
Tensor Network Renormalization
≈
arXiv:1412.0732
exact step: insert conjugate pairs of unitaries: 𝑢†𝑢 = 𝐼 approximate step: insert conjugate pairs of isometries: 𝑤† 𝑤
exact step: contract
initial square lattice tensor network:
=
coarser network:
𝑤 𝑤†
![Page 42: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/42.jpg)
𝑢†
= 𝑢
≈
Tensor Network Renormalization arXiv:1412.0732
=
initial square lattice tensor network:
coarser network:
is it possible that the additional disentangling step is enough to remove
all short-ranged degrees of freedom?
𝑤 𝑤†
![Page 43: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/43.jpg)
Corner double line tensors revisited Isometries only
(LN-TRG) • can remove some short-ranged
correlated degrees of freedom
• but fails to remove others
![Page 44: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/44.jpg)
Corner double line tensors revisited Isometries only
(LN-TRG) example: corner double
line (CDL) tensors
CDL tensor new CDL tensor
Isometries and Disentanglers (TNR)
CDL tensor uncorrelated
(product) tensor
![Page 45: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/45.jpg)
trivial (product) state TNR
previous tensor RG
short-range correlated short-range correlated
Corner double line tensors revisited
network of CDL tensors
TNR coarse-grains a short-range correlated network into a trivial (product) network as desired!
![Page 46: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/46.jpg)
𝑢†
= 𝑢
𝑤 𝑤†
≈
Tensor Network Renormalization arXiv:1412.0732
initial square lattice tensor network:
coarser network:
slight modification: we want to include the minimal amount of
disentangling (sufficient to address all short-range degrees of freedom)
=
![Page 47: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/47.jpg)
𝑢†
= 𝑢
≈
Tensor Network Renormalization arXiv:1412.0732
=
≈
Singular value decomposition
Contract
𝐴(s+1)
=
𝑤 𝑤†
𝐴(s)
![Page 48: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/48.jpg)
𝑢†
= 𝑢
≈
Tensor Network Renormalization arXiv:1412.0732
=
𝑤 𝑤†
𝐴(s)
optimization: choose unitary ‘u’ and isometric ‘w’ to minimize the truncation error 𝛿
− 𝐴 𝐴
𝑢 𝛿 ≡
𝑤† 𝑤
![Page 49: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/49.jpg)
Tensor Network Renormalization arXiv:1412.0732
Tensor network renormalization (TNR): RG for tensor networks designed to address all short-ranged degrees of freedom at each step
• works in simple examples (networks with only short-range correlations)
• does it work in more challenging / interesting cases? (such as in critical systems, which possess correlations at all length scales)
![Page 50: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/50.jpg)
Outline: Tensor Network Renormalization
The set-up: Representation of partition functions and path integrals as tensor networks
Previous approaches: Levin and Nave’s Tensor Renormalization Group (LN-TRG), conceptual and computation problems.
New approach: Tensor network renormalization (TNR): proper removal of all short-ranged degrees of freedom via disentanglers
Benchmark results
Extensions
![Page 51: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/51.jpg)
10 -8
10 -7
10 -6
10 -5
10 -4
10 -3
𝜒 = 4 LN-TRG
𝜒 = 4 TNR
𝜒 = 8 LN-TRG
𝜒 = 8 TNR
TC
2 2.1 2.2 2.3 2.4
Temperature, T
Erro
r in
Free
Ene
rgy
Dens
ity, 𝛿𝑓
Benchmark numerics:
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
2 2.1 2.2 2.3 2.4 Temperature, T
TC
Exact 𝜒 = 4
Spon
tane
ous M
agne
tizat
ion,
𝑀
TNR
2D classical Ising model on lattice of size: 212 × 212
![Page 52: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/52.jpg)
Does TRG give a sustainable RG flow?
10 -4
10 -3
10 -2
10 -1
10 0
s = 0 s = 2 s = 6 s = 1 (a)
Spe
ctru
m o
f 𝐴(𝑜
)
10 2 10 1 10 0 10 2 10 1 10 0 10 2 10 1 10 0 10 2 10 1 10 0
RG flow at criticality
LN-TRG
TNR
Sustainable RG flow
4 × 109 6 × 107 1 × 106 > 1012 Computational costs:
TNR : LN-TRG, :
5 × 107 5 × 107 5 × 107 5 × 107 𝑂(𝜒6) 𝑂(𝑖𝜒6)
TNR: ~10 ~10 ~10 ~10
LN-TRG: ~10 ~20 ~40 >100
Bond dimension required to maintain fixed truncation error (~10-3):
𝜒
Old approach (LN-TRG) vs new approach (TNR)
exponential
constant
![Page 53: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/53.jpg)
Consider 2D classical Ising ferromagnet at temperature T: Encode partition function (temp T) as a tensor network:
𝐴𝑇0 𝑇 < 𝑇𝐶
𝑇 = 𝑇𝐶
𝑇 > 𝑇𝐶
ordered phase
critical point (correlations at all length scales) disordered phase
Phases:
Tensor Renormalization Group (LN-TRG) RG flow in the
space of tensors: 𝐴 0 → 𝐴 1 → 𝐴 2 → ⋯ → 𝐴 𝑜 → ⋯
Proper RG flow:
𝐴order 𝐴disorder
𝐴(1) 𝐴(2)
𝐴(0)
𝑇 = 0 𝑇 = 𝑇𝐶
𝑇 = ∞
𝐴crit
![Page 54: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/54.jpg)
𝑇 = 2.0 𝑇𝑐
disordered phase
𝑇 = 1.1 𝑇𝑐
Proper RG flow: 2D classical Ising Old Approach: Tensor renormalization group (LN-TRG):
|𝐴 1 | |𝐴 2 | |𝐴 3 | |𝐴 4 |
CDL tensor fixed points still contain microscopic “baggage”
New Approach: Tensor Network Renormalization (TNR):
𝑇 = 1.1 𝑇𝑐
𝑇 = 2.0 𝑇𝑐
fixed points contain only universal information about the phase
![Page 55: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/55.jpg)
𝑇 = 0.9 𝑇𝑐 sub-critical ordered (Z2)
fixed point
𝑇 = 𝑇𝑐
critical critical (scale-invariant) fixed
point
𝑇 = 1.1 𝑇𝑐 super-critical disordered
(trivial) fixed point
|𝐴 1 | |𝐴 2 | |𝐴 3 | |𝐴 4 |
• Converges to one of three RG fixed points, consistent with a proper RG flow
Proper RG flow: 2D classical Ising New Approach: Tensor Network Renormalization (TNR):
![Page 56: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/56.jpg)
Proper RG flow: 2D classical Ising
𝑇 = 1.002 𝑇𝑐 more difficult!
|𝐴 1 | |𝐴 2 | |𝐴 3 | |𝐴 4 |
|𝐴 5 | |𝐴 6 | |𝐴 7 | |𝐴 8 |
|𝐴 9 | |𝐴 10 | |𝐴 11 | |𝐴 12 |
𝜒 = 4
TNR bond dimension:
![Page 57: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/57.jpg)
|𝐴 1 | |𝐴 2 | |𝐴 3 | |𝐴 4 |
|𝐴 5 | |𝐴 6 | |𝐴 7 | |𝐴 8 |
|𝐴 9 | |𝐴 10 | |𝐴 11 | |𝐴 12 |
𝑇 = 𝑇𝑐 critical point:
𝜒 = 4
TNR bond dimension:
Proper RG flow: 2D classical Ising
![Page 58: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/58.jpg)
Summary • We have discussed implementation
of real-space RG for tensor networks
• Demonstrated that previous methods (e.g. Levin Nave TRG) do not generate a proper RG flow
cause: failure to address all short-range degrees of freedom at each RG step
uses disentangling to address all short-range degrees of freedom at each RG step
Tensor Network Renormalization, arXiv:1412.0732
𝑢 𝑢†
• Proper RG flow: correct structure of RG fixed points
• Computationally sustainable RG flow
future work: implementation in higher dimensions, for contraction of PEPS, for study of impurity CFTs...
![Page 59: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/59.jpg)
Outline: Tensor Network Renormalization
The set-up: Representation of partition functions and path integrals as tensor networks
Previous approaches: Levin and Nave’s Tensor Renormalization Group (LN-TRG), conceptual and computation problems.
New approach: Tensor network renormalization (TNR): proper removal of all short-ranged degrees of freedom via disentanglers
Benchmark results
Extensions
![Page 60: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/60.jpg)
Tensor Renormalization Group (LN-TRG) vs
Tensor Network Renormalization (TNR)
Tree tensor network (TTN) Multi-scale entanglement renormalization ansatz (MERA)
Analogous to:
vs
TNR yields the MERA (Evenbly, Vidal, arXiv:1502.05385)
![Page 61: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/61.jpg)
network with PBC
TNR yields the MERA (Evenbly, Vidal, arXiv:1502.05385)
evolution in imaginary
time
1D lattice in space
|𝜓GS⟩ open boundaries?
![Page 62: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/62.jpg)
• Coarse-grain while leaving boundary indices untouched
• Disentanglers and isometries are inserted in conjugate pairs, eventually becoming a part of the coarse-grained tensors
• But a row of unpaired disentanglers and isometries remains on the open boundary…
open boundary
𝑢† 𝑢
open boundary open boundary
row of unpaired unitaries
TNR yields the MERA (Evenbly, Vidal, arXiv:1502.05385)
![Page 63: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/63.jpg)
TNR applied to open boundary tensor network generates a MERA!
TNR yields the MERA (Evenbly, Vidal, arXiv:1502.05385)
open boundary
![Page 64: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/64.jpg)
open boundary
exact representation of ground state as a path integral
Approximate representation of ground state (MERA)
|𝜓GS⟩ TNR
TNR yields the MERA (Evenbly, Vidal, arXiv:1502.05385)
![Page 65: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/65.jpg)
TNR
MERA for finite temperature thermal state
TNR yields the MERA (Evenbly, Vidal, arXiv:1502.05385)
open boundary
𝑒−𝛽𝐻
open boundary
![Page 66: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/66.jpg)
iteration of TNR Tensor network
with open ‘hole’:
coarse-grain as much as possible (subject to leaving indices
around the hole untouched)
TNR yields the MERA (Evenbly, Vidal, arXiv:1502.05385)
![Page 67: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/67.jpg)
many iterations
drawn differently
TNR yields the MERA (Evenbly, Vidal, arXiv:1502.05385)
iteration of TNR Tensor network
with open ‘hole’:
![Page 68: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/68.jpg)
punctured plane cylinder
TNR yields the MERA (Evenbly, Vidal, arXiv:1502.05385)
TNR for network with hole:
Logarithmic transform in CFT:
![Page 69: KITP, May 2015](https://reader031.vdocuments.site/reader031/viewer/2022021916/620fa78dede9af23a04f7c27/html5/thumbnails/69.jpg)
0
1
2
3
4
5 Exact 𝜒 = 4
1/8
1+1/8
2+1/8
3+1/8
4+1/8
Scaling dimensions from partition function of critical Ising
TNR yields the MERA (Evenbly, Vidal, arXiv:1502.05385)
diagonalize transfer operator: