quantitative modeling of complex systems · dr. michael von papen, competence area 3: quantitative...
TRANSCRIPT
© Simon Trebst
Quantitative Modeling of Complex Systems
© Simon Trebst
Machine learning
Many computer programs/apps have machine learning algorithms built-in already.
In computer science, machine learning is concerned with algorithms that allow for data analytics, most prominently dimensional reduction and feature extraction.
spam filters face recognition voice recognition
© Simon Trebst
Machine learning
digital assistants self-driving cars
Applications of machine learning techniques are booming and poised to enter our daily lives.
© Simon Trebst
Machines at play
1996: G. Kasparow vs. IBM’s deep blue 2016: L. Sedol vs. Google’s AlphaGo
Machine learning techniques can make computers play.
A computer at play is probably one of the most striking realization of artificial intelligence.
© Simon Trebst
How do machines learn?
How do machines learn?
What is it that they can learn?Can we control what they learn?
How can we benefit from machine learning?
x1
x2
x3
x4
x5
biological neural network artificial neural network
© Simon Trebst
Machine�Learning�Day�2017
Organizer
Prof.�Dr.�Simon�Trebst,�Theoretical�PhysicsProf.�Dr.�Achim�Tresch,�Bioinformatics�and�Computational�BiologyDr.�Michael�von�Papen,�Competence�Area�3:�Quantitative�Modeling�of�Complex�Systems
Time Title Speaker Institute
09:00 Intro Simon�Trebst
09:15 Quantum�Machine�Learning Simon�Trebst Theoretical�Physics
09:45 kernDeepStackNet:��An�R�package�for�tuning�kernel�deep�stacking�networks
Thomas�Welchowski��and�Matthias�Schmid
IMBIE,�Bonn
10:15 coffee�break
10:45 Applying�Machine�Learning�to��Text�Mining
Jürgen�Hermes Department�of�Linguistics
11:15 Machine�learning�and�the��social�sciences
Jörn�Grahl Digital�Transformation�and�Value�Creation
11:45 lunch�break
13:15 Compressed�Sensing�for�Sparse�and�Low-Rank�Models
David�Gross Theoretical�Physics
13:45 Sparse�PCA�and�convex��optimization
Frank�Vallentin Mathematical�Institute
14:15 coffee�break
14:45 Reconstruction�of�Biological�Networks
Achim�Tresch Bioinformatics�and��Computational�Biology
15:15 Illuminating�Genetic�Networks��with�Random�Forest
Andreas�Beyer CECAD
15:45 End
Machine learning quantum phases of matter
Simon TrebstInstitute for Theoretical Physics
Machine Learning DayCologne, April 2017
25.-27.�April�2017University�of�CologneInstitute�for�Theoretical�physics
Quantum�Machine�Learning�
tate-of-the-art� machine� learning�techniques� have� recently� attracted�
avid� interest� from� the� statistical�physics�community�for�their�capacity�to�distinguish�different�phases�of�quantum�matter� in� an� automated� way.� This�two-day� mini-workshop� will� provide�
informal� setting� for� discussions�amongst�numerical�practitioners�who�
�recently�started�to�apply�machine�learning� techniques� to� quantum�statistical�physics�problems.�Besides�short�talks�there�will�be�ample�time�for�interactions�amongst�the�attendees.
Peter�Bröcker��CologneGuiseppe�Carleo��ZurichMaciej�Koch-Janusz�ZurichRoger�Melko��WaterlooTitus�Neupert��ZurichEvert�van�NieuwenburgCaltechYi�(Frank)�Zhang��Cornell
Speakers
© Simon Trebst
artificial neural networks
x1
x2
x3
x4
x5
Artificial neural networks mimic biological neural networks (albeit at a much smaller scale).
They allow for an implicit knowledge representation,which is infused in supervised or unsupervised learning settings.
© Simon Trebst
artificial neural networksartificial neurons
x1
x2
x3
outputbw1
w2
w3
example – Should I skip the first talk (or sleep in)?
quantum stuff?
free coffee?
9am – really? +3
2-2
+2
sleep in
(binary)input ⇥ (~w · ~x� b)
0z
Θ(z)
0
1
© Simon Trebst
artificial neural networksArtificial neural networks are pretty powerful.
x1
x2
3
-2
-2
NAND-gate
x1 x2
0 00 11 01 1
output
1110
Like circuits of NAND gates artificial neural networks can encode arbitrarily complex logic functions, thus allowing for universal computation.
But the power of neural networks really comes about by varying the weights such that one obtains some desired functionality.
© Simon Trebst
neural network architectures
input layer output layerhidden layers
x1
x2
x3
x4
x5
feedforward network
Neural networks with multiple hidden layershave been popularized as “deep learning” networks.
© Simon Trebst
How to train a neural network?
• quadratic cost function
desiredoutput
actualoutput
C(~w,~b) =1
2n
X
x
||y(x)� a(x)||2x1
x2
x3
x4
x5
perceptrons
0z
Θ(z)
01
sigmoid neurons
0z
Θ(z)1
0
Small adjustments on the level of a single neuron should result in small changes of the cost function.
© Simon Trebst
How to train a neural network?
gradient descent
• quadratic cost function
desiredoutput
actualoutput
C(~w,~b) =1
2n
X
x
||y(x)� a(x)||2
• back propagation algorithmRumelhart, Hinton & Williams, Nature (1986)
@C
@w
@C
@b
extremely efficient way to calculate all partial derivatives
needed for a gradient descent optimization.
x1
x2
x3
x4
x5
© Simon Trebst
pattern recognition
93106x1
x2
x3
x4
x5
Some 60 lines of code (Python/Julia) will do this for you with >95% accuracy.
conv pool conv pool full dropout full
conventionalmatrix reductions
Much higher accuracy possible for networks with additional convolutional layers.
© Simon Trebst
convolutional neural networksConvolutional neural networks look for recurring patterns using small filters.
© Simon Trebst
convolutional neural networksConvolutional neural networks look for recurring patterns using small filters.
© Simon Trebst
convolutional neural networksConvolutional neural networks look for recurring patterns using small filters.
filter
activation map
Slide filters across image and create new image based on how well they fit.
element-wisematrix product
© Simon Trebst
GPUs & open-source codes
© Simon Trebst
Machine learningquantum phases of matter
© Simon Trebst
Quantum matter
water ice
superconductor
Bose-Einstein condensate
© Simon Trebst
Model systems
interacting many-body system
pair-wise interactionsfavor alignment of “spins”
Ising model
© Simon Trebst
Model systems
interacting many-body system
pair-wise interactionsfavor alignment of “spins”
Ising model
© Simon Trebst
Model systems
interacting many-body system
pair-wise interactionsfavor alignment of “spins”
Ising model
paradigmatic modelfor a phase transition
high temperature low temperature“critical” temperature
© Simon Trebst
Supervised learning approach
Supervised learning approach 1) train convolutional neural network on representative “images” deep within the two phases2) apply trained network to “images” sampled elsewhere to predict phases + transition
What are the right images to feed into the neural network?
phase A phase Bphase
transition
trainhere
trainhere
predict phases by applying neural network here
step 1
step 2�
General setupConsider some system, which as a function of some parameter λ exhibits a phase transition between two phases.
© Simon Trebst
classical phases of matterMachine learning of thermal phase transitionin the classical Ising model 6
2.0 2.5 3.0 3.5 4.0 4.5 5.0T
0.0
0.2
0.4
0.6
0.8
1.0
Out
putl
ayer
L = 30
T < Tc
T > Tc
FIG. 2. Detecting the critical temperature of the triangular Ising model through the crossing of the
values of the output layer vs T . The orange line signals the triangular Ising model Tc
/J = 4/ ln 3,
while the blue dashed line represents our estimate T
c
/J = 3.63581.
shown in Figure 3. In a conventional condensed-matter approach, the ground states and the
high-temperature states are distinguished by their spin-spin correlation functions: power-
law decay in the Coulomb phase at T = 0, and exponential decay at high temperature.
Instead we use supervised learning, feeding raw Monte Carlo configurations to train a fully-
connected neural network (Figure 1(A)) to distinguish ground states from high-temperature
states. Figure 3(A) and Figure 3(B) display high- and low-temperature snapshots of the
configurations used in the training of the model. For a square ice system with N = 2 ⇥
16⇥ 16 spins, we find that a standard fully-connected neural network with 100 hidden units
successfully distinguishes the states with a 99% accuracy. The network does so solely based
on spin configurations, with no information about the underlying lattice – a feat di�cult for
the human eye, even if supplemented with a layout of the underlying Hamiltonian locality.
These results indicate that the learning capabilities of neural networks go beyond the
simple ability to encode order parameters, extending to the detection of subtle di↵erences
in higher-order correlations functions. As a final demonstration of this, we examine an Ising
Pirs
a: 1
6080
016
Page
10/
33
Pirs
a: 1
6080
016
Page
10/
33
Nature Physics (2017)
© Simon Trebst
quantum phases of matterHubbard model on the honeycomb lattice
semi-metal
SDW
© Simon Trebst
quantum phase transitionGreen’s functions are ideal objects/images for machine learning based discrimination of quantum phases.
1 2 3 4 5 6 7 8
interaction U
0.0 0.0
0.25 0.25
0.5 0.5
0.75 0.75
1.0 1.0
pred
ictio
n
L = 15L = 12L = 9L = 6
U = 1.0 U = 16.0
SDWDirac
3.8 4.0 4.2 4.4 4.6 4.8 5.0 5.2
interaction U
0.0 0.0
0.25 0.25
0.5 0.5
0.75 0.75
1.0 1.0
pred
ictio
n
3.8 4.2 4.6 4.8
0.5
0.0
1.0 L = 15L = 9
© Simon Trebst
unsupervised learning
© Simon Trebst
Unsupervised learningEmploy ability to “blindly” distinguish phases to map out an entirephase diagram with no hitherto knowledge about the phases.
Example: hardcore bosons / XXZ model on a square lattice
H = �X
hi,ji
�S+i S�
j + S�i S+
j
�+�
X
hi,ji
Szi S
zj + h
X
i
Szi
VOLUME 88, NUMBER 16 P H Y S I C A L R E V I E W L E T T E R S 22 APRIL 2002
Finite-Temperature Phase Diagram of Hard-Core Bosons in Two Dimensions
Guido Schmid,1 Synge Todo,1,2 Matthias Troyer,1 and Ansgar Dorneich3
1Theoretische Physik, Eidgenössische Technische Hochschule Zürich, CH-8093 Zürich, Switzerland2Institute for Solid State Physics, University of Tokyo, Kashiwa 277-8581, Japan
3Institut für Theoretische Physik, Universität Würzburg, 97074 Würzburg, Germany(Received 4 October 2001; published 9 April 2002)
We determine the finite-temperature phase diagram of the square lattice hard-core boson Hubbardmodel with nearest neighbor repulsion using quantum Monte Carlo simulations. This model is equivalentto an anisotropic spin-1!2 XXZ model in a magnetic field. We present the rich phase diagram with afirst order transition between a solid and superfluid phase, instead of a previously conjectured supersolidand a tricritical end point to phase separation. Unusual reentrant behavior with ordering upon increasingthe temperature is found, similar to the Pomeranchuk effect in 3He.
DOI: 10.1103/PhysRevLett.88.167208 PACS numbers: 75.10.Jm, 05.30.Jp, 67.40.Kh, 74.25.Dw
A nearly universal feature of strongly correlated systemsis a phase transition between a correlation-induced insulat-ing phase, with localized charge carriers, and an itinerantphase. High-temperature superconductors [1], manganites[2], and the controversial two-dimensional (2D) “metal-insulator transition” [3] are just a few examples of thisphenomenon. The 2D hard-core boson Hubbard modelprovides the simplest example of such a transition froma correlation-induced charge density wave insulator nearhalf filling to a superfluid (SF). It is a prototypical modelfor preformed Cooper pairs [4], spin flops in anisotropicquantum magnets [5,6], SF helium films [7], and super-solids [8,9].
In simulations of this model, which does not suffer fromthe negative sign problem of fermionic simulations, wecan investigate some of the pertinent questions about suchphase transitions: what is the order of the quantum phasetransitions in the ground state and the finite-temperaturephase transitions? Are there special points with dynami-cally enhanced symmetry [10]? Can there be coexistenceof two types of order (such as a supersolid–coexistingsolid and superfluid order)? Answers to these questionsalso provide insight into the other problems alluded toabove.
The Hamiltonian of the hard-core boson Hubbard modelwe study is
H ! 2tX
"i,j#$ay
i aj 1 ayj ai% 1 V
X
"i,j#ninj 2 m
X
ini ,
(1)
where ayi $ai% is the creation (annihilation) operator for
hard-core bosons, ni ! ayi ai is the number operator,
V is the nearest neighbor Coulomb repulsion, and m isthe chemical potential. This model is equivalent to ananisotropic spin-1!2 XXZ model with Jz ! V and jJxyj !2t in a magnetic field h ! 2V 2 m. The zero field (andzero magnetization mz ! 0) case of the spin modelcorresponds to the half filled bosonic model (densityr ! "mz# ! 1!2) at m ! 2V . Throughout this Letterwe will use the bosonic language, and refer to the corre-
sponding quantities in the spin model where appropriate.Because of the absence of efficient Monte Carlo algo-rithms for classical magnets in a magnetic field there arestill many open questions even in the classical version ofthis model, which was only studied by a local updatedmethod [11].
In Fig. 1 we show the ground-state phase diagram[6,9,12]. For dominating chemical potential m the systemis in a band insulating state (r ! 0 and r ! 1, respec-tively), while it shows staggered checkerboard chargeorder (r ! 1!2) for dominating repulsion V . These solidphases are separated from each other by a SF. Earlierindications for a region of supersolid phase between thecheckerboard solid and SF phase turned out to be due tophase separation at this transition which is of first orderat T ! 0 [6,9,12].
All of these phases extend to finite temperatures. Onthe strong repulsion side the hard-core boson Hubbardmodel is equivalent to an antiferromagnetic Ising model att ! 0, and the insulating behavior extends up to a finite-temperature phase transition of the Ising universality class
−12 −8 −4 0 4 8 12
(µ−2V)/t
0
2
4
6
V/t
empty
solid
superfluid
Heisenberg pointfull
checkerboard
ρ=1/2
ρ=0 ρ=1
FIG. 1. Ground-state phase diagram of the hard-core bosonHubbard model. The dashed line indicates the cut along whichwe calculated the finite-temperature phase diagram shown inFig. 2.
167208-1 0031-9007!02!88(16)!167208(4)$20.00 © 2002 The American Physical Society 167208-1
PRL 88, 167208 (2002)
hS+i S�
j i+ hS�i S+
j i real space hwindow in direction
learning rate 0.0001window width 0.5fully connected network, 1024 neurons
hS+i S�
j i+ hS�i S+
j i real space hwindow in direction
learning rate 0.0001window width 0.5fully connected network, 1024 neurons
© Simon Trebst
Unsupervised learningEmploy ability to “blindly” distinguish phases to map out an entirephase diagram with no hitherto knowledge about the phases.
Example: hardcore bosons / XXZ model on a square lattice
H = �X
hi,ji
�S+i S�
j + S�i S+
j
�+�
X
hi,ji
Szi S
zj + h
X
i
Szi
VOLUME 88, NUMBER 16 P H Y S I C A L R E V I E W L E T T E R S 22 APRIL 2002
Finite-Temperature Phase Diagram of Hard-Core Bosons in Two Dimensions
Guido Schmid,1 Synge Todo,1,2 Matthias Troyer,1 and Ansgar Dorneich3
1Theoretische Physik, Eidgenössische Technische Hochschule Zürich, CH-8093 Zürich, Switzerland2Institute for Solid State Physics, University of Tokyo, Kashiwa 277-8581, Japan
3Institut für Theoretische Physik, Universität Würzburg, 97074 Würzburg, Germany(Received 4 October 2001; published 9 April 2002)
We determine the finite-temperature phase diagram of the square lattice hard-core boson Hubbardmodel with nearest neighbor repulsion using quantum Monte Carlo simulations. This model is equivalentto an anisotropic spin-1!2 XXZ model in a magnetic field. We present the rich phase diagram with afirst order transition between a solid and superfluid phase, instead of a previously conjectured supersolidand a tricritical end point to phase separation. Unusual reentrant behavior with ordering upon increasingthe temperature is found, similar to the Pomeranchuk effect in 3He.
DOI: 10.1103/PhysRevLett.88.167208 PACS numbers: 75.10.Jm, 05.30.Jp, 67.40.Kh, 74.25.Dw
A nearly universal feature of strongly correlated systemsis a phase transition between a correlation-induced insulat-ing phase, with localized charge carriers, and an itinerantphase. High-temperature superconductors [1], manganites[2], and the controversial two-dimensional (2D) “metal-insulator transition” [3] are just a few examples of thisphenomenon. The 2D hard-core boson Hubbard modelprovides the simplest example of such a transition froma correlation-induced charge density wave insulator nearhalf filling to a superfluid (SF). It is a prototypical modelfor preformed Cooper pairs [4], spin flops in anisotropicquantum magnets [5,6], SF helium films [7], and super-solids [8,9].
In simulations of this model, which does not suffer fromthe negative sign problem of fermionic simulations, wecan investigate some of the pertinent questions about suchphase transitions: what is the order of the quantum phasetransitions in the ground state and the finite-temperaturephase transitions? Are there special points with dynami-cally enhanced symmetry [10]? Can there be coexistenceof two types of order (such as a supersolid–coexistingsolid and superfluid order)? Answers to these questionsalso provide insight into the other problems alluded toabove.
The Hamiltonian of the hard-core boson Hubbard modelwe study is
H ! 2tX
"i,j#$ay
i aj 1 ayj ai% 1 V
X
"i,j#ninj 2 m
X
ini ,
(1)
where ayi $ai% is the creation (annihilation) operator for
hard-core bosons, ni ! ayi ai is the number operator,
V is the nearest neighbor Coulomb repulsion, and m isthe chemical potential. This model is equivalent to ananisotropic spin-1!2 XXZ model with Jz ! V and jJxyj !2t in a magnetic field h ! 2V 2 m. The zero field (andzero magnetization mz ! 0) case of the spin modelcorresponds to the half filled bosonic model (densityr ! "mz# ! 1!2) at m ! 2V . Throughout this Letterwe will use the bosonic language, and refer to the corre-
sponding quantities in the spin model where appropriate.Because of the absence of efficient Monte Carlo algo-rithms for classical magnets in a magnetic field there arestill many open questions even in the classical version ofthis model, which was only studied by a local updatedmethod [11].
In Fig. 1 we show the ground-state phase diagram[6,9,12]. For dominating chemical potential m the systemis in a band insulating state (r ! 0 and r ! 1, respec-tively), while it shows staggered checkerboard chargeorder (r ! 1!2) for dominating repulsion V . These solidphases are separated from each other by a SF. Earlierindications for a region of supersolid phase between thecheckerboard solid and SF phase turned out to be due tophase separation at this transition which is of first orderat T ! 0 [6,9,12].
All of these phases extend to finite temperatures. Onthe strong repulsion side the hard-core boson Hubbardmodel is equivalent to an antiferromagnetic Ising model att ! 0, and the insulating behavior extends up to a finite-temperature phase transition of the Ising universality class
−12 −8 −4 0 4 8 12
(µ−2V)/t
0
2
4
6
V/t
empty
solid
superfluid
Heisenberg pointfull
checkerboard
ρ=1/2
ρ=0 ρ=1
FIG. 1. Ground-state phase diagram of the hard-core bosonHubbard model. The dashed line indicates the cut along whichwe calculated the finite-temperature phase diagram shown inFig. 2.
167208-1 0031-9007!02!88(16)!167208(4)$20.00 © 2002 The American Physical Society 167208-1
PRL 88, 167208 (2002)
real space hwindow in directionhSzi S
zj i
learning rate 0.0001window width 0.5fully connected network, 1024 neurons
real space hwindow in directionhSzi S
zj i
learning rate 0.0001window width 0.5fully connected network, 1024 neurons
© Simon Trebst
Summary
Monte Carlo + machine learning approach can be used to distinguish phases of interacting many-body systems in(quantum) statistical physics.
The approach can be adapted to an unsupervised setting for quickly and semi-automatically mapping out phase diagrams of (quantum) many-body systems.
Can we identify even some of the more subtle quantum mechanical properties such as long-range entanglementand (non-local) topological order?
Thanks!