a journey through time with the higgs particle gordon j. aubrecht, ii columbus science pub, 2...

68
A journey through time with the Higgs particle Gordon J. Aubrecht, II Columbus Science Pub, 2 October 2012

Upload: cody-hensley

Post on 24-Dec-2015

215 views

Category:

Documents


1 download

TRANSCRIPT

A journey through time with the Higgs particle

Gordon J. Aubrecht, II

Columbus Science Pub, 2 October 2012

Sometime in the fall of 1967 that I attended a seminar by a brash young physicist from MIT. From the way the senior physicists at Princeton (where I was a graduate student at the time) treated Steven Weinberg, it was clear that this was someone who had a future in theoretical physics.

It wasn’t clear they thought he would go on to win the Nobel Prize (in 1979, with Abdus Salam) for his contributions to building what is now called the Standard Model, but I and the other graduate students there knew he was considered special.

That afternoon, I learned about spontaneous symmetry breaking. I recall Weinberg talking about a scalar particle and how this scalar particle could give mass to the vector bosons. I remember him doing calculations on the blackboard and coming out with a mass parameter μ from the symmetry breaking. The idea is that the ground state equilibrium is unstable, and any perturbation results in the particle falling to the lower potential energy. It was my first introduction to what has been called the “wine-bottle” or “Mexican-hat” potential.

First, I’d like to look ahead to the Standard Model. This is the chart from

CPEP.

I am the former chair of the Contemporary Physics Education Project (CPEP). We created this

chart in the late 1980s, as it became clear to all of us that there was such

a thing as a “Standard Model.”

The frantic 1970s

Going back to the 1930s, there was a weak interaction theory that was originally developed by Enrico Fermi for beta decay. In fact, it is still

known to physics students as Fermi’s Golden Rule: that the probability of a transition from initial to final state depends on the density of states and the square of the interaction matrix

element between initial and final states.

Of course, in 1934, people didn’t know the matrix element; it would take until the 1950s to

develop the ideas that led to the proper beta-decay spectrum. The Fermi model of weak

interactions had problems when applied beyond relatively low energies—the prediction

extrapolated more generally led to a growing transition probability because the 1950s constant matrix element and the growing density of states. The matrix element, as we have since learned, is simply a low-energy approximation in the bigger

model.

Theorists showed that one could calculate real values through a process of renormalization

reminiscent of Richard Feynman, Julian Schwinger, and Sin-Itiro Tomonaga and many

others’ work on quantum electrodynamics (QED).

The development of QED during the 1940s showed how renormalization could work to eliminate the infinities in electromagnetism.

Electromagnetism can be explained in terms of exchange of photons. Of course, photons are massless and their exchange—through matrix elements that involved propagators that are the inverse of the square of the four-momentum, 1/p2—led to calculations that found infinite

values for physical parameters.

The general idea of exchange of particles came from what is known as gauge invariance in

electromagnetism, and the particles exchanged could be referred to generally as gauge particles.

Photons are gauge particles for electromagnetism.

A way to get rid of the pesky infinities in theories of interactions is to realize that

interactions are mediated by gauge particles. This is an old idea in particle physics. Nuclear

physicists considered exchange of virtual (massive) pions and other virtual particles as the way the strong interaction worked inside nuclei.

a) A neutron and a proton at time t1. b) At time t2 > t1, the proton emits a virtual positively charged pion and becomes a neutron. c) At time t3 > t2, the neutron absorbs the pion, becoming a proton.

d) At time t4 > t3, there is again a neutron and a proton.

Feynman developed a graphical method of calculation for QED. The exchange on the last slide could be calculated from this

diagram (with appropriate rules).

The idea that interactions could proceed by exchange of particles that were massive was applied to

the weak interactions. Simplifying the situation immensely, if there

were massive gauge particles similar to the photon, the infinities

would go away.

If this idea were to work, for example in the case of the 1930s

“poster child” for the weak interaction, nuclear beta decay, this would mean that nuclear beta decay and scattering of a neutrino from a neutron to produce a proton and an

electron would be related.

a) The Feynman diagram describing the process in which a neutron (symbolized by d) and a neutrino scatter through the weak interaction

producing a proton and an electron. b) The Feynman diagram for nucleon beta decay, in which a constituent of the neutron (d) is changed by the weak interaction into a constituent of a proton

(u) and produces a W-, which then decays into an electron and an antineutrino (note that the antineutrino line points to the right). The u and d are quark

constituents of the nucleons. There is a propagator for the W- and there are two vertices (u-d-W- and e--e-W-) included in each of these Feynman

diagrams.

In the 1970s, such an interaction would be labeled a charged-current interaction (the W- being charged and being exchanged; we could instead have drawn the diagrams

for electron plus proton to neutron and electron neutrino and for

positron emission, which proceed through exchange of the W+).

The exchanged virtual particles’ propagators would be of the form

1/(pp – m2), and the m2 term would mean approximately constant matrix elements at low energy (compared to

the m2), while the p2 part of term would make them vanishingly small at high energy by tending the denominator

toward zero.

There must also be indications not simply of exchange of a charged gauge boson as in

nuclear beta decay. In other instances of the weak interaction, there could possibly

be exchange of a neutral gauge boson, similar to the photon in its lack of electric charge, but having a nonzero mass. In the

parlance of the 1970s, the former would be called a charged current interaction and the

latter a neutral-current interaction (the currents form the matrix element).

Such a particle could be produced by sending electrons and positrons colliding together and seeing, for

example, +-- pairs emerge or scattering an electron neutrino from an electron and producing the same thing

or producing a - and . The first experimental evidence for the electroweak theory was the discovery of weak neutral currents, first seen in 1973 in by the Gargamelle collaboration at CERN in –nucleon

scattering and anti-muon-neutrino-electron scattering, and immediately thereafter by the Harvard-Penn-

Wisconsin collaboration at Fermilab. The exchanged gauge particle is known as the Z.

Thus, the experimental result supported electroweak theory, in which the photon, the W±, and the Z are the

gauge bosons.

The mystery was why the photon was massless, while the W± and Z had masses.

This is where the spontaneous symmetry breaking comes in. Massless particles have two states of

polarization, which we usually label clockwise and counterclockwise. Massive particles also have a

longitudinal polarization, for a total of three states of polarization.

The “extra” state is supplied through the acquisition of mass by the Higgs mechanism.

This gives mass to the W± and Z particles.

To trust the model, the W± and Z would have to be found experimentally.

They were found in experiments at the CERN SPS (super proton synchrotron) in the early 1980s and the

1984 Nobel Prize was given for their discovery.

The “normal” massive particles’ masses arise largely through the kinetic energy of the bound constituent

quarks. This is very different from the idea of the Higgs mechanism.

Consider a field for a particle whose original value puts it on an extremum of the potential. We named

particles like these Higgs particles, which are represented by the fields, spontaneously move away

from their original wavefunction to a new wavefunction at = having a lower potential energy.

A three-dimensional vision of the potential. V(ϕ) = λ(ϕ ϕ – μ∗ 2)2

In this process, known as the Higgs mechanism, the fields disappear when they “fall” into the lower potential and through their disappearance become

responsible for creating the masses and thus the longitudinal polarizations of the gauge bosons. Thus, a key part of verification of electroweak unification is the appearance of Higgs bosons in

experiments.

A new field is made from the Higgs …

ϕ = H + μ

there is an interaction term ϕ∗ϕA2 = μ2A2 + . . .

That acts like a mass term (1/2 m2 = μ2).

Also, the Higgs field also has a mass that comes from the potential V(ϕ).

From the 1970s to now, the Higgs was a “holy grail” of experimental searches. Up until Independence Day 2012, no such scalar particle

had been found.

A particle accelerator (the LHC) was built to try to find it.

LHC stands for Large Hadron Collider.

What do we mean by the “hadron” in the Large Hadron Collider?

There are two sorts of particles shown on the chart I hope I gave you—leptons and hadrons.

They are completely different in their properties from one another, but all leptons have spin

n + 1/2 and do not interact strongly. All hadrons interact strongly and can have have

either integer spin or spin n + 1/2.

Leptons interact gravitationally, electromagnetically, and via the weak interaction.

Hadrons are the only particles that interact via the strong interaction. Quarks are hadrons.

This is important: the hadrons act over really short distances—

distances of a femtometer (10-15 m).

The Large Hadron Collider (LHC) is a place where interactions can occur through particle

collisions.

According to Wikipedia, “The Large Hadron Collider (LHC) is the world’s largest and

highest-energy particle accelerator, intended to collide opposing particle beams, protons at an energy of 7 TeV/particle or lead nuclei at 574 TeV/particle.”

The LHC is a circular accelerator ring 27 km around. Particles are steered in both directions using superconducting magnets and made to

collide in several regions loaded with detectors like the Atlas detector.

Because the ring is so big, the particles’ energies are immense—10 TeV—and the

particles are traveling at essentially the speed of light: E = mc2 = 1 GeV, so

10 TeV/(1 GeV) = 10,000, givingv = c - 1.5 m/s.

Let’s think a bit.

The resolution of objects depends on the wavelength of the probing object. A wave of wavelength bends around

objects of size d. Waves and particles are not more than different evocations of

some underlying reality. Particles have momentum p that is related to the

wavelength : p = h/.

Becausep = h/,

is comparable in size to the object (d),and the energy of a particle is given by

E = (p2c2 + m2c4)1/2 = mc2, we see that to “see” a small object (d very

small), p must be very large, and so in turn E must be very large.

This means that particle physicists are always searching to increase the energy

of collisions. They do this by accelerating the particles in an

accelerator.

The first accelerators were designed in the 1920s—Cockroft and Walton

designed a linear accelerator (linac), and E. O. Lawrence designed a circular

accelerator (cyclotron).

Lawrence’s machine was called a cyclotron (not prefix), and today particle

physicists use both linacs and synchrocyclotrons to study particle

physics.

The synchronization is necessary due to the effects of special relativity.

LHC preacceleratorsp and Pb: Linear accelerators for protons (Linac 2) and Lead (Linac 3) (not marked) Proton Synchrotron BoosterPS: Proton Synchrotron SPS: Super Proton SynchrotronLHC experimentsATLAS A Toroidal LHC ApparatusCMS Compact Muon SolenoidLHCb LHC-beautyALICE A Large Ion Collider ExperimentTOTEM Total Cross Section, Elastic Scattering and Diffraction DissociationLHCf LHC-forward

ATLAS is about 45 meters long, more than 25 meters high, and

has a mass of about 7,000 tonnes.

The Compact Muon Solenoid (CMS) is 21 meters long and 15 meters wide and high. It has a mass of 12,500 tonnes.

The Standard Model (the chart I showed) has been the most successful model ever in

describing the actions of particles.

The Standard Model explains all the particle physics of the past 30 years.

Explorations of the Standard Model have been responsible for 32 Nobel Prizes over the last 30

years.

The Higgs boson gives mass to the four colorless gauge bosons (, W+, W-, Z) in the Standard Model.

In the Standard Model chart, we now have the Higgs.

There are other bosons besides the gauge bosons. In the standard

model, they are made of quark-antiquark pairs.

As already noted, constituent mesons get their masses mainly from their enclosed constituents’

kinetic energy. (Note the smallness of the quark

masses.)

The protons, neutrons, and other ordinary constituent fermions in

matter are made up of three quarks.

The protons, neutrons, and the other ordinary constituent

fermions, get their masses through a similar mechanism to the

constituent bosons. Their masses also come mainly from enclosed

constituents’ kinetic energy.

The quarks, basic fermions, get their masses through still a

different mechanism. Their masses come mainly from entrained kinetic energy, perhaps vibrating strings.

So the Higgs particle is central to the Standard Model in that it makes

the gauge bosons.

Did the LHC experiments see the Higgs particle?

Two experiments, Atlas and CMS, reported “5 ” results.

What did ATLAS and CMS see? Here are some Atlas data from December 2011.

Here is where they both see the the possibility of the Higgs—between

120 and 130 GeV/c2.

The combined upper limit on the Standard Model Higgs boson production cross section divided by the Standard

Model expectation as a function of mH is indicated by the solid line. This is a 95% CL limit using the CLs method in in the low mass range. The dotted line shows the median

expected limit in the absence of a signal and the green and yellow bands reflect the corresponding 68% and 95%

expected regions. At that time it was over 2 .

5 as explained by BBC News

Statistics of a ‘discovery’Particle physics has an accepted definition for a “discovery”: a five-

sigma level of certaintyThe number of standard deviations, or sigmas, is a measure of how unlikely it is that an experimental result is simply down to chance

rather than a real effectSimilarly, tossing a coin and getting a number of heads in a row may

just be chance, rather than a sign of a “loaded” coinThe “three sigma” level represents about the same likelihood of tossing

more than eight heads in a rowFive sigma, on the other hand, would correspond to tossing more than

20 in a rowUnlikely results can occur if several experiments are being carried out at once - equivalent to several people flipping coins at the same time

With independent confirmation by other experiments, five-sigma findings become accepted discoveries

More CMS results.

More CMS results.

World Conference on Physics Education, Bahçeşehir Üniversitesi, İstanbul, July 2012

Fast forward to Wednesday, 4 July 2012. I was sitting in sessions of the World Conference on Physics Education in Istanbul. Because I was

listening to talks, I couldn’t watch the seminars at CERN describing the discovery of “a Higgslike particle,” but I could surreptitiously keep

following the live blog at the Guardian newspaper website.

At around 9:30 Istanbul time, I read a posting of a tweet from Brian Cox was posted on the blog: “And combined - 5 sigma. Round of

applause. That’s a discovery of a Higgs - like particle at CMS. They thank LHC for the data!”

“9.44am: Rolf Heuer, Director General of CERN, offers this verdict:As a layman I would say: I think we have it. You agree?

The audience claps. I think that’s a yes.

ATLAS: The observed (full line) and expected (dashed line) 95% CL combined upper limits on the SM Higgs boson production cross section divided by the Standard Model expectation as a

function of mH in the full mass range considered in this analysis (a) and in the low mass range (b). The dashed curves show the median expected limit in the absence of a signal and the green

and yellow bands indicate the corresponding 68% and 95% intervals.

One standard deviation from the center would give a probability of 68% of all data (~ 1 in 3). About 95.5% of the data will be

inside two standard deviations (~ 1 in 22); about 99.7% lie within three standard deviations (~ 1 in 370), four standard deviation

events occur 1 in 15,787 times; and five standard deviation events occur 1 in every 1,744, 278 times.

So a five sigma effect, which they both now have, means that such a thing would be observed by chance with a probability of

1/1,744, 278 = 5.7 x 10-7. This is so unlikely that this is the criterion for accepting an effect as real in particle physics, when

it is corroborated by another experiment as in this case.

There you have it. There is a Higgs (or something very like it).

It took from the 1960s to the 2010s, but theory was vindicated.

There are more problems lurking—beyond the Standard Model.