first results from miniboone eric prebys, fnal/boone collaboration

Post on 27-Dec-2015

223 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

First Results from MiniBooNEFirst Results from MiniBooNE

Eric Prebys, FNAL/BooNE Collaboration

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 2

University of Alabama Los Alamos National LaboratoryBucknell University Louisiana State UniversityUniversity of Cincinnati University of MichiganUniversity of Colorado Princeton UniversityColumbia University Saint Mary’s University of MinnesotaEmbry Riddle University Virginia Polytechnic InstituteFermi National Accelerator Laboratory Western Illinois UniversityIndiana University Yale University

The MiniBooNE CollaborationThe MiniBooNE Collaboration

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 3

OutlineOutline

State of neutrino mixing measurements History and background Without LSND LSND and Karmen

Experiment Beam Detector Calibration and cross checks Analysis

Results Future Plans and outlook

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 4

The Neutrino “Problem”The Neutrino “Problem”

1968: Experiment in the Homestake Mine first observes neutrinos from the Sun, but there are far fewer than predicted. Possibilities: Experiment wrong? Solar Model wrong? ( believed by most not involved) Enough created, but maybe oscillated (or decayed to

something else) along the way. ~1987: Also appeared to be too few atmospheric muon

neutrinos. Less uncertainty in prediction. Similar explanation.

Both results confirmed by numerous experiments over the years.

1998: SuperKamiokande observes clear oscillatory behavior in signals from atmospheric neutrinos. For most, this establishes neutrino oscillations “beyond a reasonable doubt”

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 5

Theory of Neutrino OscillationsTheory of Neutrino Oscillations

Neutrinos are produced and detected as weak eigenstates (e ,, or ). These can be represented as linear combination of mass eigenstates. If the above matrix is not diagonal and the masses are not equal, then the net weak

flavor content will oscillate as the neutrinos propagate. Example: if there is mixing between the e and :

then the probability that a e will be detected as a after a distance L is:

E

LmP e

222 27.1sin2sin)(

2

1

cossin

sincos

e

)eV(in 221

22 mm

Distance in km

Energy in GeV

Mass eigenstatesFlavor eigenstates

Only measure magnitude of the difference of the squares of the masses.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 6

Probing Neutrino Mass DifferencesProbing Neutrino Mass Differences

& Reactors

Accelerators use decay to directly probe e

Reactors use use disappearance to probe e Cerenkov detectors directly measure and e content in atmospheric neutrinos. Fit to e mixing hypotheses

Solar neutrino experiments typically measure the disappearance of e.

Different experiments probe different ranges of E

L Path length

Energy

Also probe with new generation of “long baseline” accelerator and reactor experiments, MINOS, T2K, etc

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 7

Best Three Generation Picture (all experiments but LSND)Best Three Generation Picture (all experiments but LSND)

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 8

The LSND Experiment (1993-1998)The LSND Experiment (1993-1998)

Signature Cerenkov ring from

electron Delayed from

neutron capture

~30 m

Energy 20-50 MeV

e

e

e

mix

nepe

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 9

LSND ResultLSND Result

(Soudan, Kamiokande,MACRO, Super-K)

(Homestake, SAGE,GALLEX, Super-KSNO, KamLAND)

Excess Signal: Best fit:

Only exclusive appearance result to date Problem: m2 ~ 1 eV2 not consistent with other

results with simple three generation mixing

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 10

PossibilitiesPossibilities

4 neutrinos? We know from Z lineshape there are only 3 active

flavors Sterile?

CP or CPT Violation? More exotic scenarios? LSND Wrong? Can’t throw it out just because people don’t like it.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 11

m5

3+2 models

hep-ph/0305255

Accommodating a Positive SignalAccommodating a Positive Signal

We know from LEP that there are only 3 active, light neutrino flavors.

If MiniBooNE confirms the LSND results, it might be evidence for the existence of sterile neutrinos

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 12

Karmen II Experiment: not quite enoughKarmen II Experiment: not quite enough

Pulse 800 MeV proton beam (ISIS) 17.6 m baseline 56 tons of liquid scintillator Factor of 7 less statistical reach than

LSND -> NO SIGNAL Combined analysis still leaves an

allowed region

Combined

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 13

Role of MiniBooNERole of MiniBooNE

Boo(ster) N(eutrino) E(xperiment) Full “BooNE” would have two detectors

Primary Motivation: Absolutely confirm or refute LSND result Optimized for L/E ~ 1 Higher energy beam -> Different systematics than LSND

• E: ~30 MeV -> 500 MeV• L: ~30 m -> 500 m

Timeline Proposed: 12/97 Approved 1998 Began Construction: 10/99 Completed: 5/02 First Beam: 8/02 Began to run concurrently with NuMI: 3/05 Oscillation results: 4/07

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 14

MiniBooNE Neutrino Beam (not to scale)MiniBooNE Neutrino Beam (not to scale)

8 GeV Protons ~ 8E16 p/hr max ~ 1 detected neutrino/minute L/E ~ 1

FNALBooster Be

Targetand Horn

“Little Muon Counter” (LMC): to understand K flux

50 m Decay Region

500m dirt

Detector

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 15

DetectorDetector

950,000 l of pure mineral oil 1280 PMT’s in inner region 240 PMT’s outer veto region Light produced by Cerenkov radiation and

scintillation

Light barrier

Trigger:All beam spillsCosmic ray triggersLaser/pulser triggersSupernova trigger

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 16

Neutrino Detection/Particle IDNeutrino Detection/Particle ID

e

e

n

p

W

e

e

n

p

W

n 0

Z

p 0

e

Important Background!!!

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 17

Selecting Neutrino EventsSelecting Neutrino Events

Collect data from -5 to +15 usec around each beam spill trigger.

Identify individual “events” within this window based on PMT hits clustered in time.

Time (ns) Time (ns) Time (ns)

No cuts Veto hits < 6 Veto hits<6tank hits>200

1600 ns spill

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 18

Delivering ProtonsDelivering Protons

Requirements of MiniBooNE greatly exceed the historical performance of the 30+ year old 8 GeV Booster, pushes… Average repetition rate Above ground radiation Radiation damage and

activation of accelerator components

Intense Program to improvethe Booster Shielding Loss monitoring and analysis Lattice improvements (result of Beam Physics involvement) Collimation system

Very challenging to continue to operate 8 GeV line during NuMI/MINOS operation Once believed impossible Element of lab’s “Proton Plan” Goal to continue to deliver roughly 2E20 protons per years to the

8 GeV program even as NuMI intensities ramp up.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 19

Small but HungrySmall but Hungry

MiniBooNE began taking data in Fall of 2002, and has currently taken more protons than all other users in history of Fermilab combined (including NuMI and pBar)

MiniBooNE, 86.2

All others (1974-present), 49.2

NuMI, 30.1

Total protons (1019)

63E19 Events in mode (this analysis)

In anti- mode since ~1/06

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 20

Modeling neutrino flux Modeling neutrino flux

Production GEANT4 model of target,

horn, and beamline MARS for protons and

neutrons Sanford-Wang fit to

production data for and K Mesons allowed to decay in

model of decay pipe. Retain neutrinos which point

at target Data Constrained by HARP

Experiment

e e

K e e

K

Antineutrino content: 6% e/ = 0.5%

“Intrinsic” e + e sources: e+ e (52%)

K+ e+ e (29%)K0 e e (14%)

Other ( 5%)

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 21

InteractionsInteractions

Cross sections Based on NUANCE 3 Monte Carlo

• Use NEUT and NEUEN as cross checks

Theoretical input:• Llewellyn-Smith free nucleon

cross sections• Rein-Sehgal resonant and

coherent cross-sections• Bodek-Yang DIS at low-Q2• Standard DIS parametrization at

high Q2• Fermi-gas model• Final state interaction model

Constraining NUANCE Observed nm data

• MAeff -- effective axial mass

• EloSF -- Pauli Blocking parameter

From electron scattering data• Eb -- binding energy• pf -- Fermi momentum

MiniBooNE

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 22

D. Casper, NPS, 112 (2002) 161

Predicted Event Rates (NUANCE)Predicted Event Rates (NUANCE)

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 23

Characterizing the DetectorCharacterizing the Detector

Full GEANT 3.21 model of detector Includes detailed (39-parameter!!) optical

model of oil and PMT response MC events produced at the raw data level and

reconstructed in the same way as real events

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 24

Calibrating the DetectorCalibrating the Detector

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 25

N 0

N

NC0

The 0 decays to 2 photons,which can look “electron-like” mimicking the signal...

<1% of 0 contribute to background.

N

N

25%

8%

CC+

Easy to tag due to 3 subevents.Not a substantial background to the oscillation analysis.

Events producing pions

(also decays to a single photon with 0.56% probability)

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 26

Bottom line: signal and background Bottom line: signal and background

If the LSND best fit is accurate, only about a third of our observed rate will come from oscillations

Backgrounds come from both intrinsic e and misidentified

SignalMis IDIntrinsic e

Energy distribution can help separate

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 27

Analysis PhilosophyAnalysis Philosophy

The “golden signal” in the detector is the Charged Current Quasi-Elastic (CCQE) event With a particular mass hypothesis, the neutrino energy can be reconstructed from the observed partcle with

If the LSND signal were due to neutrino oscillations, we would expect an excess in the energy range of 300-1500 MeV

The “signal” is therefore an event with A high probability of being a e CCQE event A reconstructed energy in the range 300 to 1500 MeV

cose or

eor

N N

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 28

Two AnalysesTwo Analyses

“Track Based” analysis: Use detailed tracking information to form a particle

ID likliehood. “Box” defined by e likelihood and energy.

Backgrounds weighted by observed events outside of box.

“Boosting” Particle ID based on feeding a large amount of event

information into a “boosted decision tree” (details to follow)

“Box” defined by the boosted “score” and the mass range. Backgrounds determined by models which are largely constrained by the data.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 29

Common Event CutsCommon Event Cuts

>200 tank hits <6 veto hits

R<500 cm (algorithm dependent)

dataMC

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 30

BlindnessBlindness

In general, the two analyses have sequestered (hidden) data which has A reconstructed neutrino energy in the range 300 to

1500 MeV A high probability of being an electron

This leaves the vast majority (99%) of the data available for analysis

Individual open boxes were allowed, provided it could be established that an oscillation would not lead to a significant signal.

In addition, detector level data (hit times, PMT spectra, etc) were available for all events.

Determination of “primary” analysis based on final sensitivity before opening the box

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 31

The goal of both analyses: minimize background & maximize signal efficiency.

“Signal range” is approximately300 MeV < EQE < 1500 MeV

One can then either:• look for a total excess

(“counting expt”)• fit for both an excess and

energy dependence (“energy fit”)

MiniBooNE signal examples:m2=0.4 eV2

m2=0.7 eV2

m2=1.0 eV2

0 1 2 3 Neutrino Energy (GeV)

Arb

itra

ry U

nits

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 32

MiniBooNE is searchingfor a small but distinctive event signature

In order to maintain blindness,Electron-like events were sequestered,Leaving ~99% of the in-beam events available for study.

Rule for cuts to sequester events: <1 signal outside of the box

Low level information which did not allow particle-idwas available for all events.

Open Data for Studies:

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 33

Uses detailed, direct reconstruction of particle tracks,and ratio of fit likelihoods to identify particles.

Philosophy:

This algorithm was found to have the bettersensitivity to e appearance.

Therefore, before unblinding, this was the algorithm chosen for the “primary result”

Analysis 1: “Track-Based” (TB) Analysis

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 34

Track-based analysisTrack-based analysis

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 35

Each event is characterized by 7 reconstructed variables:

vertex (x,y,z), time, energy, and direction ()(Ux, Uy, Uz).

Resolutions: vertex: 22 cm

direction: 2.8 energy: 11%

CCQE events

2 subeventsVeto Hits<6Tank Hits>200

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 36

Rejecting “muon-like” eventsUsing log(Le/L)

log(Le/L)>0 favors electron-like hypothesis

Note: photon conversions are electron-like.This does not separate e/0.

Separation is clean at high energies where muon-like events are long.

Analysis cut was chosento maximize the e sensitivity

e CCQE

CCQEMC

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 37

Rejecting “0-like” events

MC

Cuts were chosen to maximize e sensitivity

Using a mass cut Using log(Le/L)

NC0

e CCQE NC0

e CCQE

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 38

BLI

ND

eπ0

Invariant Masse π0

BLIND

Monte Carlo π0 only

Testing e-0 separation using data

1 subeventlog(Le/L)>0 (e-like)log(Le/L)<0 (-like)mass>50 (high mass)

log(Le/L)

invariant masssignal

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 39

2 Prob for mass<50 MeV (“most signal-like”): 69%

mass<200 (low mass)log(Le/L)>0 (e-like)log(Le/L)<0 (-like)

BLIND

Monte Carlo π0 only

Next: look here....

1 subeventlog(Le/L)>0 (e-like)log(Le/L)<0 (-like)mass<200 (low mass)

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 40

Efficiency:

Log(Le/L) + Log(Le/L) + invariant mass

Backgrounds after cuts

Summary of Track Based cuts

“Precuts” +

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 41

Construct a set of low-level analysis variables which are used to make a series of cuts to

classify the events.

Philosophy:

This algorithm represents an independent cross check of the Track Based Analysis

Analysis 2: Boosted Decision Trees (BDT)

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 42

Fundamental information from PMTsAnalysis Hit Position Charge Hit Timing variablesEnergy

Time sequence

Event shape

Physics

Step 1:Convert the “Fundamental information”

into “Analysis Variables”

“Physics” = 0 mass, EQE, etc.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 43

Resolutions:vertex: 24 cmdirection: 3.8ºenergy 14%

Examples of “Analysis Variables”

Reconstructed quantities which are inputs to EQE

CCQE CCQE

UZ = coszEvisible

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 44

Step 2: Reduce Analysis Variables to a Single PID Variable

“A procedure that combines many weak classifiersto form a powerful committee”

Boosted Decision Trees

hit level(charge, time,

position)

analysis variables

One singlePID “score”

Byron P. Roe, et al., NIM A543 (2005) 577.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 45

(sequential series of cuts based on MC study)

A Decision Tree

(Nsignal/Nbkgd)

30,245/16,305

9755/23695

20455/3417 9790/12888

1906/11828 7849/11867

signal-likebkgd-like

bkgd-like sig-like

sig-like bkgd-like

etc.

This tree is one of many possibilities...

Variable 1

Variable 2

Variable 3

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 46

A set of decision trees can be developed,each re-weighting the events to enhance identification of backgrounds misidentifiedby earlier trees (“boosting”)

For each tree, the data event is assigned +1 if it is identified as signal,-1 if it is identified as background.

The total for all trees is combined into a “score”

negative positiveBackground-

like signal-like

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 47

BDT cuts on PID score as a function of energy.We can define a “sideband” just outside of the signal region

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 48

BDT cuts on PID score as a function of energy.We can define a “sideband” just outside of the signal region

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 49

BDT Efficiency and backgrounds after cuts:

Analysis cuts on PID score as a function of Energy

signal

background

Efficiency after precuts

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 50

Errors, Constraints and Sensitivity

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 51

We have two categories of backgrounds:

(TB analysis)

mis-id

intrinsic e

Predictions of the backgrounds are among the nine sources of significant error in the analysis

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 52

Flux from +/+ decay 6.2 / 4.3 √ √ Flux from K+ decay 3.3 / 1.0 √ √ Flux from K0 decay 1.5 / 0.4 √ √ Target and beam models 2.8 / 1.3 √

-cross section 12.3 / 10.5 √ √ NC 0 yield 1.8 / 1.5 √

External interactions (“Dirt”) 0.8 / 3.4 √ Optical model 6.1 / 10.5 √ √ DAQ electronics model 7.5 / 10.8 √

Source of UncertaintyOn e background

Checked or Constrained by MB data

Furtherreduced by

tyinge to

Track Based/Boosted Decision Treeerror in %

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 53

Tying the e background and signal predictionto the flux constrains this analysis to a strict

e appearance-only search

Data/MC Boosted Decision Tree: 1.22 ± 0.29 Track Based: 1.32 ± 0.26

BDT

Predict

Normalization& energy dependenceof both backgroundand signal

From the CCQE

events

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 54

e e

• Measure the flux• Kinematics allows

connection to the flux

E(GeV)

E(GeV)

E = 0.43 E

• Once the flux is known, the flux is determined

constraint on intrinsic e from + decay chains

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 55

K+ and K0 decay backgrounds

At high energies, above “signal range” and “e -like” events arelargely due to kaon decay

Signal examples:m2=0.4 eV2

m2=0.7 eV2

m2=1.0 eV2

Predicted rangeof significant oscillation signal:300<EQE<1500 MeV

0 1 2 3 Neutrino Energy (GeV)

Arb

itra

ry U

nitssignal

range

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 56

TB

In Boosted DecisionTree analysis:Low energy bin (200<EQE<300 MeV)

constrains mis-ids:0, N, dirt ...

signal range

signal

up to 3000 MeV

BDT

In both analyses,high energy bins constrain

e background

Use of low-signal/high-background energy bins

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 57

We constrain 0 production using data from our detector

Because this constrains the resonance rate, it also constrains the rate of N

Reweighting improvesagreement in other

variables, e.g.

This reduces the erroron predictedmis-identified 0s

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 58

Other Single Photon Sources

From Efrosinin, hep-ph/0609169, calculation checked by Goldman, LANL

Neutral Current: + N + N +

Charged Current

+ N + N’ +

negligible

where the presence of the leads to mis-identification

Use events where the is tagged by the michel e-,

study misidentification using BDT algorithm.

< 6 events @ 95% CL

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 59

“Dirt” Events

Event Type of Dirt after PID cutsEnhancedBackgroundCuts

interactions outside of the detector Ndata/NMC = 0.99 ± 0.15

Cosmic Rays: Measured from out-of-beam data: 2.1 ± 0.5 events

External Sources of Background

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 60

Summary of predicted backgrounds forthe final MiniBooNE result(Track Based Analysis):

(example signal)

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 61

Handling uncertainties in the analyses:

For a given source of uncertainty,

Errors on a wide rangeof parameters

in the underlying model

For a given source of uncertainty,

Errors in bins of EQE

and information on the correlationsbetween bins

What we begin with... ... what we need

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 62

How the constraints enter...

TB: Reweight MC prediction to match measured result (accounting for systematic error correlations)

Two Approaches

Systematic (and statistical) uncertainties are included in (Mij)-1

BDT: include the correlations of to e in the error matrix:

(i,j are bins of EQE)

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 63

MAQE, elo

sf 6%, 2% (stat + bkg only) QE norm 10% QE shape function of Ee/ QE function of E

NC 0 rate function of 0 mom MA

coh, coh ±25% Nrate function of mom + 7% BF

EB, pF 9 MeV, 30 MeVs 10% MA

1 25% MA

N 40% DIS 25%

determined fromMiniBooNE QE data

determined fromMiniBooNE NC 0 data

Example: Cross Section Uncertainties

determined from other experiments

(Many are common to and e and cancel in the fit)

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 64

Example:Optical Model Uncertainties

39 parameters must be varied

Allowed variations are set by the Michel calibration sample

To understand allowed variations,we ran 70 hit-level simulations, with differing parameters.

“Multisims”

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 65

Using Multisims to convert from errors on parameters to errors in EQE bins:

For each error source,“Multisims” are generated within the allowed variations

by reweighting the standard Monte Carlo.In the case of the OM, hit-level simulations are used.

number of multisims

Number of events passing cuts in bin 500<EQE<600 MeV

1000 multisims forK+ production

70 multisims for the Optical Model

standard MC

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 66

Correlations between EQE bins from

the optical model:

• N is number of events passing cuts •MC is standard monte carlo• represents a given multisim• M is the total number of multisims• i,j are EQE bins

Error Matrix Elements:

Total error matrixis sum from each source.

TB: e-only total error matrixBDT: -e total error matrix

CVjj

MCViiij NNNN

ME

1

1 MC MC

BDT

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 67

As we show distributions in EQE,keep in mind that error bars are

the diagonals of the error matrix.

The effect of correlations between EQE binsis not shown,

however EQE bin-to-bin correlations improve the sensitivity to oscillations,

which are based on an energy-dependent fit.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 68

Sensitivity of the two analyses

The Track-based sensitivity is better,thus this becomes the pre-determined default algorithm

Set using2=1.64 @ 90% CL

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 69

Comparison to sensitivity goal for 5E20 POTdetermined by Fermilab PAC in 2003

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 70

The Initial Results

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 71

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 72

Box Opening Procedure

After applying all analysis cuts:

1. Fit sequestered data to an oscillation hypothesis, returning no fit parameters.Return the 2 of the data/MC comparison for a set of diagnostic variables.

2. Open up the plots from step 1. The Monte Carlo has unreported signal.Plots chosen to be useful diagnostics, without indicating if signal was added.

3. Report the 2 for a fit to EQE , without returning fit parameters.

4. Compare EQE in data and Monte Carlo, returning the fit parameters.At this point, the box is open (March 26, 2007)

5. Present results two weeks later.

Progress cautiously, in a

step-wise fashion

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 73

Step 1

Return the 2 of the data/MC comparison for a set of diagnostic variables

All analysis variables were returned with good probability except...

Track Based analysis 2 Probability of Evisible fit: 1%

This probability was sufficiently low to merit further consideration

12 variables are tested for TB46 variables are tested for BDT

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 74

In the Track Based analysis

• We re-examined our background estimatesusing sideband studies.

We found no evidence of a problem

• However, knowing that backgrounds rise at low energy,We tightened the cuts for the oscillation fit:

EQE> 475 MeV

We agreed to report events over the original full range: EQE> 300 MeV,

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 75

Step 1: again!

Return the 2 of the data/MC comparison for a set of diagnostic variables

Parameters of the oscillation fit were not returned.

TB (EQE>475 MeV)BDT

2 probabilities returned:

12 variables 46 variables

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 76

Open up the plots from step 1 for approval.

Examples ofwhat we saw:

MC contains fitted signal at unknown level

Step 2

Evisible

TB (EQE>475 MeV) BDT

fitted energy (MeV)

Evisible

2 Prob= 28%

2 Prob= 59%

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 77

Report the 2 for a fit to EQE across full energy range

TB (EQE>475 MeV) 2 Probability of fit: 99% BDT analysis 2 Probability of fit: 52%

Step 3

Leading to...

Open the box...

Step 4

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 78

Counting Experiment: 475<EQE<1250 MeV

data: 380 eventsexpectation: 358 19 (stat) 35 (sys) events

significance: 0.55

The Track-based e Appearance-only Result:

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 79

Error bars arediagnonals oferror matrix.

Fit errors for >475 MeV:Normalization 9.6%Energy scale: 2.3%

Best Fit (dashed): (sin22, m2) = (0.001, 4 eV2)

Track Based energy dependent fit results:Data are in good agreement with background prediction.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 80

The result of the e appearance-only analysis

is a limit on oscillations:

Energy fit: 475<EQE<3000 MeV

2 probability, null hypothesis: 93%

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 81

96 ± 17 ± 20 eventsabove background,for 300<EQE<475MeV

Deviation: 3.7

As planned before opening the box....Report the full range: 300<EQE<3000 MeV

to E>475 MeV

Background-subtracted:

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 82

Best Fit (dashed): (sin22, m2) = (1.0, 0.03 eV2)2 Probability: 18%

Fit to the > 300 MeV range:

}Examples in LSND allowedrange

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 83

This will be addressed by MiniBooNE and SciBooNE

This is interesting, but requires further investigation

A two-neutrino appearance-only model systematically disagrees with the shape as a function of energy.

We need to investigate non-oscillation explanations, including unexpected behavior of low energy cross sections.This will be relevant to future e searches

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 84

Boosted Decision Tree Analysis

Counting Experiment: 300<EQE<1600 MeV data: 971 eventsexpectation: 1070 33 (stat) 225 (sys) events

significance: 0.38

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 85

Boosted Decision Tree EQE data/MC comparison:

error bars arestat and sys(diagonals of matrix)

data -predicted (no osc) error

(sidebands used for constraint not shown)

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 86

Boosted Decision Tree analysis shows no evidence for e appearance-only oscillations.

Energy-fit analysis:solid: TBdashed: BDT

Independent analysesare in good agreement.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 87

1) There are various waysto present limits:

• Single sided raster scan(historically used, presented here)

• Global scan• Unified approach

(most recent method)

2) This result must befolded into an LSND-Karmenjoint analysis.

We will present a full joint analysis soon.

Two points on interpreting our limit

Church, et al., PRD 66, 013001

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 88

A MiniBooNE-LSND Compatibility Test

• For each m2, determine the MB and LSND measurement:

zMB zMB, zLSND zLSND where z = sin2(2) and z is the 1 error

• For each m2, form 2 between MB and LSND measurement

• Find z0 that minimizes 2

(weighted average of two measurements) and this gives 2min

• Find probability of 2min for 1 dof;

this is the joint compatibility probability for this m2

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 89

0.001

0.010

0.100

0.25 0.75 1.25 1.75 2.25 2.75

dm2 (eV2)

Jo

int

MB

-LS

ND

Pro

b (

1d

of)

2% Compatibility

Max

imum

Joi

nt P

roba

bili

ty

m2 (eV2)

MiniBooNE is incompatible with a e appearance only interpretation of LSND

at 98% CL

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 90

Plans:

A paper on this analysis will be posted to the “archive” and to the MiniBooNE webpage after 5 CT today.

Many more papers supporting this analysis will follow, in the very near future:

CCQE production0 productionMiniBooNE-LSND-Karmen joint analysis

We are pursuing further analyses of the neutrino data,including...

an analysis which combines TB and BDT,more exotic models for the LSND effect.

MiniBooNE is presently taking data in antineutrino mode.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 91

Conclusions

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 92

• A generic search for a e excess in our beam,

• An analysis of the data within a e appearance-only context

Our goals for this first analysis were:

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 93

Within the energy range defined by this oscillation analysis, the event rate is consistent with background.

The observed low energy deviation is under investigation.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 94

The observed reconstructed energy distribution is inconsistent with a e appearance-only model

Therefore we set a limit on e appearance

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 95

We Thank:

DOE and NSF

The Fermilab Divisions and Staff

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 96

Conclusions and OutlookConclusions and Outlook

MiniBooNE has been running for over three years, and continues to run well in the NuMI era

The analysis tools are well developed and being refined to achieve the quality necessary to release the result of our blind analysis

Recent results for CCQE and CCPiP give us confidence on our understanding of the detector and data.

Look forward to many interesting results in 2006

Backup SlidesBackup Slides

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 98

Sterile Neutrinos: Astrophysics Constraints (courtesy M. Sterile Neutrinos: Astrophysics Constraints (courtesy M. Shaevitz)Shaevitz)

Constraints on the number of neutrinos from BBN and CMB Standard model gives

N=2.6±0.4 constraint

If 4He systematics larger, then N=4.0±2.5

If neutrino lepton asymmetry or non-equilibrium, then the BBN limit can be evaded.K. Abazajian hep-ph/0307266G. Steigman hep-ph/0309347

“One result of this is that the LSND result is not yet ruled out by cosmological observations.” Hannestad astro-ph/0303076

Bounds on the neutrino masses also depend on the number of neutrinos (active and sterile) Allowed mi is 1.4 (2.5) eV

4 (5) neutrinos

N=3 solid 4 dotted 5 dashed

Hannestad astro-ph/0303076

0.00 0.005 0.01 0.015 0.02 0.025 0.03

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 99

Reactor Experimental ResultsReactor Experimental Results

Single reactor experiments (Chooz, Bugey, etc). Look for e disappearance: all negative

KamLAND (single scintillator detector looking at ALL Japanese reactors): e disappearance consistent with mixing.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 100

K2KK2K

First “long baseline” accelerator experiment Beam from KEK PS to Kamiokande, 250 km away Look for disappearance (atmospheric “problem”)

Results consistent with mixing

Best fit

No mixing

Allowed Mixing Region

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 101

HARP (CERN) 5% Beryllium target 8.9 GeV proton beam momentum

Modeling Production of Secondary Pions

HARP collaboration,hep-ex/0702024

Data are fit to a Sanford-Wangparameterization.

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 102

Everybody Loves a MysteryEverybody Loves a Mystery

3+2 Sterile neutrinos Sorel, Conrad, and Shaevitz (hep-ph/0305255)

MaVaN & 3+1 Hung (hep-ph/0010126)

Sterile neutrinos Kaplan, Nelson, and Weiner (hep-ph/0401099)

• Explain Dark Energy? CPT violation and 3+1 neutrinos

Barger, Marfatia & Whisnant (hep-ph/0308299)• Explain matter/antimatter asymmetry

Lorentz Violation Kostelecky & Mewes (hep-ph/0406035)

Extra Dimensions Pas, Pakvasa, & Weiler (hep-ph/0504096)

Sterile Neutrino Decay Palomares-Ruiz, Pascoli & Schwetz (hep-ph/0505216)

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 103

Three Generation Mixing (Driven by experiments listed)Three Generation Mixing (Driven by experiments listed)

General Mixing Parameterization

231312231312231223131223

231312231312231223131223

1312131213

ccescscseccsss

scesssccecsssc

essccc

ii

ii

i

CP violating phase

Almost diagonal Third generation weakly

coupled to first two “Wolfenstein Parameterization”

Mixing large No easy simplification Think of mass and weak eigenstates as totally separate

HEP Seminar, UT Austin, April 30th, 2007 – E. Prebys 104

SNO Solar Neutrino ResultSNO Solar Neutrino Result

Looked for Cerenkov signals in a large detector filled with heavy water. Focus on 8B neutrinos Used 3 reactions:

e+dp+p+e-: only sensitive to e

x+dp+n+x: equally sensitive to e , , x+ e- x+ e-: 6 times more sensitive to e than ,d

Consistent with initial full SSM flux of e’s mixing to , Completely consistent with three generation mixing

Just SNO SNO+others

34.tan;eV105 :Favor 2252 Δm

top related