noaa air quality status and research...
TRANSCRIPT
NOAA Air Quality Prediction Status and Research Needs
Ivanka Stajner NOAA/National Weather Service
AQAST Meeting, Harvard University, Cambridge, MA June 17, 2014
National Air Quality Forecast Capability Capabilities as of 6/2014
• Improving the basis for air quality alerts • Providing air quality information for people at risk
Prediction Capabilities: • Operations:
Ozone nationwide Smoke nationwide
Dust over CONUS
• Experimental testing: Ozone predictions
• Developmental testing: Components for particulate matter
(PM) predictions
2004: ozone
2005: ozone
2007: ozone and smoke 2012: dust
2009: smoke 2010: ozone 2010: ozone
and smoke
2
Model: Linked numerical prediction system Operationally integrated on NCEP’s supercomputer
• NOAA NCEP mesoscale numerical weather prediction • NOAA/EPA community model for air quality: CMAQ • NOAA HYSPLIT model for smoke and dust prediction
Observational Input: • NWS weather observations; NESDIS fire locations;
climatology of regions with dust emission potential • EPA emissions inventory
National Air Quality Forecast Capability End-to-End Operational Capability
Gridded forecast guidance products • On NWS servers: airquality.weather.gov
and ftp-servers (12km resolution hourly for 48 hours) • On EPA servers • Updated 2x daily
Verification basis, near-real time: • Ground-level AIRNow observations
of surface ozone • Satellite observations of smoke and dust
Customer outreach/feedback • State & Local AQ forecasters coordinated with EPA • Public and Private Sector AQ constituents
AIRNow
ozone
dust
smoke
3
Ozone predictions Operational predictions at http://airquality.weather.gov
over expanding domains since 2004
1-Hr Average Ozone
8-Hr Average Ozone
1-Hr Average Ozone 8-Hr Average Ozone
1-Hr Average Ozone 8-Hr Average Ozone
CONUS, wrt 75 ppb Threshold
Operational
Maintaining prediction accuracy as the warning threshold was lowered and emissions of pollutants are changing
0.99 0.99 0.98 0.98 0.98
0.9
0.95
1
4/1/2013 5/1/2013 5/31/2013 6/30/2013 7/30/2013 8/29/2013
Fraction correct of daily maximum of 8h average wrt 75 ppb threshold
4
Evaluation of experimental NAQFC ozone predictions for 2010, prior to emissions update
• T. Chai et al., Geosci. Model Dev., 2013 (http://www.geosci-model-dev.net/6/1831/2013/gmd-6-1831-2013.html)
• Ozone overestimation in August is larger in rural areas, during morning hours, and in the southeast US
• NO2 overestimation in August is larger at night time • Ozone biases higher on weekends, but NO2 biases higher on weekdays
5
Emissions data for NAQFC predictions since May 2014
Area Sources US EPA Projected 2012 Nonroad + 2005 NEI for other sectors Canada 2006 Emission Inventory Mexico 1999 EI for six border states
Mobile Sources (onroad) 2005 NEI with Cross-State Air Pollution Rule (CSAPR) projection for US sources Canada 2006 Emission Inventory
Point Sources (EGUs and non-EGUs) NEI 2005 for base year Updated with 2012 Continuous Emission Monitoring (CEM) data for EGUs
projected into forecast year using DOE Annual Energy Outlook (2014) factors Natural Sources
Terrestrial biogenic emission: BEIS model v3.14 Sea-salt emission: CMAQ online Sea-salt emission model
6
NOx emissions
Atlanta
Philadelphia
7
Com
pari
son
to 2
005
valu
es
Com
pari
son
to 2
005
valu
es
OMI NO2
ASQ NOX NAQFC NOX Emissions
OMI = Ozone monitoring Instrument on NASA’s Aura Satellite AQS = Air Quality System
• Relying on projections rather than inventories for mobile sources
• Comparison of projected emissions with surface and satellite observations
(Tong et al. Long-term NOx trends over large cities in the United States: Intercomparison of satellite retrievals, ground observations, and emission inventories, submitted)
Atlanta
Philadelphia
Partnering with AQ Forecasters
Focus group, State/local AQ forecasters:
• Participate in real-time developmental testing of new capabilities, e.g. aerosol predictions
• Provide feedback on reliability, utility of test products
• Local episodes/case studies emphasis
• Regular meetings; working together with EPA’s AIRNow and NOAA
• Feedback is essential for refining/improving coordination
Examples of AQ forecaster feedback after emissions update in 2012: • Good performance by NAQFC ozone forecast in
2012 in the Philadelphia metropolitan area. (William Ryan, Penn State)
• In Connecticut, NOAA model outperformed [human] forecasts: 73% vs. 54%. The NOAA model past record of over-predicting during July-August didn’t occur this year. (Michael Geigert, Connecticut Dept. of Energy and Environmental Protection)
• In Maryland, NOAA ozone predictions have improved since 2011: significant improvement in false alarm ratio (FAR) with some decrease in probability of detection (POD). (Laura Landry, Maryland Department of the Environment)
• Bias and accuracy statistics for NAQFC ozone predictions improved in 2012 compared to 2011. (Cary Gentry, Forsyth County Office of Environmental Assistance and Protection, Winston-Salem, NC)
8
Smoke Predictions
• Smoke predictions for CONUS (continental US), Alaska and Hawaii
• NESDIS provides satellite based detections of wildfire locations
• Bluesky used to estimate emissions
• HYSPLIT model for transport, dispersion and deposition (Rolph et. al., W&F, 2009)
• Recent updates include increased plume rise, decreased wet deposition, changes in daily emissions cycling
• Developed satellite product for verification (Zeng and Kondragunta)
9
Rim Fire in California
The largest wildfire ever recorded in Yosemite National Park. Fire started on August 17, 2013. Transport of smoke towards Reno, NV on 8/22/2013 was confirmed by GOES-14 satellite imagery. NWS office in Reno included smoke and haze in their forecast. Observed PM2.5 concentrations peaked around 2 pm LST, predicted concentrations at the surface peaked at 1 pm, and the highest predicted concentration was lower than observed.
020406080
100120140160180200
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
PM2.
5 (u
g/m
3)
LST on August 22, 2013
Galletti
Reno
Sparks
Prediction
http://airquality.weather.gov
10 Prediction model was updated on July 26, 2013
Standalone prediction of airborne dust from dust storms:
•Wind-driven dust emitted where surface winds exceed thresholds over source regions
• Source regions with emission potential estimated from MODIS deep blue climatology (2003-2006).
• Emissions modulated by real-time soil moisture.
• HYSPLIT model for transport, dispersion and deposition (Draxler et al., JGR, 2010)
• Wet deposition updates in July 2013
• Developed satellite product for verification (Ciren et.al., JGR 2014)
CONUS Dust Predictions Operational Predictions at http://airquality.weather.gov/
11
• Including sources for wildfire smoke and dust • Chemical mechanism (eg. SOA) • Meteorology (eg. PBL height) • Chemical boundary conditions/trans-boundary
inputs
Testing of PM2.5 Predictions
Forecast challenges
AQ Forecaster Focus group access, real-time as resources permit Aerosols over CONUS
From NEI sources only • CMAQ: CB05 gases, AERO-4 aerosols • Sea salt emissions • No wildfire smoke and dust storm emissions
Seasonal bias for PM2.5: winter, overprediction; summer, underprediction
12
NAQFC PM2.5 test predictions
Seasonal bias in PM2.5 predictions
The bias in the total mass of PM2.5 is dominated by overprediction of unspeciated PM in the winter and by underprediction of carbon aerosols in the summer. Saylor et al. found similar seasonal biases in speciation for CMAQ v4.6 vs IMPROVE data.
Mean (star), median (triangle), and inter-quartile ranges of model bias (model value – observed value) for multiple fine-particle species measured at CSN sites in the 12km domain. The number of model/observation pairs for each species is shown above the x-axis. [Base model is CMAQ 4.6] Foley et. al., Incremental testing of the Community Multiscale Air Quality (CMAQ) modeling system version 4.7, Geosci. Model Dev., 3, 205-226, 2010)
13
Removal of bias in PM2.5 predictions
14
Unsystematic component of the RMSE (top panel) and systematic component of RMSE (bottom panel) using hourly values for the month of November 2010 evaluated at the 518 AIRNow PM2.5 sites.
Raw: CMAQ real-time test prediction
PERS: Persistence forecast
7-day: 7-day running mean subtracted
KF: Kalman-filter approach
ANKF: Analog forecast technique followed by Kalman filter approach
AN: Analog Forecast technique
KF-AN: Kalman-filter approach followed by Analog forecast technique
• Quality control of the observations is essential • Five different post-processing techniques were tested
I. Djalalova, L. Delle Monache, and J. Wilczak: PM2.5 analog forecast and Kalman filter post-processing for the Community Multiscale Air Quality (CMAQ) model, manuscript in preparation
Testing of incremental upgrades
2012 emissions
Old recycled emission from 2012
2014 2014 emissions: new point, area and mobile
snow PM-emissions modulated by ice and snow cover
NTR NTR_fast (kinematic rate*10) i.e. sink as HNO3 faster
smoke HMS-Bluesky, plume-rise wild fire
RMSE for 48 contiguous states (CONUS)
Mean for Northeastern US
observations
real-time PM2.5 test predictions
2012 emissions
2014
snow + 2014
NTR + snow + 2014
smoke + NTR + snow +2014
15
16
Long-range transport example
Use of lateral boundary conditions from a global model (left) in CMAQ run reproduces observed enhancement in fine particulate matter around July 21 in Miami, FL (right, red curve) due to trans-Atlantic transport of Saharan dust, which is not seen with static boundary conditions (right, green curve)
NAQFC developmental prediction of PM2.5 and observations in Miami, FL
Operational AQ forecast guidance airquality.weather.gov
Further information: www.nws.noaa.gov/ost/air_quality
Ozone products Nationwide since 2010
Smoke Products Nationwide since 2010 Dust Products Implemented in 2012
17
Summary and plans US national AQ forecasting capability and recent updates:
• Operational ozone prediction nationwide; substantial emission update in 2012
• Operational smoke prediction nationwide; recent plume rise update
• Operational dust prediction for CONUS sources; emissions moderated by soil moisture
• Experimental ozone predictions for CONUS; updated emissions, lateral boundary conditions, deposition
• Prototype CMAQ aerosol predictions with NEI sources
If/when resources allow we plan to: • Maintain operational AQ predictions • Improve and transition currently experimental ozone into operations • Use lateral boundary conditions from global dust predictions in testing of PM2.5 predictions • Testing of smoke predictions with 4 km meteorology and emission updates.
18
Research needs for operational AQ predictions
Emissions inputs reflecting recent information • Emission inventories for base year 2011 are being evaluated
(information lag 4-6 years at best). • Point source emissions use more recent information: two years old
CEM data and current energy use projections. • Potential for improving emissions information for predictions using
projections and/or recent observations.
Improvements in quantitative prediction and speciation of PM2.5
• Chemistry (e.g. SOAs) • Meteorology (e.g. planetary boundary layer representation) • Lateral boundary conditions (from global predictions) • Inclusion of observations (e.g. bias correction, data assimilation)
19
Acknowledgments: AQF Implementation Team Members
Special thanks to Paula Davidson, OST chief scientist and former NAQFC Manager and to Jim Meager former NOAA AQ Matrix Manager
NOAA/NWS/OST Ivanka Stajner NAQFC Manager NWS/OCWWS Jannie Ferrell Outreach, Feedback NWS/OPS/TOC Cynthia Jones Data Communications NWS/OST/MDL Jerry Gorline, Marc Saccucci, Dev. Verification, NDGD Product Development Dave Ruth NWS/OST Sikchya Upadhayay Program Support NESDIS/NCDC Alan Hall Product Archiving NWS/NCEP
Jeff McQueen, Jianping Huang , Perry Shafran AQF model interface development, testing, & integration *Sarah Lu Global dust aerosol and feedback testing *Brad Ferrier, *Eric Rogers, NAM coordination *Hui-Ya Chuang Geoff Manikin Smoke and dust product testing and integration Allan Darling, Chris Magee NCO transition and systems testing Mike Bodner, Andrew Orrison HPC coordination and AQF webdrawer
NOAA/OAR/ARL Pius Lee, Daniel Tong, Tianfeng Chai, CMAQ development, adaptation of AQ simulations for AQF Hyun-Cheol Kim Roland Draxler, Glenn Rolph, Ariel Stein HYSPLIT adaptations
NESDIS/STAR Shobha Kondragunta Smoke and dust verification product development
NESDIS/OSDPD Liqun Ma, Mark Ruminski Production of smoke and dust verification products,
HMS product integration with smoke forecast tool
EPA/OAQPS partners:
Chet Wayland, Phil Dickerson, Brad Johns, John White AIRNow development, coordination with NAQFC
* Guest Contributors 20
Recent progress and updates
North American Meteorological model, currently Non-hydrostatic Multi-scale Model (NMMB) was updated and migrated to new supercomputers
• These meteorological predictions are used for all air quality predictions (July 2013)
Ozone - Substantial emission updates for 2012; reuse these emissions in 2013: • Mobile6 used for mobile emissions, but with emissions scaled by
growth/reduction rate from 2005 to 2012 • Non-road area sources use Cross State Rule Inventory • Canadian emissions use 2006 inventory
Dust predictions implemented operationally in March 2012: • Dust emissions are modulated by real-time soil moisture • Longer time step to speed up dust predictions implemented in October 2012
Smoke updates implemented in July 2013: • Increased maximum plume rise limit from 0.75 to 1.25 of the PBL depth;
decreased wet removal, changed in daily emissions cycling; made horizontal puff dispersion rate more consistent with particle dispersion
All AQ predictions have migrated to new NCEP computers
22
Operational and experimental predictions: fraction correct
2012
2011
Performance of experimental predictions is closer to that of operational predictions
Operational predictions used to be substantially better than experimental
2011 2012 changes Operational CB-IV
WRF-NMM, 2005 NEI
NMM-B, 2012 emission projections
Experimental CB05
WRF-NMM, 2005 NEI
NMM-B, 2012 emission projections
LBC, dry deposition, minimum PBL height 23
Smoke predictions
Surface Smoke Surface Smoke Surface Smoke
Vertical Smoke Vertical Smoke Vertical Smoke
Operational predictions at http://airquality.weather.gov
24
7/13/09, 17-18Z, Prediction:
Smoke Verification: July 13, 2009
7/13/09, 17-18Z, Observation:
GOES smoke product: Confirms areal extent of peak concentrations
FMS = 30%, for column-averaged smoke > 1 ug/m3
Manuscript about smoke verification product is in preparation 25
Verification of smoke predictions against GOES smoke retrievals
Comparison for July 2012: Operational in July 2012 Operational since 7/25/13
Updates: increased plume rise, decreased wet deposition, changes in daily emissions cycling 26
Figure of merit in space (FMS), which is a fraction of overlap between predicted and observed smoke plumes, threshold is 0.08 marked by red line
NESDIS GOES Aerosol/Smoke Product is used for verification Since July 26, 2013, the model includes the following updates: increased plume rise, decreased
deposition, changes in daily emissions cycling
Verification of smoke predictions for CONUS
Daily time series of FMS for smoke concentrations larger than 1um/m3
00.05
0.10.15
0.20.25
0.30.35
0.40.45
0.5
7/1/
13
7/3/
13
7/5/
13
7/7/
13
7/9/
13
7/11
/13
7/13
/13
7/15
/13
7/17
/13
7/19
/13
7/21
/13
7/23
/13
7/25
/13
7/27
/13
7/29
/13
7/31
/13
Thre
at S
core
Date
July 2013
00.05
0.10.15
0.20.25
0.30.35
0.40.45
0.5
8/1/
13
8/3/
13
8/5/
13
8/7/
13
8/9/
13
8/11
/13
8/13
/13
8/15
/13
8/17
/13
8/19
/13
8/21
/13
8/23
/13
8/25
/13
8/27
/13
8/29
/13
8/31
/13
Date
August 2013
27
Phoenix, AZ dust event on July 5, 2011
Source: http://www.wrh.noaa.gov/psr/pns/2011/July/DustStorm.php
• Massive dust storm hit Phoenix, AZ in the evening on July 5, 2011
• Cloud was reported to be 5,000 feet when it hit, radar shows heights from 8,000-10,000 feet tall and 50 miles wide
Source: http://www.huffingtonpost.com/2011/07/06/phoenix-dust-storm-photos-video_n_891157.html
• Originated from convection near Tucson • Stopped air traffic for over an hour • Arizona DEQ reported a PM10
concentration of 6,348 ug/m3 during peak of storm at site in downtown Phoenix
• Storm moved through Phoenix at 30-40 mph
28
PM 2.5 observations in Phoenix • Based on observations, height of
impact on Phoenix was between 8 PM and 10 PM LST
*Note* AZ three hours behind in summer
Predicted surface dust concentrations: - 8PM 158 ug/m3
- 10PM 631 ug/m3
Peak values: 996 ug/m3 at South Phoenix station at 8PM and 506 ug/m3 at West Phoenix station at 8 PM (AIRNow Tech)
• Timing of storm based on comparing predictions to observations looks accurate (albeit perhaps early – 63 ug/m3 predicted at 7 PM for Phoenix), however, the predictions keep the high levels seen at 10 PM LST for the next four to five hours, not seen in the observations
29
30
Long range transport of key air pollutants
Transport of e.g. wildfire smoke, dust, ozone, and ozone precursors from global sources impacts U.S. air quality: Average impacts are small (e.g.
1 ppb of ozone from 20% reduction of pollutant emissions on another continent), but
Variability of impacts is large Some episodic impacts are
large (example on the next slide)
Major atmospheric transport pathways affecting North America and their timescales
Global Sources of Local Pollution: An Assessment of Long-Range Transport of Key Air Pollutants to and from the United States, National Academies Press, 2009
Bias Correction Technique - Analog Forecast
31
The black curve is the NWP model prediction, while red indicates observed values. The top panel is the original time series, and the bottom panel has the models and observation pairs re-ordered so that the closest analog from the time series immediately precedes the forecast. (Courtesy of Luca Delle Monache, NCAR).
Diurnal Variation of PM2.5
32
The diurnal variation of PM2.5 averaged over all 518 AIRNow sites for the month of November, for the raw CMAQ model (red), AIRNow observations (black) and KFAN post-processed forecasts (blue).