overview: scope and goalsearth.esa.int/workshops/ivos05/pres/11_yoe.pdf · overview: scope and...

35
1 of 20 Overview: Scope and Goals Current operational NOAA instruments: AVHRR, HIRS, SBUV, GOES Imagers/Sounders; Non-NOAA satellites/sensors; Future: MetOP and NPP/NPOESS; GOES-R Assure Functionality of Current/Near-Future Operational Systems Develop & maintain on-orbit calibration approach, algorithms, & databases Oversee pre-launch sensor calibration Perform post-launch checkout, monitoring, and trouble-shooting Apply Experience to Design of Future systems Define future measurement requirements & support development of future systems (NPP, NPOESS, GOES-R, active sensors) Assure Required Accuracy, Stability, Inter-comparability Serve Community of Data Providers and Users NOAA customers Other US Government Agencies Academia and Industry International partners

Upload: ngothu

Post on 01-Oct-2018

222 views

Category:

Documents


0 download

TRANSCRIPT

1 of 20

Overview: Scope and Goals

• Current operational NOAA instruments: AVHRR, HIRS,SBUV, GOES Imagers/Sounders; Non-NOAAsatellites/sensors; Future: MetOP and NPP/NPOESS; GOES-R

• Assure Functionality of Current/Near-Future Operational Systems– Develop & maintain on-orbit calibration approach, algorithms, &

databases– Oversee pre-launch sensor calibration– Perform post-launch checkout, monitoring, and trouble-shooting

• Apply Experience to Design of Future systems– Define future measurement requirements & support development of

future systems (NPP, NPOESS, GOES-R, active sensors)• Assure Required Accuracy, Stability, Inter-comparability• Serve Community of Data Providers and Users

– NOAA customers– Other US Government Agencies– Academia and Industry– International partners

2 of 20

Examples

• Sensors, observations, and products• Calibration or validation method & results• Emphasis on topics not covered at IVOS 9

– Broader ORA, NOAA, and externalperspective than previously reported

3 of 20

Rapid Scan Winds and Hurricane IsabelRapid Scan Winds and Hurricane Isabel

September 15, 2003September 15, 2003100-399mb 400-699mb 700-950mb

4 of 20

GOES-10 IR Cloud Drift WindsGOES-10 IR Cloud Drift Winds

400-700mb400-700mb 700-1000mb700-1000mb100-400mb100-400mb100-400mb 400-700mb 700-1000mb

5 of 20

GOES-10 Visible Cloud-Drift WindsGOES-10 Visible Cloud-Drift Winds

6 of 20

GOES-12 Water Vapor WindsGOES-12 Water Vapor Winds

100-250mb100-250mb 250-350mb250-350mb 350-550mb350-550mb

100-250mb 250-350mb 350-550mb

7 of 20

Winds from MODIS: An Arctic Example

Cloud-track windsfrom MODIS for acase in the westernArctic. The windvectors were derivedfrom a sequence ofthree images, eachseparated by 100minutes. They areplotted on the first 11 µm image in thesequence.

8 of 20

An Arctic Example, cont.

Water vapor windsfrom MODIS for acase in the westernArctic. The windvectors were derivedfrom a sequence ofthree images, eachseparated by 100minutes. They areplotted on the first 6.7 µm image in thesequence.

9 of 20

Validation of Satellite Derived WindsValidation of Satellite Derived Winds

•• Radiosonde ObservationsRadiosonde Observations Statistics reported quarterly to the World

Meteorological Organization(WMO)/Coordination Group forMeteorological Satellites (CGMS)

• NOAA Wind Profiler ObservationsNOAA Wind Profiler Observations Network of 34 profilers (central U.S.)Network of 34 profilers (central U.S.)

•• National Centers for EnvironmentalNational Centers for EnvironmentalPrediction (NCEP) Global ForecastPrediction (NCEP) Global ForecastSystem (GFS) AnalysesSystem (GFS) Analyses

Daily Comparisons With …

10 of 20

Monitoring the Quality of GOES Wind ProductsMonitoring the Quality of GOES Wind Products

http://www.orbit.http://www.orbit.nesdisnesdis..noaanoaa..govgov//smcdsmcd//opdbopdb/goes/winds/html//goes/winds/html/tseriestseries.html.html

11 of 20

Monitoring the Quality of GOES Wind ProductsMonitoring the Quality of GOES Wind Products

http://www.orbit.http://www.orbit.nesdisnesdis..noaanoaa..govgov//smcdsmcd//opdbopdb/goes/winds/html//goes/winds/html/tseriestseries.html.html

12 of 20

Merriman, Nebraska

13 of 20

Height AssignmentHeight AssignmentLevel of Best-Fit AnalysisLevel of Best-Fit Analysis

•• ObjectiveObjectiveCharacterize heights assigned to Atmospheric MotionCharacterize heights assigned to Atmospheric Motion

Vectors (Vectors (AMVsAMVs))

•• Data SourceData SourceDatabase of collocated GOES-8 Database of collocated GOES-8 AMVsAMVs and and

Radiosonde Wind Profiles (Jan-Dec 2002)Radiosonde Wind Profiles (Jan-Dec 2002)

•• Level of Best-FitLevel of Best-FitDefined to beDefined to be the level at which vector difference the level at which vector difference

between the satellite wind and the radiosonde wind isbetween the satellite wind and the radiosonde wind isa minimuma minimum

14 of 20

Jan-Dec 2002 GOES-8 Cloud-drift winds assigned at 300mb Jan-Dec 2002 GOES-8 Cloud-drift winds assigned at 300mb Original height assignment (OPW); no speed bias correction Original height assignment (OPW); no speed bias correction All Height assignment methods (H All Height assignment methods (H22O-intercept, windowO-intercept, window)

Vector Difference (m/s)

15 of 20

•Vicarious Calibration Required for Ocean Color Science

•Pre-flight Laboratory and On-board Sensor Calibrations (TOA 4-5%) CannotMeet the Accuracy Requirements for Ocean Color Science Applications.

•A Minimum of an Order of Magnitude Improvement in Calibration Accuracyis Required.

•MOBY Presently Provides Water-Leaving Radiances at the 5% (surface)uncertainty Level Which Meets the Minimum Accuracy Requirement. MOBYhas demonstrated excellent long-term Stability over the past 6.5 yrs.

•MOBY has provided a NIST Traceable Scale for all of the Major Ocean ColorMissions Since the mid 1990’s. A necessary requirement for producingClimate Quality Data Records.

Ocean Color - MOBY Vicarious Calibration

Courtesy D. Clark

16 of 20

MOBY Instrumentand Spectral TimeSeries of MODISocean color bands(Accuracy ~4 -5%)

Courtesy D. Clark

17 of 20

NOAA Ocean Color Validation Systemand Related Activities

• Needed to assess accuracy ofNOAA generated ocean colorproducts

• Maintained and enhancedNOAA Ocean ColorValidation system and WWWhomepage

• Initiated development ofautomated NRT ocean colorproduct QC procedure

• Evaluated difference betweennear-real time andclimatological ancillary data inoperational ocean colorprocessing

http://wwwo2c.nesdis.noaa.gov/ocolor/validation/

18 of 20

NOAA Ocean Color Validation System• Contains ≈ 9842 in-situ

measurements ofchlorophyll concentrationcollected by numeroussources in U.S. coastalwaters

• Regression analysisperformed on estimatesderived from SeaWiFSimagery for evaluationpurposes (n ≈ 2130) Comparison between in-situ and SeaWiFS-derived chlorophyll

concentrations(OC4v4 generated for all CoastWatch regions.

19 of 20

SST Product Validation

• Periodic global GAC AVHRR SST validationhas been conducted continuously since 1983

• SST retrieval precision has gradually improvedto within 0.5 K rms of buoy matchups

• GOES SST became operational since Dec. 2000Global (GAC) AVHRR

1983 – 2003

Courtesy ORA/ORAD/SST Team

SST Analysis with ImagerRadiances

Summary of Accomplishments• Differences between Navy NOAA-16

Brightness Temperatures (BTs) andsimulated BTs w/o bias correction

• Sensitivities to changes in SST, Ta (atmos.temperature) and Qa (atmos. moisture)

• Bias correction of simulated observations• Retrieved SSTs compared to Navy

retrievals and NCEP SST analyses andBuoy data

• This analysis is significantly important for• JCSDA future activities of direct uses of

window channel radiances

Contributors: EMC: John Derber (PI), XuLi; ORA/CIRA: Alexander Ignotov, NickNalli

NCEP Analysis

NCEP First Guess

Navy retrieval

NCEP Retrieval

21 of 20

GOES Aerosol and Smoke Product(GASP)• Ongoing validation work

– Monthly mean aerosol opticaldepth (AOD) comparisons withsunphotometer

– Comparisons with MODIS AODs– Correlation studies with surface

measurements of particulateconcentrations

• Future work– Use aircraft and Lidar aerosol

profile data over the northeasternUS from July 2004 to assessaerosol model assumptions and toevaluate retrievals

Transport of smoke from forestfires in Canada/Alaska July 2004

22 of 20

Temperature Bias and RMS

(Land and Sea Samples) With Cloud Test

10

100

1000

-1 -0.5 0 0.5 1 1.5 2 2.5 3 3.5

Bias and RMS (Deg. K), NSAMP=8238 (land2_dep.txt, RAOB LS Coef, TP2_LS)

Pre

ssu

re (

mB

)

AIRS-F258+AQ:AMSU (192 P) N-16(ATOVS) AIRS- F258+AQ:AMSU(192 P) N-16(ATOVS)

COLLOCATED RADIOSONDES

airs Noaa-16

Courtesy Mitch Goldberg

23 of 20

GPS IPW (Integrated Precipitable Water) stationsoperated by the NOAA Forecast Systems Lab.

Courtesy NOAA/FSL (S. Gutman)

24 of 20Figure 3.

(a)(b)

(c) (d)

GPS measured vs. AIRS derived IPW for the CONUS (All match-ups within 0.25 deg lat/lon and within 30min)

Red line: linear fit for the match-up data.

25 of 20

MODIS and AVHRR NDVIsCompared for a Site

in South Dakota (16 daycomposites in 2001)

NDVI Calibration/Validation

Effect of AVHRR SpectralResponse on longtermNDVI time series studies

Courtesy ORA/SMCD/EMB

Courtesy X. Wu

26 of 20

SBUV/2 Ozone Cal/ValLevel 2 Ground-based Comparisons

In press, Miller et al., JGR Atmospheres

27 of 20

NIST verification of HIRSspectral response functions

(HIRS Ch 1, CO2 Q-branch, 25mb, worst case)

VendorNIST at 30, 25,20, and 15 C

28 of 20

Intersatellite Calibration with SimultaneousNadir Overpass (SNO) Observations

•Two satellites pass the same place attheir nadirs within a few seconds

•Occurs for all satellites withdifferent altitudes (typically onceevery few days); In the polar regionsfor sun synchronous satellites

•SNO time series very useful forintersatellite calibration ofIR/VIS/NIR, and Microwave sensors

•Website:www.orbit.nesdis.noaa.gov/smcd/spb/calibration/intercal/

•Several publications Cao et al, J. Atmospheric & Oceanic Tech, 2004

29 of 20

Applications of SNO method• Used for postlaunch checkout,

longterm monitoring of instrumentperformance, and re-analysis ofhistorical data for time-series analysisto achieve intersatellite calibrationconsistency

• Procedure is being implemented forNOAA’s AVHRR, HIRS, and AMSUinstruments; also used for Terra/AquaMODIS/AVHRR by MCST

• Future work for inflight spectral calwith hyperspectral thermal sensors

• Complement to GOES/POES intercal

• Examples (to the right)– SNO method reveals seasonal

biases caused by spectral responsedifferences for HIRS (upper right)

– SNO method shows excellentagreement between satellites forAMSU (low right)

HIRS stratosphere channel

(longwave infrared)

AMSU mid-troposphere channel

(Microwave)

30 of 20

AVHRR Reprocessing Project

• Re-processing of AVHRR data forbetter product quality for climatestudies

• Standardize calibration coefficientsand radiance calculation procedure,such as non-linearity correction,and band correction coefficients

• Use the SNO method to establishthe calibration link among satellites

• Provide consistent calibrationcoefficients with improvedcalibration accuracy

1990 1992 1994 1996 1998 2000 2002 2004

-6

-5

-4

-3

-2

-1

0

1

2

3

4

5

6

7

8

Channel 4

Brightn

ess tem

pera

ture

(K

)

Year

NOAA10 v.s. NOAA11 NOAA11 v.s. NOAA12 NOAA14 v.s. NOAA15 NOAA15 v.s. NOAA16 NOAA16 v.s. NOAA17

∆T in

AVHRR

31 of 20

AVHRR VIS/NIR Vicarious Calibrationusing the Libyan Desert Target

Courtesy X. Wu

–NOAA 16 AVHRR Albedo –NOAA 17 AVHRR Albedo

CH1

CH2

CH3

32 of 20

GOES Star Based Calibration

Courtesy X. Wu, 2004

GOES 8 stars(520,820,934,1191)

Star 934

y = 9.6691e -0.0486x

R2 = 0.6126

star 520

y = 9.6691e -0.0494x

R2 = 0.728

Star 820

y = 9.6693e -0.0451x

R2 = 0.7783

Star 1191

y = 9.6691e -0.0437x

R2 = 0.6971

0

2

4

6

8

10

12

0.5 1.5 2.5 3.5 4.5 5.5 6.5 7.5 8.5

date

no

rm

alized

sig

nal

G8-934

G8-520

G8-820

G8-1191

Expon. (G8-934)

Expon. (G8-520)

Expon. (G8-820)

Expon. (G8-1191)

33 of 20

Status of NESDIS Uniform InstrumentMonitoring System

• User-friendly (web-based) system with high level info,documentation, meta-data, eng & radiometric dataarchived – undertaken at NRC recommendation

• System requirements document completed December 2003• System specification document in final stage of

preparation• Government team met 14 June 2004 to decide which

architecture to employ• Funding in place to develop prototype system in FY04-05.

Courtesy T. Kleespies

34 of 20

NPP/NPOESS Cal/Val Activities

• VIIRS, CrIS, ATMS, and OMPS will replace AVHRR,HIRS, AMSU, and SBUV

• NPP = NPOESS Preparatory Project (IPO/NASA)• NPOESS Data Exploitation Team:

– Oversight of and support to NPOESS cal/val– Independent verification and monitoring– Liaison between IPO and NOAA central/civilian users on cal/val

issues– Cal/val support to NOAA unique products– Ensure continuity from POES to NPOESS

• NPOESS Prelaunch/postlaunch cal/val meetings• Currently focus on prelaunch tests, instrument

performance issues and potential science impacts

35 of 20

Summary• NOAA/NESDIS has extensive calibration/validation

experience in both prelaunch and postlaunch, atmosphere, seasurface temperature, ocean color, ozone, and land; validationusing a variety of sources, including other satellites, in-situ, andground-based remote sensors

• In partnership w/ other NOAA, NASA, USGA, Academia,International

• SNO method very useful for cal/val on-orbit for all radiometers,and will be used for calibrating future instruments

• Continued support to NOAA operational instruments, includingAVHRR, HIRS, AMSU, SBUV/2, and reprocessing for climatestudies

• Preparing for the transition to MetOP, NPOESS and GOES-R