data processing and archive

32
WFC3 PIPELINE PDR Daryl Swade February 13, 2001 Data Processing and Archive Space Telescope Science Institute 1 of 32 Data Processing and Archive Data System s Team www.stsci.edu/software/OPUS/ www.dpt.stsci.edu/ www.at.stsci.edu/

Upload: haracha

Post on 14-Jan-2016

34 views

Category:

Documents


1 download

DESCRIPTION

Data Processing and Archive. Data Systems Team. www.stsci.edu/software/OPUS/ www.dpt.stsci.edu/ www.at.stsci.edu/. General Requirements for OPUS Science Data Processing. OPUS will develop the WFC3 pipeline based on the existing pipeline model for ACS and NICMOS including OTFR. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

1 of 32

Data Processing and Archive

Data Systems Team

www.stsci.edu/software/OPUS/

www.dpt.stsci.edu/

www.at.stsci.edu/

Page 2: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

2 of 32

General Requirements for OPUS Science Data Processing OPUS will develop the WFC3 pipeline based on the existing OPUS will develop the WFC3 pipeline based on the existing

pipeline model for ACS and NICMOS including OTFR.pipeline model for ACS and NICMOS including OTFR. Level 0 data is packetized and Reed-Solomon corrected by Level 0 data is packetized and Reed-Solomon corrected by

PACOR at GSFCPACOR at GSFC Receive science telemetry (level 1a data) at STScI as science Receive science telemetry (level 1a data) at STScI as science

instrument specific ‘pod files’instrument specific ‘pod files’– Engineering snapshot includedEngineering snapshot included– No on-board compression for WFC3 dataNo on-board compression for WFC3 data

STScI processing on Compaq ALPHA/Tru64 UNIX platformSTScI processing on Compaq ALPHA/Tru64 UNIX platform

Page 3: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

3 of 32

General OPUS Requirements for Science Data Processing (cont.) OPUS must account for all scheduled exposures.OPUS must account for all scheduled exposures. Convert telemetry to FITS formatConvert telemetry to FITS format

– Structure tables or data arrayStructure tables or data array– Populate header keywordsPopulate header keywords

> Keywords to provide metadata for archive catalogKeywords to provide metadata for archive catalog

Associate groups of exposures that must process further as a Associate groups of exposures that must process further as a single unitsingle unit

Execute calibration tasks in pipeline modeExecute calibration tasks in pipeline mode

Pass level 1b science data (pod files and uncalibrated science Pass level 1b science data (pod files and uncalibrated science datasets) and jitter files to Hubble Data Archivedatasets) and jitter files to Hubble Data Archive

Page 4: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

4 of 32

OPUS Requirements for Thermal Vac

OPUS will develop a WFC3 science data processing OPUS will develop a WFC3 science data processing pipeline capable of supporting Thermal Vac testingpipeline capable of supporting Thermal Vac testing

No support schedule available in PMDB formatNo support schedule available in PMDB format– Necessary support schedule information to be read in from Necessary support schedule information to be read in from

ASCII fileASCII file No associations No associations No calibrationNo calibration Data will be archived to HDAData will be archived to HDA Processing on Sun / Solaris UNIX platformProcessing on Sun / Solaris UNIX platform

Page 5: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

5 of 32

OPUS Requirements for Thermal Vac (cont.)

Current Thermal Vac OPUS pipeline delivery scheduleCurrent Thermal Vac OPUS pipeline delivery schedule– Earliest possible WFC3 Thermal Vac currently scheduled for Earliest possible WFC3 Thermal Vac currently scheduled for

October 7, 2002.October 7, 2002.– OPUS Thermal Vac pipeline due about two months prior to OPUS Thermal Vac pipeline due about two months prior to

Thermal Vac, August 5, 2002.Thermal Vac, August 5, 2002.– Beta version of OPUS pipeline due about five months prior to Beta version of OPUS pipeline due about five months prior to

Thermal Vac, May 6, 2002.Thermal Vac, May 6, 2002.

Page 6: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

6 of 32

OPUS science data processing pipeline for WFC3

PM D B

PD B

KeywordD atabase

D ataP artition ing

D ataV alida tion

G enericC onversion

C alib ra tion

D ata C o llector

S upport S chedu le

W orld C oord ina teS ystem

A rch iveIn terface

KeywordExceptions

T able

KeywordR ules

pod file (leve l 1 data)

ED T dataset

ED T dataset

ED T dataset

ED T dataset

C alibrationreferencefiles and

tables

uncalibrated F IT S sc ience dataset

uncalibrated F IT S sc ience datasets

calibrated F IT S sc ience dataset

Page 7: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

7 of 32

OPUS Processes

Data PartitioningData Partitioning– segments the telemetry stream into standard EDT datasetsegments the telemetry stream into standard EDT dataset– fill data inserted if telemetry drop-outs existfill data inserted if telemetry drop-outs exist

> constructs a data quality image to ensure the subsequent science constructs a data quality image to ensure the subsequent science processing does not interpret fill data as valid science dataprocessing does not interpret fill data as valid science data

Support ScheduleSupport Schedule– gathers proposal information from PMDBgathers proposal information from PMDB– test proposals required for developmenttest proposals required for development

> test version of PMDB must be populated by TRANStest version of PMDB must be populated by TRANS> Thermal Vac support schedule to be input from ASCII fileThermal Vac support schedule to be input from ASCII file

Page 8: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

8 of 32

OPUS Processes (cont.) Data ValidationData Validation

– decodes the exposure and engineering parameters in the telemetry and decodes the exposure and engineering parameters in the telemetry and compares them to the planned valuescompares them to the planned values

– internal header specification (from Ball)internal header specification (from Ball)> PDB (EUDL.DAT, TDFD.DAT) must be fully populated and defined in PDB (EUDL.DAT, TDFD.DAT) must be fully populated and defined in

DM-06 DM-06 – flags and indicators need to be verified by the Instrument Scientists, but flags and indicators need to be verified by the Instrument Scientists, but

will likely be the same as ACS for UVIS channel and NICMOS for IR will likely be the same as ACS for UVIS channel and NICMOS for IR channelchannel

World Coordinate SystemWorld Coordinate System– implements a translation from telescope coordinates through the implements a translation from telescope coordinates through the

instrument light-path to an astronomically valid pointinginstrument light-path to an astronomically valid pointing– aperture positions must be definedaperture positions must be defined

Page 9: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

9 of 32

OPUS Processes (cont.)

Generic ConversionGeneric Conversion– Generic Conversion outputs uncalibrated dataGeneric Conversion outputs uncalibrated data– data will be output in standard FITS format with image or table extensionsdata will be output in standard FITS format with image or table extensions– primary header will contain keywords inherited by all extensions and a primary header will contain keywords inherited by all extensions and a

null data arraynull data array

– Separate headers and data formats for UVIS and IR channel Separate headers and data formats for UVIS and IR channel datadata

> UVIS channel keywords based on ACSUVIS channel keywords based on ACS> IR channel keywords based on NICMOSIR channel keywords based on NICMOS

Page 10: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

10 of 32

OPUS Processes (cont.)

Generic Conversion (cont.)Generic Conversion (cont.)– Data formatsData formats

> UVIS channel follows ACS data formatUVIS channel follows ACS data format• each file contains two imsets, one for each chipeach file contains two imsets, one for each chip• imset contains science, error, and data quality arrayimset contains science, error, and data quality array

> IR channel follows NICMOS data formatIR channel follows NICMOS data format• each file contains an imset for each readouteach file contains an imset for each readout• imset contains science, error, data quality, data samples, and effective imset contains science, error, data quality, data samples, and effective

integration time arrayintegration time array> data quality array will be null if no telemetry dropoutsdata quality array will be null if no telemetry dropouts

• calibration generates full data quality array with all other DQ calibration generates full data quality array with all other DQ flagsflags

Page 11: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

11 of 32

OPUS Processes (cont.)

Generic Conversion (cont.)Generic Conversion (cont.)– Required for developmentRequired for development

> DM-06 to develop algorithms for data formattingDM-06 to develop algorithms for data formatting> keyword definitions (ICD-19) must be provided by the keyword definitions (ICD-19) must be provided by the

Instrument ScientistsInstrument Scientists• world coordinate definitionsworld coordinate definitions

• exposure time calculationsexposure time calculations

• calibration switches and selection criteriacalibration switches and selection criteria

• calibration file name keywordscalibration file name keywords

Page 12: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

12 of 32

Keyword specification

– keyword namekeyword name– default valuedefault value– possible valuespossible values– unitsunits– datatypedatatype– short comment for headershort comment for header

– long descriptionlong description– header positionheader position– DADS tableDADS table– keyword sourcekeyword source

• The following information must be provided by STScI Science Instrument team for all WFC3 specific keywords using a standard form for keyword database input.

Page 13: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

13 of 32

OPUS Processes (cont.) Data CollectorData Collector

– OPUS will ensure all necessary component exposures are present before OPUS will ensure all necessary component exposures are present before processing furtherprocessing further

– association table contains all information about product datasetassociation table contains all information about product dataset> dataset self-documentingdataset self-documenting

– only associations required for data processing with be constructed in the only associations required for data processing with be constructed in the OPUS pipelineOPUS pipeline

> association created from a single proposal logsheet lineassociation created from a single proposal logsheet line– Error condition actions to be defined by Instrument ScientistsError condition actions to be defined by Instrument Scientists

> rules for processing incomplete associations rules for processing incomplete associations > association time-out rulesassociation time-out rules

Page 14: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

14 of 32

Calibration

OPUS will use STSDAS calibration softwareOPUS will use STSDAS calibration software run on ALPHA / Tru64 UNIX platform in run on ALPHA / Tru64 UNIX platform in

operations operations expands size of datasetexpands size of dataset

– converts integer raw data to realconverts integer raw data to real– population of error and data quality array population of error and data quality array

Need calibration reference files for testing (at least Need calibration reference files for testing (at least dummies)dummies)

Page 15: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

15 of 32

Other Science Data Modes

requirements for data content of each of these other requirements for data content of each of these other science data modes must be defined by Instrument science data modes must be defined by Instrument ScientistsScientists– microprocessor memory dumpmicroprocessor memory dump– engineering diagnostic dataengineering diagnostic data

No target acquisition observations for WFC3No target acquisition observations for WFC3

Page 16: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

16 of 32

Engineering data processing

Receive engineering telemetry data from CCS at GSFCReceive engineering telemetry data from CCS at GSFC Process Engineering data through FGS Data PipelineProcess Engineering data through FGS Data Pipeline Generate data products to characterize jitter and Generate data products to characterize jitter and

pointing control information in support of science pointing control information in support of science observationsobservations

WFC3 jitter file association packaging will mimic WFC3 jitter file association packaging will mimic science data associationsscience data associations

No other WFC3 specific requirementsNo other WFC3 specific requirements

Page 17: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

17 of 32

OPUS / Archive Interface

OPUS will present to the archive:OPUS will present to the archive:– Original data received from PACOR (binary pod files)Original data received from PACOR (binary pod files)– Pod file data packaged by observation in FITS formatPod file data packaged by observation in FITS format– Output of Generic Conversion (uncalibrated science Output of Generic Conversion (uncalibrated science

dataset) in FITS formatdataset) in FITS format– Output of STSDAS calibration (calibrated science Output of STSDAS calibration (calibrated science

dataset) in FITS formatdataset) in FITS format– Jitter files from the engineering telemetry in FITS Jitter files from the engineering telemetry in FITS

format format – Data from other science modes (target acquisition, Data from other science modes (target acquisition,

memory dump, engineering diagnostic data) in FITS memory dump, engineering diagnostic data) in FITS formatformat

Page 18: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

18 of 32

ZEPPOC ATLO G

Archive U ser

C AD CVictoria , B .C ., C anada

EC FG arch ing, G erm any

JukeBox

JukeBox

JukeBox

N AO JM itaka, Tokyo, Japan

O PU S

O TFR

ING EST PRO CESS

Keyw ord M etadata

Level 1 D ata

R aw Science Data

Ancilla ry D ata

DISTRIBUTION PRO C

R equest P rocessor

In ternet Data D istribution

H ard M edia D ata D ist.

R em ote S ite D ata D ist.

HU BBLE DAT A ARC HIVE

Tape C D D VD

Level 1 D ata

R eprocessed ScienceD ata

ARCHIVE ENGINE

JukeBox C ontro l

O ptica l S torage

O ptica l R etrieva l

D ata M igration

P ipe line D ata

em ailscience

datarequest

R eplicated

C ATLO G

replica ted

em ailscience

datarequest

ScienceD ataFTP

D elivery

U S Posta l Service

Arch ive U ser

Page 19: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

19 of 32

Archive Ingest – Catalog Population Archive catalog populated from FITS calibrated and uncalibrated Archive catalog populated from FITS calibrated and uncalibrated

science datasetsscience datasets Header keyword values used to populate archive catalog Header keyword values used to populate archive catalog

database fieldsdatabase fields– Since keywords based on ACS and NICMOS design, archive catalog Since keywords based on ACS and NICMOS design, archive catalog

tables will also correspond to ACS and NICMOS designtables will also correspond to ACS and NICMOS design– engineering snapshot keywords from spt file used to populate instrument engineering snapshot keywords from spt file used to populate instrument

tables for trending analysestables for trending analyses

Associations will also follow ACS/NICMOS designAssociations will also follow ACS/NICMOS design– Current database schema will work for WFC3 associationsCurrent database schema will work for WFC3 associations

Page 20: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

20 of 32

Archive Ingest – Data Store Both binary and FITS versions of pod files written to Both binary and FITS versions of pod files written to

archive mediaarchive media– FITS pod files planned to be input for OTFRFITS pod files planned to be input for OTFR

Currently Generic Conversion output (uncalibrated Currently Generic Conversion output (uncalibrated FITS dataset) written to archive mediaFITS dataset) written to archive media– May cease to write this dataset to archive media if FITS pod May cease to write this dataset to archive media if FITS pod

files and OTFR prove to be sufficient for archivefiles and OTFR prove to be sufficient for archive gzip file compression performed on all files prior to gzip file compression performed on all files prior to

writing to archive mediawriting to archive media

Page 21: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

21 of 32

StarView 6

Page 22: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

22 of 32

OTFR

In OTFR, data retrieved from the archive are reprocessed from In OTFR, data retrieved from the archive are reprocessed from the pod filethe pod file

– Provides HST data user with optimal product at time of retrievalProvides HST data user with optimal product at time of retrieval> Calibration updates, bug fixes, and new software features and algorithms Calibration updates, bug fixes, and new software features and algorithms

available to archive usersavailable to archive users

– OTFR pipeline uses the exact same code as current pre-archive science OTFR pipeline uses the exact same code as current pre-archive science data processingdata processing

> Reduces software development and maintenance costsReduces software development and maintenance costs> No science instrument specific code developed for OTFR beyond what is No science instrument specific code developed for OTFR beyond what is

necessary for pre-archive data processingnecessary for pre-archive data processing

– Adds negligible time for retrievalsAdds negligible time for retrievals

Page 23: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

23 of 32

OTFR

O PUS HSTScience Data Processing

P A C O R /D D F

O PU S ScienceD ata R eceip t

D A D Singest

D A D Sretrieva l

D A D Sdistribu tion

S tarV iew

A rch iveC ata log

initialprocessing OTFR

level 0 sc iencete lem etry

level 1 sc iencete lem etry (pod file )

pod file

pod file

m etadata

in itia lly ca libra teddata

pod file(s)

optim ally ca libra tedsc ience data

m etadata

optim ally ca libra tedsc ience data

data request

query

pod filefilte r

pod file

A rch iveC ata log

A rch iveS torageM edia

uncalibra teddata, pod files

HST

Archive Users

Page 24: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

24 of 32

Data Repair

Problems from bad telemetry must be repaired in order Problems from bad telemetry must be repaired in order for the data to process automatically through the OTFR for the data to process automatically through the OTFR system.system.

There are two methods for handling data with bad There are two methods for handling data with bad telemetry values.telemetry values.– Use a binary editor to ‘fix’ the pod file.Use a binary editor to ‘fix’ the pod file.– OTFR has a built in mechanism to process from the EDT set, OTFR has a built in mechanism to process from the EDT set,

a somewhat processed version of the data that contains ASCII a somewhat processed version of the data that contains ASCII files that can be edited.files that can be edited.

> EDT set can be archived for problematic exposuresEDT set can be archived for problematic exposures

Page 25: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

25 of 32

Data Distribution

Gzip compression will reduce outbound network loadGzip compression will reduce outbound network load Alternative media if Internet becomes the bottleneckAlternative media if Internet becomes the bottleneck

– Tape (current)Tape (current)– CD (future)CD (future)– DVD (future)DVD (future)

Page 26: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

26 of 32

Code Reuse

Data Processing and Archive systems are designed for Data Processing and Archive systems are designed for multi-mission/multi-instrument use.multi-mission/multi-instrument use.– OPUS has core system that consists of blackboard and OPUS has core system that consists of blackboard and

pipeline control.pipeline control.> Instruments specific applications plug-in to core system.Instruments specific applications plug-in to core system.

– DADS being redesigned to be less HST specific.DADS being redesigned to be less HST specific.> Archive catalog contains both general and instrument specific Archive catalog contains both general and instrument specific

tables.tables.

Page 27: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

27 of 32

OPUS Code Reuse Core OPUS system (OPUS 12.1)Core OPUS system (OPUS 12.1)

– ~236,000 lines of code~236,000 lines of code– 100% reuse100% reuse

WFC3 specific processesWFC3 specific processes– Based on FUSE study (Rose et al. 1998, “OPUS: The FUSE Data Based on FUSE study (Rose et al. 1998, “OPUS: The FUSE Data

Pipeline”, Pipeline”, www.www.stscistsci..eduedu/software/OPUS/kona2.html/software/OPUS/kona2.html))> 5076 lines of code5076 lines of code> 71% reuse of existing OPUS modules71% reuse of existing OPUS modules

Expect > 99% reuse of existing data processing software for Expect > 99% reuse of existing data processing software for WFC3, based on lines of code.WFC3, based on lines of code.

– All SI complexity contained in relatively few lines of code.All SI complexity contained in relatively few lines of code. Efficient use of existing system!Efficient use of existing system!

Page 28: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

28 of 32

Archive Systems Code Reuse

Archive changes:Archive changes:– Add WFC3 specific tables to archive catalogAdd WFC3 specific tables to archive catalog– Add WFC3 data to PI paper products Add WFC3 data to PI paper products – Add WFC3 to HST remote archive site distributionAdd WFC3 to HST remote archive site distribution– Define default file types for CAL, UNCAL, and DQ flags on Define default file types for CAL, UNCAL, and DQ flags on

StarView retrievesStarView retrieves– Add WFC3 specific screens to StarViewAdd WFC3 specific screens to StarView

Estimate ~98% reuseEstimate ~98% reuse

Page 29: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

29 of 32

Operational Considerations Data processing and archive operational system sizing based on Data processing and archive operational system sizing based on

SSR and TDRSS capacitySSR and TDRSS capacity– Current downlink limit is ~16 Gbits/day (20 minutes TDRSS contact per Current downlink limit is ~16 Gbits/day (20 minutes TDRSS contact per

orbit)*orbit)*> January 2001 downlink average was ~4.2 Gbits/dayJanuary 2001 downlink average was ~4.2 Gbits/day

– Post SM3b downlink limit expected to be 29 Gbits/day (18 – 35 minute Post SM3b downlink limit expected to be 29 Gbits/day (18 – 35 minute TDRSS contacts per day)*TDRSS contacts per day)*

> Average daily science data downlink expected to be ~16 Gbits/day**Average daily science data downlink expected to be ~16 Gbits/day**– Under consideration: post SM4 downlink limit of 48 Gbits/day (2 – 35 Under consideration: post SM4 downlink limit of 48 Gbits/day (2 – 35

minute TDRSS contacts per orbit)*minute TDRSS contacts per orbit)*

* WFC3 ISR 2001-xxx “Data Volume Estimates for WFC3 Spacecraft and Ground System Operations,” C.M. Lisse and R. Henry, 18-Jan-2001 version

** ACS ISR-97-01 “HST Reference Mission for Cycle 9 and Ground System Requirements,” M. Stiavelli, R. Kutina, and M. Clampin, July 1997.

Page 30: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

30 of 32

Operational Considerations (cont.)

Processing power – no problemProcessing power – no problem– ACS IPT results show processing of 33 Gbits downlink data in 150 ACS IPT results show processing of 33 Gbits downlink data in 150

exposures in under 2 hours on ODOcluster1 (no dither)exposures in under 2 hours on ODOcluster1 (no dither)

Processor memoryProcessor memory– ODOcluster1 memory of 2 GB per ES-40 is sufficient for ACS WFC ODOcluster1 memory of 2 GB per ES-40 is sufficient for ACS WFC

calibration, which requires about 200 MB per dataset at any one timecalibration, which requires about 200 MB per dataset at any one time

Disk spaceDisk space– Needs to be re-evaluated for WFC3 CDRNeeds to be re-evaluated for WFC3 CDR– Consider pre-archive and OTFR pipelines processing work space as well Consider pre-archive and OTFR pipelines processing work space as well

as calibration reference file space requirementsas calibration reference file space requirements

Page 31: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

31 of 32

Major Science Data Processing Requirements Summary Internal header specification (from Ball)Internal header specification (from Ball)

– DM-06DM-06 to document content and format of science internal to document content and format of science internal header header

– PDB (EUDL.DAT, TDFD.DAT) defined and fully populated PDB (EUDL.DAT, TDFD.DAT) defined and fully populated Keyword definitions (from STScI Science Instrument Keyword definitions (from STScI Science Instrument

team)team) Flags and indicators for Data Validation (from STScI Flags and indicators for Data Validation (from STScI

Science Instrument team)Science Instrument team) Aperture definitions (from STScI Science Instrument Aperture definitions (from STScI Science Instrument

team)team)

Page 32: Data Processing and Archive

WFC3 PIPELINE PDRDaryl Swade

February 13, 2001 Data Processing and Archive

Space Telescope Science Institute

32 of 32

Test Data Requirements

Test data from detectors on optical bench expected on March 4, Test data from detectors on optical bench expected on March 4, 2002 and from integrated instrument on August 5, 20022002 and from integrated instrument on August 5, 2002

Test data to be provided by IPT/Instrument Scientists and Test data to be provided by IPT/Instrument Scientists and Engineers should include all science modesEngineers should include all science modes

Test data must includeTest data must include– PMDB population and PDB definitionPMDB population and PDB definition– list of possible error conditions to simulate list of possible error conditions to simulate – data that simulate error conditionsdata that simulate error conditions– enough data for throughput testenough data for throughput test– engineering data to test jitter file productionengineering data to test jitter file production