data handling system project management

22
Project Documentation Document PMCS-0019 Revision A Advanced Technology Solar Telescope 950 N. Cherry Avenue Tucson, AZ 85719 Phone 520-318-8108 [email protected] http://dkist.nso.edu Fax 520-318-8500 Data Handling System Project Management Bret Goodrich Software September 1, 2015

Upload: others

Post on 14-Jun-2022

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Data Handling System Project Management

Project Documentation Document PMCS-0019

Revision A

Advanced Technology Solar Telescope 950 N. Cherry Avenue Tucson, AZ 85719 Phone 520-318-8108 [email protected] http://dkist.nso.edu Fax 520-318-8500

Data Handling System Project Management

Bret Goodrich Software

September 1, 2015

Page 2: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page ii

REVISION SUMMARY: 1. Date: Aug 5, 2011

Revision: DRAFT1 Changes: Created

2. Date: September, 29, 2011 Revision: DRAFT2 Changes: Updates from internal review

3. Date: Sep 1, 2015 Revision: Rev A Changes: Updated for Baseline.

Page 3: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page iii

Table of Contents

1.  INTRODUCTION ................................................................................................... 1 2.  CONSTRUCTION ................................................................................................. 2 2.1  FABRICATION ......................................................................................................... 2 2.2  PURCHASE ............................................................................................................. 2 2.3  INTEGRATION, TESTING, AND COMMISSIONING .......................................................... 3 3.  QUALITY CONTROL ............................................................................................ 4 3.1  VERIFICATION ......................................................................................................... 4 3.1.1  Requirements Verification .............................................................................................. 4 3.1.2  Design Verification .......................................................................................................... 4 3.1.3  As-Built Verification ........................................................................................................ 5 3.1.4  Documentation Verification ........................................................................................... 5 3.2  QUALITY ASSURANCE TASKS .................................................................................. 5 3.2.1  Unit Testing ..................................................................................................................... 5 3.2.2  Integration Testing .......................................................................................................... 6 3.2.3  User Acceptance (Verification) Testing ........................................................................ 6 3.3  VERIFICATION TEST PLAN ....................................................................................... 7 3.3.1  Unit Tests ......................................................................................................................... 7 3.3.2  Integration Tests ............................................................................................................. 8 4.  RISK .................................................................................................................... 10 4.1  RISK REGISTER .................................................................................................... 10 4.2  RISK MITIGATION .................................................................................................. 10 4.2.1  HLSC-007: Expanded scope for NSO data handling integration .............................. 11 4.2.2  HLSC-073: Scope refinement; specification documents in transition. .................... 11 4.2.3  DHS-04: More bandwidth required to base facility .................................................... 11 5.  COST ESTIMATES ............................................................................................. 13 5.1  COST ESTIMATE ................................................................................................... 13 5.1.1  Construction Labor ....................................................................................................... 13 5.1.2  Integration, Testing, and Commissioning Labor ........................................................ 14 5.1.3  Construction Non-labor ................................................................................................ 14 5.1.4  Contingency .................................................................................................................. 16 6.  SCHEDULE ......................................................................................................... 17 7.  CONFIGURATION ITEM DATA LIST ................................................................. 19 

Page 4: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 1 of 19

1. INTRODUCTION

This document discusses the project management of the DKIST Data Handling System (DHS). It is concerned with construction and development plans, quality control and assurance, risks, costs, and schedule. Each of these areas is covered in sections of this document.

Page 5: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 2 of 19

2. CONSTRUCTION

Construction phase planning involves the fabrication, purchase, and integration of the DHS. Planning for each of these areas is described here.

2.1 FABRICATION

Fabrication of the DHS will take place at NSO/Tucson. Fabrication involves the development of the DHS software, the construction of a mini-DHS, and the construction of one DHS camera line. All DHS equipment resides in the Tucson DKIST Computer Room in a dedicated rack with independent power and network management. Fabrication of the other camera lines will be performed during integration, testing, and commissioning (IT&C) activities at the DKIST Haleakelā site and is further described in the IT&C section of this document.

The development of the DHS software requires three distinct hardware configurations. Since it is highly likely that available commodity data storage volume and transfer rates will increase over the 5 years of DHS construction, the project has chosen to purchase only minimal hardware in each construction phase. The first phase of construction develops the individual DHS components and tests their suitability for the required volumes and rates. The second phase of construction integrates the DHS components into a single camera line, useful for both DHS tests and development and testing of DKIST cameras and instruments. The third phase of construction fabricates a single camera line using the technology that will be deployed at the site.

During the first two years of construction, the DHS is implemented upon three fast, multi-core computers, each with 10 GBit Ethernet support. These are used generally as a data source, data sink, and resource server, although they may be configured in other ways for specific tests (i.e., multiple sources or multiple sinks). Additional DKIST computers can also be employed to test requirements where lower bandwidths are acceptable. During this period the DHS will construct the bulk data transport, the quality assurance display, the data storage system, and the data processing pipeline.

During the next two years of construction, the DHS adds a dedicated computer to receive high-speed bulk data, display it, and store it. This is the mini-DHS computer, which should be capable of testing the DKIST cameras and instruments concurrently under development. A total of five mini-DHS systems will be constructed and delivered to DKIST instrument program. The mini-DHS will also be capable of interfacing with the external data processing components of the VBI (currently envisioned as a set of GPUs).

The final year of construction upgrades the NSO/Tucson mini-DHS to the latest hardware available for the allowed budget. It may be possible at this time to affordably purchase any or all of: solid state disks (SSDs), 10 TByte disk capacity, 40 or 100 Gbit networks, and commodity storage management hardware and software. The camera line will be extensively tested with new DHS management computers. These computers will provide the final DHS management servers shipped and installed at the site, and include DHS software for header databases, experiment and pipeline management, data processing sandboxes, and data export.

2.2 PURCHASE

The five mini-DHS systems will be purchased as required by the instrument developers. The mini-DHS computer will be purchased as a commodity item from the least expensive source. There is some relative advantage in purchasing all systems at once from a single vendor, but this may be impractical due to the potential of widely spaced instrument development schedules.

The final developmental camera line and the site array of camera lines will be purchased through a single vendor. The vendor will deliver one hardware camera line to NSO/Tucson during the final year of

Page 6: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 3 of 19

construction, and then deliver four more lines at the site during IT&C. Although camera lines are considered discrete components of the DHS for every camera data stream, the actual hardware implementation of multiple lines should combine the resources and manage them as a single unit. This allows the DHS during operation to reconfigure the volume, number, and sources and sinks for each camera line on an experiment-by-experiment basis.

The request for proposal for the DHS purchase will require a bid for the following materials:

Five (5) high-speed, simultaneous read/write disk storage units (DHS data store and calibration store);

Five (5) multi-core computers (DHS transfer); Five (5) multi-core computers (DHS DSD client node); Five (5) multi-core computers (DHS processing node); Three (3) multi-core computers (DHS controller), One (1) InfiniBand switch with 24-32 fiber optic ports (DHS bulk data transfer switch); Five (5) removable storage units (DHS export system).

2.3 INTEGRATION, TESTING, AND COMMISSIONING

DHS site integration begins the third quarter of 2016. At this time, the Observatory Control System (OCS), Instrument Control System (ICS), and Telescope Control System (TCS) should either be installed or in the process of installation at the site. Shortly after the DHS is installed the Visible Broadband Interferometer, the first light DKIST instrument, should be installed. At this time the DKIST will be ready for the first instrument verification tests, per the VBI IT&C plan.

The DKIST site computer room is provisioned by the Information Technology (IT) group. The computer room contains 16 racks, each 48U in height. The racks are fully cooled through the site HVAC system and hot/cold aisles. Each rack is connected to 10 kVA of 208V power from the observatory uninterruptible power supply (UPS). Each rack contains a facility network switch (10 or 40 Gbit backbone). The DHS will utilize three of these racks and reserve another rack for future expansion.

Two camera lines are installed in each rack and share a 1 Gbit network switch. All camera lines share the one InfiniBand bulk data transport switch. Two of the racks will be capable of holding the VBI real-time processing units. The third camera line rack will contain only one camera line but may be upgraded if a sixth camera line is ever required. The fourth rack is for general DHS administration and contains the DHS calibration and export stores, DHS server, and access portal.

One hardware camera line and the DHS administration equipment are shipped from NSO/Tucson, while the remaining four camera lines are shipped directly from the vendor to DKIST/Haleakala. Loss of any component or shipment is not critical; replacement part can be obtained within a few weeks. Spares will not be included during IT&C since there is no plan to run all instrument simultaneously. Additionally, only the VBI camera lines fully load a hardware camera line; other camera lines consume less than the maximum data rates. Reconfiguration of the hardware is envisioned in case of component failure.

Testing of the DHS will repeat the unit and assembly verification tests performed on the single camera line in Tucson. Since each camera line is independent of the others, there should be no additional tests needed. Further testing of the DHS will occur when the VBI integration tests begin. The VBI will have been tested at NSO/Boulder using one of the mini-DHS systems; its transition to a full-up camera line will test the data rates, volume, and processing requirements. As other instruments are commissioned, the DHS will be used to test their data systems.

Page 7: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 4 of 19

3. QUALITY CONTROL

Quality assurance and quality control are essential elements in the DKIST construction plan. Many aspects of the QA/QC plan are based upon past history of telescope development, along with modern systems engineering process control. The following definitions apply to the DKIST QA/QC.

Quality control, also known as verification, is a process used to evaluate whether or not a product, service, or system complies with regulations, specs, or conditions imposed at the start of development phase. Another way to think of this process is “Are you building it right?”

Quality assurance, also known as validation, is a process used to establish evidence that provides a high level of confidence that the product, service, or system accomplishes its intended requirements. Another way to think of this process is “Are you building the right thing?”

Change Control Board (CCB) is the name of the DKIST group that meets regularly to approve or reject requested changes in the DKIST specifications and interfaces.

Specification and Interface Control Documents define the specifications for each Work Breakdown Structure element and the public interfaces between each element. Both sets of documents are held under change control and changes must be approved by the CCB.

Design Document is the living document that defines the baseline design of the WBS element. This document is updated through the conceptual, preliminary, and final design phases to reflect the current design approach. During construction this document is updated to reflect the changes in scope or interfaces approved by the CCB. At the completion of construction the document should be a completed theory of operation manual for IT&C and operations.

3.1 VERIFICATION

The following tasks will be performed as part of the QA process. The work package manager and the DKIST QC/QA personnel will perform auditing of these tasks.

3.1.1 Requirements Verification

All new and changed requirements of the system will be verified to ensure the following:

They are directly related to an approved item from the CCB; They are consistent, feasible, and testable; They have been appropriately allocated to the correct mechanical, hardware, software, and

operational processes; and Those that are related to safety, security, and criticality have been verified by the rigorous

processes governing those areas.

The requirements verification process will be conducted by formal review. It will require participation and sign-off by the following team members:

Work package primary investigator; Work package manager; and Work package engineer responsible for the change.

3.1.2 Design Verification

All new and changed design elements will be verified to ensure the following:

Design is traceable to requirements;

Page 8: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 5 of 19

Design provides details describing how requirements will be met; and Design implements safety, security, and other critical requirements correctly as shown by

suitably rigorous methods.

The design verification process will be conducted by formal review and requires participation of the following team members:

Work package manager; Work package engineer responsible for the change; and At least one engineering representative from a different DKIST work package.

3.1.3 As-Built Verification

All new and changed components will be verified once construction is complete to ensure the following:

Applicable standards are being followed; Applicable best practices standards are being followed; DKIST software coding and commenting standards are met; and DKIST software best practices are met per SPEC-0005.

The as-built verification process will be conducted by an informal review process such as email. The review process must be performed and signed off by at least one engineer from a different DKIST work package.

3.1.4 Documentation Verification

In conjunction with new or changed component being released the following documentation must be provided/updated:

Detailed design documentation, specifications, ICDs, etc.; Performance benchmarks (for performance critical modules); Test documentation (unit, component, integration, and user acceptance); and Operations document.

The documentation verification process will be conducted by informal review, such as email. The review process must be performed and signed off by the following team members:

DKIST QA/QC representative; and DKIST release manager.

3.2 QUALITY ASSURANCE TASKS

The following tasks shall be performed as part of the quality assurance process. The results of these tasks will be documented and reviewed as part of the “Document Verification” task of the QC process.

3.2.1 Unit Testing

Unit testing involves the testing of individual units of work to ensure they are fit for use. A new or changed component shall be unit tested. This testing shall be performed before proceeding to component level testing. The tasks that must be completed as part of unit testing are as follows:

Prepare new and/or changed unit tests and related documentation; Ensure traceability of new and/or changed tests to requirements;

Page 9: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 6 of 19

Execute new, changed, and existing unit tests upon build of component; and Document unit test results.

Unit testing will be performed by the work package engineer responsible for the change or his/her designee.

3.2.2 Integration Testing

Integration testing involves testing an intended release to ensure it integrates correctly with all other DKIST systems for which it interfaces. Integration testing shall be performed in a qualified DKIST test environment that uses mechanical, hardware, and software systems equivalent to the production systems. Integration testing shall be performed before proceeding to user acceptance level testing. The tasks that must be completed as part of integration testing are as follows:

Prepare new and/or changed integration tests and related documentation; Ensure traceability of new and/or changed tests to requirements; Coordinate integration test schedule with test engineer of interfacing systems; Execute new, changed, and existing integration tests; and Document integration test results.

The work package engineer responsible for the change and the designated test engineer for each interfacing DKIST system will perform integration testing. Results shall be reviewed and signed off by the following team members before proceeding to user acceptance testing:

Work package manager; Work package manager(s) for all systems that interface with the released component; Work package engineer responsible for release; and Test engineer for all systems that interface with the released component.

3.2.2.1 Software Specific Tasks

The following tasks are specific to integration testing for software releases.

Any software defects (bugs) identified in testing will be logged in JIRA tracking system; All test cases impacted by the defect must be re-tested once the defect is resolved; Any un-resolved software defects must be approved by the review team before proceeding to

User Acceptance Testing; and Upon successful completion of integration testing the software source code will be tagged in

CVS to indicate it is part of a release. Test documentation will include reference to this release number.

3.2.3 User Acceptance (Verification) Testing

User acceptance testing involves performing tests for which the user will validate the output of the system to determine pass/fail status. User acceptance testing shall be performed in a production environment, or a qualified test environment that is approved by the user. User acceptance testing shall be performed before a release can be made operational for production use. The tasks that must be completed as part of user acceptance testing are as follows:

User to prepare new and/or changed integration tests and related documentation; Ensure traceability of new and/or changed tests to requirements; Coordinate user acceptance test schedule with production or test environments and systems; Execute new, changed, and existing user acceptance tests; and

Page 10: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 7 of 19

Document test results.

User acceptance testing will be performed by the user, with the support of the work package engineer responsible for the release, and test engineers from other interfacing systems. Before proceeding to production, the release must be approved by the following team members:

Work package user (i.e., owner or primary investigator); Work package manager(s) for all systems that interface with the released component; Work package engineer responsible for release; Test engineer for all systems that interface with the released component; and DKIST release manager.

3.2.3.1 Software Specific Tasks

The following tasks are specific to user acceptance testing for software releases.

Only software source code from the CVS tag that matches the release number identified in the integration test documentation may be used for user acceptance testing;

Any software defects identified during testing will be logged in JIRA tracking system; All test cases impacted by the defect must be re-tested once the defect is resolved; Any un-resolved software defects must be approved by the review team before proceeding to

production release; and Test documentation should include reference to the CVS tag for this release.

3.3 VERIFICATION TEST PLAN

3.3.1 Unit Tests

Unit tests will be written with the Test Automation Framework (TAF), as described in PROC-0015. All unit tests will test against the DHS compliance matrix (CMX-0016). The major DHS system tests are described below.

3.3.1.1 Bulk Data Transfer

Test procedures: A number of BDT data sources and sinks are attached to the InfiniBand switch. Data files of various typical transfer sizes are sent repeatedly from the sources to assigned sink. The average transfer rates and latencies are measured for both individual transfers and the aggregate of multiple simultaneous transfers.

Test results: The tests should show that the required data transfer rates are sustained. Tests should also show the effects and correction of dropped connections.

3.3.1.2 Quality Assurance Display

Test procedures: Data probes are inserted into the DHS data transfer to send images to the detailed displays. Images should be viewed on the display and manipulated by an operator.

Test results: Display rates should meet the DHS requirements. The set of display capabilities should meet the requirements (i.e., zoom, pan, histogram, color adjustment, etc.).

3.3.1.3 Quality Assurance Plug-in

Test procedures: A plug-in that manipulates the data is inserted into the data transfer stream before the quality assurance display. Typical test plug-ins would sum data, reject bad images, or correct bad pixels. The displayed data should be reviewed by the operator.

Page 11: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 8 of 19

Test results: The plug-in should affect transfer performance and latency only to the level of processing required to manipulate the image—the plug-in infrastructure should have minimal latency effects.

3.3.1.4 Data Storage

Test procedures: Data should be transferred from a camera to the data storage system at the required data rates. Subsequently, data should be transferred from the data storage to another sink at the required data rates.

Test results: The data storage system should maintain read and write throughputs.

3.3.1.5 Data Processing Pipeline

Test procedures: Several data processing nodes are connected with the first fed artificial data from a simple camera simulator. A data-processing plug-in node should be inserted into the data processing pipeline. Typical operations would be identical those used in 3.3.1.3. The final data image should be reviewed by the data handling scientist for correct behavior of the plug-in.

Test result: The meta-data tags should show that pipeline performance and latency was only affected to the level of processing required to manipulate the image. The final image should illustrate the correct operation of the plug-in's actions on the data.

3.3.1.6 Data Archive

Test procedures: The last node in the data processing pipeline used in 3.3.1.5 is replaced by one that records the image and meta-data into the data archive. The archive is examined to verify that the data and meta-data are properly recorded. A data transfer node simulator then retrieves the image from the archive and produce a FITS version. The resulting FITS image is examined by the data handling scientist.

Test results: The data and meta-data should be properly recorded in the data archive.. The execution of the data transfer node simulator should produce no errors. The data handling scientist should not find any problems with the resulting FITS image.

3.3.2 Integration Tests

Integration tests are performed both with the Tucson end-to-end test bed and at the summit.

3.3.2.1 Camera Software Systems

Test procedure: The camera is configured to send data at various rates and sizes that conform to the requirements. The DHS accepts the input data stream, displays it, and stores it.

Test results: Data rates should meet requirements. Multiple cameras and data streams should not impact the performance of each other.

3.3.2.2 Observatory Control System

Test procedure: This test must be performed with the cooperation of the OCS and the DKIST End-to-End simulator. The OCS Experiment Editor is used to construct a simple experiment using the DHS control display plug-in 'view'. The DHS is then installed in the DKIST End-to-End test bed and the experiment is executed. The DHS simulator displays the configuration(s) is it given during execution.

Test result: The displayed configurations should contain the expected attributes and their values.

3.3.2.3 Visible Broadband Imager

Test procedure: This test should support the VBI integration tests as defined in SPEC-0107, VBI Critical Design Document, section 10.3.

Page 12: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 9 of 19

Test results: The VBI requirements for data handling should be met.

Page 13: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 10 of 19

4. RISK

The DKIST Risk Management Plan defines the project approach to risk analysis and mitigation. The project maintains a risk register containing all identified risks at the beginning of the project and throughout the life of the project. Plans are provided for mitigating each high-level risk and the subsequent results of the actions taken.

The risk register exists to accomplish the following risk management goals:

• provide a tool for managing and reducing the risks identified before and during the project; • document risk mitigation strategies being pursued in response to the identified risks and their

grading in terms of likelihood and seriousness; • provide project management with a document from which status can be reported to oversight; • ensure the communication of risk management issues to key stakeholders; • provide a mechanism for seeking and acting on feedback to encourage the involvement of the key

stakeholders; and • identify the mitigation actions required for implementation of the risk management plan and

associated costs.

4.1 RISK REGISTER

Risk Item

Pro

bab

ilit

y

Non

-lab

or

Cos

t

Lab

or C

ost

HLSC-007 Expanded scope for NSO data handling integration 25% $340K $576K

HLSC-073 Scope refinements, specification documents in transition 80% $216K $173K

DHS-04 More bandwidth required to the base facility. 25% $100K $50K

4.2 RISK MITIGATION

General risk mitigation strategies are applied to all risks in the DHS These strategies include:

Early Development: The DHS will finish construction in 2017. The IT&C period will be used to address issues arising from later instrument development or unplanned integration needs.

Instrument support: 15% of the DHS developer’s time is allocated to general purpose instrument development support. It is expected that instrument developers will need assistance in learning about the DHS, using the mini-DHS for development, building data plug-ins, and interfacing with the DKIST facility cameras.

First camera line in Tucson: The first camera line will be available to test the DHS with the OCS end-to-end test bed. It will also be used to integrate the VBI simulator, quick look displays, and data processing. It will also be used to test and validate other DKIST instruments’ simulators as these instruments are developed.

Mini-DHS: One mini-DHS is delivered to each instrument developer. Although it may not perform at the full data rate nor be able to hold more than a few hours of data, it will be sufficient

Page 14: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 11 of 19

to develop the DKIST instruments. Further expansion of the mini-DHS (more cores, more storage) may be performed by the instrument developer if required. The mini-DHS will test that the interfaces to the camera, plug-ins, and quality assurance displays are correct.

The following risk mitigation strategies are envisioned for each of the major risks defined in the risk register.

4.2.1 HLSC-007: Expanded scope for NSO data handling integration

Based on HLS PDRs and subsequent discussions, there is some risk that additional implementation details/design changes will be required to integrate into the developing NSO model for both ingestion of scripts (Phase II Proposal preparation) and the export of data products.

The DHS-to-Operations interface has not been finalized. At this time, the DHS is transferring all data and associated products (e.g., header databases, calibration files, AO information) to an undetermined facility on the island. It is not yet determined that the list of these data products is complete. Also, it is possible that additional scope will be placed on the DHS, including new features such as remote early fetching of data, prioritized retrievals, quick look transferals, thumbnails, and remote user access to data.

Mitigation: Early interaction with the operations group. The HLS developers have met with Operations during preliminary development to ascertain the currently known scope of the DHS to Operations interface. Current operations development plans in this area are still uncertain, but the HLS group will continue to work towards finalizing the interface.

4.2.2 HLSC-073: Scope refinement; specification documents in transition.

There may be additional work or delays in development due to formal specification changes.

The DHS Design Requirements Document (DRD, SPEC-0165) has not been finalized. In addition, the DHS Specification (SPEC-0016) has incorrect data rates and volumes. Several of the DHS specifications are vague.

Data rates may increase if cameras are run at higher rates than specified by the DHS. Data volumes may go up with any of: increased rates, larger cameras, more simultaneous observations, more required calibrations, longer observing days, or retention of raw VBI data. Each of these would be a scope increase from the DHS specifications, but may feasibly occur.

Mitigation: Complete the specification review and update the specification document. Update the DRD for the new data rates and volumes. Work with operations and management to determine the best DHS solutions for the new rates using the existing budget and/or contingency.

Mitigation: Enforce current requirements. Initial operations at DKIST will need to be within the DHS specification and OCD use cases. Multiple instrument operations and length of day will be curtailed to the existing bandwidth, number of camera lines, and data storage size. The DKIST Change Control Board will need to deny scope increases.

4.2.3 DHS-04: More bandwidth required to base facility

More data may need to be sent to the base facility to support remote operations. This may include raw VBI data, larger VTF data frame size and rate, additional diagnostics data, or extended calibration data.

Mitigation: Reduce scope of transmitted materials. Levels of acceptable data transmission volume or latency must be set by a remote operations plan. Automation of data transfer products would help enforce this.

Page 15: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 12 of 19

Mitigation: Rely on physical transport. Many general-purpose data products can be delivered through physical transport of disk packs. The delivery may be one or two days after the data are collected. More data transport packs will need to be purchased.

Mitigation: Increase number of data lines to summit. There are currently no additional fiber optic network lines available to the summit. DKIST may rent more of the older copper data lines but these have limited bandwidth. Currently this is not seen as a viable mitigation strategy.

Page 16: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 13 of 19

5. COST ESTIMATES

The DKIST budget for the DHS is $1,544,000.48. This escalated cost number was developed by a bottom-up cost analysis performed in August 2011, and assumed hardware technology available at that time. At that time, the definition of the DKIST instrument suite was not complete; it was uncertain how many cameras would be in operation and what their final data rates would be. Because of this and other technology factors, project contingency of 23% was set aside.

5.1 COST ESTIMATE

The August 2011 DKIST budget has created separate work packages for labor and materials of each DHS development area. New quotes were obtained all materials required by the DHS design

Work Package AmountS-WASW3-300 DHS - Management (ARRA) $52,378.77S-WASW3-310 DHS - Mini-DHS Design & Development (ARRA) $504,942.07 S-WMSW3-310 DHS – Mini-DHS Design & Development $19,159.77S-WASW3-311 DHS - Mini-DHS Equipment (ARRA) $107,137.29 S-WMSW3-311 DHS - Mini-DHS Equipment $97,462.99 S-WASW3-315 DHS - Development & Testing (ARRA) $227,399.65S-WMSW3-315 DHS - Development & Testing $112,320.69S-WMSW3-316 DHS – Equipment - System $120,181.71S-WMSW3-317 DHS – Equipment – Camera Line 1 $56,652.07S-WMSW3-318 DHS – Equipment – Camera Lines 2-5 $209,261.28S-WASW3-320 DHS – Instrument Support (ARRA) $55,468.84S-WMSW3-320 DHS – Instrument Support $124,321.15 Total $1,544,000.48

5.1.1 Construction Labor

Labor costs for the DHS were recalculated in August 2011 for the individual schedule items. The first three years of construction included a half-time position for design of the DHS and development of the mini-DHS. This up-front effort is required to keep the DHS development ahead of its users, the instruments. Beginning in mid FY2012, the DHS also began supporting instrument developers at 15% effort.

Resource Work

FY2011

FY2012

FY2013

FY2014

FY2015

SW.1 Mini‐DHS 100% 89% 43% DHS 42% 85% 85% InstrumentSupport 11% 15% 15% 15%SW.2 Mini‐DHS 50% 50% 25%

Construction labor for the DHS ceases in July 2016, although instrument support activities continue through the end of construction.

Page 17: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 14 of 19

5.1.2 Integration, Testing, and Commissioning Labor

After construction, the DHS labor remains in the project to perform IT&C activities. The IT&C planning for these activities is currently under development. The activities include:

Construction of the five DHS camera lines at the summit computer room;

Cabling the InfiniBand network from the instruments to the computer room;

Site acceptance testing of the DHS;

Integration of the VBI (data transfer, processing, plug-ins, displays, export);

Integration of the remaining instruments.

IT&C labor begins August 2016 and completes in July 2019.

5.1.3 Construction Non-labor

Hardware costs for the DHS were recalculated in July 2011 for the latest DHS design. Except for the real-time processing node, these are commodity items we expect to purchase shortly before integration. Therefore the configuration and prices are expected to change over the course of the next three to four years. In some areas, such as network switches and disk drives, new technology may supplant our baseline hardware. In all cases except the real-time processing nodes, the current technology meets the DHS requirements.

Total cost for all materials is $1.019M. This is comprised of both the mini-DHS materials and the Haleakela materials.

Item Model # Price Total Tucson System Development systems various 1 $31,000 $31,000 Data store Winchester VX3464R 1 $10,000 $10,000 NDDS license/maint. RTI 1 $60,000 $60,000 Transfer node Pogo Iris 1252 1 $12,000 $12,000 Mini DHS Transfer node Pogo Iris 1252 4 $12,000 $48,000 Halekala System DHS controller Pogo Iris 1252 3 $4,000 $12,000 Access portal Pogo Iris 1252 1 $5000 $5,000 Calibration store Winchester VX3464R 2 $10,000 $20,000 Export Storage MaxVision TeraPac 3 6 $10,000 $60,000 BSD switch, 10 GB Fujitsu XG2600 1 $12,000 $11,000 Net switch, 1 GB HP E4500-24 1 $1,000 $1,000 Camera Lines InfinBand Switch Mellanox SX6005 2 $7,000 $14,000 Net switch, 1 GB HP E4500-24 2 $1,000 $2,000 Data store Winchester VX3464R 5 $10,000 $50,000 Transfer node Pogo Iris 1252 5 $5,000 $25,000 DSD client Pogo Iris 1252 5 $4,000 $20,000 Process node, non-RT Pogo Iris 1252 5 $8,000 $40,000

$421,000

Page 18: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 15 of 19

5.1.3.1 InfiniBand Switch

Configuration:

12 port FDR, 56 Gbit ConnectIB support Cables and rackmount

Vendor: Mellanox SX6005

Price: $6,789

5.1.3.2 1 Gb network switch

Configuration:

24 port 1 Gb switch Layer 3 management

Vendor: HP E4500-24

Price: $980.00

5.1.3.3 Data Store and Calibration Store

Configuration:

32 TB disk array 2000 GB 7200 SAS disks 4 x 10 Gb iSCSI ports RAID 0

Vendor: Winchester Systems VX3464R

Price: $19,122

5.1.3.4 Transfer Node

Configuration:

CPU: 1x Xeon X5690 3.46GHz 12MB Cache 6.4GT/sec Memory: 1x 12GB DDR3 1333MHz ECC Reg (3 x 4GB) SAS/SATA HDD 1: 1x Western Digital RE4 500GB SATA 64MB 3Gb/s PCI Expansion: 1x Intel X520-DA2 10G Dual Port SFP+ Network Adapter Network Options: 1x Dual Gigabit Ethernet Operating System: 1x CentOS 7 64bit

Vendor: Pogo Iris 1252

Price: $4,974.33

5.1.3.5 DSD client

Configuration:

CPU: 1x Xeon X5650 2.66GHz 12MB Cache 6.4GT/sec

Page 19: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 16 of 19

Memory: 1x 12GB DDR3 1333MHz ECC Reg (3 x 4GB) SAS/SATA HDD 1: 1x Western Digital RE4 500GB SATA 64MB 3Gb/s Network Options: 1x Dual Gigabit and Dual 10G SFP+ Ethernet Operating System: 1x CentOS 7 64bit

Vendor: Pogo Iris 1252

Price: $3,472.20

5.1.3.6 External store

Configuration:

24 TB removable storage 8x 3.5” 3000GB SAS drives Ruggedized

Vendor: MaxVision TeraPac 3

Price: $9,554.00

5.1.3.7 Processing node (non-realtime)

Configuration:

CPU: 2x Xeon X5690 3.46GHz 12MB Cache 6.4GT/sec Memory: 1x 24GB DDR3 1333MHz ECC Reg (6 x 4GB) SAS/SATA HDD 1: 1x Western Digital RE4 500GB SATA 64MB 3Gb/s PCI Expansion: 1x Intel X520-DA2 10G Dual Port SFP+ Network Adapter Network Options: 1x Dual Gigabit Ethernet Operating System: 1x CentOS 7 64bit

Vendor: Pogo Iris 1252

Price: $7233.78

5.1.4 Contingency

DKIST contingency is held by the project management and is not a part of any specific work package. However, to determine the project contingency, the DHS non-labor activities were assessed of possible risk factors and assigned a contingency percentage based upon the criteria of the DKIST Contingency Management Plan. The calculated DHS contingency was 23% of non-labor activities, or a total of 11% of the DHS budget. The DHS contingency was then rolled up into the project-wide contingency.

Page 20: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 17 of 19

6. SCHEDULE

The DHS schedule starts construction in FY2010 with the development of the Mini-DHS and supporting infrastructure. This work package includes purchase of the development hardware servers and development and testing of the bulk data transfer system, the quality assurance displays, and the data storage systems. It leads to the development of the first Mini-DHS system, available in March 2013.Another system can be purchased and assembled for the first DKIST instrument, the Visible Broadband Imager. Three subsequent Mini-DHS systems can be purchased similar to the original as required by instrument development. Additional Mini-DHS systems may be purchased by instruments (as part of the instrument budget) if additional camera lines are required.

Beginning in April 2013, the production DHS development begins. The schedule repeats the development cycle for the bulk data transfer, quality assurance, and data storage systems. It also begins developing the data processing pipeline and data archive and export systems. The DKIST Mini-DHS is replaced with the first physical camera line. Development of the DHS continues until March 2016. During the final few months, the DHS systems and first camera line are relocated to Haleakala and the additional four camera lines are purchased and installed at the summit. DHS acceptance tests are performed and the system is prepared for integration testing when the first instrument is delivered.

Integration of the DHS with the instruments is a part of the IT&C process and not covered by the DHS construction.

Page 21: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 18 of 19

Page 22: Data Handling System Project Management

Data Handling System Project Management

PMCS-0019, Revision A Page 19 of 19

7. CONFIGURATION ITEM DATA LIST

The Configuration Item Data List (CIDL) describes the status of all major deliverable software and documentation items. At present, the CIDL for the DHS contains the following:

Source Code Packages

DHS Final Release 1.0 Mini-DHS Final Release 1.0 BDT Final Release 1.0

Documentation

SPEC-0016, DHS Specification Document SPEC-0165, DHS Design Requirement Document TN-0065, DHS Critical Design Document CMX-0016, DHS Compliance Matrix MAN-0011, DHS Operations Manual PROC-0017, HLS Software Test Plan SPEC-0151, BDT Application Programmatic Interface ICD-4.2/4.3, OCS to DHS Interface ICD-4.3/7.0, DHS to Operations Interface DHS Factory Assembly Test Plan