1 overheads from parnas’ presentation the next slides are transcribed versions of (most of) the...

24
1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation.

Upload: miles-ford

Post on 21-Jan-2016

220 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

1

Overheads from Parnas’ Presentation

• The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation.

Page 2: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

2

Why is it important that the software can never be trusted?

• “We” will make decisions as if it was not there.

• “They” will make decisions as if it might work.

Page 3: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

3

A necessary condition for trustworthy engineering products is validation by:

• Mathematical analysis, or

• Exhaustive case analysis, or

• Prolonged, realistic, testing

or a combination of the above

Page 4: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

4

Why software is always the unreliable glue in engineering systems:

• The best mathematical tools require that a system be described by continuous functions

• Exhaustive case analysis can only be used when the number of states is small or the design exhibits a repetitive structure

Page 5: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

5

Why do we have some usable software?

• Sometimes the requirements allow untrustworthy software

• There has been extensive use under actual conditions

• Operating conditions are controlled or predictable

• “Backup” manual system available when needed

Page 6: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

6

What makes the SDI software much more difficult than other projects?

• Lack of reliable information on target and decoy characteristics

• Distributed computing with unreliable nodes and unreliable channels

• Distributed computing with hard real-time deadlines

• Physical distribution of redundant real-time data

• Hardware failures will not be statistically independent

Page 7: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

7

What makes the SDI software much more difficult than other projects?

• Redundancy is unusually expensive• Information essential for real-time scheduling

will not be reliable• Very limited opportunities for realistic testing• No opportunities for repairing software during

use• Expected to be the largest real-time system ever

attempted, frequent changes are anticipated

Page 8: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

8

Software Espionage and Nuclear Blackmail

• Fact: Software systems, because of their rigid predetermined behaviors are, easily defeated by people who understand the programs

• Fact: Changes in large software systems must be made slowly and carefully with extensive review and testing

Page 9: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

9

What about new Soft. Eng. techniques?

• Precise requirement documents• Abstraction/information hiding• Formal specifications

The use of these techniques requires previous experience with similar systems

• Co-operating sequential processes requires detailed information for real-time scheduling

• Structured programming reduces but does not eliminate errors

Page 10: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

10

What about Artificial Intelligence?

• AI-1 - Defined as solving hard problems.– Study the problem, not the problem solver.No magic techniques just good solid program

design.• AI-2 - Heuristic or Rule Based

Programming/Expert Systems– Study the problem solver, not the problem– Ad hoc, “cut and dry” programming– Little basis for confidence

Page 11: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

11

What about new programming languages?

• No magic• They help if they are simple and well

understood• No breakthroughs

The fault lies not in our tools but in ourselves and in the nature of our product.

Page 12: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

12

What about automatic programming?

• Since 1948 a euphemism for programming in a new language?

Page 13: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

13

What about program verification?

• The right problem but do we have a solution?• What’s a big program?• Wrong kind of program? How do you verify

a model of the earth’s gravitational field?• Implicit assumption of perfect arithmetic• What about language semantics?

Page 14: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

14

Is there a meaningful concept of tolerance for software?

• The engineering notion of “tolerance” depends on an assumption of continuity.

• Statistical measures of program quality are limited in their application to situations where individual failures are not important.

Page 15: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

15

Overheads from Seitz’ Presentation

• The next slides are transcribed versions of (most of) the transparencies in Seitz’ presentation.

Page 16: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

16

From “The Strategic Defense Initiative”White House pamphlet dated Jan, 1985.

“SDI’s purpose is to identify ways to exploit recent advances in ballistic missile defense technologies that have potential for strengthening our security and that of our Allies. The program is designed to answer a number of fundamental scientific and engineering questions that much be addressed before the promise of these new technologies can be fully assessed. The SDI program will provide to a future president and a future congress the technical knowledge necessary to support a decision in the early 1990’s on whether to develop and deploy advanced defensive systems.”

Page 17: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

17

From 1985 “Report to the Congress on the Stategic Defense Initiative” (Section III):

“The goal of the SDI is to conduct a program of rigorous research focused on advanced defensive technologies.”

“The SDI seeks, therefore, to exploit emerging technologies that may provide options for a broader-based deterrence by turning to a greater reliance on defensive systems”

Page 18: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

18

From 1985 “Report to the Congress on the Stategic Defense Initiative” (Section III):

“It should be stressed that the SDI is a research program that seeks to provide the technical knowledge required to support a decision on whether to develop and later deploy these systems. All research efforts will be fully compliant with U.S. treaty

obligations.”

Page 19: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

19

• Weapons– Incapable of causing damage at Earth’s surface– Range 1000 km.– Partial deployment ineffective in boost phase

• Sensors– Some located in high orbits– Can be passive– Useful in early deployments

• Battle Management System– Computers and communication

Page 20: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

20

Coordination

• Lowest Level - stereo and “sensor fusion”• Middle Levels - target discrimination, attack

and coordination• High Levels - assignment of priorities of target

in midcourse in order to prevent particular areas from being overwhelmed in terminal defense, or to prevent any single area to accept too high a concentration for terminal defense

• Top Level - command and control decisions

Page 21: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

21

Conclusions of the Panel:

“The feasibility of the battle management software and our ability to test, simulate, and modify the system are very sensitive to the choice of system architecture. In particular, the feasibility of the BMS software is much more sensitive to the system architecture than it is to the choice of software engineering technique”

Page 22: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

22

Conclusions of the Panel:

“Software technology is developing against what appears today to be relatively inflexible limits in the complexity of systems. The treadeoffs necessary to make the software tractable are in the system architecture”

Page 23: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

23

Conclusions of the Panel:

“We must prefer an unconventional system architecture whose programming is within the anticipated limits of software engineering over reliance on radical software development approaches and the risk that we could not develop reliable software at any cost…”

Page 24: 1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation

24

Conclusions of the Panel:

“One promising class of system architecture for a strategic defense system are those that are less dependent on tight coordination… [because of]… the ability to infer the performance of full-scale deployment by evaluating the performance of small parts of the system.”