software security testing

56
delivering results that endure Delivering Results that Endure Software Security and the Software Development Lifecycle Stan Wisseman [email protected] Booz Allen Hamilton 8251 Greensboro Drive McLean VA 22102

Upload: ankitmehta21

Post on 17-May-2015

1.236 views

Category:

Education


3 download

TRANSCRIPT

Page 1: Software Security Testing

delivering results that endureDelivering Results that Endure

Software Security and the Software Development Lifecycle

Stan [email protected]

Booz Allen Hamilton

8251 Greensboro Drive

McLean VA 22102

Page 2: Software Security Testing

2

Delivering Results that Endure

Software security: Why care?

Software is ubiquitous.

We rely on software to handle the sensitive and high-value data on which our livelihoods, privacy, and very lives depend.

Many critical business functions in government and industry depend completely on software.

Software—even high-consequence software—is increasingly exposed to the Internet.– Increased exposure makes software

(and the data it handles) visible to people who never even knew it existed before.

– Not all of those people are well-intentioned (to say the least!).

Page 3: Software Security Testing

3

Delivering Results that Endure

Security as a property of software

Secure software is software that can’t be intentionally forced to perform any unintended function.

Secure software continues to operate correctly even under attack.

Secure software can recognize attack patterns and avoid or withstand recognized attacks.

At the whole-system level, after an attack, secure software recovers rapidly and sustains only minimal damage.

Page 4: Software Security Testing

4

Delivering Results that Endure

Exploitable defects in software lead to vulnerabilities

Inherent deficiencies in the software’s processing model (e.g., Web, SOA, Email) and the model’s associated protocols/technologies

– Example: Trust establishment in webapplications is only one-way (client authenticates server)

Shortcomings in the software’s security architecture

– Example: Exclusive reliance on infrastructure components to filter/block dangerous input, malicious code, etc.

Defects in execution environment components (middleware, frameworks, operating system, etc.),

– Example: Known vulnerabilities in WebLogic, J2EE, Windows XP, etc.

Page 5: Software Security Testing

5

Delivering Results that Endure

Exploitable defects cont’d

Defects in the design or implementation of software’s interfaces with environment- and application-level components

– Example: Reliance on known-to-be-insecure API, RPC, or communications protocol implementations

Defects in the design or implementation of the software’s interfaces with its users (human or software process)

– Example: Web application fails to establish user trustworthiness before accepting user input.

Defects in the design or implementation of the software’s processing of input

– Example: C++ application does not do bounds checking on user-submitted input data before writing that data to a memory buffer.

Page 6: Software Security Testing

6

Delivering Results that Endure

So what do you do with these exploitable defects? Exploit them!

Session hijacking – A hacker will claim the identity of another user in the system

Command Injection (e.g. SQL Injection) – A hacker will modify input causing a database to return other users’ data, drop tables, shutdown the database

Cross Site Scripting (XSS) – A hacker will reflect malicious scripts off a web server to be executed in another user’s browser to steal their session, redirect them to a malicious site, steal sensitive user data, or deface the webpage

Buffer Overflows – A hacker will overflow a memory buffer or the stack, causing the system to crash or to load and execute malicious code, thereby taking over the machine

Denial of Service – A hacker will cause individual users or the entire system the inability to operate

Page 7: Software Security Testing

7

Delivering Results that Endure

Topology of an Application Attack

Network Layer

OS Layer

Application Layer

(End-user interface)

Network Layer

OS Layer

Application Layer

Custom

ApplicationBack-end

Database

Application Traffic

Page 8: Software Security Testing

8

Delivering Results that Endure

Software Security Vulnerabilities Reported

1995-1999

Year1995 1996 1997 1998 1999

Vulnerabilities 171 345 311 262 417

2000-2005

Year2000 2001 2002 2003 2004 1Q-2Q,2005

Vulnerabilities 1,090 2,437 4,129 3,784 3,780 2,874

Total vulnerabilities reported (1995-2Q,2005): 19,600 CERT/CC

Page 9: Software Security Testing

9

Delivering Results that Endure

Cost of Software Security Vulnerabilities

NIST estimates costs of $60 Billion a year due to software vulnerabilities

Security fixes for implementation flaws typically cost $2000-$10,000 when done during testing phase. However, they may cost more than 5-10 times when fixed after the application has been shipped.

The cost of fixing architectural flaws is significantly higher than fixing implementation flaws.

Gartner Group says system downtime caused by software vulnerabilities will triple from 5% to 15% by 2008 for firms that don't take proactive security steps

Page 10: Software Security Testing

10

Delivering Results that Endure

What to Do About These Serious Vulnerabilities?

Integrate Software Security into the Software Development Lifecycle

Page 11: Software Security Testing

11

Delivering Results that Endure

When to Address Software Security?

As early as possible AND throughout the development lifecycle

Page 12: Software Security Testing

12

Delivering Results that Endure

The Software Project Triangle

Software assurance affects every side of the triangle, and any changes you make to any side of the triangle are likely to affect software assurance

Page 13: Software Security Testing

13

Delivering Results that Endure

Challenges to Developing Secure Software

SDLC often does not have security as a primary objective

SDLC often is not robust enough to handle complex development needs For example, how does your SDLC handle:

– Inherent vulnerabilities in the technologies you’re using

– Use of code from untrusted (and open) sources

– Increase is features and complexity make security harder

– Time to market pushes security out

– Vendors that don’t warrant the trustworthiness of their software

– Software developers that aren’t trained in secure development

– Component assemblies, COTS integration, etc.

– Time, money constraints

– COTS upgrades and patches

Page 14: Software Security Testing

14

Delivering Results that Endure

Security Enhancing the Software Development Lifecycle

Page 15: Software Security Testing

15

Delivering Results that Endure

Software Security Problems are Complicated

IMPLEMENTATION BUGS

Buffer overflow– String format

– One-stage attacks

Race conditions– TOCTOU (time of check to time of

use)

Unsafe environment variables

Unsafe system calls – System()

Untrusted input problems

ARCHITECTURAL FLAWS

Misuse of cryptography

Compartmentalization problems in design

Privileged block protection failure (DoPrivilege())

Catastrophic security failure (fragility)

Type safety confusion errors

Broken or illogical access control (RBAC over tiers)

Method over-riding problems (subclass issues)

Signing too much code

Page 16: Software Security Testing

16

Delivering Results that Endure

The Challenge: Find Security Problems Before Deployment

Page 17: Software Security Testing

17

Delivering Results that Endure

Software Security SDLC Touchpoints

Requirementsand use cases

Design Test plans Code Testresults

Fieldfeedback

Abusecases

Securityrequirements

Externalreview

Riskanalysis

Risk-basedsecurity tests

Securitybreaks

Staticanalysis(tools)

Riskanalysis

Penetrationtesting

Source: Gary McGraw

Page 18: Software Security Testing

18

Delivering Results that Endure

Security Throughout the Application Lifecycle

Page 19: Software Security Testing

19

Delivering Results that Endure

Requirements Phase

Page 20: Software Security Testing

20

Delivering Results that Endure

Requirements Phase

You may have built a perfectly functional car, but that doesn’t mean it’s gas tank won’t blow up.

System requirements usually include functional requirements

But omit security requirements!

Page 21: Software Security Testing

21

Delivering Results that Endure

Principles of the Requirements Phase

You can’t assume security will be addressed by the developers

To adequately identify and specify security requirements, a threat-based risk assessment must be performed to understand the threats that the system may face when deployed. The development needs to understand that the threats to the system may change while the system is under development and when it is deployed

If it’s not a requirement, it doesn’t get implemented and doesn’t get tested

Page 22: Software Security Testing

22

Delivering Results that Endure

Security Requirements

Reuse Common Requirements– Most IT systems have a common set of security requirements

– Some examples:

Username/password

Access control checks

Input validation

Audit

– Dozens of common security requirements have been collected and perfected by security professionals…use these to get your requirements right

Security Requirements should include negative requirements

Requirement Tools should include misuse and abuse cases as well as use cases to capture what the system isn’t suppose to do

Page 23: Software Security Testing

23

Delivering Results that Endure

Requirements Phase: Misuse and Abuse Cases

Use cases formalize normative behavior (and assume correct usage)

Describing non-normative behavior is a good idea– Prepare for abnormal behavior (attack)

– Misuse or abuse cases do this

– Uncover exceptional cases

Leverage the fact that designers know more about their system than potential attackers do

Document explicitly what the software will do in the face of illegitimate used

Page 24: Software Security Testing

24

Delivering Results that Endure

Design Phase

Page 25: Software Security Testing

25

Delivering Results that Endure

Principles of Secure Design

Based on premise that correctness is NOT the same as security

Defense-in-depth: layering defenses to provide added protection. Defense in depth increases security by raising the cost of an attack by placing multiple barriers between an attacker and critical information resources.

Secure by design, secure by default, secure in deployment

Avoid High Risk Technologies

Page 26: Software Security Testing

26

Delivering Results that Endure

Principles of Secure Design (cont.)

Isolate and constrain less trustworthy functions

Implement least privilege

Security through obscurity is wrong except to make reverse engineering more difficult

Using good software engineering practices doesn’t mean the software is secure

Page 27: Software Security Testing

27

Delivering Results that Endure

Security in the Design Phase

Have security expert involved when designing system

Design should be specific enough to identify all security mechanisms– Flow charts, sequence diagrams

– Use cases, misuse case and abuse cases

– Threat models

Sometimes an independent security review of the design is appropriate– Very sensitive systems

– Inexperienced development team

– New technologies being used

Design your security mechanisms to be modular– Allows reuse!

– Allows for centralized mechanism

Page 28: Software Security Testing

28

Delivering Results that Endure

Threat Analysis

You cannot build secure applications unless you understand threats– Adding security features does not mean you have secure software– “We use SSL!”

Find issues before the code is created

Find different bugs than code review and testing– Implementation bugs vs. higher-level design issues

Approx 50% of issues come from threat models

Page 29: Software Security Testing

29

Delivering Results that Endure

Threat Modeling Process

Create model of app (DFD, UML etc)– Build a list of assets that require protection

Categorize threats to each attack target node– Spoofing, Tampering, Repudiation,

Info Disclosure, Denial of Service, Elevation of Privilege

Build threat tree for each threat– Derived from hardware fault trees

Rank threats by risk– Risk = Potential * Damage

– Damage potential, Reproducibility, Exploitability, Affected Users, Discoverability

Page 30: Software Security Testing

30

Delivering Results that Endure

Design Phase: Architectural Risk Analysis

The system designers should not perform the assessment

Build a one page white board design model

Use hypothesis testing to categorize risks– Threat modeling/Attack patterns

Rank risks

Tie to business context

Suggest fixes

Multiple iterations

Page 31: Software Security Testing

31

Delivering Results that Endure

Risk Analysis Must be External to the Development Team

Having outside eyes look at your system is essential

– Designers and developers naturally have blinders on

– External just means outside of the project

– This is knowledge intensive

Outside eyes make it easier to “assume nothing”

– Find assumptions, make them go away

Red teaming is a weak form of external review

– Penetration testing is too often driven by outsidein perspective

– External review must include architecture analysis

Security expertise and experience really helps

Page 32: Software Security Testing

32

Delivering Results that Endure

Risk Assessment Methodologies

These methods attempt to identify and quantify risks, then discuss risk mitigation in the context of a wider organization

A common theme among these approaches is tying technical risks to business impact

Commercial

STRIDE from Microsoft

ACSM/SAR from Sun

Standards-Based

ASSET from NIST

OCTAVE from SEI

Page 33: Software Security Testing

33

Delivering Results that Endure

Implementation Phase

Page 34: Software Security Testing

34

Delivering Results that Endure

Secure Implementation Concepts

Developer training– Essential that developers learn how to implement code securely

– Subtleties and pitfalls that can only be addressed with security training

Reuse of previously certified code that performs well for common capabilities, especially– Authentication

– Input Validation

– Logging

– Much of “custom” software uses previously developed code

Coding standards, style guides

Peer review or peer development

Page 35: Software Security Testing

35

Delivering Results that Endure

Validating Inputs

Cleanse data

Perform bounds checking

Check– Configuration files

– Command-line parameters

– URLs

– Web content

– Cookies

– Environment variables

– Filename references

Page 36: Software Security Testing

36

Delivering Results that Endure

Secure Coding Guides

Provide guidance on code-level security– Thread safety

– Attack patterns

– Technology-specific pitfalls

Booz Allen has developed internal secure coding guides– Java (J2EE general)

– C/C++

– Software Security Testing

Page 37: Software Security Testing

37

Delivering Results that Endure

Code Review Code review is a necessary evil

Better coding practices make the job easier

Automated tools help catch common implementation errors

Implementation errors do matter

– Buffer overflows can be uncovered with static analysis

C/C++ rules Java rule .NET rules

Tracing back from vulnerable location to input is critical

– Software exploits

– Attacking code

Page 38: Software Security Testing

38

Delivering Results that Endure

Code Review (con’t)

Pros of Code Review– Demonstrate that all appropriate security mechanisms exist

(e.g. LOGGING cannot be verified by pen testing)

– Can be performed throughout the development

– Complete traceability to show that security mechanisms are implemented correctly

– Able to find risks that are not evident in live application

(explicit comments, race conditions, missing audit, class-level security, etc.)

Cons of Code Review– Labor intensive

(Static analysis tools reduce labor, expand completeness)

– Requires expert

– Only using automated tools isn’t sufficient

Page 39: Software Security Testing

39

Delivering Results that Endure

Testing Phase

Page 40: Software Security Testing

40

Delivering Results that Endure

Testing Phase

Software security testing objective is to determine that software:– Contains no defects that can be exploited to force the software to operate

incorrectly or to fail

– Does not perform any unexpected functions

– Source code contains no dangerous constructs (e.g., hard-coded passwords)

The methodology for achieving these objectives will include:– Subjecting the software to the types of intentional faults associated with attack

patterns

Question to be answered: Is the software’s exception handling adequate?

– Subjecting the software to the types of inputs associated with attack patterns

Question to be answered: Is the software’s error handling adequate?

Page 41: Software Security Testing

41

Delivering Results that Endure

Software Security Testing is Different than ST&E

ST&E testing is functional in nature

– Goal of ST&E is to verify correct behavior, not to reveal defects or cause unintended behavior

– Only 3 NIST 800-53 controls refer to software security

ST&E testing not targeted towards vulnerabilities

Software Security testing is purely technical (no managerial or operational testing)

Software Security testing seeks out defects and vulnerabilities and attempts to exploit or reveal them.

– Defects and vulnerabilities are in context of software platform or architecture

Software Security testing goes into detail, where ST&E leaves off.

Page 42: Software Security Testing

42

Delivering Results that Endure

How Software Security Testing is Different

Software Security Testing is focused, in-depth security testing of the software– Minimizes gap between potential exploits

- Level of Detail +

Foc

used

C

over

age

Bro

ad C

over

age

ST&E

Sw Security Testing

Recommended by NIST Guidance Potential Sophistication of Attackers

Page 43: Software Security Testing

43

Delivering Results that Endure

Testing strategy

1. Think like an attacker and a defender. – Seek out, probe, and explore unused functions

and features. – Submit unexpected input.– Enter obscure command line options. – Inspect call stacks and interfaces.

– Observe behavior when process flow is interrupted.

2. Verify all properties, attributes, and behaviors that are expected to be there.

3. Verify use of secure standards and technologies and secure implementations of same.

4. Be imaginative, creative, and persistent.

5. Include independent testing by someone who isn’tfamiliar with the software.

Page 44: Software Security Testing

44

Delivering Results that Endure

What parts of software to test

The parts that implement:

The interfaces/interactions between the software system’s components (modules, processes)

The interfaces/interactions between the software system and its execution environment

The interfaces/interactions between the software system and its users

The software system’s trusted and high-consequence functions, such as the software’s exception handling logic and input validation routines

Page 45: Software Security Testing

45

Delivering Results that Endure

Lifecycle timing of security reviews and tests

Page 46: Software Security Testing

46

Delivering Results that Endure

Software security testing tools

TOOL PURPOSE EXAMPLES Code security review (source code, bytecode)

PREfast (in Microsoft Visual Studio 2005 Enterprise Edition), CodeAssure Workbench (Secure Software), inSpect (Klockwork), Source Code Analysis Engine & Audit Workbench (Fortify), Prexis (Ounce Labs)

Run-time binary analysis AppVerifier (in Visual Studio 2005)

Application vulnerability scanning WebInspect (SPI Dynamics), AppScan (Watchfire), ScanDo (KaVaDo), WebScarab (OWASP)

Security fault injection Holodeck (Security Innovation), Icebox (HBGary)

Software penetration testing Red Team Workbench/ Red Team Intercept (Fortify), SPI Toolkit (SPI

Dynamics), SOAtest pen. testing tool (ParaSoft)

Reverse engineering (disassembling, decompilation)

FxCop (in Visual Studio 2005), Logiscan & Bugscan (LogicLibrary)

Other: fuzzing, brute force testing, buffer overrun detection, input validation checking, etc.

Codenomicon (Codenomicon), Peach Fuzzer Framework (open source), BFB Tester (open source), Stinger (Aspect Security)

Categories of testing tools

Page 47: Software Security Testing

47

Delivering Results that Endure

Software Security Testing - Conclusions

Combined code review combined with application-level vulnerability scanning or penetration testing is the most effective

– Analysts have the ability to more clearly see and understand the interfaces of the application

– Analysts have the ability to verify or dismiss suspected vulnerabilities

– Security testers can “look under the hood” to fully understand application behavior

May lead to recommendations that change the requirements specification

Not guaranteed to find all problems: addressing security throughout the SDLC is more likely to reduce vulnerabilities

The changing threat environment can still induce vulnerabilities. Some level of testing should be periodically performed

Page 48: Software Security Testing

48

Delivering Results that Endure

Deployment Phase

Page 49: Software Security Testing

49

Delivering Results that Endure

Deployment Phase

Pre-deployment activities depend on the application but may include:– Remove developer hooks

– Remove debugging code

– Remove sensitive information in comments, e.g. “FIXME”

– Harden deployment OS, web server, app server, db server, etc.

– Remove default and test accounts

– Change all security credentials for deployed system, e.g. database passwords: to reduce the number of insiders that have direct access to the operational system

Page 50: Software Security Testing

50

Delivering Results that Endure

Post-Deployment Validation

Security of deployed software should be investigated regularly

Requires observing and analyzing its usage in the field

Requires automated support

Page 51: Software Security Testing

51

Delivering Results that Endure

Maintenance Phase

Page 52: Software Security Testing

52

Delivering Results that Endure

Maintenance Phase Security Activities

Monitor for, install patches for COTS in your system

Individually consider security implications for each bug fix

Security analysis review for every major release

Changes to system should not be ad-hoc, should be added to the requirements specification, design specification, etc.

Monitoring, intrusion detection at the application level

Page 53: Software Security Testing

53

Delivering Results that Endure

Relevant Process Models

Capability Models are designed to improve a process – CMMI

– SSE-CMM

Project capability models define a system by:– Analyzing, quantifying, and enhancing the efficiency and quality of generic

processes

– Improve efficiency and adaptability

– Provide a framework to apply a process multiple times with consistency

Page 54: Software Security Testing

54

Delivering Results that Endure

A Final Caution

It is inevitable that unknown security vulnerabilities will be present in deployed software– Software, users, environments too complex to fully comprehend

– Environment and usage are subject to change

Page 55: Software Security Testing

55

Delivering Results that Endure

References

Application security public cases:– http://informationweek.com/story/showArticle.jhtml?articleID=164900859

Build Security in Portal– https://buildsecurityin.us-cert.gov/portal/

Open Web Application Security Project (OWASP)– http://www.owasp.org/

• Computer Security: Art and Science by M. Bishop

Secure Coding: Principles and Practices by M. G. Graff and K. R. van Wyk

Exploiting Software: How to Break Code by G. Hoglund and G. McGraw

Writing Secure Code by M. Howard and D. LeBlanc

Attack Modeling for Information Security and Survivability by A.P. Moore, R.J. Ellison, and R.C. Linger

Building Secure Software by J. Viega and G. McGraw

Page 56: Software Security Testing

56

Delivering Results that Endure