software architectural risk analysis (sara)cradpdf.drdc-rddc.gc.ca/pdfs/unc108/p534648_a1b.pdf ·...
TRANSCRIPT
Defence Research andDevelopment Canada
Recherche et développementpour la défense Canada Canada
Software Architectural Risk Analysis (SARA)
Frédéric Painchaud
Robustness and Software Analysis Group
Defence R&D Canada • R & D pour la défense Canada
PAGE 2
Agenda
• Introduction
– Break
• Architectural Risk Analysis
– Break
• Risk Mitigation
• Continual Evaluation and Assessment
• Conclusion
Defence R&D Canada • R & D pour la défense Canada
PAGE 3
Introduction
• Risk?
– “The net negative impact of the exercise of a vulnerability, considering both the probability and the impact of occurrence”.
– “A function of the likelihood of a given threat-source’s exercising a particular potential vulnerability, and the resulting impact of that adverse event on the organization”.
• Risk Management?
– “The process of identifying risk, assessing risk, and taking steps to reduce risk to an acceptable level”.
– Gary Stoneburner, Alice Goguen and Alexis Feringa, Risk Management Guide for Information Technology Systems, NIST, Special Publication 800-30.
Defence R&D Canada • R & D pour la défense Canada
Introduction
• The three processes of Risk Management:
– Risk assessment (aka Risk analysis)
– Risk mitigation
– Evaluation and assessment
• Why?
– To forecast potential problems;
– To develop and implement appropriate controls to avoid identified problems;
– To plan the actions to be taken if the controls go wrong, if uncontrolled identified problems arise (residual risk) and if unforeseen problems happen.
PAGE 4
Risk assessmentR
isk mitigation
Evaluation and
assessment
Defence R&D Canada • R & D pour la défense Canada
Introduction
• Architectural Risk Analysis (ARA)?
– An adaptation of general Risk Assessment (Analysis), the first step of Risk Management.
• ARA and Risk Management in general are more an artthan a science. Their processes are defined but practice and experience tailor them to the particular organization.
PAGE 5
Defence R&D Canada • R & D pour la défense Canada
Introduction
• Going back to Risk Management, for IT systems, it must be integrated into the Software Development Life Cycle (SDLC) to be effective.
• The risk management methodology is (roughly) the same regardless of the SDLC phase for which the assessment is being conducted.
PAGE 6
Defence R&D Canada • R & D pour la défense Canada
Introduction
SDLC PhasesSupport from Risk Management
activities
Phase 1: Initiation • System risk identification
• System requirements incl. security
• System CONOPS
Phase 2: Development or acquisition • Architecture design
• Coding practices
– e.g., security (CERT C, ISO C)
• Testing
– e.g., security testing
Phase 3: Implementation (in the environment)
• Validation w.r.t. the requirements and operational environment
Phase 4: Operation and maintenance • Continual evaluation and assessment
Phase 5: Disposal • Proper disposal and system migration
PAGE 7
Defence R&D Canada • R & D pour la défense Canada
Introduction
• Participation in the risk management process is the business of many key players in organizations, including senior managers, business operations and IT procurement managers, IT security program managers and computer security officers, IT administrators and trainers.
PAGE 8
Defence R&D Canada • R & D pour la défense Canada
Architectural Risk AnalysisArchitectural Risk Analysis
Architecture documentation
Step 1. System characterization
One-page overview
Step 2. Threatidentification
Step 3. Vulnerabilityidentification
Step 4. Controlanalysis
History of system attacksSources of intelligence
Threat statement
Security testingresults
Vulnerability statement
Plannedcontrols
List of controls
Step 5. Attack likelihood determination
Attack likelihood rating
Step 6. Impactanalysis
Impact rating
Step 7. Risk determination
Step 8. Control recommendations
Step 9. Results documentation
Rated risks
Recommended controls or modifications
Architectural Risk Analysis Report
One-page overview
PAGE 10
Defence R&D Canada • R & D pour la défense Canada
Step 1. System Characterization
Architecture documentation
Step 1. System characterization
One-page overview
• Mostly design diagrams• Questionnaires, on-site interviews, tools, …
• Update or create design diagrams (validate against operational system)• Merge the views and abstract the design levels to produce a one-page overview (ambiguity analysis can help)
The basis for the entire architectural risk analysis
PAGE 11
Defence R&D Canada • R & D pour la défense Canada
Software Architecture Recovery Tools
• When your architectural documentation is incomplete, out-of-date or inexistent and when you have the source code, there are robust tools available to help.
• Refer to Philippe Charland’s presentation.
PAGE 12
Defence R&D Canada • R & D pour la défense Canada
Software Architecture Recovery Tools
• These tools are useful when you have the source code. When you only have the binary, there are currently two choices:
1. you use decompilers to generate source code of varying quality from the binary and then you use these tools or
2. you manually analyze the binary with the help of specialized tools to accelerate the process.
• These tools are useful to accomplish step 1 of the Architectural Risk Analysis process. When it comes to managing the whole process, a solution like KDM Workbench (seen during the tutorial) is more appropriate.
PAGE 13
Defence R&D Canada • R & D pour la défense Canada
Step 2. Threat Identification
One-page overview
Step 2. Threatidentification
History of system attacksSources of intelligence
Threat statement
OPS, ASIC, Darknets/Blacknets, CAPEC,Security design patterns, STRIDE,SANS Top Cyber Security Risks, …
• Misuse and abuse cases (add the time you take for use cases)• Attack resistance analysis
• Distributed architectures, dynamic code generation and interpretation, APIs across stateless protocols, rich internet applications, service-oriented architectures, …
Identifies threats, their level of motivation,their capacities and their likelihood
PAGE 14
OPS: OperationsASIC: All-Source Intelligence CentreCAPEC: Common Attack Pattern Enumeration and Classification, http://capec.mitre.orgSTRIDE: Spoofing identity, Tampering with data, Repudiation, Information disclosure, Denial of service, Elevation of privilege,
http://msdn.microsoft.com/en-us/library/ee823878(CS.20).aspx
Defence R&D Canada • R & D pour la défense Canada
Step 3. Vulnerability Identification
OPS, ASIC, Darknets/Blacknets, seven perniciouskingdoms, 19 deadly sins, OWASP Top 10, CWE,Open Source vulnerability database, CVE,questionnaires, on-site interviews, tools, …
Identifies vulnerabilities and their exploitability
One-page overview
Step 3. Vulnerabilityidentification
History of system attacksSources of intelligence
Security testingresults
Vulnerability statement
Assess evidences that controls are working properly
• Underlying framework weakness analysis• System’s dependencies
• Ambiguity analysis (in requirements and design)• Trust modeling (security zones)• Data sensitivity modeling (privacy and integrity)
PAGE 16
Seven pernicious kingdoms: Brian Chess and Gary McGraw’s taxonomy, http://www.cigital.com/papers/download/bsi11-taxonomy.pdf19 deadly sins: Michael Howard’s list, http://blogs.msdn.com/michael_howard/archive/2005/07/11/437875.aspxOWASP: Open Web Application Security Project, http://www.owasp.org/index.php/Category:OWASP_Top_Ten_ProjectCWE: Common Weakness Enumeration, http://cwe.mitre.org/CVE: Common Vulnerabilities and Exposures, http://cve.mitre.org/
Defence R&D Canada • R & D pour la défense Canada
Step 4. Control Analysis
Must be considered for future assessments
Identifies current and future controls and their effectiveness
Assess evidences that controls are working properly
• Controls are technical or non-technical• Controls are preventive or detective• Ambiguity analysis (in requirements and design)
• Trust modeling (security zones)• Data sensitivity modeling (privacy and integrity)
One-page overview
Step 4. Controlanalysis
Security testingresults
Plannedcontrols
List of controls
PAGE 18
Defence R&D Canada • R & D pour la défense Canada
Step 5. Attack Likelihood Determination
Identifies the likelihood of threats exercising vulnerabilities, that is, thelikelihood of attack scenarios
• No black magic: a function of threat and vulnerability likelihood, the latter being dependent on controls• Ambiguity analysis (in requirements and design)
• Threat modeling (attack surface)
Threat statement
Vulnerability statement
List of controls
Step 5. Attack likelihood determination
Attack likelihood rating
PAGE 19
Defence R&D Canada • R & D pour la défense Canada
Step 5. Attack Likelihood Determination
Threat likelihood
High Medium Low
High High High Medium
Medium High Medium Low
Low Medium Low Low
Vulnerability likelihood
Defence R&D Canada • R & D pour la défense Canada
Step 6. Impact Analysis
Identifies the magnitude of impacts
• Needs key players: senior management, business operations managers and IT security program managers• Interviews• Impacts can be measured quantitatively or qualitatively
• Examples: lost revenue, maintenance cost, loss of public confidence, …
One-page overview
Step 6. Impactanalysis
Impact rating
PAGE 21
Defence R&D Canada • R & D pour la défense Canada
Step 7. Risk Determination
Identifies the risks with their associated levels
• Again, no black magic: a function of attack likelihood and impact• Associates attacks with impacts
Attack likelihood rating
Impact rating
Step 7. Risk determination
Rated risks
PAGE 22
Defence R&D Canada • R & D pour la défense Canada
Step 7. Risk Determination
Attack likelihood
High Medium Low
High High High Medium
Medium High Medium Low
Low Medium Low Low
Impact
Defence R&D Canada • R & D pour la défense Canada
Step 8. Control Recommendations
• Risks prioritization (cost-benefit analysis)• For each risk, recommend one or more new controls or system modifications that will eliminate or mitigate that risk and are appropriate to the organization’s operations
Step 8. Control recommendations
Rated risks
Recommended controls or modifications
PAGE 24
Defence R&D Canada • R & D pour la défense Canada
Step 9. Results Documentation
A report that describes the architecture, the identified threats, the risks with their associated levels, exploited vulnerabilities and impacts, and the final recommendations on controls and modifications to implement
Threat statement
Step 9. Results documentation
Rated risks
Recommended controls or modifications
Architectural Risk Analysis Report
One-page overview
PAGE 25
Defence R&D Canada • R & D pour la défense Canada
Architectural Risk Analysis and Risk Management in Practice
• Cigital’s ARA Methodology (seen during the tutorial and mapped on the processes presented here)
• Microsoft’s– STRIDE Threat Model (Spoofing identity, Tampering with data,
Repudiation, Information disclosure, Denial of service, Elevation of privilege)
• http://msdn.microsoft.com/en-us/library/ee823878(CS.20).aspx
– Security Risk Management Guide• http://technet.microsoft.com/en-us/library/cc163143.aspx
• Sun’s Adaptive Countermeasure Selection Mechanism/Security Adequacy Review (ACSM/SAR)
– Discussed in Secure Coding: Principles and Practices, Mark G. Graff & Kenneth R. van Wyk, 2003, ISBN 0-596-00242-4.
PAGE 26
Defence R&D Canada • R & D pour la défense Canada
Architectural Risk Analysis and Risk Management in Practice
• Department of Homeland Security “Build Security In” website, https://buildsecurityin.us-cert.gov/
• NIST’s– Recommended Security Controls for Federal Information Systems
and Organizations, Special Publication 800-53 Rev. 3, August 2009.
– DRAFT Managing Risk from Information Systems: An Organizational Perspective, Special Publication 800-39, April 2008.
– DRAFT Guide for Applying the Risk Management Framework to Federal Information Systems: A Security Life Cycle Approach, Special Publication 800-37 Rev. 1, November 2009.
– Risk Management Guide for Information Technology Systems, Special Publication 800-30, July 2002.
– http://csrc.nist.gov/publications/PubsSPs.html
PAGE 27
Defence R&D Canada • R & D pour la défense Canada
Architectural Risk Analysis and Risk Management in Practice
• SEI’s Operationally Critical Threat, Asset, and Vulnerability Evaluation (OCTAVE)
– http://www.cert.org/octave/
• ISACA’s Control Objectives for Information and Related Technology (COBIT)
– http://www.isaca.org/Template.cfm?Section=COBIT6&Template=/TaggedPage/TaggedPageDisplay.cfm&TPLID=55&ContentID=7981
PAGE 28
SEI: Software Engineering InstituteISACA: Information Systems Audit and Control Association
Defence R&D Canada • R & D pour la défense Canada
Risk Mitigation
• A few options:– Acknowledgment and research: acknowledge threats and
vulnerabilities and research for controls to lower risk– Risk limitation: lower or minimize risk by implementing controls– Risk planning: prioritize (cost-benefit analysis, operational impact
and feasibility), implement and maintain controls and plan for residual risk
– Risk avoidance: eliminate risk cause and/or consequence, if possible
– Risk transference: transfer risk to other party, e.g., purchase insurance
– Risk assumption: accept the potential risk (if at an acceptable level)
PAGE 30
Defence R&D Canada • R & D pour la défense Canada
Continual Evaluation and Assessment
• Do not forget:
– Threats and the system change over time: risk management is ongoing and evolving!
– Plan for re-evaluation and re-assessment!
PAGE 31
Defence R&D Canada • R & D pour la défense Canada
Conclusion
• Keys for success in ARA:
– Senior management’s commitment
– Full support and participation of the IT team
– Competence of the risk analysis team:
• Architectural risk analysis process
• Threats and vulnerabilities (the attacker’s perspective)
• System and controls (the defender’s perspective)
– Support of the user community:
• Participation in risk analysis
• Compliance to controls
– Ongoing evaluation and assessment
PAGE 32