OWASP Secure Code Review Metrics

Secure Code Review:
Enterprise Metrics
Richard Tychansky
Lockheed Martin Corporation
Email: [email protected]
OWASP
11/10/2010
Copyright © The OWASP Foundation
Permission is granted to copy, distribute and/or modify this document
under the terms of the OWASP License.
The OWASP Foundation
http://www.owasp.org
Agenda
 Software Development Life Cycles
 Enterprise Elements for Secure Development
 Application Security Standards
 Secure Code Review Metrics
 Metrics by SDLC Phase (General Model)
 Implementing the Framework
OWASP
2
Software Development Life Cycles
 Each SDLC model has its own benefits depending upon
your organizational needs.






Agile
Waterfall
Iterative
Vee Model
Incremental and Iterative Development
Microsoft Security Development Lifecycle (SDL)
 Ultimately, to develop secure software you need to
follow a repeatable and continuously improving software
engineering process.
Security needs to be integrated and measured at
each SDLC phase.
OWASP
3
Enterprise Elements for Secure Development
 Management Support
 Developer Training
 Professional Career Development
 Technical Vulnerability Management Program
 Integration of Security Standards into the SDLC
 Security Metrics Program
OWASP
4
OWASP
5
Application Security Standards
 OWASP Application Security Verification Standard
 Level 1 Automated Verification
 Level 1A – Dynamic Scan
 Level 1B – Source Code Scan
 Level 2 Manual Verification
 Level 2A – Penetration Test
 Level 2B – Code Review
 Level 3 Design Verification – threat modeling
 Level 4 Internal Verification – examine how security works – look for malicious code
 NSA Guidance for Addressing Malicious Code Risk
 DISA STIG Application Security and Development
 Use a standard which is development life cycle independent and vendor
agnostic.
 Maintain the principles of the SDLC model you are using and integrate
security at each phase.
OWASP
6
OWASP
7
Secure Code Review Metrics
 Decided what to measure
 Set the minimum benchmark
 Define reporting requirements to Management, and
customers.
 Use a hybrid approach to integrating standards into your
SDLC model of choice.
 Map metrics to ASVS level completion and security
testing and monitoring programs.
OWASP
8
Metrics by SDLC Phase (General Model)
SDLC Phase
Secure Code Metric
Requirements
•Percentage of security requirements given in project specifications.
•Percentage of security requirements subject to cost/benefit, and risk
analysis.
•Percentage of security requirements which are considered in threat
models.
Design
•Percentage of design components subjected to attack surface analysis.
•Percentage of security controls that are covered by security design
patterns.
•Percentage of security controls which pose an architectural risk.
Implementation
(Coding)
•Percentage of application components subject to manual and/or
automated source code review.
•Percentage of code deficiencies detected during peer reviews.
•Percentage of application components subject to code integrity/signing
procedures.
Verification
(Testing)
Reference: Allen, J .(2009)
•Percentage of common weaknesses and exposures detected per
requirement specification.
•Percentage of security controls within the application that met the required
specification for software assurance.
OWASP
9
Implementing the Framework
 COTS vs internal development web application security metrics.
 Security in source code begins with requirements, then design, and
test (i.e., throughout the SDLC)
 Align source code vulnerability metrics to OWASP Top Ten “design
flaw categories”.
 Security design flaw metrics captured for:






Source code design
Insecure field scope
Insecure method scope
Insecure class modifiers
Unused external references
Redundant code
OWASP
10
Conclusions
Consensus building across multiple business
areas is not easy
Training all developers is elusive
Centralizing source code analysis is problematic
Finding the right reporting metrics for Senior
Management is critical to project success
OWASP
11
References
Allen, J., (2009). Measuring Software Security. Software Engineering Institute, Carnegie Melon
University http://www.cert.org/archive/pdf/research-rpt-2009/allen-meas-soft-sec.pdf
 Application Security and Development STIG Version 3 Release 2. DISA
http://iase.disa.mil/stigs/downloads/zip/u_application_security_and_development_stig_v3r2_2010
1029.zip
 Fundamental Practices for Secure Software Development. Software Assurance Forum for
Excellence in Code (SAFECode)
http://www.safecode.org/publications/SAFECode_Dev_Practices1108.pdf
 NSA Guidance for Addressing Malicious Code Risk.
http://www.nsa.gov/ia/_files/Guidance_For_Addressing_Malicious_Code_Risk.pdf
 OWASP Application Security Verification Standard Project.
http://www.owasp.org/index.php/Category:OWASP_Application_Security_Verification_Standard_P
roject
 OWASP Security Code Review in the SDLC.
http://www.owasp.org/index.php/Security_Code_Review_in_the_SDLC
 OWASP Top Ten Project. http://www.owasp.org/index.php/OWASP_Top_Ten_Project
 Resources for Developers. Software Engineering Institute, Carnegie Melon University
http://www.cert.org/cert/information/developers.html
 Software Assurance: An Overview of Current Industry Best Practices. Software Assurance Forum
for Excellence in Code (SAFECode)
http://www.safecode.org/publications/SAFECode_BestPractices0208.pdf

OWASP
12
Discussion Question
 Tell me about security metrics in your organization.
OWASP
13