The Use and Implementation of Coding Standards for High

Safety-Critical Coding and Testing
Trends
MAE UK
November 10th, 2009
Presented by:
Paul Anderson
GrammaTech, Inc.
317 N Aurora St.
Ithaca, NY 14850
Tel: 607-273-7340
E-mail: [email protected]
© 2009 GrammaTech, Inc. All rights reserved
Outline
 Introduction and Motivation
 Coding Standards for Safety Critical Development
› Rationale
› Critiques of Coding Standards
› Detection of Standards Violations
 Advanced Static Analysis for Safety Critical Code
 Case Studies
Page 2
© 2009 GrammaTech, Inc. All rights reserved
Introduction and Motivation
 High-confidence software is difficult to validate
› Risk of failure is very high
› Most software contains flaws
 Two approaches
› Use coding standards to streamline development
› Use advanced static-analysis tools to find flaws early
Page 3
© 2009 GrammaTech, Inc. All rights reserved
Microsoft Zune Bug
Page 4
© 2009 GrammaTech, Inc. All rights reserved
Coding Standards
 Address several practices
› Layout
• “Spaces will not be used around ‘.’ or ‘->’, nor between unary operators
and operands.” (JSF 63).
› Naming
• “Different identifiers should be typographically unambiguous” (Misra
C++ 2-10-1).
› Syntactic restrictions
• “Brackets and parentheses in macros must be balanced” (JPL 8.2).
› Semantic guidelines
• “Do not use dynamic memory allocation after initialization.” (JPL 3).
› Process
• “Compile with all warnings enabled, and use source code analyzers”
(JPL 10).
Page 5
© 2009 GrammaTech, Inc. All rights reserved
Safety-critical Coding Standards
 Misra C/C++ (Motor Industry Software Reliability Association)
›
›
›
›
›
›
Rules first introduced in 1998
Revised in 2004: 141 rules for C
Revised to cover C++ in 2008 (mostly derived from JSF rules): 228 rules
Widely used in motor vehicle industry
Some support in popular embedded compilers
Closed standard
 JSF
›
›
›
›
Joint Strike Fighter Air Vehicle standards, introduced in 2005
232 rules for C and C++, some based on Misra C
Not widely used yet due to low uptake of C++ for safety-critical
Open standard
 EC
› A subset of ISO C, introduced in 2003 by Les Hatton
› Designed to be “measurement based”
› Open standard
Page 6
© 2009 GrammaTech, Inc. All rights reserved
Safety-critical Coding Standards
 Netrino Embedded C Coding Standard
› Much overlap with Misra C, but more liberal
› Rules biased towards enforcability
› Closed standard
 JPL “Power of 10”
› Gerard Holzmann’s (of NASA Jet Propulsion Laboratory), introduced
in 2006
› 23 distinct rules
› Intended as a minimum set of the most important rules
› Designed to leverage automation
› Adopted by JPL for NASA codes
› Open Standard
Page 7
© 2009 GrammaTech, Inc. All rights reserved
Rationale for Coding Rules
 Clarity
› Reduce programmer confusion
› A construct may be perfectly unambiguous and well-defined, but
may make code difficult to read
› “Do not use goto statements” (JPL 1.1)
 Predictability
› Eliminate sources of ambiguity
› Helps with portability
› “Do not use union types” (Misra C++ 9-5-1)
 Simplicity
›
›
›
›
Page 8
Keep the program simple
May help reduce the cost of testing
Keeps programs amenable to analysis
“No recursion” (JPL 1.2)
© 2009 GrammaTech, Inc. All rights reserved
Rationale for Coding Rules
 Defense
› Encourage defensive programming
› Fosters maintainability
› “Switch statement shall be a well-formed switch statement” (Misra C++ 6-4-3)
 Compliance
› Standards compliance
› Aids portability
› “Use IEEE floating point formats” (Misra C++ 0-4-3)
 Process
› How code is developed, not about the code itself
› “Compile with all warnings enabled, and use source code analyzers” (JPL 10)
 Performance
› Nothing to do with safety
› “Trivial forwarding functions should be inlined” (JSF 124)
Page 9
© 2009 GrammaTech, Inc. All rights reserved
Critiques of Coding Standards
 Too many rules
 Rules are ambiguous or otherwise poorly defined
 Enforcement may not improve quality
› Hatton argues that rules should only be specified where there is
empirical evidence that violations are risky
› Fixing violations may inject faults, with probability p
• p < 0.02 required to achieve higher quality for Misra C, but
• p = 0.15, as reported
› Conclusion based on analyzing code after completion
 Poor support for automation
 Standards are closed
Page 10
© 2009 GrammaTech, Inc. All rights reserved
Critique of Critiques of Coding Standards
 Too many rules
› “Power of 10” is designed to be easy to recall
 Rules are ambiguous or otherwise poorly defined
› For many rules, consistency of interpretation is most important
› Dogmatic interpretation may be counter productive
 Enforcement may not improve quality
› Enforcement helps quality by fostering good programming practices
› Advanced static-analysis tools do help eliminate bugs
 Poor support for automation
› Tool support is improving
 Standards are closed
› “Power of 10” is open
Page 11
© 2009 GrammaTech, Inc. All rights reserved
JPL “Power of 10” rules
 Easy for programmers to remember all rules
 Open standard
 Keeps code simple
› Easy for humans to understand
› Easy for machines to manipulate and analyze
• Not just static analysis, but test-case generation etc.
 Leverage automation as much as possible
› Infeasible to manually scan for violations otherwise
› Bias code towards analyzability
 Rule 10: “Compile with all warnings enabled, and use
source code analyzers”
› i.e., Advanced Static Analysis Tools
Page 12
© 2009 GrammaTech, Inc. All rights reserved
Advanced Static Analysis Testing
 Finds serious flaws by examining the program source code
› Leaks, Null pointer dereferences, Buffer overruns, Race conditions,
etc.
› No execution, so no test cases are required
 Uses symbolic execution to explore many paths through the
program
 Not guaranteed to find all bugs
 Low false positive rate
 Highly scalable (multiple MLOC)
 Extensible
Page 13
© 2009 GrammaTech, Inc. All rights reserved
JPL10: “Compile with all warnings enabled,
and use source code analyzers
 By “source-code analyzers”, read “advanced static analysis
tools”
 Fix all issues raised, even if they are technically false
positives
› Because not doing so may mask real problems too
› Train programmers to “write to the tool” by eliminating false positives
by rewriting the code
 Helps the tool, simplifies the code
Page 14
© 2009 GrammaTech, Inc. All rights reserved
Properties of a Violation Detection Tool
 Wide range of ability
› Should be able to find violations of simple rules, as well as deep semantic violations
• Easier to “dumb down” than “smarten up”
› Many different technologies appropriate
• Preprocessor, parser, control-flow analysis, data-flow, symbolic execution
 Customizability
› Easy mechanism for minor changes (e.g., N=20 to N=25)
› Users should be able to adapt rules
› Entirely new rules should be possible
 Should cope with “extreme” environments
›
›
›
›
Language variants (there is no ANSI C)
Unusual build systems
Huge code bases
Pathological constructs
 False positive management
› Unavoidable in practice
› Dismiss once, never be bothered again
Page 15
© 2009 GrammaTech, Inc. All rights reserved
Recommendations for Adoption of Coding
Standards
 Pick a coding standard and adopt it
› Consider using JPL rules as the base
 Use a wide-spectrum violation detection tool
› Deploy this for regular scans and practice JPL rule 10
 Be careful about interpreting rules
› Use the rationale as a guide for rules with a subjective element
 Allow exceptions
› Discourage these for objective rules
 Be consistent about enforcement
 Extend the tools for new rules
› As new flaws are found, derive a rule for how the code should have
been written, and write a checker
 Don’t neglect manual reviews, thorough testing, etc.
Page 16
© 2009 GrammaTech, Inc. All rights reserved
Conclusions
 Safety-critical coding standards are becoming more popular
› Despite doubts about the efficacy of requiring enforcement for
existing code
› Wide consensus on use of such standards for new code
› Keys to success are non-dogmatic consistency, and leverage of
automation
 Advanced static analysis testing tools now best practice
› For finding serious defects early in development process
› For detecting violations of coding standards
 Use of both yields maximum benefit
› Symbiotic relationship developing between coding rules and tools
Page 17
© 2009 GrammaTech, Inc. All rights reserved
Case Study: LLNL
 Lawrence Livermore National Laboratories
› DoE-funded government research lab
 Detailed study of 200 KLOC C/C++ program
› 285 defects found using Klocwork
•
•
•
•
56 critical (null pointer dereferences, buffer overruns)
14 severe (use of freed memory)
193 errors (leaks, uninitialized variables)
22 other warnings
› Estimated cost savings of $200,000
• Based on $4,000 cost to fix if found during deployment vs $100 to fix early
 Used advanced static-analysis tool on 1.9 MLOC
› Over 3,000 defects found and repaired
 Reference:
›
Page 18
Page 18
"Give Your Defects Some Static - Using Static Analyzers to Debug Your Code", Greg Pope, Kim Ferrari, and Bill
Oliver, Better Software, July 2008, Volume 10, Number 6, pp 36-42, StickyMinds.com,
http://www.stickyminds.com/BetterSoftware/Magazine.asp.
Copyright © GrammaTech, Inc. 2009
© 2009 GrammaTech, Inc. All rights reserved
Case Study: FDA
 Software Forensics Lab at FDA
› Studies software for medical devices, usually after harmful incident is reported
 Case Study:
› 200 KLOC C program for old device
• Used discontinued compiler. Build system had to be reconstructed.
› Analyzed with CodeSonar
 Results
› 127 serious problems (736 reported in 16 classes)
•
•
•
•
•
29 unsafe casts
28 null pointer dereferences
36 uninitialized variables
20 unreachable code
14 other
› Manufacturer was previously aware of 82 of these
› 45 were not previously known
 References
›
›
Page 19
Page 19
"Flaws in medical coding can kill", Jonathan D.Rockoff, Baltimore Sun, June 30 2008,
http://www.baltimoresun.com/news/health/bal-te.fda30jun30,0,912831.story.
"Using Static Analysis to Evaluate Software in Medical Devices", Raoul Jetley and Paul Anderson, Embedded
Systems Design, April 2008, Volume 21, Number 4, pp 40-44, TechInsights,
http://www.embedded.com/design/207000574.
Copyright © GrammaTech, Inc. 2009
© 2009 GrammaTech, Inc. All rights reserved
References
 Coding rules
›
›
›
›
›
Misra C: http://www.misra.org.uk/
JSF: http://www.research.att.com/~bs/JSF-AV-rules.pdf
JPL 10: http://www.spinroot.com/p10/
EC: http://www.leshatton.org/ISOC_subset1103.html
Netrino: http://www.netrino.com
 Critiques
› Les Hatton Misra C Critique:
http://www.leshatton.org/Documents/MISRA_comp_1105.pdf
 Tools
› CodeSonar: http://www.grammatech.com/products/codesonar
 Me
› Paul Anderson, [email protected]
Page 20
© 2009 GrammaTech, Inc. All rights reserved
Backup slides
Page 21
© 2009 GrammaTech, Inc. All rights reserved
JPL “Power of 10”
1.
2.
3.
4.
5.
6.
7.
Restrict to simple control flow constructs.
Give all loops a fixed upper-bound.
Do not use dynamic memory allocation after initialization.
Limit functions to no more than 60 lines of text.
Use minimally two assertions per function on average.
Declare data objects at the smallest possible level of scope.
Check the return value of non-void functions, and check the validity of
function parameters.
8. Limit the use of the preprocessor to file inclusion and simple macros.
9. Limit the use of pointers. Use no more than one level of dereferencing.
10. Compile with all warnings enabled, and use source code analyzers.
See http://www.spinroot.com/p10/
Page 22
© 2009 GrammaTech, Inc. All rights reserved
Rule Interpretation: use Rationale
 Clarity
› Be flexible, as clarity is a very subjective issue. Foster consistency among developers
 Compliance
› Be flexible, but bear in mind future transition costs.
 Defense
› Be moderately flexible, but have good evidence that exceptions are justified.
 Performance
› Be very flexible, except where performance is required to enhance safety properties
 Predictability
› Strongly avoid compromising on predictability.
 Process
› Avoid violations of these rules, as following them offers the best return on investment
 Simplicity
› Be flexible, but bear in mind that compromise on these rules may impact testing
costs.
Page 23
© 2009 GrammaTech, Inc. All rights reserved