Common Criteria for IT Security Evaluation - SPI Analogies1 Prof. Dr. Miklos Biro Budapest University of Economic Sciences and Public Administration [email protected] Abstract Analogies between complex systems are useful for helping those familiar with one of them to better understand the other one, and for establishing cost and resource effective unified systems. The principle is applied in this paper to the ISO/IEC 15408 (Common Criteria) IT Security Evaluation standard, software quality evaluation standards and the Capability Maturity Model Integration (CMMI). Keywords Security, Quality, Assessment, Evaluation, Capability, Maturity, Classification, Categorization 1 published in: Biró,M. Common Criteria for IT Security Evaluation - SPI Analogies. In: Proceedings of the EuroSPI'2003 Conference (ed. by R.Messnarz). (Verlag der Technischen Universität Graz) (ISBN 3-901351-84-1) pp.IV.13-IV.21. CMMI is registered in the U.S. Patent & Trademark Office by Carnegie Mellon University EuroSPI 2003 1 1 Introduction Security is naturally present in all systems of software product quality criteria, and plays a significant role in the approporiate implementation of many software and systems engineering process areas. The development of the Information Society made this criterion of even higher significance, which resulted in the distinguished attention of international standardization bodies for example, resulting in the ISO/IEC 15408 (Common Criteria) standard. Certification needs and the constraints of the standardization process led to the flexibility in both the product standards (ISO/IEC 9126, ISO/IEC 14598) and the process methodologies (CMMI, ISO/IEC 15504) which allows for evaluation modules based on a more elaborated background (ISO/IEC 15408, ISO/IEC 12207) as well as other modules based on simpler measurements. Even if some of the underlying standards evolved independently of each-other, the discovery of analogies between their structure can contribute to the establishment of a cost and resource effective multiple certification process [Taylor, Alves-Foss, Rinker, 2002]. The combination of software process and product quality standards has already been studied in [Boegh, Régo, 2000]. In this paper we examine the analogies between the ISO/IEC 15408 (Common Criteria) standard and software quality and process capability evaluation methodologies. 2 The Common Criteria The history of the ISO/IEC 15408 (Common Criteria~CC) standard goes back to the 80's with the following non-exhaustive list of milestones: 1980- TCSEC: Trusted Computer System Evaluation Criteria (USA) 1991 ITSEC: Information Technology Security Evaluation Criteria v 1.2 (France, Germany, the Netherlands, U.K.) 1993 CTCPEC: Canadian Trusted Computer Product Evaluation Criteria v 3.0 1993 FC: Federal Criteria for Information Technology Security v 1.0 (USA) CC Editorial Board 1996 CC v 1.0 ISO Committee Draft (CD) 1998 CC v 2.0 ISO Committee Draft (CD) 1999 CC v 2.1 = ISO/IEC 15408 CC v 2.1 consists of the following parts: Part 1: Introduction and general model Part 2: Security functional requirements Part 3: Security assurance requirements EuroSPI 2003 2 It is a common perception that understanding the Common Criteria (CC) evaluation process requires painstakingly inspecting multiple documents and cross referencing innumerable concepts and definitions [Prieto-Díaz, 2002]. The first challenge is the digestion of the abbreviations of which here is a brief extract for our immediate purposes: TOE: Target of Evaluation — An IT product or system and its associated administrator and user guidance documentation that is the subject of an evaluation. TSP: TOE Security Policy — A set of rules that regulate how assets are managed, protected and distributed within a TOE. TSF: TOE Security Functions — A set consisting of all hardware, software, and firmware of the TOE that must be relied upon for the correct enforcement of the TSP. PP: Protection Profile — An implementation-independent set of security requirements for a category of TOEs that meet specific consumer needs. ST: Security Target — A set of security requirements and specifications to be used as the basis for evaluation of an identified TOE. EAL: Evaluation Assurance Level — A package consisting of assurance components from Part 3 that represents a point on the CC predefined assurance scale. Here is an illustrative list of the classes of security functional requirements discussed in Part 2 of the CC: FAU Security audit FCO Communication FCS Cryptographic support FDP User data protection FIA Identification and authentication FMT Security management FPR Privacy FPT Protection of the TOE security functions FRU Resource utilisation FTA TOE access FTP Trusted path / channels The following are classes of security assurance requirements discussed in Part 3: ACM Configuration Management ADO Delivery and Operation ADV Development AGD Guidance Documents ALC Life Cycle Support ATE Tests AVA Vulnerability Assessment AMA Maintenance of Assurance APE Protection Profile Evaluation ASE Security Target Evaluation And finally, table B.1 from Appendix B of Part 3 of CC v 2.1 which describes the relationship between EuroSPI 2003 3 the evaluation assurance levels and the assurance classes, families and components. Table 1 EuroSPI 2003 4 3 Enlightening Analogies The above sample from the CC naturally raises a lot of questions whose answers would require the already mentioned inspection and cross referencing of multiple documents including hundreds of pages. As an introductory alternative approach, the analogies below offer a shortcut to those who already have a basic understanding of models of software quality and process capability. CC certification is performed after the system is developed. In this sense, CC is closer to the software product quality evaluation standards ISO/IEC 9126, ISO/IEC 14598, and their follow-up being developed under the acronym SQUARE (ISO 25000 Software Quality Requirements and Evaluation). As far as the ISO/IEC 9126 standard is concerned, the classes of security functional requirements and the classes of security assurance requirements are analogous to the high-level quality characteristics, while the requirement families to the subcharacteristics. Evaluation Assurance Levels (EAL) can be simply interpreted as measurement results on an ordinal scale analogously to measurements of subcharacteristics in ISO/IEC 9126. A key concept of ISO/IEC 14598 is that of the evaluation module. "An evaluation module specifies the evaluation methods applicable to evaluate a quality characteristic and identifies the evidence it needs. It also defines the elementary evaluation procedure and the format for reporting the measurements resulting from the application of the techniques." It also defines its own scope of applicability. In other words, an ISO/IEC 14598 evaluation module defines a consistent set of requirements and procedures for evaluating a quality characteristic independently from the concrete product, but depending on its application environment. If we consider the concept of Protection Profile (PP) as an implementation-independent set of security requirements for a category of TOEs that meet specific consumer needs, as introduced above, we can immediately see the analogy. Even-though CC certification is performed after the system is developed, its structure shows a striking analogy with the system of continuous and staged representation structures of the Capability Maturity Model Integration (CMMI). In order to highlighting the analogy, let us consider the table in Appendix F of version 1.1 of the CMMI Continuous Representation showing the process area capability level (CL) target profiles of the Continuous Representation making an organization's maturity level equivalent to a maturity level (ML) defined in the Staged Representation. CMMI is registered in the U.S. Patent & Trademark Office by Carnegie Mellon University EuroSPI 2003 5 Name Abbr ML Requirements Management REQM 2 Measurement and Analysis MA 2 Project Monitoring and Control PMC 2 Project Planning PP 2 Process and Product Quality Assurance PPQA 2 Supplier Agreement Management SAM 2 Configuration Management CM 2 Decision Analysis and Resolution DAR 3 Product Integration PI 3 Requirements Development RD 3 Technical Solution TS 3 Validation VAL 3 Verification VER 3 Organizational Process Definition OPD 3 Organizational Process Focus OPF 3 Integrated Project Management (IPPD) IPM 3 Risk Management RSKM 3 Organizational Training OT 3 Integrated Teaming IT 3 Organizational Environment for Integration OEI 3 Organizational Process Performance OPP 4 Quantitative Project Management QPM 4 Organizational Innovation and Deployment OID 5 Causal Analysis and Resolution CAR 5 CL1 CL2 CL3 CL4 CL5 Target Profile 2 Target Profile 3 Target Profile 4 Target Profile 5 Table 2 Let us equivalently transform this table so that the last columns contain maturity levels instead of capability levels, and the cells underneath contain the capability level of the given process area necessary for achieving the given maturity level. EuroSPI 2003 6 Name Abbr ML ML1 ML2 ML3 ML4 ML5 Requirements Management REQM 2 - 2 3 3 3 Measurement and Analysis MA 2 - 2 3 3 3 Project Monitoring and Control PMC 2 - 2 3 3 3 Project Planning PP 2 - 2 3 3 3 Process and Product Quality Assurance PPQA 2 - 2 3 3 3 Supplier Agreement Management SAM 2 - 2 3 3 3 Configuration Management CM 2 - 2 3 3 3 Decision Analysis and Resolution DAR 3 - - 3 3 3 Product Integration PI 3 - - 3 3 3 Requirements Development RD 3 - - 3 3 3 Technical Solution TS 3 - - 3 3 3 Validation VAL 3 - - 3 3 3 Verification VER 3 - - 3 3 3 Organizational Process Definition OPD 3 - - 3 3 3 Organizational Process Focus OPF 3 - - 3 3 3 Integrated Project Management (IPPD) IPM 3 - - 3 3 3 Risk Management RSKM 3 - - 3 3 3 Organizational Training OT 3 - - 3 3 3 Integrated Teaming IT 3 - - 3 3 3 Organizational Environment for Integration OEI 3 - - 3 3 3 Organizational Process Performance OPP 4 - - - 3 3 Quantitative Project Management QPM 4 - - - 3 3 Organizational Innovation and Deployment OID 5 - - - - 3 Causal Analysis and Resolution CAR 5 - - - - 3 Table 3 The analogy between table 1 and table 3 is immediately apparent if we consider the following equivalence of the concepts of the Common Criteria and of CMMI: EuroSPI 2003 7 Common Criteria CMMI Assurance Family Process Area Evaluation Assurance Level (EAL) Maturity Level Assurance value Capability Level Classification of Security Requirements Categorization of Process Areas Table 4 This analogy not only helps those already familiar with CMMI to better understand the Common Criteria, but provides a new perspective on CMMI itself as well. 4 Conclusion The analogies discovered between the complex standards and methodologies described in the paper help those familiar with one of the systems of concepts better understanding the other system of concepts on the one hand, contribute to the potential establishment of a cost and resource effective multiple certification process on the other hand. 5 Literature Biró,M.; Tully,C. The Software Process in the Context of Business Goals and Performance. Chapter in the book entitled Better Software Practice for Business Benefit (ed. by R. Messnarz, C. Tully). (IEEE Computer Society Press, Washington, Brussels, Tokyo, 1999) (ISBN 0-7695-0049-8). Biró,M.; Messnarz,R. Key Success Factors for Business Based Improvement. Software Quality Professional (ASQ, American Society for Quality) Vol.2, Issue 2 (March 2000) pp.20-31. (http://www.asq.org/pub/sqp/past/vol2_issue2/biro.html) Boegh,J.; Rêgo,C.M. Combining software process and product quality standards. Presented at the 2nd World Conference on Software Quality, Japan, September 2000. Prieto-Díaz,R. Understanding the Common Criteria Evaluation Process. Commonwealth Information Security Center Technical Report CISC-TR-2002-003, September 2002. Taylor,C.; Alves-Foss,J.; Rinker,B. Merging Safety and Assurance: The Process of Dual Certification for Software. In Proc. Software Technology Conference, March 2002. EuroSPI 2003 8 6 Author CV Dr. Miklós Biró Dr. Miklós BIRÓ ([email protected]) is a professor at the Department of Information Systems of the Budapest University of Economic Sciences and Public Administration.with 26 years of software engineering and university teaching (including professorship in the USA), and 16 years of management experience. He has a Ph.D. in mathematics (operations research) from the Loránd Eötvös University in Budapest, an Executive MBA (Master of Business Administration) degree from ESC Rouen, France, and a Master of Science in Management degree from Purdue University, USA. He is fluent in Hungarian, English, and French. He is a Bootstrap and SPICE (Software Process Improvement and Capability dEtermination - ISO/IEC 15504) assessor. He gives Ph.D. courses and company training courses on software quality management, and on the Capability Maturity Model - Integrated (CMMI - service mark of Carnegie Mellon University, USA). He initiated and managed the Hungarian participation in numerous European multinational projects and organisations committed to software process impovement (European Software Institute, Bootstrap Institute). He was the initiator and head of the Information Society Technologies Liaison Office in Hungary for the European Union's 5th Framework Programme. He is invited as expert consultant by Hungarian and international organizations (European Commission; Irish National Policy and Advisory Board for Enterprise, Trade, Science, Technology & Innovation~Forfás; Communications Authority of Hungary; Hungarian Committee for Technological Development; Investment and Trade Development Agency of Hungary; Hungarian Airlines; United Nations Industrial Development Organization~UNIDO; International Software Consulting Network;...). He has numerous publications in international scientific and professional journals (Software Process Improvement and Practice, Software Quality Professional (1, 2), Software Process Newsletter, European Journal of Operational Research, Zeitschrift für Angewandte Mathematik und Mechanik, Optimization, Information Processing Letters, Discrete Mathematics, Journal of Advanced Transportation, Acta Cybernetica) and conference proceedings. He is the co-author of Hungarian and English language books on operations research models, software engineering, software process improvement and business motivations. He is member of the Editorial board of the journal Software Process Improvement and Practice published by Jonh Wiley & Sons, and founding president of the professional division for Software Quality Management of the John von Neumann Computer Society. He is the Hungarian member of Technical Committee 2 (TC-2) Software: Theory and practice of the International Federation for Information Processing (IFIP) . He is member of several other professional bodies and societies. EuroSPI 2003 9
© Copyright 2024 Paperzz