EMI INFSO-RI-261611 EMI INFSO-RI-261611 Software Metric Definitions, Reports and Analysis in EMI Authors: Eamonn Kenny (TCD), Gianni Pucciani (CERN) Date: Tuesday 12th April 2011 1 Why do we need metrics? • Measure quality/quantity of the software. EMI INFSO-RI-261611 EMI INFSO-RI-261611 • Encourage conformance between the 4 middleware groups. • Highlight previously unlooked at problems (e.g: using static analysers) 2 Why do we need metrics? • To measure: EMI INFSO-RI-261611 EMI INFSO-RI-261611 – Performance: e.g: How long to fix a bug? – Complexity: e.g: How many fixed bugs? – Support: e.g: How complex is a component? E.g: Cyclomatic Complexity and SLOC • EMI is funded to provide quality software and to measure its quality • With metrics one can see trends and understand what needs to be improved and how many defects a component has. 3 Metrics Role in EMI SA2 SA2 PEB link Metrics? SA2.2 QA Plan SA2.1 Coordination Verification EMI INFSO-RI-261611 EMI INFSO-RI-261611 SA2.5 QA Review Definitions & Analysis SA2.3 Metrics Area Leaders Product Teams (PT) Existing Middleware Metrics? Top 3 Metrics? Weekly Meetings EMT JRA1 SA1 Tool Definition SA2.4 Tools Report Generation Evaluation SA1 Quality Control 4 Model for Metric Collection EMI INFSO-RI-261611 EMI INFSO-RI-261611 • Practical metrics, not a theoretical set! • Governed by the end goals of the project End-Goals Suggests a Question Leads to a Metric 5 Metric Template Metrics Id Thresholds/Target Value E.g: PriorityBugs Name Tools Description Availability Per middleware availability Measurement Calculation Goals Mathematical formula EMI INFSO-RI-261611 EMI INFSO-RI-261611 Input(s) & Units E.g: time range in days Output(s) & Units Quality factor Follows the McCall factors Risks E.g: Average time in hours Scope Special Notes 6 EMI INFSO-RI-261611 EMI INFSO-RI-261611 Metrics Description Example 7 EMI INFSO-RI-261611 EMI INFSO-RI-261611 Metric Categories Areas of Coverage Process Management (Bug-tracking related) Types of Metrics Relates to priority, severity, open/closed bugs, state changes in bugs, improving turnaround times External:Quality in Use (Bug-tracking/EGI) Product Team software - Static Analysers Product Team software - Testing, platform, etc. Optional Add-on Tools 3rd level GGUS related metrics (KPIs for SA1) Language specific analysers, SLOC, CLOC, Cyclomatic Complexity, etc. Unit tests, supported platforms, bug density Valgrind/Helgrind (Memory leak & thread checking) 8 Bug-tracking Process Suggested Metrics Untouched Open Bugs (DSA 1.1 requirement) Open Is Accepted? NO Rejected Time To Fix a Bug (SA1-QC) YES NO Accepted Open Priority Bugs Average Time To Close a Bug (PriorityBugs, BugSeverityDistribution) EMI INFSO-RI-261611 EMI INFSO-RI-261611 Fixed Can I test it? YES NO Test successful? YES Not Tested Tested Closed 9 Handling Multiple Bug-Trackers ARC • Bugzilla Exposed XML to ETICS dCache • Request Tracker (RT) (run anytime) Middleware BugMapping (needs periodic intervention) gLite • Savannah EMI INFSO-RI-261611 EMI INFSO-RI-261611 Middleware Bugtracking (Snapshot taken daily) UNICORE • SourceForge Multiple bug-trackers One common interface 10 EMI INFSO-RI-261611 EMI INFSO-RI-261611 SQAP/Bug-tracking Metrics Metric Name Designed Implemented Open Priority Bugs (High/Immediate) Successful Builds metric Open Untouched Bugs (> 14 days) Fixed Bugs Priority Bugs (High/Immediate) Bug Severity Distribution Backlog Management Index Integration test effectiveness metric Delay on release schedule metric Up-to-date documentation metric Recipient EMT/Reports SA1-QC Periodic Reports/ Deliverables/ Product Team Reports 11 Fixed Bugs (SA1-QC) EMI INFSO-RI-261611 EMI INFSO-RI-261611 Required for assessing the number of regression tests produced in association with each fixed bug. 12 EMI INFSO-RI-261611 EMI INFSO-RI-261611 Open Untouched Bugs (weekly EMT) • Report from 10th March 2011 • Inter-quartile ranges used in visualization 13 EMI INFSO-RI-261611 EMI INFSO-RI-261611 Successful Builds Metric (for EMT) Highlight: product teams having failures due to their own internal issues Highlight: product teams causing other product teams to fail 14 EMI INFSO-RI-261611 EMI INFSO-RI-261611 Backlog Management Index Track whether product teams have an increasing backlog or decreasing backlog of bugs or defects? 15 Product related Metrics EMI INFSO-RI-261611 EMI INFSO-RI-261611 Metric Name Designed Implemented Unit test coverage metric Number of supported Platforms Total bug density Bug density per release Cyclomatic complexity C/C++ metrics – CCCC, cppcheck Java metrics - FindBugs, PMD, Checkstyle Python metrics - pylint Code commenting metrics Recipient Reports/ Deliverables/ Periodic PT reports 16 Static Analysers: Java – FindBugs (PT) EMI INFSO-RI-261611 EMI INFSO-RI-261611 Project FinerOverview Grained - Reporting Results per for Product team Teams 17 Source Lines of Code EMI INFSO-RI-261611 EMI INFSO-RI-261611 Source lines of code is particularly important when assessing the Bug Density Distribution (i.e: Bugs open per lines of codes) 18 EMI INFSO-RI-261611 EMI INFSO-RI-261611 More Static Analysers 19 Current Status EMI INFSO-RI-261611 EMI INFSO-RI-261611 • The metrics are useful and we defined them for the whole of EMI: – There is now a common framework for producing reports for the EMT, deliverables and Product Teams (PTs). – The reporting structure is interpretable at the project level, activity level & Product Teams level. – Static analysers produce reports highlighting problems previously not seen by developers. 20 Conclusions • We do have an objective way to measure performance/complexity/support. EMI INFSO-RI-261611 EMI INFSO-RI-261611 • EMI can provide quality software and measure its quality. • With the current metric reports one can see trends and understand what needs to be improved and how many defects a component has. 21 Future Work EMI INFSO-RI-261611 EMI INFSO-RI-261611 • A few important metrics are currently being implemented. • The metrics will be reassessed and evaluated after the EMI-1 release. 22
© Copyright 2026 Paperzz