Supplementary Slides for Software Engineering: A Practitioner's Approach, 5/e copyright © 1996, 2001 R.S. Pressman & Associates, Inc. For University Use Only May be reproduced ONLY for student use at the university level when used in conjunction with Software Engineering: A Practitioner's Approach. Any other reproduction or use is expressly prohibited. This presentation, slides, or hardcopy may NOT be used for short courses, industry seminars, or consulting purposes. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 1 Chapter 7 Project Scheduling and Tracking These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 2 Why Are Projects Late? an unrealistic deadline established by someone outside the software development group changing customer requirements that are not reflected in schedule changes; an honest underestimate of the amount of effort and/or the number of resources that will be required to do the job; predictable and/or unpredictable risks that were not considered when the project commenced; technical difficulties that could not have been foreseen in advance; human difficulties that could not have been foreseen in advance; miscommunication among project staff that results in delays; a failure by project management to recognize that the project is falling behind schedule and a lack of action to correct the problem These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 3 Scheduling Principles compartmentalization—define distinct tasks interdependency—indicate task interrelationships Time allocation – work unit (eg. Person-days of effort) or start-finish date for each task Effort validation—be sure resources are available defined responsibilities—people must be assigned defined outcomes—each task must have an output defined milestones—review for quality These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 4 People – Effort Relationship The relationship between the number of people and productivity is not linear Example: 4 software engineer, each capable of producing 5000 LOC/year Assume that each communication will reduce the productivity by 250 LOC/year Therefore, the number of communication path: 4!/(2!2!) = 6 team productivity: 5000*4 – 250*6 = 18,500 LOC/year ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 5 People – Effort Relationship (2) Example 2: With 2 months remaining, 2 additional people are added Therefore, the number of communication path: 6!/(2!4!) = 15 productivity of 2 new staffs = 2 * (5000/12 month) * 2 month = 1680 LOC team productivity: 20,000 + 1680 – 250*15 less than 18,500 LOC/year ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 6 Defining Task Sets A collection of software engineering work tasks, milestones, and deliverables determine type of project Concept development New Application development Application enhancement Application maintenance Reengineering assess the degree of rigor required identify adaptation criteria compute task set selector (TSS) value interpret TSS to determine degree of rigor select appropriate software engineering tasks These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 7 Degree of rigor Adaptation Criteria Size of the project Number of potential users Mission criticality Application longevity Stability of requirements Ease of customer/developer communication Maturity of applicable technology Performance constraints Embedded and non-embedded characteristics Project staff Reengineering factors ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 8 Degree of rigor (2) Task Set Selector Adaptation criteria Grade (1 to 5) Weight Entry point multiplier Product Concept NDev. Enhan. Maint. Reeng. Size of project 1.20 0 1 1 1 1 Number of users 1.10 0 1 1 1 1 Business criticality 1.10 0 1 1 1 1 Longevity 0.90 0 1 1 0 0 Stability of requirements 1.20 0 1 1 1 1 Ease of communication 0.90 1 1 1 1 1 Maturity of technology 0.90 1 1 0 0 1 Performance constraints 0.80 0 1 1 0 1 Embedded/non-embedded 1.20 1 1 1 0 1 Project staffing 1.00 1 1 1 1 1 interoperability 1.10 0 1 1 1 1 Reengineering facators 1.20 0 0 0 0 1 ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 9 Degree of rigor (3) Task Set Selector – example of New Application Development Project Adaptation criteria Grade (1 to 5) Weight Size of project 2 1.20 1 2.4 Number of users 3 1.10 1 3.3 Business criticality 4 1.10 1 4.4 Longevity 3 0.90 1 2.7 Stability of requirements 2 1.20 1 2.4 Ease of communication 2 0.90 1 1.8 Maturity of technology 2 0.90 1 1.8 Performance constraints 3 0.80 1 2.4 Embedded/non-embedded 3 1.20 1 3.6 Project staffing 2 1.00 1 2.0 interoperability 4 1.10 1 4.4 Reengineering facators 0 1.20 0 0.0 ++ Entry point multiplier Concept NDev. Enhan. Maint. Product Reeng. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 10 Degree of rigor (4) Interpreting the TSS value Task Set Selector Value Degree of Rigor TSS < 1.2 Casual 1.0 < TSS < 3.0 Structured TSS > 2.4 Strict The overlap in TSS value is to illustrate that sharp boundaries are impossible to define Degree of Rigor Casual All process frameworks are applied, only a minimum task set is required Umbrella task is minimized & documentation is reduced Structured All process frameworks are applied; framework activities & related tasks are applied Umbrella activities, SQA, SCM, documentation, measurement task Strict Full process, umbrella activities Quick Reaction ++ Process framework is applied, only essential tasks applied Documentation, reviews conducted after delivery These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 11 Selecting Software Engineering Tasks -- Example These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 12 TASK NETWORK A graphic representation of the task flow for the project Sometimes used as a mechanism for inputting task sequence and dependencies to an automated tools The concurrent nature of the tasks may lead to critical path, that is, tasks that must be completed on schedule ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 13 Define a Task Network These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 14 SCHEDULING Method: PERT (Program Evaluation and Review Technique) & CPM (Critical Path Method) PERT & CPM used to: 1. Determine critical path 2. Establish most likely time estimates for individual task 3. Calculate “boundary times” that define a time “window” for particular task Task, sometimes called Work Breakdown Structure (WBS) ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 15 SCHEDULING (2) Important boundary times: 1. 2. 3. 4. 5. ++ The earliest time to start, to finish The latest time to start, to finish The earliest time to finish The latest time to finish Total float – amount of surplus time allowed so that the critical path is maintained Go To Network Diagram These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 16 Effort Allocation 40-50% 15-20% “front end” activities customer communication analysis design review and modification construction activities coding or code generation testing and installation 30-40% unit, integration white-box, black box regression These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 17 Use Automated Tools to Derive a Timeline Chart These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 18 ________stoppage These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 19 Project tables Work Tasks Planned Start Actual Start Planned Complete Actual Complete Assigned person Effort Allocated 1.1 Identify needs Wk1, d1 Wk1, d1 Wk1, d2 Wk1, d2 BLS 2 p-d ++ Notes These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 20 Tracking the schedule Conducting periodic project status meeting Evaluating the results of all reviews Determining whether milestones have been accomplished by scheduled date Comparing actual start-date to planned start date Meeting informally with practitioners Use earned value analysis to assess progress quantitatively ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 21 Tracking the schedule (2) Time Boxing: Project scheduling and control technique used when faced with severe deadline pressure Complete product may not be deliverable by the predefined deadline Incremental paradigm is chosen Task with each increment are time boxed, ie, the schedule for each task is adjusted by moving backward from the delivery date. When a task hits the boundary of time box (10%), works stop & the next task begins ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 22 EARNED VALUE ANALYSIS (EVA) Using only actual and planned costs can mislead management and customers Eg. A project has duration of 10 month & a cost of $200,000/month (total cost = $2 million) For the first 5 months, actual cost is $1,3 million Is there a cost overrun of $300,000? Or, is it ahead of schedule? For the first 5 months, actual cost is $0.8 million Is the cost less than expected by $200,000? Or, is it behind schedule? Need to keep track schedules and budgets against time ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 23 EARNED VALUE ANALYSIS (EVA) Steps: 1. Determine BCWS (Budgeted Cost of Work Scheduled) for each task Baseline 2. BAC (Budget at Completion) = sum of all BCWS 3. Compute BCWP (Budgeted Cost of Work Performed) Earned Value 4. Compute ACWP (Actual Cost of Work Performed) Actual Cost ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 24 Example $400 100% $340 85% $300 75% $200 $100 ACWP Actual Cost SV CV 50% BCWP Earned Value 25% 10 ++ BCWS Baseline 20 30 40 50 duration These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 25 Interpretation of Graph At the end of period 25, 75% of work was scheduled to be accomplished (BSWS) At the end of period 25, the value of the work accomplished is 50% (BCWP) At the end of period 25, the actual cost is $340 or 85%(ACWP) Cost Variance shows that the project is over budget by $140 ($200 - $340) Schedule Variance suggests that the project is behind schedule ($200 - $300 = -$100) ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 26 EVA (2) Schedule performance index, SPI = BCWP/BCWS Schedule variance, SV = BCWP – BCWS % schedule for completion = BCWS/BAC % complete = BCWP/BAC Cost performance index, CPI = BCWP/ACWP Cost variance, CV = BCWP - ACWP ++ These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 27 ERROR TRACKING Defect Removal Efficiency (DRE) = E/(E+D), where E is error before delivery & D is defect found after delivery Metrics: ++ Errors per requirements specification page, Ereg Errors per component – design level, Edesign Errors per component – code level, Ecode DRE – requirement analysis DRE – architectural design DRE – component level design DRE – coding These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001 28
© Copyright 2026 Paperzz