Project Plan Group name: Tplan Authors: Ioana Matei (PM) Irja Rautio (PM) Jukka Pitkänen Reko Linko Riina Pakarinen Valtteri Pihlajamäki Client: Computer Centre, Tampere University Representative for client: Names removed Change date: 17.11.03 Version: 1.0.0 Reviewed: 17.11.03 Contents 1Introduction ........................................................................................................................3 1.1Project scope ...................................................................................................................4 1.2Major software functions ................................................................................................5 1.3Performance and behaviour issues ..................................................................................5 1.4Management and technical constraints ...........................................................................5 2Project Estimates ................................................................................................................6 2.1Estimation techniques applied and results ......................................................................6 2.1.1COCOMO-model.........................................................................................................6 Formula ................................................................................................................................6 Effort adjustment factor .......................................................................................................7 2.1.2Use Case Points Approach ...........................................................................................8 2.1.3Results ........................................................................................................................11 2.2Reconciled Estimate......................................................................................................11 2.3Project Resources ..........................................................................................................12 2.3.1Human resources ........................................................................................................12 2.3.2Other resources .........................................................................................................12 3Risk management .............................................................................................................12 3.1Project Risks .................................................................................................................12 3.2Risk Table .....................................................................................................................13 3.4Overview of risk mitigation, monitoring and management ..........................................15 4Project schedule ...............................................................................................................15 4.1Project task set ..............................................................................................................15 4.1.1First half: Project plan and requirement specification ...............................................16 4.1.2Second half: Implementation, testing and finalization ..............................................16 4.2Functional decomposition (work breakdown) ..............................................................17 4.3Task network .................................................................................................................17 4.4Timeline chart ...............................................................................................................17 5Staff Organization ............................................................................................................19 5.1Team structure ..............................................................................................................19 5.2Organization structure ...................................................................................................21 5.3Management reporting and communication .................................................................22 6Tracking and Control Mechanisms ..................................................................................23 6.1Quality assurance and control .......................................................................................23 6.2Change management and control ..................................................................................24 7Appendix ..........................................................................................................................25 2 1 Introduction The project request came from the Computer Centre of the University of Tampere and its aim is to develop and improve the current teachers work plan handling process. Background At the end of the spring term, each teacher from the University of Tampere needs to fill in his/her work plan for the next academic year. This process is based on lot of manual work and Excel files: - the teacher downloads the work plan form (Excel format) from the University web pages - based on previous work plan and future work activities the teacher fills in the form - the work plan is printed in 3 copies and signed - the Department Secretary collects all the teachers work plans and give one copy to the Head of Department and stores one copy - after discussions between the teacher and the Head of Department the work plan is finalized and sent to the Department Council (the third copy) - the Department Council discuss and approved the work plan - the info from the work plan is input into the HeVi system - the Accounting and HR department uses this info for calculations and reporting purposes. Figure 1.1. The old process of teacher’s work plan. New system TPlan will be a pilot software for decreasing manual work and improve the teacher’s work plan process. The teachers will be able to fill in on-line their web work plan form. Once the teachers decide to save it, the data will be saved into the TPlan database. Printing capabilities will be provided to everyone who has access to the work plan. After approval from the Department Head and Council the data will be ready to be transferred 3 to the HeVi system. All along reporting features will be available to all people involved in the process. Figure 1.2. The conceptual model of new process of teacher’s work plan. 1.1 Project scope The project main scope is to redesign and improve the HeVi system. TPlan is prototype software, which will integrate with the HeVi system as part of teachers work plan module. The project main scope is to provide an pilot system for the current process, an improved architecture for teacher module. TPlan will be a starting point for future Teacher Workplan and working hours surveillance system. In the Scope: - teacher WWW form for filling their work plan - direct and instant access to work plan for modifications - approval methods for department head and council (if needed, based on user authentication) - design interface for transferring data to HeVi’s work plan part and cost accounting - reporting system for all the people involved in the process - get users feed-back and based on it design an improved Teacher Work plan system which will cover all the user requirements and needs - find out all users and user expectations based on TPlan interaction and add the results in the specification document - take into consideration a larger scope of the project (i.e. follow-up workplan) 4 Out of the Scope: - the actual interface with the HeVi system will not be implemented - the approval part will be based only on user authentication, no other approval system will be implemented - reporting features will be implemented according to user specification and to project scope. No additional reporting will be done after specification phase closure. 1.2 Major software functions The TPlan functions will be grouped into modules, which will be used for time, effort and cost estimation - Web interface design: teacher work plan form, reporting forms - WWW programming - Database design and implementation: interface with HeVi system, interface with cost accounting and HR systems - Web authentication: user login and approval 1.3 Performance and behaviour issues The user interface to the system has to be easy to use and the interface to HeVi and other external connections has to be flexible. WWW form should be easy to modify. 1.4 Management and technical constraints The project has constraints concerning report deadlines, last delivery date and available working hours. Constraints about license are in the contract (GPL). There are some technical constraints on the implementation phase, database query language has to be PostgreSQL, the web programming part has to be done by XML and PHP and web interface has to be compatible with common web browsers. There is no constraint concerning about the platform. 2 Project Estimates 2.1 Estimation techniques applied and results For estimating cost, effort and schedule of this project we used COCOMO-model and Use case points approach. COCOMO-model, constructive cost model, offers formulas for counting effort and time of a project. Formulas use constants and project depended values. Constants are originally based on averages from database of 56 projects. With the help of project 5 depending values each project is characterized. In this project we use semidetached mode, which means that project is between tight project and loose project, it has features from both modes. Project is tight because of fixed dead lines and limited working hours. On the other hand side project is loose as it is a pilot project and we can develop the existing process instead of only automating it. With COCOMO-model we got following values: Time = 34 months and effort 8,5 months. Values include only development months, not support. That’s 215 working hours for each. We also used another estimation technique called User case points approach. We wanted to use more than one method to find out how much the results vary between different approaches. The average of these results gives more accurate values than only one technique. User case points approach uses project function points as requirements of project. It consists of five steps: Classifying user cases (simple, medium, complex); giving a certain factor for each type; Computing technical complexity factor based on weights and own evaluation; Computing environmental factor; and finally counting amount of working hours with certain formula. With user case points we got the following values: Working hours: 182 working hours for each. 2.1.1 COCOMO-model Formula In estimating with semidetached mode we used following formulas: Effort = C1 EAF (Size)P1 Time = C2 (Effort)P2 Where: Effort = number of staff-months C1 = a constant scaling coefficient for effort; in semidetached mode 3,0 EAF = an effort adjustment factor that characterizes the domain, personnel, environment, and tools used to produce the artefacts of process Size = size of the end product (in human generated source code), measured by the number of delivered source instructions (DSI), required to develop the required functionality. P1 = an exponent that characterizes the economies of scale inherent in the process used to produce to avoid non-value-adding activities (rework, bureaucratic delays, communications overhead); in semidetached mode 1,12 Time = Total number if months, staff month consists of 152 hours C2 = a constant scaling coefficient for schedule; in semidetached mode 2,5 6 P2 = an exponent that characterizes the inherent inertia and parallelism in managing a software development effort; in semidetached mode 0,35 In semidetached mode the formula looks as following: Effort = 3,0 EAF (Size)1,12 Time (in months) = 2,5 (Effort)0,35 Size: The amount of code lines, without comments, in units of thousand code lines, is based on our evaluation both using Java as a programming language and the functionality of the program. We estimated that code lines amount to be 8 KDSI. Effort adjustment factor Effort adjustment factor value, EAF, is based on our estimations on effect of each factor. The value, 1.1 includes a default value (1) and an average gap (0.1) between value and default value. Effort adjustment factors and efforts are listed on the following table. Table 2.1. Effort adjustment factors. Identifier Effort Adjustment Factor Setting Effect Product attributes RELY DATA CPLX Required reliability Database size Product complexity Nominal Nominal Nominal 1.0 1.0 0.8 Hardware attributes TIME STOR VIRT TURN Execution time constraint Main storage constraints Virtual machine volatility Computer turnaround time Nominal Nominal Nominal Nominal 1.0 1.0 1.0 1.0 Analyst capability Applications experience Programmer capability Virtual machine experience Language experience Nominal Nominal Nominal High Nominal 1.0 1.0 1.0 1.15 1.0 Use of modern practices Use of software tools Required development schedule Nominal Nominal High 1.0 1.0 1.15 Personnel attributes ACAP AEXP PCAP VEXP LEXP Project attributes MODP TOOL SCED Comments: RELY – only nominal, because out software is not handling for example directly money flows. Errors can be easily corrected before data goes further. CPLX – Product complexity is lower than normal because this pilot program covers only the main tasks. SCED is high because deadlines are already fixed without knowing the exact wideness of the project. 7 Results Based on constants of semidetached mode and counted values we get the following results: Effort = 3,0 * 1.1 * (8)1,12 = 34 Time (in months) = 2,5 * (34)0,35 = 8,5 (Formulas and table are based on book Software Project management, Walker Royce 1998) 2.1.2 Use Case Points Approach The results of user case point approach are based on 3 values: Analysis of user cases, technical factors and environmental factors. These are computed by going thought these tables: Use cases User cases are at the moment quite general, but they give information about the length of the project. Classification of use cases is based on project meetings, later the more accurate cases are based on interviews with end users. Table 2.2. Uses cases. Use Case No 1 2 3 4 5 6 Description Navigate to the right form Fill Form View summary report View report of last year Update form Work Plan approval Complexity Simple Complex Medium Medium Complex Complex Complexity is based on amount of transactions, which are needed when performing the user cases. A simple user case has 3 or fewer transactions, medium has 4-7 transactions and complex case has more than 7 transactions. A transaction is defined to be an atomic set of activities that is either performed entirely or not at all. Build effort Build effort for each case and factor for calculating the unadjusted use case points (UUCP). Table 2.3. Build effort. Use Case Type Factor Simple use cases 5 Medium use cases 10 Complex use cases 15 Number of Units 1 2 2 Effort (per use case, in person-days) 1 6 10 Total Build Effort (person-days) 1 6 10 Based on the table above we can count the following values: 8 UUCP = 1 * 5 + 2 * 10 + 2 * 15 = 55 Results are based on number of case units and constants factors of this approach. Total Build effort days: 1+12+10 = 16 days. Results are based on number of case units and days used for each user case type. Technical factors Technical factors and weights are based on common factors and constant weights. We rated them with value from 0 to 5, from irrelevant to essential, from the project point of view. Table 2.4. Technical Factors and Weights. Sequence Number 1 2 3 4 5 6 7 8 9 10 11 12 13 Factor Weight Rating Distributed system Response of throughput performance objectives End-user efficiency (online) Complex internal processing Code must be reusable Easy to install Easy to use Portable Easy to change Concurrent Includes special security features Provides direct access for third parties Special user training facilities required 2 1 0.5 0.5 1 1 1 0.5 0.5 2 1 1 1 1 1 2 1 0.5 0.5 4 2 0.5 3 0.5 2.5 0.5 Comments on rating factors: While reading the report, which included interviews with end users, we noticed that they emphasized the usability of the programs. That’s why we put more effort to the easy to use -factor. Concurrency of program is rated higher than average, because it’s important to give access to many users and other programs at the same time. Direct access for third parties is relevant because this program is going to be pilot software and a part of a bigger system. That’s why we need to pay special attention to possible interfaces for other programs. Based on table above we can count the following values Tfactor = 18,25 Tfactor (Technical factor) is based on sum of weights multiplied by rates TCF = 0,6 + (0,01*18,25) = 0,78 9 Technical complexity factor is based on constant formula and Tfactor. Environmental factors Environmental factors are based on for common factors and constant weights. We rated them with value from 0 to 5, from no experience/motivation to expert/high motivation. Table 2.5 Environmental Factors for Team and Weights. Sequence Number 1 2 3 4 5 6 7 8 Factor Weight Rate Familiar with Internet process Application experience Object-orientated experience Lead analyst capability Motivation Stable requirements Part-time workers Difficult programming language 1.5 0.5 1 0.5 1 2 -1 -1 3 3 3 3 3 1 5 2 Comments on rating factors: Part time working is rated high, as most of us are part time workers from the project point of view. Stable requirements is rated low because be now requirements are still not clarified. Based on the table above we got the following values: EFactor = 8 EFactor (Environmental factor) is based on sum of weights multiplied by rates. EF = 1,4 + (-0,03 * 8) = 1,16 Environmental complexity factor is based on constant formula and EFactor. To these 3 values (unadjusted use case points, technical factors and environmental factors), which are counted above, we use this formula to get the final result: UCP = UUCP * TCF * EF = 55* 0,78* 1,16 = 49,764 (Formulas and tables are based on book Software Project Management. A Unified Framework. Walter Royce. 1998) 10 2.1.3 Results Results of estimating: COCOMO-model Effort = 3,0 * 1,1 * (8)1,12 = 34 Time (in months) = 2,5 * (34)0,35 = 8,5 When months are divided to project member transferred to hours we get: 8,5 / 6 * 152 = for each person. Use case points approach UCP = 49,764 One UCP unit is 20-28 hours depending if rates are near the average value 3. In this case most of them are under 3, so we use 22 as hour unit. Then we get 49,764 * 22 / 6 = 182,468 hours = around for each person. Use case points approach give smaller values than COCOMO-model. A reason for difference is probably that in use case points approach all user cases are not listed before the interviews are made. 2.2 Reconciled Estimate Both estimations give almost the same amount of working hours. We decided to the use the bigger amount of hours as a guideline, because the second estimated could change when more use cases will be found. Table 2.6 Time estimation comparison Phase Planning & Design Coding Test Standard TPlan Reasons estimation estimation 15–20% 35% TPlan design phase is more important for this project than the actual implementation. The main scope of the project is to get users feed-back based on the prototype and prepare for a new improved system 50% 45% Less coding is needed for prototype 30% 20% Testing of the prototype will not take much effort but we will focus on user feed-back Please refer to Appendix C for detailed task list and durations. 11 2.3 Project Resources 2.3.1 Human resources Please refer to Chapter 5, Staff Organization, point 5.1 Team Structure and point 5.2 Organisation Structure. 2.3.2 Other resources Software for users: Net connection Internet browser, Explorer 6.0 or similar are supported Tools, Software, hardware for project: PostgreSQL database Access to database during testing If testing needed while program is in use, a copy of original database is needed as a test database Java support from server Backup for database Project server for sharing documentation and code 3 Risk management 3.1 Project Risks The project can face many kinds of risks. The main risk types are: - Technology risks - Human risks - Organizational risks - Client risks - Tool risks - Demand risks - Estimation risks Technology risks The new HeVi system architecture is not clearly defined. The TPlan requirements are not clearly defined. 12 Technical restrictions: hardware resources not available (e.g. web server, database server), software constraints, test environment constraints. Human risks English as working language can cause problems. This project is the team's first assignment and we could face some skills issues, which will be hard to overcome. Illnesses, lack of motivation, ability to participate in meetings Client risks Too many end-users, which will lead to a difficult standardisation Will the end user use our system as planned? The users will face the challenge of using a new system. The users availability in giving feedback during the testing phase Tool risks The implementation plan may not be accurate. Real test data provided by the client not ready Creating a work plan template which will satisfy every user requirement New client requirements during the implementation phase can cause serious delays in delivering the final version. Time estimation risks The project has fixed deadlines, no possible negotiations The testing of use-cases can take more time than reserved. 3.2 Risk Table Each risk has a probability, seriousness, ability to be observed, reaction and effect. Probability is numbered in scale 0 to 4. Values for each number are: 0: very low 1: low 2: fair 3: high 4: very high Seriousness is categorised in four categories. Categories are: Catastrophe Serious Tolerable Trivial Ability to be observed is categorised in five categories. Categories are: Very uncertain Low Fair High Almost certain 13 Effect is categorised in long-term effect and short-term effect. Table 3.1. Risks met and likely to be met in the project. Risk description Technology Risks The new HeVi system architecture is not clearly defined. The Tplan requirements are not clearly defined Seriousness Mitigation steps Effect Probability Observing Serious Meetings with the client and weekly meetings with the team longterm 3 Fair Serious longterm 3 Fair Technical restrictions. Tolerable Make sure that human resources are available from client side Interview questions should be planned beforehand We should define the test environment properly in the test plan, use platform independent techniques. longterm 3 Almost certain Finnish can also be used as alternative for explanation purposes Used tools and methods already familiar to the group members. Solutions: learning from each other, minimize the need of unfamiliar methods to the, divide the tasks to right person Communication methods: weekly meetings, emails and web page. Try to motivate the project group and provide support when necessary. shortterm 2 High shortterm 2 Low shortterm 2 Fair We should discuss with the client and the project-owner, and set up a group of end-users who will clarify the requirements and perform the testing. Manuals and support should be delivered with the product. Endusers test the prototype. Information about the correct usage of product should be given Accurate training for the users. Usability tests will be performed so that the system will be userfriendly. Make sure that the users are testing properly. If no feedback available escalate. longterm 2 Fair longterm 3 Very uncertain longterm 2 Low longterm 3 Almost certain The implementation plan will be based on user requirements and specifications longterm 3 Fair Human Risks English as working language can cause problems. This project is the team's first assignment and we could face some skills issues, which will be hard to overcome Illnesses, lack of motivation, ability to participate in meetings and so on. Client risks Too many end-users. Trivial Trivial Tolerable Tolerable Will the end user use our system as planned? Tolerable The users will face the challenge of using a new system. Tolerable The users availability in giving feedback during the testing phase Tool risks The implementation plan may not be accurate. Tolerable Serious 14 Real test data provided by the client not ready Tolerable Creating a standardised base to work plan. Serious New client requirements during the implementation phase can cause serious delays in delivering the final version. Time estimation risks The project has fixed deadlines, no possible negotiations The testing of usecases can take more time than reserved. Tolerable Tolerable Tolerable The TPlan pilot will not have an interface with the HeVi system. Make sure the client will provide accurate test data. If not, this will be a risk that the client will accept. Reviews are arranged regularly. Also discussion in weekly meetings. Once the specifications will be ready only minor changes will be allowed. Those changes should not impact the project timetable with more than 3 days work. longterm 3 Almost certain longterm 3 Almost certain longterm 3 Almost certain If deadlines need to be changed we will have a meeting with the client (c1) and the owner (Isto) Work estimations are followed and updated if needed. shortterm 3 Very high shortterm 3 High 3.3 Overview of risk mitigation, monitoring and management Mitigation of risks Mitigation of risks can be done easier if we follow the most important risks (marked as Serious in the previous table). Also in the table above you can find the mitigation steps that can be followed to avoid or at least to prevent each identified risk. Monitoring risks Every risk in the table above should be considered carefully and tracked. Risk prioritisation and consequent planning are based on the risk perception at the time the risk analysis is performed. Monitoring the risks will reveal the actual risk found and the methods applied to avoid it. Risk management Once the risks are clearly defined, the main task is to identify the actions needed the risk consequences referred as Mitigation steps in our risk table. The TPlan project managers will make sure that the identified risks are prevented and will take the necessary actions to handle the risk consequences. Note: in the risk table we can find 4 Serious risks that will be monitored carefully. 4 Project schedule 4.1 Project task set At the moment, project is still without strict definition, so this schedule is a draft. All of the points below will be corrected, focused and verified later during the project. 15 4.1.1 First half: Project plan and requirement specification During the autumn term the TPlan team handles the Project Plan phase and User Requirements phase. Project Plan - due 17.11.2003 During this phase TPlan team has the following tasks: understand the current system try to estimate the effort and time the team needs to finalize the project try to spot the project risks and find mitigation steps schedule the project set up the team structure and assignments according to team member experience set up a controlling and monitoring system for the project e Project Plan review is done together with the client and the course coordinator on 13.11.2003. e steering group meeting is on 17.11.2003 when the Project Plan will be approved. er Requirements Specification – due 8.12.2003 ring this phase TPlan team has the following tasks: set up meetings with the end users get user requirements make specification document meeting with steering group on 8.12.2003 fication includes everything the system should be capable of executing and the system limitations. Usage user profiles and use cases), data-models (e.g. ER-diagrams), functional models (process diagrams), user iptions (including control flow-diagrams), modeling for program operation (e.g. CSPEC), testing and other required supplements (mostly diagrams) are to be specified during requirement specification phase. review is set on 8.12.2003. arketing Day on 12.12.2003 when we will try “to sell” our project to the clients. Second half: Implementation, testing and finalization term the TPlan will handle the implementation phase, testing phase and project closure, documentation. We Christmas holiday during last 2 weeks of December and first week of January. phase – finalized on 2.04.2004 TPlan team will have the following tasks: make the implementation plan based on user requirements implementation plan is reviewed on 16.02.2003 implement TPlan pilot software the beginning of April and last review code will be on 2.04.2004. be planned for the implementation plan review, March 15th. 24.04.2004 will have the following tasks: make test plan based on specifications divide testing in test cases according to system design and implementation modify TPlan software according to the test case results 16 get user feed-back and based on it make overall system specification for future 4 and testing will include internal testing done during coding and 2 weeks after coding phase ends and BAT eks. ollowing tasks: revise estimations prepare Project Closure report make User Guide and Technical Documentation deliver the product training, if necessary y will be done on the first week of May. Functional decomposition (work breakdown) ere after requirements are specified. The system functionality decomposed into a small pieces and the cheduling of project. This part of plan needs proper system specification from the client side and the group Task network re presented by using a process or data flow chart. That will happen at the latest 8.12.2003 when the system Timeline chart will be defined after the requirements are specified. This chart is a very preliminary plan, and it will be TPlan schedule/Tasks Month Week Tasks Project plan draft Project plan proposal Project plan review Project plan correction Oct 43 44 Nov 45 46 47 48 Dec 49 50 Requirement specification draft Requirement specification proposal 17 Requirement specification review Requirement specification correction Weekly meeting (autumn) Marketing day preparation Marketing day Month Week Tasks Project plan correction Jan Feb 2 3 4 5 6 7 8 9 Mar 10 11 12 13 Apr 14 15 16 17 Requirement specification correction Testing plan draft Testing plan proposal Testing plan review Testing plan correction Implementation plan draft Implementation plan proposal Implementation plan review Implementation plan correction Coding Code review Code correction Testing Manual Documentation Weekly meeting (spring) Delivery day preparation Delivery day Month Week Tasks Project plan correction Requirement specification correction Testing plan correction Implementation plan draft Implementation plan proposal Implementation plan review Implementation plan correction Coding Code review Code correction 18 Testing Manual Documentation Weekly meeting (spring) Delivery day preparation Delivery day Month Apr Week 18 May 19 20 21 22 Tasks Project plan correction Requirement specification correction Testing plan correction Implementation plan correction Coding Code review Testing Manual Documentation Weekly meeting (spring) Delivery day preparation Delivery day Explanation to the used colours and patterns: Preparing Finalizing Reviewing Correcting Flexibility to the schedule 5 Staff Organization 5.1 Team structure The project team is following: Ioana Matei, [email protected], Project Manager Irja Rautio, [email protected], Project Manager Team members: 19 Jukka Pitkänen, [email protected], responsible for: web pages design, database implementation, test plan Reko Linko, [email protected], responsible for: implementation plan, user interface design, usability, user guide, testing Riina Pakarinen, [email protected], responsible for: software specifications, XML, XSLT, PHP and Java programming, testing Valtteri Pihlajamäki, [email protected], responsible for: technical documentation (involving UML diagrams), Java programming and testing The roles of the group members are presented in the table 5.1 and they are defined according to person’s interest and former knowledge of subject. The responsible coordinates the work, arranges the review and documents. Table 5.1. Roles of the project team members. TPlan project group members' roles 20 Name/R ole C h a ir m a n / S e c r e t a r y ( P w r e o e j k e l c y t m M e a e n ti a n g g e s r ) C h a ir m a n / S e c r e t a r y ( c o d e r e v i e w s ) P r o j e c t P l a n d o c u m e n t & tr a c k i n g G a t h e ri n g r e q u ir e m e n ts S p e c if i c a ti o n s d o c u m e n t & T e st i n g p l a n d o c u m e n t & tr a c k i n g tr a c k i n g I m p l e m e n t a ti o n p l a n d o c u m e n t I m p l e m e n t a ti o n / c o d i n g tr a c k i n g M o n it o r & T e st i n g c o n tr o l a ll a c ti v it i e s Ioana x x x x x x Irja x x x x x x Jukka x x Reko x x Riina x x Valtteri x x x J a v a i m p l e m e n t a ti o n x x x W e b i m p l e W m e e b n d t e a si ti g o n n U s e r g u i d e / U s a b il it y S u p p o rt (t e c h n i c a l) x x x x x ( U M L ) X M L / X S L T i m p l e m e n t a ti o n D a t a b a s e d e si g n a n d i m p l e m e n t a ti o n ( S Q L ) x x x S u p p o rt ( D B ) S u p p o rt ( p r o g r a m m i n g ) S u p p o rt (a ll d o c u m e nt at io n ) x x x x x x x x x x x x x x x x x x x x 21 5.2 Organization structure The organization structure of the project interest point of view is consisting of departments in different faculties of Tampere University, administrative center, cost accounting and computer center. User groups Teachers – fills in the project plan for the whole academic year Visiting teachers – fills in the project plan for part of the academic year Director – revises and approves teacher’s work plan Cost Accounting Manager - manages financial resources HR Manager – manages human resources, tracks the teacher’s work HeVi system responsible – administrates the HeVi system Department Secretary – handle administrative tasks in the department Admin – the TPlan Administrator Interest groups Clients names removed Steering group Names removed End users Names removed Other Isto Aho, [email protected], responsible for project courses, project owner, Department of Computer Science 5.3 Management reporting and communication Project managers do the management reporting regularly. Communication between project team members is done during weekly meetings, emails and project’s web page. The communication to steering group is done via email and meetings usually initiated by the client, or the project managers. Project group meetings take place once a week, two hours in autumn and one hour in spring per time. For the autumn term the weekly meeting is scheduled on Wednesdays, 14:15 to 16. Weekly meeting reporting is done and delivered by project manager via email copying the project team and Isto Aho, the course teacher. We are using the template available at: http://www.cs.uta.fi/~tyisah/projektikurssit/ . 22 The project group web page can be found at: http://opjopt.cs.uta.fi/~tplan. This information is available to all people involved in the project. All report documents can be found on the web page. Table 5.2. Distribution of reports and documents made by project group. Name of Report/Document File name Monthly report (progress) Minutes of meeting with steering group Review report (steering group) Minutes of weekly meeting Project plan Requirement specification Evaluation of specification Testing plan Implementation plan Testing report Review report (code) Final report Evaluation of results tplan_progress_yymmdd.doc tplan_minutes_yymmdd.doc tplan_review_yymmdd.doc tplan_weekly_minutes_yymmdd.xls tplan_projectplan.doc tplan_requirementspec.doc tplan_evalspec_nnnn.doc tplan_testplan.doc tplan_implementationplan.doc tplan_testreport.doc tplan_code_review_yymmdd.doc tplan_finalreport.doc tplan_evalresults_nnnn.doc St ee rin g gr ou p x x x x x x x x x Pr oje ct tea m Pr oje ct ow ner x x x x x x x x x x x x x x x x x x x x x x x The documents, which should be reviewed by steering group, are distributed latest two days before the review date. The review dates can be read from the tracking and control mechanism part of project plan. The corrected document is sent to review participants as soon as possible when corrections are approved. 6 Tracking and Control Mechanisms 6.1 Quality assurance and control The tracking and control cycle of the T-Plan software is based on the following framework: project Analyze FigureExecute 6.1. Framework forMeasure tracking and control of project. and Apply control project characteristics prevent mechanism Measuring the actual process against the planned process ensures project quality in all project phases: specification phase, design phase and coding phase. The quality will be measured by defects density: number of defects per unit size. Tracking 23 The defects density will be measured through: 1. requirements reviews – checking if the specification document is compliant to user’s requirements. This will be done through feedback. 2. design reviews – checking if the design document (diagrams) is compliant with specifications. 3. code reviews – checking if the coding is done according to specifications and system architecture 4. unit testing – test the software by modules. This will be done individually by each team member during the programming phase 5. integration testing – test the integrated software (all the components together) on the real platform 6. BAT (Business Acceptance Test) – test together with the users and get feedback. For the TPlan project, we have planned each review type as following: Table 6.1 Reviews of TPlan project. Document name Project plan Requirement specification Test plan Implementation plan Code review 1 Code review 2 Code review 3 Code review 4 Manual/Instructions Other review (re-review) Review date 13.11.2003 8.12.2003 26.1.2004 16.2.2004 W9/2004 W11/2004 W11/2004 W13/2004 W18/2004 When needed Responsible (chairman) Ioana (Irja) Riina Jukka Reko Valtteri Riina Jukka Reko Valtteri Irja (Ioana) Responsible roles: Chairman - inviting the participants to the review. The invitation is done latest 5 days before review date. The document to be reviewed is sent by chairman to the participants latest 2 days before the review date. Secretary - responsible for noting the review findings to the review template. All group members discuss the documents and the each of us will take charge of writing a part of it. Each group member will participate at the code implementation and all the team will review the code. For Defect Report and Defect Report details please check Appendix A1. Based on this report we will be able to calculate the quality met in percentage, on every stage or overall. 1, 2 the model is taken from Pankaj Jalote’s book ”Software Project Management in Practice” 24 6.2 Change management and control In every phase of the project we will maybe have to deal with changes: - requirement changes coming from the client - architecture changes coming from changes on the HeVi system - implementation changed coming from changing the design Each change will be analyzed and monitored according to its importance. Some changes will be implemented some will be not. Versioning the product releases The product will bear a version code. The code will have the following format: TPlan x.y.z where - z: reflects minor changes. This is important for the team members to know there is a new software version. - y: reflects major changes: architecture changes, design changes, new big features implementation - x: release version number, changed when a new version of the software is released. In each review document we will have the project name and version. Old review version will be kept for reference in an archive directory on the TPlan server. Please refer to Appendix B for change report and tracking (2). 7 Appendix Defect Report and Defect Report details Appendix A Change report and tracking Appendix B Detailed task list and durations Appendix C 25
© Copyright 2025 Paperzz