17 User Acceptance Testing - Information Technology Services

PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
<Project Name>
<fill in>
Project Manager
<fill in>
First Name
e-mail
<fill in>
Last Name
Phone
<fill in>
Revision History
Version
v1
v2
v3
v4
v5
v6
Date
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
Author(s)
Reviewer(s)
Change Description
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
COPYRIGHT INFORMATION
This document is the exclusive property of University of North Carolina at Charlotte (“UNCC”); the recipient
agrees that he/she may not copy, transmit, use or disclose the confidential and propriety information set forth
herein by any means without the expressed written consent of UNCC. By accepting a copy hereof, the recipient
agrees to adhere to and be bound by these conditions to the confidentiality of UNCC's practices and procedures;
and to use these documents solely for responding to UNCC’s operations methodology. All rights reserved UNCC
Corporation, 2000. UNCC IT reserves the right to revisit Business Requirements and Functional Specifications
Documents if approval to proceed is not received within 90 days of the issue date.
1.
INTRODUCTION
<Describe the project in a high-level summary giving key deliverables, milestones and name sponsor, key
stakeholders, etc.>
2.
TEST PLAN APPROVAL PROCESS
<Describe the approval plan for adoption of this document and state why that approval process contributes to the
project’s success. Reference any documents required to be reviewed or used with this document.>
3.
ASSOCIATED & REFERENCED DOCUMENTATION <MODIFY
APPROPRIATE>
3.1
TEXT
AS
Framework, Elements, Events, & User Flows
<Describe the framework of the test strategy and cite academic references used to develop the framework. List the
major elements or stages of the testing; briefly describe the testing events or sequences and information flows
between users.>
3.2
Testing Project Plans
<Outline or describe the plans for testing in detail greater than that of the Project Management Plan which
references this document.>
3.3
Test Plans and Test Scripts
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 1 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
As test plans and test scripts are completed and assigned a version number, they will be placed in <give path to file
folder> under Test Documents folder. As plans and scripts are completed or modified, notification will be sent to the
appropriate individuals. Scripts created for the purpose of Testing will be placed in the above referenced Test
Documents folder under “Integration, QA and UAT”.
4.
TESTING STRATEGY <MODIFY TEXT AS APPROPRIATE>
4.1
Scope
The following types of testing will be conducted for this project: (remove any testing not required for the project
and its associated section below)
Unit Testing (see Section <11>)
Integration Testing (see Section <12>)
System (QA) Testing (see Section <13>)
Host’s System Testing (for hosted applications – see Section 14
Load Testing (see Section <15)
Regression Testing (see Section <16>)
User Acceptance Testing (see Section <17>)
Participation from each SMEs specialty technical staff will be required as documented below. Each SME must
develop a specialized Testing Plan to account for their staff’s participation in each phase of testing. It is anticipated
that the level of developer involvement will decrease as the testing progresses.
4.2
Testing Approach

Manual test script generation will be the preferred method until such time that the Testing Project
Manager determines that site stability is adequate for automated script creation.

A “two pass” per iteration approach will be used.

A-Pass: focuses on “normal” conditions to ensure all parts of the application are working in a normal test
script. Immediate identification of major issues is required.

B-Pass: focuses on “exception” conditions to ensure boundary conditions, error handling, and etc. are
working correctly. Immediate identification of major issues is required.

Four (4) types of test scripts will be created
1.
Normal (N): Test scripts that test the expected behavior under normal, or “pass” conditions.
2.
Exception (E): Test scripts that test the expected behavior under exception, or “fail” conditions.
3.
Data Normal (DN): Test scripts that test the expected behavior under data-specific normal
conditions.
4.
Data Exception (DE): Test scripts that test the expected behavior under data-specific exception
conditions.

Iteration: One (1) complete end-to-end A-Pass and one (1) complete end-to-end B-Pass across all modules.

Number of iterations for Unit Testing to be determined by Development Project Manager(s)

Unit test script creation and execution is the responsibility of the development staff(s)

Number of iterations for Integration Testing will be on an as needed basis within the Integration Testing
Cycle. This will be determined by the Testing Project Manager and Customer Application Project
Manager(s) during the Integration Testing Cycle.
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 2 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN

Integration Testing will include all (N) test scripts during the A-Pass and (E) test scripts during the BPass.

Number of iterations for System Testing will consist of up to three. Should issues arise that justify
additional iterations, the testing timeline will increase by five (5) days per iteration.

System Testing will include all (N) & (DN) test scripts during the A-Pass and (E) & (DE) test scripts
during the B-Pass

User Acceptance Testing: Test scripts to be created by QA team with the help of Business Analyst and
User Acceptance Group (Business).

Load Testing will consist of a select group of (N) scripts that accurately represent a cross section of
functionality against a predetermined load.

Regression Testing will be created from the (N), (DN), (E), & (DE) test scripts.
4.3
Test Setup
The diagram below gives a high level overview of the proposed System
(Insert here a “flowchart style” diagram of the relationships of the major components.
Remove this text after inserting diagram.)
4.4
Test Environment Details
(Insert here a “grid / matrix style” diagram of the relationships of the various types of tests.
First column should be date ranges for each test
Second column should be Phase of test
Third column should be name of organization, department or SME responsible for tests
Fourth column should be type of test
Column(s) and content may vary depending on needs of the project.
Show solid lines for interdependencies of primary tests and dashed lines for secondary or
limited test(s) relationships. Remove this text after inserting diagram.)
4.5
Responsibilities
The following table gives the basic information for all testing included in this Test Plan. The Lead System Tester
<XYZ> will be “accountable” for completion of testing activities until all tests produce acceptable results. Persons
listed in the table have personal responsibility for actually planning and conducting (or overseeing) specific tests.
Testing Responsibilities
No.
Date
1
2
3
4
5
6
7
8
9
10
11
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
Responsible for
Testing
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
The University of North Carolina at Charlotte
Next Steps / Important Notes
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
Data Classification:
INTERNAL
<creation date>
Page 3 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
Testing Responsibilities
No.
Date
12
<fill in>
4.6
Responsible for
Testing
<fill in>
Next Steps / Important Notes
<fill in>
Risks that Impact Testing
The following table shows the primary risks, probabilities, impacts and contingent responses. These apply to the
various tests as appropriate. Individuals listed in the above table must coordinate with the Lead System Tester
requesting authorization to implement the response plans.
Testing Risks
Risk
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
4.7
Probability
Impact
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
Contingency Response
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
Test Suspension / Resumption Criteria
Sanity test will be carried out on every build received from development team to ensure suitability of application for
further testing. A set of functional test-cases will be identified to run during sanity test. Testing will be suspended
when it is not possible to proceed with test execution due to major showstopper error in the application.
Testing shall be resumed once the above problems are addressed.
4.8
Test Stop Criteria
All planned tests have been executed.
5.
TEST-CASE DESIGN AND DEVELOPMENT
5.1
Test-case Design Instructions <Modify the following text as appropriate>
1.
Use following documents to create test-cases:


2.
Generate test-cases using … value analysis techniques.
3.
Mention “trace-ability” for the test-cases from Use Case document.
4.
Number of steps to execute the test-case should not be more than 20. We can split such scenarios in
number of test-cases for better tracking purpose.
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 4 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
5.
Test-case execution steps need to be in detail so that any tester can complete the test-case without any
System knowledge.
6.
Whenever required mention the business rules and formulas under expected result column for reference.
5.2
Test-case Design Deliverables
Test-cases will be developed by Test team and reviewed by Business before test execution. In case of requirements
change, refer the Change Request Process defined in Approval section.
6.
TESTING TEAM <MODIFY TEXT AS APPROPRIATE>
6.1
Core Team

Testing Project Manager: <fill in>

Lead System Tester: <fill in>

System Testers: <fill in>

User Acceptance Group Coordinator: <fill in>
6.2
Technical Support Team
Please refer to <document source> for Technical Support Team details of <fill in> (Instert Hyperlinks to shared
drives)
6.3
Assumptions, Constraints, and Exclusions
Any reference to the Testing Team will be those individuals listed above under “Core Team.”
7.
TESTING TOOLS
7.1
Testing Tools <Modify text as appropriate>
Manual: Test-cases will be created via <fill in>.
Refer Templates section under <fill in> for Test-case template.
Automated: Automated scripts will be created/executed using the following:

<fill in>

<fill in>

<fill in>
The testing tool related decision is pending for budget approval. The above listed tools are the proposed testing
tools.
7.2
Assumptions, Constraints, and Exclusions
<Modify text as appropriate>

<fill in>

<fill in>

<fill in>
Limitations of test automation:

Problems with tool
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 5 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN

Support from vendor

Rapidly changing requirements

<fill in>
8.
KEY EXTERNAL DEPENDENCIES <MODIFY TEXT AS APPROPRIATE>
8.1
Below is a list of all key external dependencies:

8.2
<fill in>
Assumptions, Constraints, and Exclusions
None noted at this time.
9.
METRICS COLLECTION <MODIFY TEXT AS APPROPRIATE>
Detailed defect analysis shall be done for the reported defects and test-case execution status shall be reported for
each module.
The metrics to be collected during test life cycle are:
1.
Defect location Metrics – Defects raised against the module shall be plotted on a graph to indicate the
affected module.
2.
Severity Metrics – Each defect has an associated severity (Critical, High, Medium and Low), which is
how much adverse impact the defect has or how important the functionality that is being affected by the
issue. Number of issues raised against severity shall be plotted on a graph. By examining the severity of a
project’s issues, the discrepancies can be identified.
3.
Defect Closure Metrics – To indicate progress, the number of raised and closed defects against time shall
be plotted on a graph.
4.
Defect Status Metrics – It will indicate the number of defects in various states like, new, assigned,
resolved, verified, etc.
5.
Re-opened bugs – The number of defects re-opened by testing team once they are fixed by development
team shall be reported & percentage shall be calculated with respect to total number of defects logged.
6.
Test-case progression trend: This trend shall indicate the progress of test execution module wise. It shall
state the number of test-cases planned, executed, passed and failed.
These metrics shall be collected and presented as test summary report after each test cycle. Also, these shall be part
of weekly status report.
Refer Templates section under <fill in> for Metrics Analysis template. (Insert Hyperlinks)
10.
CLASSIFICATION OF ISSUES <MODIFY TEXT AS APPROPRIATE>
10.1
The following standards will be used to classify issues found during testing:
Severity 1: Critical Issues:
Application crashes, returns erroneous results, or hangs in a major area of functionality and there is no work
around. Examples include the inability to navigate to/from a function, application timeout, and incorrect
application of business rules.
Severity 2: High Functional Issues:
Functionality is significantly impaired. Either a task cannot be accomplished or a major work around is necessary.
Examples include erroneous error handling, partial results returned, and form pre-population errors.
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 6 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
Severity 3: Medium Functional Issues:
Functionality is somewhat impaired. Minor work around is necessary to complete the task. Examples include
inconsistent keyboard actions (e.g. tabbing), dropdown list sort errors, navigational inconsistencies, and serious
format errors causing usage issues (e.g. incorrect grouping of buttons).
Severity 4: Low Functional Issues:
Functionality can be accomplished, but either an annoyance is present, or efficiency can be improved. Cosmetic or
appearance modifications to improve usability fall into this category. Examples include spelling errors, format
errors, and confusing error messages.
Examples of each severity level to be delivered to participants prior to Integration Testing.
10.2
Assumptions, Constraints, and Exclusions
Issue classifications can include creative or content related issues.
<MODIFY OR DELETE THE FOLLOWING SECTIONS AS APPROPRIATE>
11.
UNIT TESTING
11.1
Purpose
The purpose of Unit Testing is to deliver code that has been tested for end-to-end functionality within a given
module and normal interfacing between dependent modules in the development environment.
11.2
Responsibility
Testing will be the responsibility of the individual developers. Ultimate signoff for promotion into Integration
Testing will be the responsibility of the Development Project Manager(s). Configuration management, builds, etc.
will be the responsibility of the Configuration Management Team at the direction of the Development Project
Manager.
11.3
Environment
Refer section 4.4 Test Environment Details
11.4
Exit Criteria
In order to be accepted for Integration Test, each component must:

Successfully compile in the development environment

Be tested for complete threads for all code, from UI, to data access, and back to UI using test data created
by developers

Be tested for one example each of normal, high, and low boundary conditions for Data input where
appropriate

Tested for one example of error handling per event

Successfully execute pairwise test as required for inter-module interfaces, including likely error conditions
(e.g. common data entry error)

No high Severity issues in open state.

Have Project Manager signoff on individual modules
Specifically excluded from the Unit Test exit criteria are:

Comprehensive data validation

Exhaustive test of various entry and exit points across modules
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 7 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN

11.5
Comprehensive testing for abnormal situations and error handling combinations
Assumptions, Constraints, and Exclusions
Creation of test data and scripts for the purpose of Unit Testing is the responsibility of the development staff(s).
12.
INTEGRATION TESTING
12.1
Purpose
The purpose of Integration Testing is to deliver code that has been comprehensively tested for Normal (N) and
Exception (E) conditions across all modules in the Development environment.
12.2
Responsibility
Testing Team holds the primary responsibility for the execution of Normal (N) and Exception (E) test scripts. All
N & E type test scripts will be completed prior to the start of Integration Testing. The N & E test scripts will be
executed for the following modules:

<fill in>

<fill in>
Configuration management, builds, etc. will be the responsibility of the Configuration Management Team at the
direction of, and with the agreement of the Development Project Managers and Testing Project Manager. Ultimate
sign-off of Integration Testing and promotion into System Testing resides with the Testing Project Manager.
12.3
Environment
Refer to Section 4.4 Test Environment Details.
12.4
Test Data
12.4.1
Mainframe test data
Test Lead will request for test data migration / creation. Test Lead will forward details surrounding the
migration/creation to the appropriate individuals. After data migration/creation test data list will be forwarded to
Test Lead by mainframe team.
12.4.2
Local test data
Test Lead will identify test data for local database (<fill in>). With help of development team, Test Lead will ensure
test data is set up before the start of integration testing.
A Testing Data Repository Document will be delivered on or before Integration Testing. Specific reference will be
made in the N & E Test Scripts to the data types listed in the Testing Data Repository Document.
12.5
Test Execution Process
Integration Testing will be comprised of <fill in> weeks.
Issue Identification:
Integration Testers will log issues as they are identified.
Issue Resolution:
It is expected that the Development Team(s) will undertake issue resolution based on the severity and priority. All
efforts will be made to turn around the Critical/High category issues, in the next scheduled Build/Release.
Build and Release Process:
The Configuration Management Team will deliver fresh builds as requested by the Development Project Managers
and Testing Project Manager along with release notes.
Issue Closure:
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 8 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
After each build, the Testing Team will review the issues that have been resolved in order to verify and close/reinstate the issues and resolution priority. All effort will be made to close the resolved issues as soon as possible.
Issue Tracking:
The Testing Project Manager will be responsible for the administration of the issue tracking tool.
12.6
Exit Criteria
In order to be accepted for System Test, each component must:

Be successfully deployed to the System Test Environment

For transaction-based data access, be tested successfully for “normal” and “exception” conditions

Contain no “dead” links/inaccessible pages

Contain no Severity 1 or 2 issues

Have Testing Project Manager signoff
12.7
Additional Information
Test scripts will be provided to the <fill in> prior to Integration Testing. The test scripts provided should be used as
a baseline for exit criteria expectations. Any additional test or scripts that the development staff deems necessary
will be left at the discretion of the Development Project Managers. Should the Development Project Managers feel
that such scripts should be incorporated into the Testing Team scripts, they may request such to the Testing
Project Manager. It will be the responsibility of the Testing Project Manager to analyze the feasibility of such
incorporation.
12.8
Assumptions, Constraints, and Exclusions
Assumptions:
Functionality testing of <fill in> by the Testing Team will also include entry points from other websites via link,
travel portals, etc.
Exclusions from Integration Testing:
Delivery of code that has been comprehensively tested for Data Normal (DN) and Data Exception (DE) conditions
across all modules in the Development environment. Any issues discovered with the informative pages, creative
design, or content should be reported to respective development area.
13.
SYSTEM (QA) TESTING
13.1
Purpose
The purpose of System Testing is to deliver code that has been comprehensively tested and functionality that is
certified to be end-to-end user ready in the System Test environment.
13.2
Responsibility
The Testing Team holds the primary responsibility for the executions of Normal (N), Exception (E), Data Normal
(DN), and Data Exception (DE) test scripts. Test scripts will include field form validation and display rules as
stated in the Elements section of the <fill in>. The N, E, DN, & DE test scripts will be executed for the following
modules:
1.
<fill in>
2.
<fill in>
The System Testing Team will be comprised of individuals from the Testing Staff. Configuration management,
builds, etc. will be the responsibility of the Configuration Management Team at the direction of the Development
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 9 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
Project Managers and requires the agreement of the Testing Project Manager. Ultimate sign-off of System Testing
and promotion into User Acceptance Testing resides with the Testing Project Manager.
13.3
Environment
Refer to section 4.4 Test Environment Details
System Test Script execution will be completed as per the following Operating System / Browser matrix: <Edit as
appropriate>
Windows XP
IE 6.0
IE 5.5
IE 5.0
Mozilla 1.7.2
Netscape 7.1
AOL 5.0
Windows 2000
C
U
C
C
U
U
U
Windows 98
Mac OS/9
U
U
C – Complete test-cases suite will be executed on OS/Browser combination
U – Only critical functionalities and UI test-cases will be executed on OS/Browser combination
Blank – OS/Browser combination will not be tested
13.4
Test Data
13.4.1
Host Environment test data
Test Lead will request for test data migration/creation. Test Lead will forward details surrounding the
migration/creation to the appropriate individuals. After data migration/creation test data list will be forwarded to
Test Lead by mainframe team.
13.4.2
Local test data
Test Lead will identify test data for local database (Oracle 9i). With help of development team, Test Lead will
ensure test data is set up before the start of integration testing.
A Testing Data Repository Document will be delivered on or before System Testing. Specific reference will be made
in the N, E, DN, & DE Test Scripts (see 13.2) to the data types listed in the Testing Data Repository Document.
Additional specific data may be required. Should this be the case, the data will be listed on the corresponding test
script.
13.5
Test Execution Process
System Testing will be comprised of <fill in> (<fill in>) weeks.
Issue Identification:
System Testers will log issues as they are identified.
Issue Resolution:
It is expected that the Development Team(s) will undertake issue resolution based on the severity and priority. All
efforts will be made to turn around the Critical/High category issues, in the next scheduled Build/Release.
Build and Release Process:
The Configuration Management Team will deliver fresh builds to the System Test environment, as directed by the
Testing Project Manager along with release notes. If the situation warrants, an emergency build may be released.
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 10 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
The Testing Project Manager and all Development Project Managers must be in agreement to proceed with the
emergency build.
Issue Closure:
The Testing Team will review the issues, which have been resolved, to verify and close/re-instate the issues and
resolution priority. All effort will be made to close the resolved issues as soon as possible.
Issue Tracking:
The Testing Team will be responsible for the administration of the tracking tool.
13.6
Exit Criteria
In order to be accepted for User Acceptance Test, the application must:

Be successfully deployed to the System Test environment

Tested by System Test Team according to all System Test Scripts

Performance / Load tested

No Severity 1 or 2 issues

Minimal Severity 3 or 4 issues, documented and with a known resolution path.

Testing Project Manager signoff and acceptance by User Acceptance Team.
Specifically excluded from the System Test exit criteria are:

Security Testing

System Crash / Restart Testing
13.7
o
DB crash
o
iPlanet crash
o
Hardware
o
DB capacity/resources
Additional Information
Regression and Load testing will take place prior to promotion to the User Acceptance Testing. Please see the
Regression Testing and Load Testing sections of this document for further information.
A copy of the test plan will be provided to the User Acceptance Group prior to System Testing for their review.
The test plan provided should be viewed as a baseline for System Testing exit criteria expectations. Any items in
the test plan that the System Testing Team or User Acceptance Group feels should be modified or added should be
submitted to the Testing Project Manager. It will be the responsibility of the Testing Project Manager to analyze
the feasibility of such incorporation or modification.
13.8
Assumptions, Constraints, and Exclusions
Assumptions:
Functionality testing of <fill in> by the Testing Team will also include entry points from other websites via link,
travel portals, etc.
Testing of Personalization engine will be limited to business rules created by developer.
Visitor tracking details will be verified only at login level by <fill in> as Reporting tool has not been finalized.
14.
HOST’S SYSTEM TESTING
14.1
Purpose
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 11 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
The purpose of testing the host’s system is to determine the health, fail-over, vulnerabilities, customer support and
functionality of the vendor’s code under stress for the new functionality of the project. It may also test
compatibility with UNCC firewalls and existing functionality of associated applications.
14.2
Responsibility
Vendor’s hosted environment testing will be carried out by <fill in>. It will be scheduled and coordinated by <fill in>
Test team according to test execution dates for System testing and UAT.
14.3
Environment
Vendor’s modules will reside in the <fill in> Acceptance Test Region (ATR).
14.4
Test Execution Process
<fill in> QA team at <fill in> will deliver the unit tested code of <fill in> feature to <fill in> development team. After
integration with application, <fill in> Test team will verify the <fill in> feature from end to end user perspective, i.e.
from front end to <fill in>. <fill in> Test team will be trained on using vendor’s screens to verify <fill in> data. <fill in>
Test team will raise issues using <fill in> Tracker and escalate to IT Project Manager (<fill in>), who will take it
further with vendor’s team for fixes.
To follow up on vendor testing progress (during vendor’s testing period), a status report will be provided to <fill in>
team on weekly basis by <fill in> QA team.
During test execution, <fill in> test team will provide with list of <fill in> input test data to <fill in>’s team so that it
can verify <fill in>.
15.
LOAD TESTING
15.1
Purpose
The purpose of Load Testing is to deliver code that has been stress tested to the upper and lower control limits of
its design specification and is ready for promotion into the Production Environment.
15.2
Scope
Load Testing will consist of a select group of (N) scripts that accurately represent a cross section of functionality.
Scripts will be executed to generate up to <fill in> user peak load levels.
Tests will be executed for <fill in> concurrent users at load levels of <fill in> and <fill in>. The test execution would be
completed when the <fill in> user load is ramped up or any failure condition necessitates stopping the test. The team
would monitor the test execution and record the timings and errors for report preparation.
15.3
Responsibility
The creation and execution of the Load Testing Scripts is the responsibility of the Testing Team. Ultimate
authority rests with the Testing Project Manager, who will be in close contact with User Acceptance Group.
15.4
Environment
<fill in> testing tool will be physically located on a server at <fill in>. For the purpose of test execution, <fill in> testing
tool will be pointed to the (System Testing Environment, which will become the Production Environment  edit
as appropriate) upon Implementation. <fill in> Test team will access <fill in> using remote client tool to execute the
scripts. ( <fill in> Test team will be allocated one VU Gen license to create scripts offline.  edit as appropriate)
Environment
Description
IP Address
Budget Application
Servers
Controller
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 12 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
Environment
Description
Load Generator
IP Address
DB Server
Web Server
Details pertaining to Network:

Network Card setting – <fill in> MBPS Duplex

Bandwidth of LAN – <fill in> MBPS
15.5
Testing Methodology
15.5.1
Load testing
Load testing will be carried out under varying workloads to access and evaluate the ability of the system under test
to continue to function properly under these different workloads. The goal of load testing is to determine and
ensure that the system functions properly beyond the expected maximum workload. Additionally, load testing
evaluates the performance characteristics (response times, transaction rates, and other time sensitive issues).
15.5.1.1 Serviceability
Approach

Determine the serviceability of the system for a volume of <fill in> concurrent users.

Measure response times for users
Steps
1.
<fill in> users estimation: Arrive at a maximum number of concurrent users hitting the system where the
system response time is within the response time threshold and the system is stable. This number would
be the virtual user number and should be higher by a factor of x times the average load.
2.
<fill in> users profiles and their distribution for client operations:
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
Users / Operations
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
<fill in>
3.
Load simulation schedule:
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 13 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
Schedule for concurrent user testing with a mix of user scenarios and the acceptable response times: <edit table as
appropriate – sample info for suggested level of detail, only>
Load Simulation Schedule
On Dial-up (56 Kbps)
On Broadband
Homepage Load
19 seconds
NA – As Dial-up is considered more
relevant.
Log-in/ Log-out
NA – As Broadband is considered
more relevant.
Sub ‘second’
Rate Request-Response
NA – As above
Sub 20 seconds
Rate Request – Response for
multi-BCD rate shop
NA – As above
Sub 20 seconds
Create a booking
NA – As above
1 min: 30 seconds (inclusive of
mandatory intermediate steps)
Modify / Cancel Booking
NA – As above
1 min: 3 seconds (inclusive of
mandatory intermediate steps)
One-Click Booking
NA – As above
30 seconds
Statistics

The graph with y-axis representing response times and x-axis concurrent users will depict the capability
of the system to service concurrent users.

The response times for slow users will provide worst-case response times
15.5.2
Endurance testing
Validate systems behavior for continuous hours of operation for projected load conditions.
Number of continuous hours of operation is to be discussed with Business
Approach

Endurance testing – check resource usage and release namely; CPU, Memory, Disk I/O and network
(TCP/IP sockets) congestion for continuous hours of operation

Determine the robustness - check for breakages in the web server, application server and data base server
under CHO conditions.
Steps
1.
Arrive at a base line configuration of the web server and application server resources i.e. CPU, RAM and
Hard disk for the endurance and reliability test.
2.
The test would be stopped when one of the components breaks. A root cause analysis is to be carried out
based on the data collection described under the server side monitoring section.
Client side monitoring

Failure rate -- web server responses/timeouts/exceptions and incomplete page downloads

Response time degradation under peak load numbers (concurrent users)
Server side monitoring

Collect CPU, Disk and Memory usage for analysis

Check for application server slow down/freeze/crash
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 14 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN

Check for resource contention/deadlocks

Database server load and slow down

Web server crashes

Collect data for analysis to tune the performance of web server, application server and database server

If there is an alarm support in the tool through an agent, check for alerts when the activity level exceeds
preset limits.

If there is a load balancing configuration deployed, check if it is able to distribute the requests
Result
The result of this test will be a proof of confidence for Continuous Hours of Operation. The data collected in this
phase would give pointers to improve the reliability of the system and fix any configuration, component parameters
for reliable performance.
15.5.3
Planned testing cycles
Load testing will be done on <fill in>.com Web Application against a range of operational conditions and factors
including network bandwidth, data volumes and transaction frequency. The test cycles shall be run on the network
measuring performance from <fill in> users to <fill in>users.
The test cycle shall be run for <fill in> users initially (let’s say incrementing <fill in> users per <fill in> seconds till it
reaches <fill in> concurrent users). The test shall be stopped if application crashes before reaching 50 users and issue
shall be reported to development team. The response time shall be noted for <fill in> concurrent users before
stopping the test. If the response time is exceeding the benchmark limit, load test shall be stopped until
development team fixes the issue. If the response time is well within benchmark limit, fresh test cycle shall be run
with an aim to reach <fill in> concurrent users. The same process shall be used until <fill in> concurrent users target is
met within acceptable response time.
The response times will be noted for the following user loads within the same test cycle: (Edit as appropriate)

50 users

100 users

200 users

500 users

1000 users

1500 users
The first cycle of Load testing will be carried out on QA environment and second cycle on Production environment
during System testing phase. (Edit as appropriate)
15.6
Metrics to be measured
Client Side Primary Metrics: (Edit as appropriate)

Response Time

Throughput

Concurrent users
OS Level Primary Metrics: (Edit as appropriate)

Processor Usage

Memory Usage

Disk I/O Rates
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 15 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
App Server Primary Metrics: (Edit as appropriate)

15.7
To be discussed with Technical team.
Test Deliverables (Edit as appropriate)

Test scripts using Load Runner

Performance test report containing all the analysis graphs
15.8
Assumptions, Limitations and Constraints (Edit as appropriate)
Assumptions
1.
The Transaction mix (user mix) shall be provided by XXX Business team.
2.
The <fill in> Team shall provide the application setup. The application provided would have ensured the
following:

Successfully deployed to the Test environment.

Tested by System Test Team according to System Test scripts.

All the requirements verified for as per the requirements specification
Constraints
If Load test scripts shall be executed from offshore, network delay shall add up in response times.
15.9
Exit Criteria (Edit as appropriate)
In order for Load Testing to be considered successful the Load Scripts must be successfully be executed under the
following conditions:

Executed by Test Team

Simulate up to 1500 virtual users

Testing Project Manager sign-off

Meet the exit criteria for the Phase in which the Load Test is executed
16
REGRESSION TESTING
16.1
Purpose
Deliver code that has been regression tested and is ready for promotion into the Production Environment.
Regression Testing will consist of a majority of (D), (N), (DN), and (DE) type test scripts.
16.2
Responsibility
The creation of the Regression Testing Scripts is the responsibility of the Testing Team. Regression Test Scripts
will be created and executed using <fill in>. The execution of the Regression Testing Scripts is the responsibility the
Testing Team.
16.3
Environment
The <fill in> software will be physically located in <fill in>. For the purpose of test execution, the front-end will be
pointed to the System Testing Environment.
16.4
Exit Criteria
In order for Regression Testing to be considered successful the results must meet the exit criteria stated in the
corresponding testing phase exit criteria. For example, Regression Scripts executed during the System Testing
phase must meet the exit criteria stated in the System Testing section of this document.
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 16 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
17
USER ACCEPTANCE TESTING
17.1
Purpose
The purpose of User Acceptance Testing is to deliver code that has been tested by the User Acceptance Test Group
and functionality that is certified to be end-to-end user ready for promotion into the Production Environment.
17.2
Responsibility (Edit as appropriate)
User Acceptance Testing is to be executed by the User Acceptance Group (Business). Management of the User
Acceptance Testing Phase will be the responsibility of the Testing Project Manager via the User Acceptance Group
Coordinator. The test scripts used during User Acceptance Testing are to be created by Test Team with the help of
Business Analyst and User Acceptance Group. Test scripts should accurately reflect the functionality documented
in the <fill in>.
Ultimate authority rests with the Testing Project Manager, who will be in close contact with the User Acceptance
Group Coordinator. Configuration management, builds, etc. will be the responsibility of the Configuration
Management Team at the direction of the Development Project Manager and requires the agreement of the Testing
Project Manager.
17.3
Environment
Refer section 4.4 Test Environment Details
17.4
Test Data
The requesting of data migration/creation is the responsibility of the Testing Project Manager. Details surrounding
the migration/creation will be forwarded to the appropriate individuals. A Testing Data Repository Document will
be delivered on or before User Acceptance Testing. Specific reference will be made in the N & E Test Scripts to the
data types listed in the Testing Data Repository Document. Additional specific data may be required. Should this
be the case, the data will be listed on the corresponding test script.
17.5
Test Execution Process (Edit as appropriate)
User Acceptance Testing will be comprised of five (5) weeks.
Issue Identification:
UAT Testers will log issues as they are identified.
Issue Resolution:
It is expected that the Development Team(s) will undertake issue resolution based on the severity and priority. All
efforts will be made to turn around the Critical/High category issues, in the next scheduled Build/Release.
Build and Release Process:
The Configuration Management Team will deliver fresh builds to the UAT Test environment, as directed by the
Testing Project Manager along with release notes. If the situation warrants, an emergency build may be released.
The Testing Project Manager and all Development Project Managers must be in agreement to proceed with the
emergency build.
Issue Closure:
The UAT Testing Team (Business) will review the issues, which have been resolved, to verify and close/re-instate
the issues and resolution priority. All effort will be made to close the resolved issues as soon as possible.
Issue Tracking:
The Testing Team will be responsible for the administration of the tracking tool.
(Edit diagram below as appropriate)
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 17 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
UAT group
tester logs
defect in PVCS
Tracker and
assigns to UAT
coordinator
UAT group
tester
identifies
defect
Yes
UAT coordinator
reviews defect for
validity & details
UAT
coordinator
assigns
defect to
developer
Developer
fixes
defect
No
Defect is re-assigned to tester
for more clarifications
No
Test team
verifies the
defect
17.6
New
application
version is
released into
UAT env. with
Release Notes
UAT group
tester
verifies fixed
defect
Yes
Defect passed?
Close defect
Exit Criteria
In order to be accepted for promotion to the Production Environment, the application must: (Edit as appropriate)

Tested by User Acceptance Group (Business)

Regression and Load Tested (responsibility of the Testing Team)

No Severity 1, 2 or 3 issues

Minimal Severity 4 issues, documented and with a known resolution path

User Acceptance Group must give the approval for Severity 4 issues to be included into production release

User Acceptance Group Coordinator and Testing Project Manager sign-off
Specifically excluded from User Acceptance Test exit criteria are: (Edit as appropriate)

Security Testing

System Crash/Restart Testing
17.7
o
DB crash
o
iPlanet crash (Software Crash- Hang)
o
Hardware
o
DB capacity/resources
Additional Information
Regression and Load testing will take place prior to promotion to the Production Environment. Please see the
Regression Testing and Load Testing sections of this document for further information.
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 18 of 19
PROGRAM & PORTFOLIO MANAGEMENT OFFICE (PPMO)
<PROJECT> TEST PLAN
17.8
18
Assumptions, Constraints, and Exclusions (Edit as appropriate)

UAT is conducted as per documented and signed off requirements

Any changes or new functionalities that come up during UAT will go through Change Management
process

System is stable and available during scheduled testing period.
SOFT LAUNCH
TBD (will enter details after discussing with Business)
The University of North Carolina at Charlotte
Data Classification:
INTERNAL
<creation date>
Page 19 of 19