Towards Service oriented Testing of Web Services

Workshop on Web Service and Testing
15 Oct. 2008
Towards Service-Oriented
Testing of Web Services
Hong Zhu
Department of Computing, Oxford Brookes University
Oxford OX33 1HX, UK
Yufeng Zhang
Dept of Computer Sci, National Univ. of Defense Tech.,
Changsha, China, Email: [email protected]
1
Overview
 Motivation
 The
15 Oct. 2008
impact of WS on software testing
 The requirements on supports to testing WS
 Proposed framework
 Prototype implementation
 Case
studies
 Conclusion
Workshop on Web Service and Testing
2
Characteristics of Web Services
 The
components (services) of WS applications
15 Oct. 2008
 Autonomous:
control their own resources and their
own behaviours
 Active: execution not triggered by message, and
 Persistent: computational entities that last long time
 Interactions between
services:
 Social
ability: discover and establish interaction at
runtime
 Collaboration: as opposite to control, may refuse
service, follow a complicated protocol, etc.
Workshop on Web Service and Testing
3
WS technique stack

Basic standards:


15 Oct. 2008


WSDL: service description and publication
UDDI: for service registration and retrieval
SOAP for service invocation and delivery
More advanced standards for collaborations between
service providers and requesters.


BPEL4WS: business process and workflow models.
OWL-S: ontology for the description of semantics of services
Registry
Search for
services
Requester
registered
services
request service
deliver service
Workshop on Web Service and Testing
register
service
Provider
4
Testing developer’s own services

Service has similarity to test software components
15 Oct. 2008


Many existing work on software component testing can be
applied or adapted
Requires special considerations:

The stateless feature of HTTP protocol;
 XML encoding of the data passing between services as in
SOAP standard;
 Confirmation to the published descriptions:
 WSDL for the syntax of the services
 workflow specification in BPEL4WS
 semantic specification in e.g. OWL-S.
Workshop on Web Service and Testing
5
Testing developer’s own services (continue)

Dealing with requesters’ abnormal behaviours

15 Oct. 2008


Dealing with unexpected usages/loads



The requesters are autonomous, their behaviours may be
unexpected
Need to ensure that the service handles abnormal behaviours
properly
As all web-based applications, load balance is essential.
The usage of a WS may not be available during the design
and implementation of the system.
Dealing with incomplete systems


A service may have to rely on other services, thus hard to
separate the testing of the own services from the integration
testing, especially when it involves complicated workflows.
In the worst case, when dynamically bound to the other
services
Workshop on Web Service and Testing
6
Testing of others’ services in composition
15 Oct. 2008
 Some
similarity to component integration
 However, the differences are dominant
 Problems in the application of existing
integration testing techniques:
 Lack
of software artifacts
 Lack of control over test executions
 Lack of means of observation on system behaviour
Workshop on Web Service and Testing
7
Lack of software artifacts


The problem:
No design documents, No source code, No executable code
The impacts:
15 Oct. 2008



For statically bound services,
 Techniques that automatically derive stubs from source code are not
applicable
 Automatic instrumentation of original source code or executable code
is not applicable
For dynamic bound services,
 Human involvement in the integration becomes impossible.
Possible solutions:
(a) Derive test harness from WS descriptions;
(b) The service provider to make the test stubs and drivers available for
integration.
Workshop on Web Service and Testing
8
Lack of control over test executions

15 Oct. 2008


Problem:
 Services are typically located on a computer on the Internet
that testers have no control over its execution.
Impact:
 An invocation of the service as a test must be distinguished
from a real request of the service.
 System may be need to be restarted or put into a certain state
to test it.
 The situation could become much more complicated when a
WS is simultaneously tested by many service requesters.
Possible solution:
 The service provider must provide a mechanism and a service
that enable service requesters to control the testing executions
of the service.
Currently, there is no support to such mechanisms in W3C
standards of WS.
Workshop on Web Service and Testing
9
Lack of means of observation

The problem:

15 Oct. 2008

The Impacts:



A tester cannot observe the internal behaviours of the services
No way to measure test coverage
No way to ensure internal state is correct
Possible solutions:


The service provider provides a mechanism and the services
to the outside tester to observe its software’s internal
behaviour in order to achieve the test adequacy that a service
requester requires.
The service provider opens its document, source code as well
as other software artifacts that are necessary for testing to
some trusted test service providers.
Workshop on Web Service and Testing
10
The proposed approach
15 Oct. 2008
 A WS
should be accompanied by a testing
service.
 functional services: the services of the
original functionality
 testing services: the services to enable test the
functional services
 Testing services can be either provided by the
same vendor of the functional services, or by a
third party.
Workshop on Web Service and Testing
11
Architecture of service oriented testing
T-services of
Tester T1
F-services of
Tester T1
Matchmaker
T-services of
Tester T2
F-services of
Tester T2
Broker
GUI
T-services of
A1
T-services of
A2
F-services of
A1
F-services of
A2
Workshop on Web Service and Testing
Ontology management
15 Oct. 2008
UDDI Registry
12
A Typical Scenario: Car Insurance Broker
Test Broker
Tester T2
F-Services
Tester T1
F-Services
Test
Broker
F-Services
Tester T2
T-Services
T-Services
T-Services
15 Oct. 2008
Tester T1
Bank B’s
GUI
CIB’s service
Interface
requester
F-Services
Bank B’s
CIB’s F-Services
T-Services
CIB’s T-Services
WS
Registry
Insurance A1’s F-
Insurance A2’s F-
Insurance An’s F-
Services
Insurance A1’s T-
Services
Insurance A2’s T-
Services
Insurance An’s T-
Services
Workshop on Web Service
and Testing
Services
Services
13
How does it work?
15 Oct. 2008
Suppose the car insurance broker want to
search for web services of insurers and test the
web service before making quote for its customers.
Testing the integration
of two services
Information
about the car and
the user
Car Insurance
Broker CIB
customer
Insurer Web
Service IS
Insurance
quotes
Workshop on Web Service and Testing
14
15 Oct. 2008
Matchmaker
4. Search for testers
5. List of testers
.3.Request test
Car Insurance
service
Test Broker
Broker: TB
Broker CIB
16. Test report
6. Request
15. Test
0. Intended composition
test
service
9. Test
results
of services
case
10. Request
test service
Insurer Web
Service IS
1 Register
7. Request service
(F- Service)
Testing Service:
service
meta-data
TG ( Test case
Generator)
2 Register
8.
Testing
related
Insurer Web
service
meta-data
Service: IS
(T- Service) 11.. Request service
Testing Service: TE
meta-data
( Test Executor)
12. Testing related
meta-data
13. Test invocation of services
Workshop
on Web
Service
and Testing
14. Results
of test
invocation
of services
15
Key Issues to Automate Test Services
15 Oct. 2008

How a testing service should be described, published
and registered at WS registry;
 How a testing service can be searched and retrieved
automatically even for testing dynamically bound
services;
 How a testing service can be invoked by both a human
tester and a program to dynamically discover a service
and then test it before bind to it.
 How testing results can be summarized and reported in
the forms that are suitable for both human beings to
read and machine to understand.
These issues can be resolved by the utilization of a software testing
ontology (Zhu & Huo 2003, 2005).
Workshop on Web Service and Testing
16
STOWS: Software Testing Ontology for WS
15 Oct. 2008
 Ontology
defines the basic terms and relations
comprising the vocabulary of a topic area as well as the
rules for combining them to define extensions to the
vocabulary
 STOWS is base on an ontology of software testing
originally developed for agent oriented software testing
(Zhu & Huo 2003, 2005).
 The concepts of software testing are divided into two
groups.
 Knowledge about software testing are also
represented as relations between concepts
Workshop on Web Service and Testing
17
STOWS (1): Basic concepts
15 Oct. 2008






Tester: a particular party who carries out a testing activity.
Activity: consists of actions performed in testing process,
including test planning, test case generation, test execution,
result validation, adequacy measurement and test report
generation, etc.
Artefact: the files, data, program code and documents etc.
inovlved in testing activities. An Artefact possesses an attribute
Location expressed by a URL or a URI.
Method: the method used to perform a test activity. Test
methods can be classified in a number of different ways.
Context: the context in which testing activities may occur in
software development stages to achieve various testing
purposes. Testing contexts typically include unit testing,
integration testing, system testing, regression testing, etc.
Environment. The testing environment is the hardware and
software configurations in which a testing is to be performed.
Workshop on Web Service and Testing
18
STOWS (2): Compound concepts

Capability: describes what a tester can do
15 Oct. 2008
Capability
1
Activity
0-1
Context
•
•
•
•
•
•
1
Method
0-1
Environment
the activities that a tester can perform
the context to perform the activity
the testing method used
the environment to perform the testing
the required resources (i.e. the input)
the output that the tester can generate
Workshop on Web Service and Testing
Artefact
0-*
Capability Data
1-*
<<enumeration>>
Capability Data Type
Input
Output
19

Task: describes what testing service is requested
Task
15 Oct. 2008
1
Activity
Artefact
1
Method
0-1
Context
1-*
Task Data
0-1
Environment
A testing activity to be performed
 How the activity is to be performed:
 the context
 the testing method to be used
 the environment in which the
activity must be carried out
 the available resources
 the expected outcomes

Workshop on Web Service and Testing
1-*
<<enumeration>>
Task Data Type
Input
Output
20
STOWS (3): Relations between concepts

Relationships between concepts are a very important
part of the knowledge of software testing:

15 Oct. 2008





Subsumption relation between testing methods
Compatibility between artefacts’ formats
Enhancement relation between environments
Inclusion relation between test activities
Temporal ordering between test activities
How such knowledge is used:


Instances of basic relations are stored in a knowledge-base as
basic facts
Used by the testing broker to search for test services through
compound relations
Workshop on Web Service and Testing
21
Compound relations

MorePowerful relation: between two capabilities.
15 Oct. 2008


Contains relation: between two tasks.


MorePowerful(c1, c2) means that a tester has capability c1
implies that the tester can do all the tasks that can be done
by a tester who has capability c2.
Contains(t1, t2) means that accomplishing task t1 implies
accomplishing t2.
Matches relation: between a capability and a task.

Match(c, t) means that a tester with capability c can fulfil
the task t.
Workshop on Web Service and Testing
22
Prototype Implementation
 STOWS
is represented in OWL-S
 Basic
15 Oct. 2008
concepts as XML data definition
 Compound concepts defined as service profile
 UDDI
/OWL-S registry server (as the test
broker):
 Using
OWL-S/UDDI Matchmaker
 The environment:
 Windows XP,
 Intel Core Duo CPU 2.16GHz,
 Jdk 1.5, Tomcat 5.5 and Mysql 5.0.
Workshop on Web Service and Testing
23
Case Study:

An automated software testing tool CASCAT is
wrapped into a test service

Registered:
 Capability is described in the ontology represented in OWL-S
15 Oct. 2008

Searchable:
 It can be searched when the testing task matches its capability

Invoked through the internet
 As a web services to generation test cases based on algebraic
specification

A web service and its corresponding test service are
implemented


Both registered
Testing of the WS can be invoked through the corresponding
T-Service
Workshop on Web Service and Testing
24
Conclusion
 Challenges
to testing web services applications
a web service as developers’ own software
 Integration testing at development and at run-time
15 Oct. 2008
 Testing
 No support in current WS standard stack
 A service
oriented approach is proposed
 Architecture
fits well into service oriented
architecture
 Supported by software testing ontology
 Feasibility
of the approach tested via a case
study.
Workshop on Web Service and Testing
25
Advantages

Automated process to meet the requirements of on-thefly service integration testing
15 Oct. 2008



Automation without human involvement
Testing without interference to providing normal functional
services
Testing without affect the real world state

Security and IPR can be managed through a
certification and authentication mechanism for third
party specialised testing services
 Business opportunities for testing tool vendors and
software testing companies to provide testing services
online as web services
Workshop on Web Service and Testing
26
Remaining challenges and future work

Technical challenges
15 Oct. 2008




To develop a complete ontology of software testing (e.g. the
formats of many different representations of testing related
artefacts)
To implement the test brokers efficiently
To device the mechanism of certification and authentication
for testing services
Social challenges


For the above approach to be practically useful, it must be
adopted by web service developers, testing tool vendors and
software testing companies
Need standards, such as a standard of software testing
ontology
Workshop on Web Service and Testing
27
References

15 Oct. 2008




Zhang, Y. and Zhu, H., Ontology for Service Oriented Testing of
Web Services, Proc. of The Fourth IEEE International
Symposium on Service-Oriented System Engineering (SOSE
2008) , Dec. 18-19, 2008, Taiwan. In press.
Zhu, H., A Framework for Service-Oriented Testing of Web
Services, Proc. of COMPSAC’06, Sept. 2006, pp679-691.
Zhu, H. and Huo, Q., Developing A Software Testing Ontology in
UML for A Software Growth Environment of Web-Based
Applications, Chapter IX: Software Evolution with UML and
XML, Hongji Yang (ed.). IDEA Group Inc. 2005, pp263-295.
Zhu, H. Cooperative Agent Approach to Quality Assurance and
Testing Web Software, Proc. of QATWBA’04/COMPSAC’04,
Sept. 2004, IEEE CS, Hong Kong,pp110-113.
Zhu, H., Huo, Q. and Greenwood, S., A Multi-Agent Software
Environment for Testing Web-based Applications, Proc. of
COMPSAC’03, 2003, pp210-215.
Workshop on Web Service and Testing
28