2. external service with respect to service under test.
Late-change can be detected at the URL level or
from the message header of any service message.
Tarhini et al. [15] proposed a safe regression
testing technique based on event dependency
graph. Naslavsky et al. [16] proposed fine-grained
traceability relationship between the model
elements and test cases that traversed those
elements for locating the test cases for retest and
used to support model based regression test
selection. Tsai et al. [17] proposed verification
mechanism to the UDDI server including check-in
and checkout of web service. UDDI server having a
test infrastructure consisting of test master, test
agent and test monitor to perform web service
testing remotely. Mezhi et al. [18] present an SOA
model based on autonomic registry where
autonomic registry is composed of a controller
named automatic registry manager and traditional
UDDI registry. Here, they extend WS-Policy based
on QoS data and additional information such as
service specific adaptation action. Boghdady et al.
[19] proposed an XML based automatic test case
generation approach from activity diagram. Yuan et
al. [20] proposed a model driven approach towards
generating executable test case from the given
business process. There are two key
transformations in their approach: Process Under
Test (PUT) to Abstract Test Case (ATC) and
Abstract Test Case (ATC) to Executable Test Case
(ETC). Here, PUT is visualized in activity diagram
and ATC visualize in sequence diagram.
III. PROPOSED REGRESSION TESTING
PROCESS
Here we define a regression testing process with
the help of UML use case diagram and activity
diagram. UML use case diagram helps us to break
our requirements into short stories that makes easy
to understand. Use Cases focus on the user of the
system and describe the way the system can be
used by the user. In activity diagram nodes
represent various user actions, conditions and
system outputs. The edges represent transitions
from node to node.
Our approach differs from the existing
approaches in the way that in this testing process
we generate test cases of activity nodes. It also
helps to identify the changes in the existing
services by identifying the changes in the activity
node version of activity diagram of that service.
Fig.1 shows the regression testing process of SOA
based application. We briefly explain the steps of
our testing process below:-
Step 1. The first step of the regression testing
process is to identify atomic services and
composite services from the given service
descriptions. Atomic service having a fine-grained
structure and so decomposition is not possible. Its
implementation is self contained and does not
invoke any other service. Whereas composite
service is a service whose implementation call
other services. Composite service aggregating
together the child service into a bigger service [10].
Step 2. After identifying the atomic services and
composite services we have to build use case
diagrams that consists all composite services. It
helps us to identify the black box functional
requirement which can help to generate test cases.
Step 3. After building use case diagram we have
to draw an activity diagram of each use case. With
the help of activity diagram we can identify the
business process workflow, business logic,
functional processes and flow within a use case.
Based on it we can generate test cases that suite our
service requirement.
Fig. 1. Regression Testing Process of SOA Based
Application.
Step 4. During building an activity diagram we
have to give a unique node version of each activity
node. Suppose initially an activity diagram having
five nodes then we have to assign a unique node
version to each activity node (Say A1 to A5). If
there is any changes happen in the service
specification then corresponding activity node
version should be changed. It means if there is any
change happens at the activity node A1 then it
should be changed to A1.1. Due to this we can be
able to identify the changes happening in the
service that evolve due to changes made in service
requirements.
Step 4.1 to Step 4.4 are optional steps or
additional steps associated with Step4. If there is
any kind of changes happening from step 4.1
( Modify the Existing Node) or step 4.2 ( Add The
3. New Node) or step 4.3 ( Delete Existing Node) or
step 4.4 ( Shift Node), these individual step
changes will affect the node version in step 4.
After identifying the modified part with the help
of node version we can select test cases of that
modified part by adopting our test case selection
technique which is discussed in section IV .
Step 5. Identify test cases based on defined
requirements and constraints, business process
workflow, Capturing data flow, capturing decision
points and control flow between services.
Step 6. After identifying the test cases we build
a regression test suite ( say T) that includes set of
test cases before and after service evolution.
iv. PROPOSED SOA TESTING
PERSPECTIVE MODEL (STPM)
SOA testing is multi-agent and multiphase
process. Testing of SOA imposes different needs
and challenges to different stakeholders who are
involved in testing activities. Each stakeholder has
his own testing perspective. For example, Service
developers need to test the services in order to
release a highly reliable service based applications.
Service provider tests the services to ensure that
services meet the service level agreements
agreed upon the customer. The end-user
concerns with the services they use as per the
requirement [12] [13].
We have named our proposed model "SOA
Testing Perspective Model (STPM)". There are
three different perspectives of this model. These are
Service Developer perspective, Service Tester
Perspective and Service Provider Perspective.
The model (Fig.2) we proposes supports multi-
agent testing. Different agents in this model are:
Service Developer, Service Tester and service
provider. This model also focuses on service
validity when the service is going to register in the
UDDI.
Model Explanation:-
This STPM represents a complex model
consisting of Service Developer Model (SDM),
Service Tester Model (STM) and Service Provider
Model (SPM).
In the sub-model SDM Service Developers know
the internal structure of the service. They have a
knowledge of service specification. Service
Developers test the services in terms of service
functionality, ensure quality of services and the
interaction with other services. Service Developers
deliver both interface and implementation of the
services and are responsible for detecting bugs in
order to release reliable services.
In service based application services may
invoke other services to full-fill clients needs.
Among these services, some services may evolve.
So, there is a need to test the service composition.
Web Service Description Language (WSDL)
contains the information about the service interface
name, service operation name, operation input
parameter, operation return value, service message
format, service location and the data that are to be
transmitted. In our STM sub model service tester
get information about the service from the WSDL
document.
In the sub-model STM Service Testers execute
and manage the test cases. Service testers first
identify the changes in the services. After
identification of changes, the tester needs to
identify the missed coverage items. The tester
needs to select and execute test cases that cover
missed coverage items in the service. The
execution order of the test cases should be in such a
way that the test cases that cover more number of
items should be executed first.
Let S is the service and its evolved version is S1.
A tester can identify the missed coverage item by
executing test case t with an evolved version of the
service and compare the results of t with that of the
preceding version of the service. Our test case
selection technique helps to test missed coverage
items.
Test Case Selection Technique:
Step1:- Let T be the prioritize Test Suite and t
be the Subset of T (t T) where
t={t1,t2,t3,t4.......tn} executed before
service evolution.
Step2: - Execute test cases other than t from T
to cover all missed coverage items
after service evolution. Let this test case
set be T' where T'=T-t.
Step3: - If needed add additional test cases say
t' in T to cover missed coverage items.
t' may cover additional coverage items
that are not yet covered by pre-
executed test cases.
Step4:- Check pre-conditions and post-condition
based on 'Design by Contract'
document for each test input and test
output.
Step5: - If pre-conditions and post-conditions of
test cases satisfied then continue
execution of the test cases until it
cover all items.
else
Go To Step3
Step 6: - Exit
4. Fig. 2. SOA Testing Perspective Model (STPM)
In the SPM Service Provider publishes services
in UDDI. But before publishing, it is necessary to
test a service so that the service provider can be
able to give guarantees about the quality of the
registered services. The service provider generates
test cases from a model provided by the service
developer. The service provider submits test cases
to the UDDI to confirm service standards. During
registration of a service, the server will use the
provided test cases to test the services. The service
provider also sets various criteria and matching
rules for service consumers and ensure that the
authorized service consumers are able to use this
service. There is also a need to develop test cases
for notification mechanism. This ensures that, when
there is any evolution in the service or any new
version of the service is registered in the
infrastructure, an automatic notification is sent to
the service consumers. The changes happening in
the UDDI registry is done by the authenticated
users. unauthenticated users can also access the
UDDI registry for read only purposes.
Information is available and searchable in the
UDDI registry. UDDI contains contact information
about the service provider company and
information stores in the form of business name,
address, contact information and other short
descriptive information about the service provider.
There is a category partition for web services in the
UDDI which is based on the web service
functionalities. Each category represents a different
quality of service (QoS) attribute.
UDDI also contains information about the
specific adaptation policy which is defined by the
service provider at design time [19].
Fig. 3. describes the type of information that is
available and searchable in the UDDI registry.
Fig. 3. The information available in the UDDI
This information can be used to create a
checklist. It is the responsibility of the service
provider to request UDDI server to create another
checklist if there are any changes happening in the
service specification. When such changes occur in
the UDDI then there must be sent a notification
mechanism to service consumers and now service
consumers are able to use modifiable and/or
updateable services. So, there is a need to continue
monitoring the services. Monitoring the services
also helps to identify non-working services present
in the UDDI. When service provider gets
5. non-working services they can remove such
services from the UDDI and make UDDI more
trusty.
v. CONCLUSION AND FUTURE WORK
Our contribution through this paper is in the field
of regression testing process and SOA testing
perspective model which helps to test complex
SOA based applications and to validate services
when service are registered into UDDI. To the best
of our knowledge, no SOA testing perspective
model has been proposed yet. We believe, this
model which we have developed for SOA
regression testing would be adapted by software
industries and this would constitute our future
research .
VI. REFERENCES
[1] Antonia Bertolino, Andrea Polini. "The Audition
Framework for Testing Web Services Interoperability", In
Proceedings of 31st EUROMICRO Conference on
Software Engineering and Advanced
Applications(EUROMICRO-SEAA'05), 2005.
[2] Izzat Alsmadi, Sascha Alda." Test Case Reduction and
Selection Optimization in Testing Web Services",
International Journel of Information Engineering and
Electronic Business, October MECS 2012.
[3] Hai Huang, Rick A. Mason. "Model Checking
Technology for Web Services", In Proceedings of The
Fourth IEEE Workshop on Software Technology for
Future Embedded and Ubiquitous Systems and Second
International Workshop on Collaborative Computing,
Integration and Assurance (SEUS- WCCIA 06), IEEE
2006.
[4] Testing Service-Oriented Architecture(SOA)
Applications and Services, White
Paper,HP,www.hp.com/go/software.
[5] W. T. Tsai, Y. Chen, R. Paul, N. Liao and H. Huang.
" Cooperative and Group Testing in Verification of
Dynamic Composite Web Services", In Proceedings of
28th Annual International Computer Software and
Applications Conference (COMPSAC’04) 0730-
3157/04 , IEEE 2004.
[6] Xiaoying Bai, Wenli Dong. "WSDL-Based Automatic
Test Case Generation for Web Services Testing", In
Proceedings of International Workshop on Service
Oriented System Engineering (SOSE’05)0- 7695-
2438-9/05, IEEE 2005.
[7] Tamim Ahmed Khan, Reiko Heckel,"A Methodology for
Model-Based Regression Testing of Web Services",
In Proceedings of Testing: Academic and Industrial
Conference- Practice and Research Techniques,
IEEE 2009.
[8] Athira B, Philip Samuel. "Web Services Regression Test
Case Prioritization", In Proceedings of Internationa
Conference on Computer Information Systems and
Industrial Management Applications (CISIM),
IEEE,2010.
[9] Rajani Kanta Mohanty, Binod Kumar Pattanyak,
Bhagabat Puthal, Durga Prasad Mohapatra." A Road
Map to Regression Testing of Service Oriented
Architecture (SOA) Based Applications", Journal of
Theoretical and Applied Information Technology
Vol.36 No.1 15th February 2012.
[10] Bobby Woolf: Web Sphere SOA and JEE in
Practice,www.ibm.com/developerworks/community/
blogs/woolf/?lang=en.
[11] Orest Pilskalns, Gunay Uyan and Anneliese
Andrews."Regression Testing UML Design", In
Proceedings of 22nd IEEE International Conference on
Software Maintenance ( ICSM-06), IEEE,2006.
[12] Poonkavithai Kalamegam and Zayaraz Godandapani.
"A Survey on Testing SOA Built Using Web Services",
International Journel of Software Engineering and
Its Applications, Vol.6,No.4,October-2012.
[13] Massimiliano Di Penta, Marcello Bruno and
Gerardo Canfora." Web Service Regression Testing",
RCOST- Research Centre on Software Technology-
University of Sannio Palazzo Ex Poste, via Tralanno
82100 Benevento, Itly May-2007.
[14] Lijun Mei, Kezhai Bojiang, W.K.Chan and T.H.Tse."
Preemptive Regression Test Scheduling Strategies: A
New Testing Approach to Thriving on The Volatile
Service Environments", In Proceedings of 36th
International Conference on Computer Software and
applications, IEEE 2012.
[15] Abbas Tarhini, Zahi Ismail and Nashat Mansour."
Regression Testing Web Application", In Proceedings of
International Conference on Advanced Computer Theory
and Engineering, IEEE, 2008.
[16] Leila Naslavsky, Hadar Ziy and Debra J.Richardson. "
A Model-Based Regression Test Selection Technique",
In Proceedings of ICSM 2009, Edmonton, Canada, IEEE
2009.
[17] W.T.Tsai, R.Paul, Z.cao, L.Yu, A.Saimi, B.Xiao.
"Verification of Web Services Using an Enhanced
UDDI Server", In Proceedings of The Eighth IEEE
International Workshop on Object-Oriented Real-Time
Dependable Systems, IEEE, 2003.
[18] Haithem Mezhi, Walid Chainbi, Khaled Ghedira,
"An Autonomic Registry-Based SOA Model", IEEE,
2011.
[19] Pankinam N. Boghdady, Nagwa L.Badr and Mohamed
F.Tolba, " An Enhanced Test Case Generation
Technique Based on Activity Diagrams", IEEE, 2011.
[20] Qiulu Yuan, Jiwu Chao Liu and Lizhang. " A Model
Driven Approach Toward Business Process Test Case
Generation", IEEE, 2008.