WEB SERVICES TESTING CHALLENGES AND APPROACHES

Similar documents
An introduction API testing with SoapUI

Software Architecture Recovery based on Dynamic Analysis

RiMOM Results for OAEI 2009

Web Service Testing for the New Digital Age

Chapter 8 Web Services Objectives

Tutorial to Building Automation Frameworksfor Web Services Testing

SOFTWARE ARCHITECTURES ARCHITECTURAL STYLES SCALING UP PERFORMANCE

Web Services in Cincom VisualWorks. WHITE PAPER Cincom In-depth Analysis and Review

Automated testing in ERP using Rational Functional Tester

Meltem Özturan misprivate.boun.edu.tr/ozturan/mis515

A Study on Website Quality Models

Java Web Service Essentials (TT7300) Day(s): 3. Course Code: GK4232. Overview

An MVC Approach to Design Content Management System

Agent-Enabling Transformation of E-Commerce Portals with Web Services

International Journal of Software and Web Sciences (IJSWS) Web service Selection through QoS agent Web service

GUI Development in TDD Model Case Study

Domain Based Problem Solving Process

A Novel Testing Model for SOA based Services

We manage the technology that lets you manage your business.

Configuration Management for Component-based Systems

Etanova Enterprise Solutions

Integrating Legacy Assets Using J2EE Web Services

A Framework Supporting Quality of Service for SOA-based Applications

XML Web Services Basics

Enterprise Architecture Deployment Options. Mark Causley Sandy Milliken Sue Martin

Development of Contents Management System Based on Light-Weight Ontology

Checklist for Testing of Web Application

QoS-aware model-driven SOA using SoaML

D2.5 Data mediation. Project: ROADIDEA

INVESTIGATING QoS OF REAL WORLD WEB SERVICES

Data Access and Analysis with Distributed, Federated Data Servers in climateprediction.net

Performance Testing of a Road Tolling System

Network Based Hard/Soft Information Fusion Network Architecture/SOA J. Rimland

Incorporating applications to a Service Oriented Architecture

An Evaluation of the Advantages of Moving from a VHDL to a UVM Testbench by Shaela Rahman, Baker Hughes

Performance Comparison and Analysis of Power Quality Web Services Based on REST and SOAP

A STUDY OF ANDROID OPERATING SYSTEM WITH RESPECT WITH USERS SATISFACTION

CONFERENCE PROCEEDINGS QUALITY CONFERENCE. Conference Paper Excerpt from the 28TH ANNUAL SOFTWARE. October 18th 19th, 2010

The Umbilical Cord And Alphabet Soup

Test Tools Meet-up. The Server Side Symposium Selenium, soapui, HTMLUnit, TestMaker, TestGen4Web

Monitoring services on Enterprise Service Bus

Software Engineering Principles

N-JIS: a System for Web Services Integration in Java RMI Distributed Applications

Introduction to Web Services & SOA

Comparison Study of Software Testing Methods and Levels- A Review

MBT TO TTCN 3 TOOL CHAIN: THE

FP Sizing of SOA applications made easy!

A B2B Search Engine. Abstract. Motivation. Challenges. Technical Report

Second OMG Workshop on Web Services Modeling. Easy Development of Scalable Web Services Based on Model-Driven Process Management

Global Journal of Engineering Science and Research Management

Performance Testing: A Comparative Study and Analysis of Web Service Testing Tools

USING XML FOR USER INTERFACE DOCUMENTATION AND DIFFERENTIAL EVALUATION

Pattern recognition through perceptually important points in financial time series

Architectural Styles I

Enabling Mainframe Assets to Services for SOA

GUI Structural Metrics

Chapter 9 Quality and Change Management

Web Data Extraction and Generating Mashup

Pearson Education 2007 Chapter 9 (RASD 3/e)

SOFTWARE ARCHITECTURE & DESIGN INTRODUCTION

A Comparative Study of Web Services-based Event Notification Specifications

USSD. The USSD Evolution. txtnation provides carrier grade USSD API and interface options. Global Reach, Personal Touch.

Web 2.0 Käyttöliittymätekniikat

AUTOMATED GUI TESTING OF SOFTWARE APPLICATIONS USING UML MODELS

Correlation Between Coupling Metrics Values and Number of Classes in Multimedia Java Projects: A Case Study

Ontology based Model and Procedure Creation for Topic Analysis in Chinese Language

: ESB Implementation Profile

Marking Guidelines for MVK Projects. MVK11. Version 6.2 (PPD, URD, ADD, revised URD+ADD, and software demo)

Software Design COSC 4353/6353 DR. RAJ SINGH

A Model-Transformers Architecture for Web Applications

Moving Toward Distributed Web Services

On the Potential of Web Services in Network Management

Test Automation. Fundamentals. Mikó Szilárd

Introduction to Web Services & SOA

Topics on Web Services COMP6017

International Journal of Computer Science Trends and Technology (IJCST) Volume 3 Issue 6, Nov-Dec 2015

Software Quality. Richard Harris

A NET Refresher

Working Group Charter: Basic Profile 1.2 and 2.0

User Interface Oriented Application Development (UIOAD)

A Study of Open Middleware for Wireless Sensor Networks

WSDL Interface of Services for Distributed Search in Databases

Semantic Web. Semantic Web Services. Morteza Amini. Sharif University of Technology Spring 90-91

Test Automation Using Hp Unified Functional Testing

Study on Ontology-based Multi-technologies Supported Service-Oriented Architecture

Realisation of SOA using Web Services. Adomas Svirskas Vilnius University December 2005

Indexing Strategies of MapReduce for Information Retrieval in Big Data

MSc. Software Engineering. Examinations for / Semester 1

An Approach to VoiceXML Application Modeling

Web Services Take Root in Banks and With Asset Managers

WebSphere MQ Update. Paul Dennis WMQ Development 2007 IBM Corporation

An Interactive Web based Expert System Degree Planner

10. Software Testing Fundamental Concepts

Monitoring WAAS Using XML API

Sahi. Cost effective Web Automation

AN ALGORITHM FOR TEST DATA SET REDUCTION FOR WEB APPLICATION TESTING

Gheyath Huzayen. Creative Director / Senior Internet System Analyst

UCSD Extension. Fundamentals of Web Services. Instructor: John Pantone. 2007, Objectech Corporation. All rights reserved

Web Services - Overview

A Study of Future Internet Applications based on Semantic Web Technology Configuration Model

Transcription:

WEB SERVICES TESTING CHALLENGES AND APPROACHES Sana Azzam CIS department, IT faculty Yarmouk University Irbid, Jordan sana_azzam@yahoo.com Mohammed Naji Al-Kabi CIS department, IT faculty Yarmouk University Irbid, Jord mohammedk@yu.edu.jo Izzat Alsmadi CIS department, IT faculty Yarmouk University Irbid, Jordan ialsmadi@yu.edu.jo Abstract The web is evolving and expanding continuously and its services are getting more and more complex, public and usable. However, one of the major benefits and challenges in web services is for those services to be offered in a very useful, flexible, effective and secure way. In this paper, we will investigate web services, show their generic structure and show examples and challenges related to testing web services. Keywords- Text mining, plagiarism; documents similarity, and string search. Keywords- Software testing, web services, e-transactions, e-services, service security, performance, reliability. I. INTRODUCTION Web Services technology is one of the most significant Web technologies in these days. Moreover, it is considered as an important topic in the field of software testing. Therefore, different tools have been developed and designed to use different types of Web Services. This has motivated the Web experts to propose different methods and different tools to test new Web Services efficiently. In this study, we make a comparison between three important tools for testing Web Services. These tools are: SoapUI, PushToTest, and WebInject. We can define WS as a service oriented architecture or a software system which designed to provide an interoperable application to application interaction through network, where the systems interact with applications using Simple Object Access Protocol (SOAP). The (SOAP) is used to exchange XML-based messages over HTTP or HTTPs for identifying the WS not through GUI interface. The user requests a service from a specific Web Service using Application Programming Interface (API), then the Web Service sends the request to the user in formal XML form [1][2][3][4]. In the WS, the client communicates with a service in the server through a program. Testing Web Services is a significant problem that should be studied carefully; the testing has to be extensive and comprehensive to all important levels (unit, component, and system level). Some recent studies show that the error rate of new software programs has the range from 3 to 6 errors per 1000 statement [8]. "Web Services" (as a term) is an important topic in the field of software testing; therefore, different tools have been developed and designed to enhance the use of Web Services. In this research, we will test a number of tools. These tools are: SoapUI, PushToTest, and WebInject. The previous mentioned tools are used primarily for collecting, creating, and testing Web Services. The major question of this research is: "Which is the best tool for testing Web Services?" This paper determines the efficiency of using each tool (such as "SoapUI") with a group of several available and created Web Services. The proposed process is as following: Understanding the tools that have been selected for the comparison. Applying the created and collected Web Services on each selected tool. ICCIT 2012 291

Comparing the selected tools based on some factors (like testing coverage). The results of this paper will answer the main question and summarize the most efficient tools for testing different Web Services? II. BACKGROUND Web Services provide many means of interoperation between different software applications, and have the ability to work on many frameworks and platforms [4]. The architecture of WS consists of logical evolution of component architecture, and object oriented analysis and design. The architecture of WS is designed to be able to achieve three main purposes. First, providing a conceptual model and context, and clarifying the relationship among them. Second, describing a number of characteristics of common Web Services. Third, supporting the interoperability with older applications [5] [6] [7]. Many tools have been implemented for testing Web Services, and few approaches have been proposed to study the quality of these tools. This study is based on three open source tools. These tools are considered as effective software solutions for testing Web Services. Next subsections describe briefly the three selected tools. SoapUI Tool This tool is a Java based open source tool. It can work under any platform provided with Java Virtual Machine (JVM). The tool is implemented mainly to test Web Services such as SOAP, REST, HTTP, JMS and other based services. Although SoapUI concentrates on the functionality, it is also consider performance, interoperability, and regression testing [11]. PushToTest Tool One of the objectives of this open source tool is to support the reusability and sharing between people who are involved in software development through providing a robust testing environment. PushToTest primarily implemented for testing Service Oriented Architecture (SOA) Ajax, Web applications, Web Services, and many other applications. This tool adopts the methodology which is used in many reputed companies. The methodology consists of four steps: planning, functional test, load test, and result analysis. PushToTest can determine the performance of Web Services, and report the broken ones. Also, it is able to recommend some solutions to the problems of performance [12]. WebInject Tool This tool is used to test Web applications and services. It can report the testing results in real time, and monitor applications efficiently. Furthermore, the tool supports a set of multiple cases, and has the ability to analyze these cases in reasonable time. Practically, the tool is written in Perl, and works with the platforms which have Perl interpreter. The architecture of WebInject tool includes: WebInject Engine and Graphical User Interface (GUI), where the test cases are written in XML files and the results are shown in HTML and XML files [13]. III. LITERATURE REVIEW Several researches have been conducted to improve testing of Web Services. This chapter describes the related work to the domain of testing Web Services. We will first present some techniques that have been proposed for testing Web Services. Ashok Kumar et al. have proposed a model called Automated Regression Suite model. The proposed model includes parsing of WSDL file and generating SOAP to represent how to automate the testing of any Web Service with any environment by making a comparison between responses of Golden and SOAP. Moreover, the number of methods that passed or failed is identified through generated response report [9]. Sneed et al. has developed a WSDL test for testing Web Services. This technique is based on generating a request and adjusting it based on a pre-condition. The proposed technique dispatches the requests and capture responses [8]. Siblini et al. has proposed a technique relied on mutation analysis for testing Web services. This technique includes applying mutation operators to WSDL document in order to generate a mutated Web Services interface. The technique is proposed to test Web services through discovering of errors in WSDL 292

interface and Web Services as well. Nine mutation operators, mutation groups (switch, special, occurrence) and WSDL document are used in the proposed technique [2]. Automatic Test Generation from GUI Applications for Testing Web Services is another approach that has been proposed by Conroy et al. This approach includes proposing a novel generic approach (smart) to generate test case from reference legacy GAP depends on extraction of data from GUI, and then applying this test in Web services to reduce the time consuming [10]. IV. GOALS AND APPROACHES To use the selected tools to test the collected Web Services, we have to apply these Web Services in the tools. In other words, collected Web Services have to be tested by each tool. Obviously, each testing tool has its own method to test the Web Services. This includes importing the Web Services into the tool, dealing with tool s options which involves in testing process, and finally, showing the results of the analysis process in the supported format. Actually, this means that we have to understand all the required process for applying Web Services and get the results report from each tool deeply. 1.Results Collecting and Analyzing Testing tools generates its results based on some testing criteria. These criteria include functionality, Performance (loading), Interoperability, and many other criteria. Test of performance has many types. No doubt that discussing these types is helpful to clarify how to test the performance of a specific Web Service. Table 1 shows different types of performance testing. Table 1: Types of Performance Testing Type Definition Baseline Testing Load Testing Stress Testing Soak Testing Scalability Testing Normal modes performance testing. High load performance testing. Exceptional loads testing. Long period testing. Large volumes testing. 2. Evaluation and Comparison Comparing between different testing tools of Web Services is a complex task, the testing criteria are not the same in all the selected tools, which means that one tool may have the ability to test the Interoperability (for example), while another one does not test this criteria. Furthermore, some criteria may be affected by factors which are not related to testing tools or Web Services. For example, the performance of internet connection may affect the loading test of Web Services and therefore cause a negative impact in the comparison between tools. To compare selected testing tools, some factors are chosen as a base of the comparison process. Table 2 shows comparison factors. Factor Table 2: Comparison Factors Ease of Use Testing Coverage Functionality Performance Details Is the tool user friendly or not? How many criteria the tool can test? What is the testing tool response time? Dealing withloading in testing Web Services? Throughput How much Web Services the tool can handle in a period of time? Our method to make a comparison between testing tools includes the following steps: Testing each Web Service from the collected Web Services using the three selected tools, taken into consideration the testing criteria. Analyzing the testing result of each tool based on the comparison factors. Reporting results between the testing tools. The results of comparison will help us to specify the more efficient testing tools. Note that it is possible that the tool will be the best in testing one criterion and the worst in testing another one. 3.Discussing the Results The tools which used to test the Web Services are varying in the time which they need to react. In other 293

words, some tools take less time than others to test the Web Services. In the evaluation of the three selected tools, response time of tools for each Web Services is calculated. Table 3 shows the response time of each tool for testing the Web Services. Figure 1 shows the average response time of each tool for all Web Services. The results show that the SoapUI testing tool is the fastest tool in responding to and testing of the Web Services. The second fastest tool is PushToTest, where WebInject is the slowest tool. Response Time in (ms) 1200 1000 800 600 400 200 0 1029.14 813.761 519.88 Testing Web Services Tools Table 3: Response Time for the Web Services Web Service Response time for each Tool in (ms) Weather Forecast 1237 832 1120 Global Weather 1198 943 428 Send Fax 1264 1412 755 Currency Convertor 1603 692 776 Country Details 1869 1364 606 Bible Web Service 1401 1542 373 Text to Braille 958 1526 631 Weight Unit con. 977 601 369 Com. Unit con. 981 635 619 V. Credit Card 929 394 639 Statictic 932 895 804 Stock Quote 953 742 682 Fedach 1329 683 565 Send SMS World 1272 842 478 V. Email Address 966 732 596 Overall average 1029.14 813.761 519 4.The Performance Comparison Factor The performance (loading testing) of tools is a critical factor in the comparison process. The conducted tests by this study include studying the normal load of each of the three tools under consideration, where several requests are sent per minutes, and determine how the tool resistant the loading. Figure 1: Average Response Time. Table 4 shows the performance of each tool for testing the Web Services. Figure 2 shows the average performance of each tool for all Web Services. The results show that the SoapUI testing tool has better performance than PushToTest and WebInject, while PushToTest is better than WebInject in the performance. Table 4: Tools Performance Web Service Performance (Loading Test) in (ms) Weather Forecast 1271.8 860 399.98 Global Weather 1115 1040 447.81 Send Fax 1215.8 1286 1459.35 Currency Convertor 1119.6 0960 1,382.46 Country Details 1028.6 1534 6 Bible Web Ser. 1544.6 1260 379.46 Text to Braille 0977.8 850 650 Weight Unit C. 0951.2 700 367.69 Com. Unit Con. 1152.4 800 940 V. Credit Card 1066.8 360 353.09 Statistic 1082.4 553 377.72 Stock Quote 0884.8 462 459.62 Fedach 1028.4 1046 1,635.55 Send SMS 1388 930 337.71 V. Email add. 1164 560 321.58 Overall average 983.70 730.047 556.751 294

Performance in(ms) 1200 983.704762 1000 800 600 400 200 0 730.047619 556.7519048 Testing Web Services Tools Figure 2. Average Performance of each Tool. 5.The Throughput Comparison Factor Throughput can be defined the as the number of requests per seconds. High number of requests with low load leads to high performance. This research tries to study the throughput of each tool depending on the selected Web Services, but only the SoapUI and PushToTest tools support this kind of testing. Table 5 shows the throughput of SoapUI and PushToTest tools for testing the Web Services. SoapUI has better throughput than the two other tools. Table 5: The Throughput of SoapUI. Throughput of SoapUI Web Service and PushToTest Tool (sec) SoapUI PushToTest USA Weather Forecast 4.13 1.6 Global Weather 4.02 0.86 Send Fax 0.84 0.81 Currency Convertor 0.82 1.26 Country Details 6.8 0.78 Bible Web Service 4.21 0.88 Text to Braille 6.04 1.68 Weight Unit Convertor 4.4 5.5 Computer Unit Convertor 1.32 4.9 Validate Credit Card 4.49 4.4 Statistic 4.42 6.5 Stock Quote 4.09 4.8 Fedach 2.04 0.82 Send SMS World 4.54 3.9 Validate Email Address 4.64 5.1 Average 4.169 3.9947 6. The Ease of Use Comparison Factor The study of the testing tools shows that the SoapUI tool is the most users friendly tool among the selected tools. The reason for that is the excellent graphical user interface which the tool support for testing the Web Services. The PushToTest is somehow can be used with a little effort, but it is still more complex than SoapUI. The last tool which is WebInject is the most complex tool to use, it requires a lot of effort because it needs a configuration process as well as it does not support graphical user interface, and push the user to use a command line. 7. The Testing Coverage Comparison Factor The studying process of the three tools shows that the SoapUI and PushToTest can provide multiple cases in the same testing criteria more than WebInject tool. For example, for the loading test side, SoapUI and PushToTest can test the Web Service in multiple cases: Baseline Testing, Load Testing, Stress Testing, Soak Testing, and Scalability Testing. The SoapUI has a big advantage over the other two tools since it supports high level of security. Clearly, the results illustrates that the three testing tools have the following order depending on the total factors from the best to the worst: SoapUI, PushToTest, and WebInject. SoapUI outperforms PushToTest and WebInject in all the comparison factors, and with a huge difference. SoapUI has an excellent response time, performance, and throughput. It is easy to use and has a group of multiple cases in testing the same criteria. PushToTest is a good testing tool; even it could not outperform SoapUI in any factor. The PushToTest testing tool has got good values in response time and performance. WebInject has the lowest values in the comparison factors. This tool does not have the ability to test the Web Services in the same quality with the other tools. Moreover, it is complex and difficult to use. V. CONCLUSION AND FUTURE WORK In this paper, several selected web services are evaluated based on several known important quality factors. Examples of those quality factors used in evaluating web services include: performance, 295

reliability, security, throughput, etc. The obtained results from the evaluation process show that the SoapUI outperforms PushToTest, and WebInject in all the comparison factors. In the same way, PushToTest outperforms WebInject in the compared factors. In general, the project suggests using SoapUI to test different Web Services because of its good performance and high quality test of Web Services. [11] "SoapUI tool", (visited in 14-November), http://www.soapui.org. [12] "PushToTest tool", (visited in 16-November), http:// www.pushtotest.com. [13] "WebInject" (visited in 30-october-2010), http://www.webinject.org/. References [1] "Web Service (WS)", (visited in 11-october-2010), http://en.wikipedia.org/wiki/web_service. [2] Siblini, R.; Mansour, N. (2005), "Testing Web Services", aiccsa, pp.135-vii, ACS/IEEE 2005 International Conference on Computer Systems and Applications. [3] Martin, E.; Basu, S.; and Xie, T. (2006), "Automated Robustness Testing of Web Services", In Proceedings of the 4th International Workshop on SOA And Web Services Best Practices. [4] Bertolino, A.; Polini, A. (2005), The Audition Framework for Testing Web Services Interoperability" in Proceedings of the 2005 31st EUROMICRO Conference on Software Engineering and Advanced Applications. [5] "What's Web Service", (visited in 21-october-2010), http://www.w3.org/tr/ws-arch/#gengag. [6] "Web Service architecture",(visited in 22-october-2010), https://www.ibm.com/developerworks/webservices/library/w-ovr/. [7] Judith, M. "Web Service architectures", Published by Tect,29 South LaSalle St. Suite 520 Chicago Illinois 60603 USA. [8] Sneed, H, M.; Huang, S.( 2006), " WSDLTest A Tool for Testing Web Services", Eighth IEEE International Symposium on Web Site Evolution. [9] Kumar, A, S.; Kumar, P, G.; and Dhawan, A.(2009), Automated Regression Suite for Testing Web Services" International Conference on Advances in Recent Technologies in Communication and Computing, pp.590-592. [10] Conroy, K, M.; Grechanik, M.; Hellige, M.; Liongosari, E, S.; and Xie, Q.( 2007), " Automatic Test Generation From GUI Applications For Testing Web Services", Software Maintenance 2007,ICSM 2007,IEEE International Conference on 2-5 Oct 2007, pp.345-354. 296