Experimental Evaluation of Effectiveness of E-Government Websites

Similar documents
CHAPTER 4 HUMAN FACTOR BASED USER INTERFACE DESIGN

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

IBM MANY EYES USABILITY STUDY

Amsterdam Medical Center Department of Medical Informatics. Improve. Usability evaluation of the sign up process of the Improve app

Usability. HCI - Human Computer Interaction

User Centered Design (UCD)

Usability Evaluation of Digital Library

(Milano Lugano Evaluation Method) A systematic approach to usability evaluation. Part I LAB HCI prof. Garzotto - HOC-POLITECNICO DI MILANO a.a.

Usability Test Report for Programming Staff

EVALUATION OF THE USABILITY OF EDUCATIONAL WEB MEDIA: A CASE STUDY OF GROU.PS

Web-Accessibility Tutorials 1 Development and Evaluation of Web-Accessibility Tutorials

Usability Evaluation of Cell Phones for Early Adolescent Users

HOMEPAGE USABILITY: MULTIPLE CASE STUDIES OF TOP 5 IPTA S HOMEPAGE IN MALAYSIA Ahmad Khairul Azizi Ahmad 1 UniversitiTeknologi MARA 1

A new interaction evaluation framework for digital libraries

Usability Study: The Key Ingredient in Designing Effective Help Systems

An Analysis of Image Retrieval Behavior for Metadata Type and Google Image Database

How to Choose the Right UX Methods For Your Project

Recording end-users security events: A step towards increasing usability

A Comparative Usability Test. Orbitz.com vs. Hipmunk.com

Usability Report. Author: Stephen Varnado Version: 1.0 Date: November 24, 2014

Usability Report for Online Writing Portfolio

Comparative usability evaluation of web systems through ActivityLens

Web Usability in the Irish Airline Industry

Learnability of software

Software Design and Evaluation by Ergonomics Knowledge and Intelligent Design System (EKIDES)

A Study of Human Factors in Interactive Software

How to Add Usability Testing to Your Evaluation Toolbox

Procedia - Social and Behavioral Sciences 174 ( 2015 ) INTE Jintavee Khlaisang*

Study of Usability of Indian Websites

User-Driven Usability Assessment of Internet Maps

A STUDY ON SMART PHONE USAGE AMONG YOUNGSTERS AT AGE GROUP (15-29)

Table of Contents. I) Project Planning. User Analysis. III) Tasks Analysis. IV) Storyboard. V) Function Design. VI) Scenario Design.

Folsom Library & RensSearch Usability Test Plan

Evaluation of the Effect of Wireshark-based Laboratories on Increasing Student Understanding of Learning Outcomes in a Data Communications Course

Finding Governmental Statistical Data on the Web: Three Empirical Studies of the FedStats Topics Page

Usability Testing Methodology for the 2017 Economic Census Web Instrument

Online Food Ordering Company, Founded 2004, Chicago, IL

Team : Let s Do This CS147 Assignment 7 (Low-fi Prototype) Report

THE EFFECT OF SCENT ON USER RECALL OF WEB SITE NAVIGATION

Stream Features Application Usability Test Report

Towards an Experimental Framework for Measuring Usability of Model-Driven Tools 1

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

Easy Knowledge Engineering and Usability Evaluation of Longan Knowledge-Based System

Comparing the Usability of RoboFlag Interface Alternatives*

Usability Testing. November 14, 2016

5 th International Symposium 2015 IntSym 2015, SEUSL

An Integrated Quantitative Assessment Model For Usability Engineering

A WEB SITE USABILITY MEASURE USING NAVIGABILITY MAP METRIC

Usability of interactive systems: Current practices and challenges of its measurement

Usability and User Experience of Case Study Prof. Dr. Wiebke Möhring

DESIGNING HELP FEATURES TO SUPPORT BLIND USERS INTERACTIONS WITH DIGITAL LIBRARIES

Visual Appeal vs. Usability: Which One Influences User Perceptions of a Website More?

EVALUATION OF PROTOTYPES USABILITY TESTING

User centered system design

Integrating Usability Engineering in the Iterative Design Process of the Land Attack Combat System (LACS) Human Computer Interface (HCI)

Research Article Evaluation and Satisfaction Survey on the Interface Usability of Online Publishing Software

Usability Testing CS 4501 / 6501 Software Testing

Usability Testing. November 9, 2016

Capgemini employ 30,000+ (2010) people in India with offices in Mumbai, Bangalore, Kolkata, Pune, Hyderabad, Chennai and Delhi/NCR.

Improving Government Websites and Surveys with Usability Testing

ISO INTERNATIONAL STANDARD. Ergonomic requirements for office work with visual display terminals (VDTs) Part 11: Guidance on usability

Adobe Target Analyst Adobe Certified Expert Exam Guide

A Study on Website Quality Models

User-Centred Evaluation Criteria for a Mixed Reality Authoring Application

Evaluation techniques 1

Evaluation techniques 1

THINK THE FDA DOESN T CARE ABOUT USER EXPERIENCE FOR MOBILE MEDICAL APPLICATIONS? THINK AGAIN.

Design and User Evaluation of Augmented-Reality Interfaces Topic Usability Evaluation Methods. Payel Bandyopadhyay MSc Student University of Helsinki

Measuring the Usability of Web-Based Services Exemplified by University Student Information System

The Study on Cost Comparisons of Various Card Sorting Methods

A STUDY OF CUSTOMER DEFECTION IN CELLULAR SERVICES IN INDIA

EECE 418. Fantasy Exchange Pass 2 Portfolio

Quality and usability: A new framework

Users Satisfaction with OPAC Services and Future Demands Of Govt. College University, Lahore

DESIGN AND EVALUATION OF GIS BASED REAL TIME MUNICIPALITY MANAGEMENT SYSTEM (RTMMS) SANA KHUSHI, M. A

Multi-disciplinary User Interface Construction: Case FESTO Flexible Assembly Unit

A Step-Wise Evaluation of Three Commonly used Statistical Packages in Institutional Research

USER EXPERIENCE ASSESSMENT TO IMPROVE USER INTERFACE QUALITY ON DEVELOPMENT OF ONLINE FOOD ORDERING SYSTEM

Applying Usability to elearning

Evaluation Assignment-VI Usability Test Report Team 4: Calm b4 the storm

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms

Data analysis using Microsoft Excel

What is Sherpa? ENTER A LOCATION CHOOSE EQUIPMENT ORDER SERVICE ENJOY MORE FREE TIME. UXD Presentation Peter Zahn

Usability and the HMIS The Role of HMIS Configurability in Coordinated Assessment Systems

Usability Services at the University of Maryland: Who, What and How

A Paper Prototype Usability Study of a Chronic Disease Self-management System for Older Adults

Developing Schemas for the Location of Common Web Objects

CS3724 Human-computer Interaction. Usability Specifications. Copyright 2005 H. Rex Hartson and Deborah Hix.

Scientist: Andrew Storer

Human-Computer Interaction

Experimental Validation of TranScribe Prototype Design

Quality Evaluation of Educational Websites Using Heuristic and Laboratory Methods

Adobe Target Analyst Adobe Certified Expert Exam Guide

Web Evaluation Report Guidelines

Parts of the SUM: a case study of usability benchmarking using the SUM Metric

CHAPTER - 7 MARKETING IMPLICATIONS, LIMITATIONS AND SCOPE FOR FUTURE RESEARCH

Improve the User Experience on Your Website

User Testing Study: Collaborizm.com. Jessica Espejel, Megan Koontz, Lauren Restivo. LIS 644: Usability Theory and Practice

Chapter 10 Interactive Systems And Usability Organizational Requirements Engineering

CITP Mentoring Program Guidelines

Transcription:

Experimental Evaluation of Effectiveness of E-Government Websites A. Basit Darem 1, Dr. Suresha 2 1 Research Scholar, DoS in Computer Science, University of Mysore 2 Associate Professor, DoS in Computer Science, University of Mysore Abstract Usability of the e-government website is a crucial factor that should be considered for improving effectiveness, efficiency and satisfaction in services to citizens. In this study the effectiveness of e-government website will be measured using usability testing approach. The aim of this study is to support the mission of the government Website by evaluating the existing design of government Web site. The results indicated that there is an urgent need to improve the usability of e-government website in order to be more effective for citizens. Keywords e-government, usability test, tasks, effectiveness, success rate, performance metrics. I. INTRODUCTION The number of citizens (ranging from novice to expert) seeking information and services online are increasing rapidly day by day. These citizens expect government websites to save their time and money. The responsibility of government is to design easy to use websites to be usable by all types of citizens. Implementing the simple principle of having website that works well and doesn t confuse the user or get him frustrated, will help to reduce the abandonment of the website by visitors. The user s abandonment will lead to unusable website. Usability is defined as the degree to which people (users) can perform successfully a set of required tasks [4]. Current Web applications are very complex and highly sophisticated software products, whose usability can largely determine their success or failure [8]. Different definitions of usability have been proposed. Nielsen in his book usability engineering defined Usability as a quality attribute that assesses how easy user interfaces are to use [9]. He adds that the word "usability" also refers to the methods for improving ease-of-use during the design process. The international standard (ISO 9241) defined usability as the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use. From the above definition, effectiveness means the accuracy and completeness with which users achieve specified goals, efficiency is the resources expended in relation to the accuracy and completeness with which users achieve goals, and satisfaction is described as the comfort and acceptability of use [6]. Therefore, usability problems refer to aspects that make the application ineffective, inefficient, and difficult to learn and use. Thus, we can emphasize the usability in e- government context by ensuring the website interface to be easy to use. Usability is very critical in E-Government, as even one usability problem can adversely affect millions of citizens; it can cost them time and money [5]. Maguire sates that many poorly designed and unusable systems exist which users find difficult to learn and complicated to operate [7]. He adds that these systems are likely to be underused, misused or fall into disuse with frustrated users maintaining their current working methods. The outcome is costly for the organization using the system [7]. Moreover Usability is qualitative or quantitative measure of the relative factors with which a novice user interacts with an E-Government website to accomplish the user s goal(s) [2]. The purpose of this study is conducting a usability test to evaluate the effectiveness of the current government websites at the local government (district level) in India. The local government is an important part of good governance as most citizens are normally affected more by local governance than by the other layers of governance [3]. The study will try to find out to what extent these e- government websites in local government in India are effective and easy to use by the citizens (users). The aim of this evaluation is to better understand how website visitors are using the government websites. Performance metrics are powerful tools to evaluate the usability of any product. According to Tullis and Albert [10] there are five general types of performance metrics that usability test can capture: 1) Task success metrics are used when researchers are interested in whether participants are able to complete tasks using the website. 469

2) Time-on-task is helpful when researchers are concerned about how quickly users can perform tasks with the website. 3) Errors are a useful measure based on the number of mistakes made while attempting to complete a task. 4) Efficiency is a way of evaluating the amount of effort (cognitive and physical) required to complete a task. 5) Learnability involves looking at how any efficiency metric changes over time. Learnability is useful if researchers want to examine how and when participants reach proficiency in using a product. II. EXPERIMENTAL SETUP To avoid unreliable and biased results, the design of a user test evaluation and its execution should be carefully planned and managed. The researcher followed the framework proposed in the literature ([8], [10], and [1]) to conduct the usability test and collect the data. The framework has the following steps, 1) define the goals of the test, 2) define the user sample to participate in the test, 3) select tasks and scenarios, 4) define how to measure usability, 5) prepare the material and the experimental environment. Each individual session will consist of a set of tasks and a questionnaire for the participants to complete. The individual evaluations will take place in the following order, a) a performance evaluation in which each participant is asked to perform a series of real-life tasks, b) a questionnaire after each performance evaluation to gather additional insights from the participants about Mysore District s website. Data Scoring: The data was scored, using the approach found in the literature [10]. A task was counted as a Success if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. If the participant abandoned (gave-up) the task, did not reach the correct answer, or performed it incorrectly, the task was counted as a Failure. Participants: Typically the test was conducted with a group of potential users who have knowledge of computer and internet. Fourteen participants (2 female, 12 male) in the age group ranging from 20 years to 30 years volunteered for this domain study. 470 All participants were familiar with the use of Internet (average experience of internet use was 3 years). Procedure: Users were asked to complete a series of routine tasks. Sessions were recorded and analyzed to identify the success rate, gave-up rate, time spent on task, number of pages visited per task, etc. Mysore district website (www.mysore.nic.in) was selected for this test. It serves as an information gateway to Mysore district citizens to get information and services. An online usability test was conducted using a live version of www.mysor.nic.in located on the internet. Loop11 online tool (avilable in www.loop11.com ) was used to capture the participant s comments, navigation path, heat map, overall satisfaction ratings, questions and feedback. A usability test was intended to determine the extent to which an interface facilitates a user s ability to complete routine tasks. Evaluation Tasks: The tasks were intended to be general, simple, and from the reality of citizens daily needs. Test participants attempted completion of the tasks as shown in Table 1. No Task 1 Task 2 Task 3 Task 4 Task 5 Data Collection: Table I- TASKS ATTEMPTED Task description Find important centers of learning in Mysore. Download the application form of birth registration. Find the page that gives information about tourist places in Mysore. Find the Public Grievance System. Find the contact details of Karnataka Urban Water Supply And Drainage Board. The data were collected in an excel file. The first round after exporting the data is to prepare the data for analysis. The second round is filtering the data. The filter was used to check the data against the participants who were not serious and completed the test without care. For that we put some criteria and a threshold. The threshold used for the time spent in task is less than 5 seconds and for the page viewed is 1 page per task. The Filter used was: if (Page Views =1 and the Time spent <5 sec) for more than 2 tasks, then the participant considered not serious, delete the participant s data

By implementing the above filter the data of participant number 10 was deleted and not included in the final analysis. He spent 2 seconds in each page and visited only 1 page per task. It means that he only navigates through the test by clicking the next button only. The total number of participants included in the final analysis was 14. They were coded as p 1 to p 14. As we can see in Table 2, the success rate for Task1 is 50% (SD 0.519). Tasks 3 and 4 scored 57% each (SD 0.514). Figure 2 showed statistically significant difference (at the p = 0.05 level for the confidence intervals) between Task 2 and Tasks 1,3, and 4 as also between Task 5 and Tasks 1,3, and 4. TABLE II- TASK SUCCESS RATE III. RESULT AND DISCUSSION Figure 1 showed a screen shot of the website used in this study. It can be accessed at (www.mysore.nic.in) Tasks Task 1Task 2 Task 3 Task 4 Task 5 Sum of Successful Tasks 7 0 8 8 0 Average 50% 0% 57% 57% 0% SD 0.519 0.514 0.514 Upper bound 77.2% 84.0% 84.0% Lower bound 22.8% 30.2% 30.2% Confidence Interval(95%) 27.2% 26.9% 26.9% Figure 2- Tasks success rate (Error bars represent a 95% confidence interval for the mean) The results indicate that, task 1 and 5 were very difficult and participants failed to download the birth registration form (Task 2) and to find the contact information of water supply (Task 5). Figure 1- Mysore district website Success: Success rates are best used to provide a general picture of how the website effectively supports users goal and how much improvement is needed to make the website usable[1]. There are two ways of calculating task success rates. First is the percentage of tasks each participant completed successfully. Second is the percentage of participants who completed each task successfully. Figure 3- Participants' tasks success rate 471

TABLE III - PARTICIPANTS' TASKS SUCCESS RATE Participants P 1 P 2 P 3 P 4 P 5 P 6 P 7 P 8 P 9 P 10 P 11 P 12 P 13 P 14 Task 1 1 1 1 1 0 0 0 0 1 0 1 1 0 0 Task 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Task 3 1 1 1 1 0 1 0 1 1 0 0 1 0 0 Task 4 1 1 1 1 0 0 0 1 1 1 0 1 0 0 Task 5 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Mean 0.60 0.60 0.60 0.60 0.00 0.20 0.00 0.40 0.60 0.20 0.20 0.60 0.00 0.00 SD 0.55 0.55 0.55 0.55 0 0.45 0 0.55 0.55 0.45 0.45 0.55 0 0 Table 3 and Figure 3 showed that out of 14 participants only 6 participants completed 60% of the tasks successfully, 1 participant completed 40% and 3 participants completed 20% of the tasks. Four participants fail to complete any task successfully Gave-up rate: Tasks 3 and 4 have 0 gave-up as they have the highest success rate. It indicates that they are easy to achieve. There is statistically significant difference (at the p = 0.05 level for the confidence intervals) between Task 1 and Tasks 2, 5. Figures 5 showed the completion rate of each participant. To measure the frustration of the users using the website recorded the data of users who gave-up the tasks. The gaveup rate indicates that participants got frustrated to complete the task so, they move to other task. The gave-up rate was separately calculated to know the rate of frustration, and then it was added up to the failed task result to make the overall failure rate Figure 5- Results of each Participant for all tasks Figure 4- Gave-up for each task Out of 67% (the overall failure rate) around 23% of the participants were abandoned the tasks. Tasks 2 and 5 showed highest gave-up rate with 50% each. It indicates that these two tasks were difficult or confusing. The success rate that these tasks got was 0. Summary of Data: The table 4 below displays a summary of the test data. It showed Low success rates around 33% and high failure rate (23% gave-up and 44% failure to access the right page). The low success rate indicates that the effectiveness of the website is very low. In general all participants found the mysore.nic.in web site to be unclear, not straightforward, and not easy to use. The test identified many major problems including; the lack of categorization of topics on the home page, confusion over apparent duplicative usage of some terms, and confusion of terms and abbreviations. 472

TABLE IV- OVERALL COMPLETION RATE REFERENCES Tasks Success Gave-up Fail Educational institutes in Mysore 50% 14% 36% Download the application form 0% 50% 50% Tourist information 57% 0% 43% Public Grievance System 57% 0% 43% Find contact details 0% 50% 50% Average Task Completion Rate 33% 23% 44% The users found many orphan pages, they couldn t return to home or previous page, the site index or site map does not exist, and lack of color contrast between text and background. Moreover no search facility, help or FAQ page were avilable. [1] Albert, B., Albert, W., Tullis, T., & Tedesco, D. (2010). Beyond the Usability Lab: Conducting Large-scale User Experience Studies. Morgan Kaufmann. [2] Baker, D. L. (2004). E-government: website usability of the most populous counties (Doctoral dissertation, Arizona State University). [3] Basit, A.D., Suresha & Al-Hashmi, A. (2011). Evaluation of the Use of Local Government Websites: Internet Users Perspective. The IUP Journal of Information Technology, 7(2), 47-55. [4] Brinck, T., Gergle, D., & Wood, S. D. (2002). Usability for the Web: designing Web sites that work. Morgan Kaufmann Publishers. [5] Dinesh Katre,(2007). Indian E-Government Initiative: An Ideal Case for Universal Usability, Retrieved Dec 2, 20011, from http://www.hceye.org/usabilityinsights/?p=21 [6] ISO (1997) ISO 9241: Ergonomics Requirements for Office Work with Visual Display Terminal (VDT) Parts 1 17. [7] Maguire, M. (2001). Methods to support human-centred design. International Journal of Human-Computer Studies, 55(4), 587-634. [8] Matera, M., Rizzo, F., & Carughi, G. (2006). Web usability: Principles and evaluation methods. Web engineering, 143-180. [9] Nielsen, J., & Hackos, J. T. (1993). Usability engineering (Vol. 125184069). San Diego: Academic press. [10] Tullis, T., & Albert, W. (2008). Measuring the user experience: collecting, analyzing, and presenting usability metrics. Amsterdam: Elsevier/Morgan Kaufmann Publishers. ACKNOWLEDGMENT We would like to thank the software producers of Loop11 for the free use of their useful product as well as the participants in this study. Figure 6- Result of Overall comlation rate IV. CONCLUSION Usability test was conducted to test the effectiveness of e-government website at the district level in India. Live website of Mysore district was selected for this study. Fourteen participants completed the test. Each participant was given 5 tasks. The success rate for all tasks was 33%. The results indicated that this website has a serious usability problem. There is an urgent need to improve the interface design, search and navigation of e-government website to be more effective and usable for the citizens. 473