The Reliability of Usability Tests: Black Art or Science? Paul Naddaff. Bentley University

Size: px
Start display at page:

Download "The Reliability of Usability Tests: Black Art or Science? Paul Naddaff. Bentley University"

Transcription

1 Running head: THE RELIABILITY OF USABILITY TESTS: BLACK ART OR SCIENCE? The Reliability of Usability Tests: Black Art or Science? Paul Naddaff Bentley University

2 1.1 INTRODUCTION The field of user centered design research can be broadly divided into three main categories; testing, inspection, and inquiry. Each category employs various methods that can be used to evaluate how a given design supports a user in completing tasks, the general usability of a system, or exploring user needs and understandings of that system. In this paper, we will focus primarily on the category of testing and discuss the reliability of this core testing method. 1.2 USABILITY TESTING OVERVIEW Usability testing (or UT) has long been viewed as a core method of user centered design research. In usability testing, a testing team will select representative users to interact with a system/design to accomplish a set list of tasks (or openly explore it) (Dumas & Redish, 1999). While the tasks are being attempted, the test practitioners will examine and record specific experiences (both positive and negative) each user has in completing a given task. This data is later analyzed and feedback is submitted to the development team and other stakeholders. Broadly speaking, usability testing as a data collection method, is not fully reliable on its own. As Wilson (2000) points out, usability testing is one of many methods used in user research. He emphasizes that using multiple methods can provide different perspectives on the same problem(s). The convergence of those methods can lead to true, actionable, and reliable insight. Relying too heavily on any one method, specifically usability testing, is not sound because many aspects of a given system may not be evaluated thoroughly enough, if at all. Problems will go undetected. Based on the literature, there are many reasons why a usability test may not be as reliable as usability practitioners once thought. Designing something that is simple to use is extremely difficult, and there is no formula. Usability testing attempts to perfect the design of something that will be used by imperfect beings- it is no wonder that it is not perfectly reliable. That being said, we will examine specific reasons for the lack of reliability, and potential methods to make usability testing more reliable. The main areas that we will discuss are:

3 1. The evaluator effect 2. Synchronization between client and testing team 3. Task selection and formulation 4. Who to test? 5. Problem severity 6. Making change happen 2.1 THE EVALUATOR EFFECT The evaluator effect refers to how multiple evaluators will record differing findings when analyzing the same usability test sessions or when evaluating the same system (in the case of expert evaluation). Humans no matter how well trained will make errors and have variability in judgment. There have been many studies scientifically examining this effect (Jacobsen, 1998, Kessner, 2001, Lindgaard, 2007, Molich ). The validation of the evaluator effect sent shockwaves through the foundations of the usability testing community because it suggests that the method as a whole is not as repeatable as it was once thought. In the scientific community, in order for a method to be reliable it must generate reproducible results, this is an imperative property of an established methodology (Kessner, 2000). Molich has lead a series of studies targeted specifically at analyzing the different methods used by various professional usability testing labs. The Comparative User Evaluation (or CUE) studies have consistently shown that there are striking differences in approach, reporting, and findings between the labs (Molich, 1998). These differences contained a surprising amount of variability in the results of usability tests. For example, in CUE-4, only 9 of the 340 total different usability issues detected by the 17 participating teams, were reported by more than half of the teams (Molich, 2006). 205 (60%) of the reported issues were uniquely reported by only one team, no two teams reported the same issue. Statistics like this run throughout the CUE studies as well as through similar studies cited earlier.

4 The most important aspect of the evaluator effect is to know that it exists and to act accordingly. As mentioned earlier, Wilson highlights the importance of using multiple evaluation methods to triangulate or converge upon product requirements or issues. Each method that a user research team uses is going to have its strengths and weaknesses- it is up to the team to advocate for the appropriate resources to run the necessary methods for a given design. 2.2 SYNCHRONIZATION BETWEEN CLIENT AND TEST TEAM While close client-testing team integration is not critical for an evaluation (i.e. in the case of the expert evaluation of a competitor website), specific and focused requests by the client lead to more overlap in the findings of the testing team (Kessner, 2000). Usability testing is not a perfect science, this much is clear- but when dealing with humans, imperfect beings, the more insight a testing team can get before building a test plan, the better. While the client may not always have the deep understanding of their customer that they think they do, the knowledge they do have should not be discounted. The earlier in the design process that this synchronization can happen, the better. 2.3 TASK SELECTION AND FORMULATION One of the most widely discussed, researched, and debated aspects of usability testing has been the question of how many test subjects are necessary to reveal the maximum number of usability problems in a system over a series of usability tests (Virzi, 1992). The topic has been put to rest and resurrected, with highly regarded experts in the field still not in perfect agreement. The specific number test participants required, however important it may be, is only part of the equation that needs to be considered when designing a reliable usability test. If a given research lab had all of the time and resources in the world to test millions of users, the test would still not necessarily be perfect. Task selection and formulation is perhaps the most important (and difficult) aspect of designing a usability test. Evidence highlighting this importance is not hard to find. For example Cockton & Wollrych (2001) developed concurrent tests of the same system to determine if multiple passes would uncover new problems- it did. A task is

5 merely a different way of looking at a system. Each task will poke and prod a design in a certain place and in a certain way, revealing strengths and weaknesses. Tasks should be designed to simulate a user s real world experience using something. By giving many sets of user tasks to a small number of users vs giving many users the same limited set of user tasks, the testing team will reliably reveal more issues (Lindgaard, 2001). The preceding statement assumes those tasks are well designed. A well designed task will focus on probing a systems usefulness, efficiency, effectiveness, satisfaction, and accessibility (Rubin & Chisnell, 2008). % Problems % Problems vs Task Coverage Number of user tasks % Problems % Problems Found vs Number of Users Number of users 2.4 WHO TO TEST? In an effort to increase the reliability of a usability test, it is important to test various types of potential users. For example, both novice and experienced users will each interact with a system in a different way. Each user will have a different mental model of a given system, collecting that data will help the development team craft the design around those models. Additionally, selecting non-target users, for example elderly consumers with arthritis, when designing kitchen utensils, lead design firm, Smart Design, to design an extremely successful collection of utensils for OXO (designcouncil.org).

6 2.6 PROBLEM SEVERITY Even a poor usability test will uncover some problems. However, a testing method that only uncovers minor problems (assuming there are severe problems to be discovered) is not of much use (Lindgaard, 2007). A study by Hertzum et. al (2002), used multiple expert evaluators to examine, in part, if the evaluator effect applied as much to severe problems as it did for minor problems. Evaluators in this study were slightly more likely to find severe problems than they were overall problems (24% vs 16% respectively). Another study by Kessner (2001) tells a slightly more favorable story for the reliability of expert evaluators. As can be seen in the chart below, severe problems had more overlap than minor problems. This is an important fact because it illustrates the ability of usability testing to reliably detect severe problems. As mentioned earlier, no one method is going to detect all problems. Another aspect of problem severity is the question of when is a problem severe enough to be a problem. This depends strongly on what stage of development the system is in. For instance, a relatively early iteration of a system is going to be more plagued by problems large and small, than say a near final iteration. It is advised that in early iterations, tasks that are not successfully completed by 70% of the users, it is problematic

7 (Rubin & Chisnell, 2008). In later iterations of the system, the testing team should aim for 95% success rate. This focus on defining severe problems helps increase reliability by providing a firm framework for testing teams to work within. This issue relates strongly to section 2.6 below. 2.6 MAKING CHANGE HAPPEN In Molich s CUE-4, 340 usability problems were found on the hotel website. 340 different issues that a development team is being asked to address. It is possible that the practitioners in this case, were being too sensitive (Molich & Dumas, 2006). At a certain point, a testing team is going to have to make the decision as to which problems they want to take issue with, and which are not worth fighting for. It is likely that if somebody told you they had 340 unique issues with something you created, that you may not be open to hearing what they have to say. The ultimate goal of usability testing is to make usability changes happen, not just to test and report on them. While usability testing is part art and part science, delivering results of a test in a productive manner definitely requires tact. While making user centric change doesn t happen be means of any one test, it is the raison d être of the field, and thus, the reliability of practitioners to reliably make change happen cannot be overlooked. It is important to build partnerships with internal teams in marketing, engineering, and management early in the design process. It is advisable to apply usability methods to those internal teams in an effort to learn more about their goals, priorities, customer contacts, and customer data; doing so can help the strategic penetration of usability within organizations (Rosenbaum, 2000). 3.1 CONCLUSION This paper has provided numerous facts and pieces of evidence that paint a relatively bleak picture of the reliability usability testing. Some endearingly call it a black art (Lindgaard, 2000), while others continue to revere it as the gold standard of user centered design research. Regardless of how reliable it is or is not, we keep using it. Most advanced fields are at first considered a concoction of magic and science until they are fully understood.

8 While usability testing is not the silver bullet of the design process, it provides a large amount of information about a design, much of which is useful. In the cause of usability, doing something is almost always better than doing nothing. (Gray and Salzman, 1998). Today, it appears as though many companies do not do any sort of user research before launching a product- usability testing has proven itself to be extremely valuable despite it s drawbacks. By taking some of the above considerations into account, usability testing teams will be more likely to produce reliable and repeatable results in their testing.

9 REFERENCES Cockton, G & Woolrych, A. Understanding inspection methods: Lessons from an assessment of heuristic evaluation. In People & Computers XV, A. Blandford & J. Vanderdonckt (Eds.), Springer-Verlag (2001), Dumas, J.S, & Redish, J.C. (1999)- A practical guide to usability testing (Rev. ed.). Exeter, UK: Intellect. Hertzum, M.& Jacobsen, N.E. The Evaluator Effect: A chilling fact about usability evaluation methods. International Journal of Human-Computer Interaction, 15, 1, Lawrence Erlbaum Associates (2003), Jacobsen, N.E., Hertzum, M., and John, B.E., 1998, The evaluator effect in usability studies: Problem detection and severity judgments. Human Factors and Ergonomics Society 42nd Annual Meeting, 5-9 October 1998 (Santa Monica: Human Factors and Ergonomics Society), pp Kessner, M., Wood, J., Dillon, R. F. and West, R. L., 2001, On the reliability of usability testing. Conference on Human Factors in Computing Systems: CHI 2001, 31 March 5 April 2001 (extended abstracts) (Seattle: ACM Press), pp Molich, R., Bevan, N., Butler, S., Curson, I., Kindlund, E., Kirakowski, J., and Miller, D., 1998, Comparative evaluation of usability tests. Usability Professionals Association 1998 Conference, June 1998 (Washington DC: Usability Professionals Association), pp Molich, R. & Dumas, J.S. Comparative Usability Evaluation (CUE-4). Behaviour & Information Technology, Taylor & Francis (in press). OXO, Retrieved 10/30/11from Good-Grips/User-research/ Rosenbaum, S., Rohn, J., and Humburg, J., 2000, A Toolkit for Strategic Usability: Results from Workshops, Panels, and Surveys. Conference on Human Factors in Computing Systems: CHI 2000 (New York: ACM Press), pp Rubin, J. & Chisnell, D. (2008). Handbook of usability testing: How to plan, design and conduct effective tests (2 nd ed.). Indianapolis, IN: Wiley Publishing, Inc. Virzi, R.A, (1992). Refining the test phase of usability evaluation: How many subjects are enough? Human Factors, 34, Wilson, C. (2006). Triangulation: The Explicit Use of Multiple Methods, Measured, Measures, and Approaches for Determining Core Issues in Product Development. Interactions (Nov-Dec. 2006).

Current Issues in the Determination of Usability Test Sample Size: How Many Users is Enough?

Current Issues in the Determination of Usability Test Sample Size: How Many Users is Enough? Current Issues in the Determination of Usability Test Sample Size: How Many Users is Enough? Carl W. Turner (State Farm Insurance, carl.turner.hxyf@statefarm.com) Jakob Nielsen (Nielsen Norman Group, nielsen@nngroup.com)

More information

Detection of Web-Site Usability Problems: Empirical Comparison of Two Testing Methods

Detection of Web-Site Usability Problems: Empirical Comparison of Two Testing Methods Detection of Web-Site Usability Problems: Empirical Comparison of Two Testing Methods Mikael B. Skov and Jan Stage Department of Computer Science Aalborg University Fredrik Bajers Vej 7 9220 Aalborg East,

More information

EVALUATION OF THE USABILITY OF EDUCATIONAL WEB MEDIA: A CASE STUDY OF GROU.PS

EVALUATION OF THE USABILITY OF EDUCATIONAL WEB MEDIA: A CASE STUDY OF GROU.PS EVALUATION OF THE USABILITY OF EDUCATIONAL WEB MEDIA: A CASE STUDY OF GROU.PS Turgay Baş, Hakan Tüzün Hacettepe University (TURKEY) turgaybas@hacettepe.edu.tr, htuzun@hacettepe.edu.tr Abstract In this

More information

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece Kostaras N., Xenos M., Assessing Educational Web-site Usability using Heuristic Evaluation Rules, 11th Panhellenic Conference on Informatics with international participation, Vol. B, pp. 543-550, 18-20

More information

Heuristic Evaluation of Groupware. How to do Heuristic Evaluation of Groupware. Benefits

Heuristic Evaluation of Groupware. How to do Heuristic Evaluation of Groupware. Benefits Kimberly Tee ketee@ucalgary.ca CPSC 681 Topic Heuristic Evaluation of Groupware Heuristic evaluation [9] is a discount evaluation method for finding usability problems in a singleuser interface design.

More information

How to Conduct a Heuristic Evaluation

How to Conduct a Heuristic Evaluation Page 1 of 9 useit.com Papers and Essays Heuristic Evaluation How to conduct a heuristic evaluation How to Conduct a Heuristic Evaluation by Jakob Nielsen Heuristic evaluation (Nielsen and Molich, 1990;

More information

2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system

2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system Introducing Interactive Systems Design and Evaluation: Usability and Users First Ahmed Seffah Human-Centered Software Engineering Group Department of Computer Science and Software Engineering Concordia

More information

SFU CMPT week 11

SFU CMPT week 11 SFU CMPT-363 2004-2 week 11 Manuel Zahariev E-mail: manuelz@cs.sfu.ca Based on course material from Arthur Kirkpatrick, Alissa Antle and Paul Hibbits July 21, 2004 1 Analytic Methods Advantages can be

More information

Usability Evaluation as a Component of the OPEN Development Framework

Usability Evaluation as a Component of the OPEN Development Framework Usability Evaluation as a Component of the OPEN Development Framework John Eklund Access Testing Centre and The University of Sydney 112 Alexander Street, Crows Nest NSW 2065 Australia johne@testingcentre.com

More information

PUTTING THE CUSTOMER FIRST: USER CENTERED DESIGN

PUTTING THE CUSTOMER FIRST: USER CENTERED DESIGN PUTTING THE CUSTOMER FIRST: USER CENTERED DESIGN icidigital.com 1 Case Study DEFINE icidigital was chosen as a trusted creative partner to design a forward-thinking suite of sites for AICPA, one of the

More information

Task-Selection Bias: A Case for User-Defined Tasks

Task-Selection Bias: A Case for User-Defined Tasks INTERNATIONAL JOURNAL OF HUMAN COMPUTER INTERACTION, 13(4), 411 419 Copyright 2001, Lawrence Erlbaum Associates, Inc. Task-Selection Bias: A Case for User-Defined Tasks Richard E. Cordes IBM Corporation

More information

CUE-10: Moderation Page 1. Comparative Usability Evaluation 10. Moderation. Observing usability test moderators

CUE-10: Moderation Page 1. Comparative Usability Evaluation 10. Moderation. Observing usability test moderators CUE-10: Moderation Page 1 Comparative Usability Evaluation 10 Moderation Observing usability test moderators Workshop: Boston, MA, USA, Wednesday 9 May 2018 CUE-10: Moderation Page 2 Call For Participation

More information

Evaluation techniques 1

Evaluation techniques 1 IMS5302 Human-computer interaction Lecture 6 Other Evaluation Techniques Overview Other evaluation methods Expert reviews Field studies Developing scenarios Selecting an evaluation method IMS5302 2 Scenarios

More information

Evaluation techniques 1

Evaluation techniques 1 IMS5302 Human- computer interaction Lecture 6 Other Evaluation Techniques Overview Other evaluation methods Expert reviews Field studies Developing scenarios Selecting an evaluation method IMS5302 2 Scenarios

More information

THE USE OF PARTNERED USABILITY TESTING TO HELP TO IDENTIFY GAPS IN ONLINE WORK FLOW

THE USE OF PARTNERED USABILITY TESTING TO HELP TO IDENTIFY GAPS IN ONLINE WORK FLOW THE USE OF PARTNERED USABILITY TESTING TO HELP TO IDENTIFY GAPS IN ONLINE WORK FLOW Dianne Davis Fishbone Interactive Gordon Tait Department of Surgery, University of Toronto Cindy Bruce-Barrett Strategic

More information

Usability Testing: Too Early? Too Much Talking? Too Many Problems?

Usability Testing: Too Early? Too Much Talking? Too Many Problems? Vol. 11, Issue 3, May 2016 pp. 83 88 Usability Testing: Too Early? Too Much Talking? Too Many Problems? Morten Hertzum University of Copenhagen Denmark hertzum@hum.ku.dk Abstract Usability testing has

More information

Requirements Gathering: User Stories Not Just an Agile Tool

Requirements Gathering: User Stories Not Just an Agile Tool Copyright 2016 Loft9. All Rights Reserved. 1 Loft9Consulting.com LOFT9 BUSINESS INSIGHTS Requirements Gathering: User Stories Not Just an Agile Tool Copyright 2016 Loft9. All Rights Reserved. 2 Loft9Consulting.com

More information

Chapter 18: Usability Testing U.S. Dept. of Health and Human Services (2006)

Chapter 18: Usability Testing U.S. Dept. of Health and Human Services (2006) Chapter 18: Usability Testing U.S. Dept. of Health and Human Services (2006) There are two major considerations when conducting usability testing. The first is to ensure that the best possible method for

More information

Reflections on Brian Shackels Usability - Context, Definition, Design and Evaluation

Reflections on Brian Shackels Usability - Context, Definition, Design and Evaluation Reflections on Brian Shackels Usability - Context, Definition, Design and Evaluation Judy Kay CHAI - Computer Human Adapted Interaction Research Group School of Information Technologies, University of

More information

UX Research in the Product Lifecycle

UX Research in the Product Lifecycle UX Research in the Product Lifecycle I incorporate how users work into the product early, frequently and iteratively throughout the development lifecycle. This means selecting from a suite of methods and

More information

Generating and Using Results

Generating and Using Results Background Generating and Using Results from Usability Evaluations Kasper Hornbæk University of Copenhagen www.kasperhornbaek.dk Associate professor in the Human computer Interaction group at Copenhagen

More information

White Paper. Incorporating Usability Experts with Your Software Development Lifecycle: Benefits and ROI Situated Research All Rights Reserved

White Paper. Incorporating Usability Experts with Your Software Development Lifecycle: Benefits and ROI Situated Research All Rights Reserved White Paper Incorporating Usability Experts with Your Software Development Lifecycle: Benefits and ROI 2018 Situated Research All Rights Reserved Learnability, efficiency, safety, effectiveness, memorability

More information

Usability of interactive systems: Current practices and challenges of its measurement

Usability of interactive systems: Current practices and challenges of its measurement Usability of interactive systems: Current practices and challenges of its measurement Δρ. Παναγιώτης Ζαχαριάς Τμήμα Πληροφορικής Πανεπιστήμιο Κύπρου 23/2/2010 Concepts and Definitions Usability engineering

More information

A Usability Evaluation of Google Calendar

A Usability Evaluation of Google Calendar A Usability Evaluation of Google Calendar Executive summary Purpose This study is to conduct a usability evaluation of web-based Google Calendar. Four highlevel questions about the usability of this product

More information

Curtin University School of Design. Internet Usability Design 391. Chapter 1 Introduction to Usability Design. By Joel Day

Curtin University School of Design. Internet Usability Design 391. Chapter 1 Introduction to Usability Design. By Joel Day Curtin University School of Design Internet Usability Design 391 Chapter 1 Introduction to Usability Design By Joel Day Internet Usability Design 391 Chapter 1: Usability Introduction Page 2 of 6 What

More information

Foundation Level Syllabus Usability Tester Sample Exam

Foundation Level Syllabus Usability Tester Sample Exam Foundation Level Syllabus Usability Tester Sample Exam Version 2017 Provided by German Testing Board Copyright Notice This document may be copied in its entirety, or extracts made, if the source is acknowledged.

More information

Usability Evaluation of Cell Phones for Early Adolescent Users

Usability Evaluation of Cell Phones for Early Adolescent Users Yassierli*, Melati Gilang Industrial Management Research Division, Faculty of Industrial Technology, Bandung Institute of Technology Jl. Ganesa 10 Bandung 40134 Indonesia ABSTRACT:. The increasing number

More information

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

Assistant Professor Computer Science. Introduction to Human-Computer Interaction CMSC434 Introduction to Human-Computer Interaction Week 12 Lecture 24 Nov 21, 2013 Intro to Evaluation Human Computer Interaction Laboratory @jonfroehlich Assistant Professor Computer Science Hall of Fame

More information

User Interface Evaluation

User Interface Evaluation User Interface Evaluation Heuristic Evaluation Lecture #17 Agenda Evaluation through Expert Analysis Cognitive walkthrough Heuristic evaluation Model-based evaluation Cognitive dimension of notations 2

More information

Survey of severity ratings in usability. Lisa Renery Handalian

Survey of severity ratings in usability. Lisa Renery Handalian Survey of severity ratings in usability Lisa Renery Handalian Renery Handalian 1 Introduction A few careers ago, I was a writing instructor at U. Mass, Boston, where all undergraduates are required to

More information

Heuristic Evaluation Report. The New York Philharmonic Digital Archives archives.nyphil.org

Heuristic Evaluation Report. The New York Philharmonic Digital Archives archives.nyphil.org Heuristic Evaluation Report The New York Philharmonic Digital Archives archives.nyphil.org Cassie Hickman Wednesday, October 14, 2015 Table of Contents Executive Summary... 3 Introduction... 4 Methodology...

More information

NPTEL Computer Science and Engineering Human-Computer Interaction

NPTEL Computer Science and Engineering Human-Computer Interaction M4 L5 Heuristic Evaluation Objective: To understand the process of Heuristic Evaluation.. To employ the ten principles for evaluating an interface. Introduction: Heuristics evaluation is s systematic process

More information

SEGUE DISCOVERY PARTICIPATION IN DISCOVERY DISCOVERY DELIVERABLES. Discovery

SEGUE DISCOVERY PARTICIPATION IN DISCOVERY DISCOVERY DELIVERABLES.   Discovery SEGUE DISCOVERY An initial engagement with Segue begins with a Phase where our experienced team works directly with our customer to define the vision, scope, and high-level requirements for the project.

More information

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2) : Usability Methods II Heuristic Analysis Heuristics versus Testing Debate Some Common Heuristics Heuristic Evaluation Expert Reviews (1) Nielsen & Molich (1990) CHI Proceedings Based upon empirical article

More information

A Heuristic Evaluation of Ohiosci.org

A Heuristic Evaluation of Ohiosci.org A Heuristic Evaluation of Ohiosci.org Executive Summary Site evaluated: Goal: Method: The Ohio Academy of Science http://www.ohiosci.org/ The goal of this heuristic evaluation is to test the overall usability

More information

Improving the Usability of the University of Rochester River Campus Libraries Web Sites

Improving the Usability of the University of Rochester River Campus Libraries Web Sites River Campus Libraries Improving the Usability of the University of Rochester River Campus Libraries Web Sites Susan K. Cardinal April 6, 2008 New Orleans ACS Meeting Outline What is Usability? Why is

More information

needs, wants, and limitations

needs, wants, and limitations In broad terms Process in which the needs, wants, and limitations of end users of a product are given extensive attention at each stage of the design process. ISO principles which says that the design

More information

Problem and Solution Overview: An elegant task management solution, that saves busy people time.

Problem and Solution Overview: An elegant task management solution, that saves busy people time. An elegant task management solution, that saves busy people time. Team: Anne Aoki: Storyboarding, design, user studies, writing Alex Anderson: User studies, design Matt Willden: Ideation, writing, user

More information

Chapter 3: Google Penguin, Panda, & Hummingbird

Chapter 3: Google Penguin, Panda, & Hummingbird Chapter 3: Google Penguin, Panda, & Hummingbird Search engine algorithms are based on a simple premise: searchers want an answer to their queries. For any search, there are hundreds or thousands of sites

More information

Evaluation of Commercial Web Engineering Processes

Evaluation of Commercial Web Engineering Processes Evaluation of Commercial Web Engineering Processes Andrew McDonald and Ray Welland Department of Computing Science, University of Glasgow, Glasgow, Scotland. G12 8QQ. {andrew, ray}@dcs.gla.ac.uk, http://www.dcs.gla.ac.uk/

More information

Improve the User Experience on Your Website

Improve the User Experience on Your Website Forrester Consulting Approach Document Improve the User Experience on Your Website Improving the usability of your website will provide a better customer experience. By providing site visitors with a better

More information

Design for usability

Design for usability Proceedings of HCI International 1999, 22-26 Aug, Munich Design for usability Nigel Bevan Serco Usability Services, 4 Sandy Lane, Teddington, Middlesex, TW11 0DU, UK, nbevan@usability.serco.com 1 Introduction

More information

ONS Beta website. 7 December 2015

ONS Beta website. 7 December 2015 ONS Beta website Terminology survey results 7 December 2015 Background During usability sessions, both moderated and online, it has become clear that users do not understand the majority of terminology

More information

Real Wireframes Get Real Results

Real Wireframes Get Real Results Page 1 of 7 Real Wireframes Get Real Results by Stephen Turbek Published on 09/19/2006 19 Comments 8,935 Views How many times have you been asked, So, is the new website going to be black Just because

More information

Taxonomy Governance Checklist

Taxonomy Governance Checklist Author and Manage a SharePoint Taxonomy Taxonomy Governance Checklist v.1.0 Table of Content Introduction Methodology Phase 1: Planning Phase 2: Taxonomy Authoring Phase 3: Maintenance Appendix 1: Non-functional

More information

Usefulness of Nonverbal Cues from Participants in Usability Testing Sessions

Usefulness of Nonverbal Cues from Participants in Usability Testing Sessions Usefulness of Nonverbal Cues from Participants in Usability Testing Sessions Karen Long, Lara Styles, Terence Andre, and William Malcom Department of Behavioral Sciences and Leadership United States Air

More information

Lose It! Weight Loss App Heuristic Evaluation Report

Lose It! Weight Loss App Heuristic Evaluation Report Lose It! Weight Loss App Heuristic Evaluation Report By Manuel Ryan Espinsa Manuel Ryan Espinosa 1-27-2017 Heuristic Evaluation IN4MATX 283 LEC A: USER EXPERIENCE (37000) TABLE OF CONTENTS EXECUTIVE SUMMARY

More information

University of Maryland. fzzj, basili, Empirical studies (Desurvire, 1994) (Jeries, Miller, USABILITY INSPECTION

University of Maryland. fzzj, basili, Empirical studies (Desurvire, 1994) (Jeries, Miller, USABILITY INSPECTION AN EMPIRICAL STUDY OF PERSPECTIVE-BASED USABILITY INSPECTION Zhijun Zhang, Victor Basili, and Ben Shneiderman Department of Computer Science University of Maryland College Park, MD 20742, USA fzzj, basili,

More information

On the performance of novice evaluators in usability evaluations

On the performance of novice evaluators in usability evaluations On the performance of novice evaluators in usability evaluations Panayiotis Koutsabasis, Thomas Spyrou, Jenny S. Darzentas and John Darzentas University of the Aegean Department of Product and Systems

More information

User-Centered Design (UCD) is a multidisciplinary USER-CENTERED DESIGN PRACTICE. The State of

User-Centered Design (UCD) is a multidisciplinary USER-CENTERED DESIGN PRACTICE. The State of By Ji-Ye Mao, Karel Vredenburg, Paul W. Smith, and Tom Carey The State of USER-CENTERED DESIGN PRACTICE UCD is gaining industry acceptance but its current practice needs fine-tuning. User-Centered Design

More information

Usability Testing Methodology for the 2017 Economic Census Web Instrument

Usability Testing Methodology for the 2017 Economic Census Web Instrument Usability Testing Methodology for the 2017 Economic Census Web Instrument Rebecca Keegan Economic Statistical Methods Division March 8th, 2017 FCSM Disclaimer: Any views expressed are those of the author

More information

Visual Appeal vs. Usability: Which One Influences User Perceptions of a Website More?

Visual Appeal vs. Usability: Which One Influences User Perceptions of a Website More? 1 of 9 10/3/2009 9:42 PM October 2009, Vol. 11 Issue 2 Volume 11 Issue 2 Past Issues A-Z List Usability News is a free web newsletter that is produced by the Software Usability Research Laboratory (SURL)

More information

A Revised Set of Usability Heuristics for the Evaluation of Interactive Systems

A Revised Set of Usability Heuristics for the Evaluation of Interactive Systems Informatica Economică vol. 21, no. 3/2017 31 A Revised Set of Usability Heuristics for the Evaluation of Interactive Systems Costin PRIBEANU National Institute for Research and Development in Informatics

More information

Introduction to User Stories. CSCI 5828: Foundations of Software Engineering Lecture 05 09/09/2014

Introduction to User Stories. CSCI 5828: Foundations of Software Engineering Lecture 05 09/09/2014 Introduction to User Stories CSCI 5828: Foundations of Software Engineering Lecture 05 09/09/2014 1 Goals Present an introduction to the topic of user stories concepts and terminology benefits and limitations

More information

How to Choose the Right UX Methods For Your Project

How to Choose the Right UX Methods For Your Project How to Choose the Right UX Methods For Your Project Bill Albert, Ph.D. Director, Bentley Design and Usability Center May 4, 2011 walbert@bentley.edu 781.891.2500 www.bentley.edu/usability Motivation Our

More information

A short introduction to. designing user-friendly interfaces

A short introduction to. designing user-friendly interfaces A short introduction to designing user-friendly interfaces Usability is often ignored until it becomes a problem Introduction This booklet is about Usability and User Experience design. It is aimed at

More information

Usability Testing. November 14, 2016

Usability Testing. November 14, 2016 Usability Testing November 14, 2016 Announcements Wednesday: HCI in industry VW: December 1 (no matter what) 2 Questions? 3 Today Usability testing Data collection and analysis 4 Usability test A usability

More information

The Analysis and Proposed Modifications to ISO/IEC Software Engineering Software Quality Requirements and Evaluation Quality Requirements

The Analysis and Proposed Modifications to ISO/IEC Software Engineering Software Quality Requirements and Evaluation Quality Requirements Journal of Software Engineering and Applications, 2016, 9, 112-127 Published Online April 2016 in SciRes. http://www.scirp.org/journal/jsea http://dx.doi.org/10.4236/jsea.2016.94010 The Analysis and Proposed

More information

PLEASE SCROLL DOWN FOR ARTICLE

PLEASE SCROLL DOWN FOR ARTICLE This article was downloaded by: [Det Kgl Bibl Nationalbibl og Kbh Univ] On: 15 September 2008 Access details: Access Details: [subscription number 776111357] Publisher Taylor & Francis Informa Ltd Registered

More information

ITIL implementation: The role of ITIL software and project quality

ITIL implementation: The role of ITIL software and project quality ITIL implementation: The role of ITIL software and project quality Tom Roar Eikebrokk Department of Information Systems University of Agder Kristiansand, Norway tom.eikebrokk@uia.no Abstract: This research

More information

Speed and Accuracy using Four Boolean Query Systems

Speed and Accuracy using Four Boolean Query Systems From:MAICS-99 Proceedings. Copyright 1999, AAAI (www.aaai.org). All rights reserved. Speed and Accuracy using Four Boolean Query Systems Michael Chui Computer Science Department and Cognitive Science Program

More information

Usability Report. Author: Stephen Varnado Version: 1.0 Date: November 24, 2014

Usability Report. Author: Stephen Varnado Version: 1.0 Date: November 24, 2014 Usability Report Author: Stephen Varnado Version: 1.0 Date: November 24, 2014 2 Table of Contents Executive summary... 3 Introduction... 3 Methodology... 3 Usability test results... 4 Effectiveness ratings

More information

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms Standard Glossary of Terms used in Software Testing Version 3.2 Foundation Extension - Usability Terms International Software Testing Qualifications Board Copyright Notice This document may be copied in

More information

Automate Transform Analyze

Automate Transform Analyze Competitive Intelligence 2.0 Turning the Web s Big Data into Big Insights Automate Transform Analyze Introduction Today, the web continues to grow at a dizzying pace. There are more than 1 billion websites

More information

Usability Report Cover Sheet

Usability Report Cover Sheet Usability Report Cover Sheet Project Overall Project: MLibrary Website Project Title: Search/Browse Card Sort Committee & Members Usability Group: Suzanne Chapman (chair), Shevon Desai, Kat Hagedorn, Julie

More information

Acurian on. The Role of Technology in Patient Recruitment

Acurian on. The Role of Technology in Patient Recruitment Acurian on The Role of Technology in Patient Recruitment Wearables smartphones social networks the list of new technological tools available to patients and healthcare providers goes on and on. Many clinical

More information

Verification and Validation. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 1

Verification and Validation. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation 1 Objectives To introduce software verification and validation and to discuss the distinction between them To describe the program inspection process and its role in V & V To

More information

Step-by-Step Instructions for Pre-Work

Step-by-Step Instructions for Pre-Work Step-by-Step Instructions for Pre-Work We recommend that you schedule 20-25 hours to complete your pre-work assignment. Please track your hours. We will ask you to record them on your pre-work evaluation

More information

ACL Interpretive Visual Remediation

ACL Interpretive Visual Remediation January 2016 ACL Interpretive Visual Remediation Innovation in Internal Control Management SOLUTIONPERSPECTIVE Governance, Risk Management & Compliance Insight 2015 GRC 20/20 Research, LLC. All Rights

More information

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

CS 160: Evaluation. Professor John Canny Spring /15/2006 1 CS 160: Evaluation Professor John Canny Spring 2006 2/15/2006 1 Outline User testing process Severity and Cost ratings Discount usability methods Heuristic evaluation HE vs. user testing 2/15/2006 2 Outline

More information

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test CS 160: Evaluation Professor John Canny Spring 2006 2/15/2006 1 2/15/2006 2 Iterative Design Prototype low-fi paper, DENIM Design task analysis contextual inquiry scenarios sketching 2/15/2006 3 Evaluate

More information

User-centered design in technical communication

User-centered design in technical communication User-centered design in technical communication Information designer & information architect Sharing knowledge is better than having it. Tekom - TC Europe November 19-20, 2003 Nov. 19-20, 2003 User-centered

More information

The Changing. Introduction. IT Business Brief. Published By IT Business Media Cofounders Jim Metzler

The Changing. Introduction. IT Business Brief. Published By IT Business Media   Cofounders Jim Metzler MAY 19, 2004 Branch Office Networking IT Business State-of-the-Market Brief By Jim Metzler The Changing Branch Office Network State of the Market Brief Introduction Branch offices are key to the success

More information

Choosing the Right Usability Tool (the right technique for the right problem)

Choosing the Right Usability Tool (the right technique for the right problem) Choosing the Right Usability Tool (the right technique for the right problem) User Friendly 2005 December 18, Shanghai Whitney Quesenbery Whitney Interactive Design www.wqusability.com Daniel Szuc Apogee

More information

Joint Application Design & Function Point Analysis the Perfect Match By Sherry Ferrell & Roger Heller

Joint Application Design & Function Point Analysis the Perfect Match By Sherry Ferrell & Roger Heller Joint Application Design & Function Point Analysis the Perfect Match By Sherry Ferrell & Roger Heller Introduction The old adage It s not what you know but when you know it that counts is certainly true

More information

Course Information

Course Information Course Information 2018-2020 Master of Information Systems: Management and Innovation Institutt for teknologi / Department of Technology Index Index... i 1... 1 1.1 Content... 1 1.2 Name... 1 1.3 Programme

More information

Hyper Mesh Code analyzer

Hyper Mesh Code analyzer Hyper Mesh Code analyzer ABSTRACT Hyper Mesh Code Analyzer (HMCA) is a text based programming environment, designed for programmers to write their source code in a more organized and manageable fashion.

More information

June 2017 intel.com schneider-electric.com

June 2017 intel.com schneider-electric.com DCIM Solution Deployment June 2017 intel.com schneider-electric.com DCIM Solution Deployment Introduction Current state of data center management Do you currently have a solution deployed? 20% 80% The

More information

Evaluation and Design Issues of Nordic DC Metadata Creation Tool

Evaluation and Design Issues of Nordic DC Metadata Creation Tool Evaluation and Design Issues of Nordic DC Metadata Creation Tool Preben Hansen SICS Swedish Institute of computer Science Box 1264, SE-164 29 Kista, Sweden preben@sics.se Abstract This paper presents results

More information

THINK THE FDA DOESN T CARE ABOUT USER EXPERIENCE FOR MOBILE MEDICAL APPLICATIONS? THINK AGAIN.

THINK THE FDA DOESN T CARE ABOUT USER EXPERIENCE FOR MOBILE MEDICAL APPLICATIONS? THINK AGAIN. THINK THE FDA DOESN T CARE ABOUT USER EXPERIENCE FOR MOBILE MEDICAL APPLICATIONS? THINK AGAIN. When granting regulatory approvals for medical devices, both IEC 62366 and the FDA have emphasized the importance

More information

HCI and Design SPRING 2016

HCI and Design SPRING 2016 HCI and Design SPRING 2016 Topics for today Heuristic Evaluation 10 usability heuristics How to do heuristic evaluation Project planning and proposals Usability Testing Formal usability testing in a lab

More information

Shaping User Experience

Shaping User Experience A Workshop presented 1 April 2012 for Electronic Resources and Libraries 2012 by Tara Carlisle, University of North Texas Libraries Kate Crane, Texas Tech University Ana Krahmer, University of North Texas

More information

Usability Testing CS 4501 / 6501 Software Testing

Usability Testing CS 4501 / 6501 Software Testing Usability Testing CS 4501 / 6501 Software Testing [Nielsen Normal Group, https://www.nngroup.com/articles/usability-101-introduction-to-usability/] [TechSmith, Usability Basics: An Overview] [Ginny Redish,

More information

Step by Step Instructions for Pre Work

Step by Step Instructions for Pre Work Step by Step Instructions for Pre Work We recommend that you schedule 20 25 hours to complete your pre work assignment. Please track your hours. We will ask you to record them on your pre work evaluation

More information

ArticlesPlus Launch Survey

ArticlesPlus Launch Survey University of Michigan Deep Blue deepblue.lib.umich.edu 2011-07-25 ArticlesPlus Launch Survey Chapman, Suzanne http://hdl.handle.net/2027.42/106781 Project ArticlesPlus Launch Survey Report Info Report

More information

Integrating Usability Design and Evaluation: Training Novice Evaluators in Usability Testing

Integrating Usability Design and Evaluation: Training Novice Evaluators in Usability Testing Integrating Usability Design and Evaluation: Training Novice Evaluators in Usability Testing Mikael B. Skov and Jan Stage Department of Computer Science Aalborg University Aalborg Øst, Denmark +45 9635

More information

Preliminary Findings. Vacation Packages: A Consumer Tracking and Discovery Study. Exploring Online Travelers. November 2003

Preliminary Findings. Vacation Packages: A Consumer Tracking and Discovery Study. Exploring Online Travelers. November 2003 Exploring Online Travelers Vacation Packages: A Consumer Tracking and Discovery Study Preliminary Findings November 2003 PhoCus Wright +1 860 350-4084 www.phocuswright.com Exclusive preview for PhoCusWright

More information

Case Study: Successful Adoption of a User-Centered Design Approach During the Development of an Interactive Television Application

Case Study: Successful Adoption of a User-Centered Design Approach During the Development of an Interactive Television Application Case Study: Successful Adoption of a User-Centered Design Approach During the Development of an Interactive Television Application Sheri Lamont Microsoft Corporation One Microsoft Way Redmond, WA U.S.A.

More information

Foundation Level Syllabus Usability Tester Sample Exam Answers

Foundation Level Syllabus Usability Tester Sample Exam Answers Foundation Level Syllabus Usability Tester Sample Exam s Version 2017 Provided by German Testing Board Copyright Notice This document may be copied in its entirety, or extracts made, if the source is acknowledged.

More information

Rapid Software Testing Guide to Making Good Bug Reports

Rapid Software Testing Guide to Making Good Bug Reports Rapid Software Testing Guide to Making Good Bug Reports By James Bach, Satisfice, Inc. v.1.0 Bug reporting is a very important part of testing. The bug report, whether oral or written, is the single most

More information

Toward an Automated Future

Toward an Automated Future 2017 State of the Network Engineer: Toward an Automated Future netbraintech.com Executive Summary Today s enterprises have reached a tipping point when it comes to network management. Networks are growing

More information

Usability Inspection Report of NCSTRL

Usability Inspection Report of NCSTRL Usability Inspection Report of NCSTRL (Networked Computer Science Technical Report Library) www.ncstrl.org NSDL Evaluation Project - Related to efforts at Virginia Tech Dr. H. Rex Hartson Priya Shivakumar

More information

The Study on Cost Comparisons of Various Card Sorting Methods

The Study on Cost Comparisons of Various Card Sorting Methods The Study on Cost Comparisons of Various Card Sorting Methods Jiann-Cherng Shieh, National Taiwan Normal University, Taiwan Chih-Hwei Lu, National Taiwan Normal University, Taiwan Yi-Ching Wu, National

More information

The Bizarre Truth! Automating the Automation. Complicated & Confusing taxonomy of Model Based Testing approach A CONFORMIQ WHITEPAPER

The Bizarre Truth! Automating the Automation. Complicated & Confusing taxonomy of Model Based Testing approach A CONFORMIQ WHITEPAPER The Bizarre Truth! Complicated & Confusing taxonomy of Model Based Testing approach A CONFORMIQ WHITEPAPER By Kimmo Nupponen 1 TABLE OF CONTENTS 1. The context Introduction 2. The approach Know the difference

More information

Improving user interfaces through a methodological heuristics evaluation framework and retrospective think aloud with eye tracking

Improving user interfaces through a methodological heuristics evaluation framework and retrospective think aloud with eye tracking Improving user interfaces through a methodological heuristics evaluation framework and retrospective think aloud with eye tracking Progress Report Supervisors: Dr. Tom Gedeon Mr. Christopher Chow Principal

More information

User Experience Report: Heuristic Evaluation

User Experience Report: Heuristic Evaluation User Experience Report: Heuristic Evaluation 1 User Experience Report: Heuristic Evaluation Created by Peter Blair for partial fulfillment of the requirements for MichiganX: UX503x Principles of Designing

More information

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005 An Introduction to Information Visualization Techniques for Exploring Large Database Jing Yang Fall 2005 1 Evaluation in Information Visualization Class 3 2 1 Motivation What are the advantages and limitations

More information

USER-CENTERED DESIGN KRANACK / DESIGN 4

USER-CENTERED DESIGN KRANACK / DESIGN 4 USER-CENTERED DESIGN WHAT IS USER-CENTERED DESIGN? User-centered design (UCD) is an approach to design that grounds the process in information about the people who will use the product. UCD processes focus

More information

Up and Running Software The Development Process

Up and Running Software The Development Process Up and Running Software The Development Process Success Determination, Adaptative Processes, and a Baseline Approach About This Document: Thank you for requesting more information about Up and Running

More information

GOMS Lorin Hochstein October 2002

GOMS Lorin Hochstein October 2002 Overview GOMS Lorin Hochstein lorin@cs.umd.edu October 2002 GOMS is a modeling technique (more specifically, a family of modeling techniques) that analyzes the user complexity of interactive systems. It

More information