Improving user interfaces through a methodological heuristics evaluation framework and retrospective think aloud with eye tracking

Similar documents
Foundation Level Syllabus Usability Tester Sample Exam

CHAPTER 4 HUMAN FACTOR BASED USER INTERFACE DESIGN

2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

Design Heuristics and Evaluation

CS-5200 Design Project

Overview of the course. User-Centred Design. Group. Practical issue. Writting the report. Project work. Fang Chen

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

This PDF was generated from the Evaluate section of

User-Centered Design. Jeff Bos, Design Insights BlackBerry

CS 315 Intro to Human Computer Interaction (HCI)

IPM 10/11 T1.6 Discount Evaluation Methods

evision Review Project - Engagement Simon McLean, Head of Web & IT Support Information & Data Services.

A new interaction evaluation framework for digital libraries

Usability of interactive systems: Current practices and challenges of its measurement

Usability and User Experience of Case Study Prof. Dr. Wiebke Möhring

Analytical evaluation

CUE-10: Moderation Page 1. Comparative Usability Evaluation 10. Moderation. Observing usability test moderators

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

17 th IEA World Congress

EVALUATION OF THE USABILITY OF EDUCATIONAL WEB MEDIA: A CASE STUDY OF GROU.PS

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

Programmiersprache C++ Winter 2005 Operator overloading (48)

Integrating Usability Engineering in the Iterative Design Process of the Land Attack Combat System (LACS) Human Computer Interface (HCI)

User Experience Metric (UXM) and Index of Integration (IoI): Measuring Impact of HCI Activities

USER RESEARCH Website portfolio prototype

Learnability of software

Goals of Usability Evaluation

HCI Research Methods

(Milano Lugano Evaluation Method) A systematic approach to usability evaluation. Part I LAB HCI prof. Garzotto - HOC-POLITECNICO DI MILANO a.a.

NPTEL Computer Science and Engineering Human-Computer Interaction

UX Research in the Product Lifecycle

Towards Systematic Usability Verification

Homework Set 2. A brief discussion

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

3 Prototyping and Iterative Evaluations

CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018

SFU CMPT week 11

The LUCID Design Framework (Logical User Centered Interaction Design)

Choosing the Right Usability Tool (the right technique for the right problem)

Heuristic Evaluation of Groupware. How to do Heuristic Evaluation of Groupware. Benefits

Chapter 15: Analytical evaluation

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)

cs414 principles of user interface design, implementation and evaluation

needs, wants, and limitations

CS Human Computer Interaction

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web

DEFINING FEATURES OF QUALITY CERTIFICATION AND ASSESSMENT-BASED CERTIFICATE PROGRAMS (Draft) Rev. 5.1 August 8, 2007

A Heuristic Evaluation of Ohiosci.org

User-centered design in technical communication

Chapter System Analysis and Design. 5.1 Introduction

Usability analysis and inspection

Why? Usability analysis and inspection. Evaluation. Evaluation. Measuring usability. Evaluating usability

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

ISO/TR TECHNICAL REPORT. Ergonomics of human-system interaction Usability methods supporting human-centred design

Interaction design. The process of interaction design. Requirements. Data gathering. Interpretation and data analysis. Conceptual design.

Accessibility/Usability Through Advanced Interactive Devices

Criteria for selecting methods in user-centred design

USERINTERFACE DESIGN & SIMULATION. Fjodor van Slooten

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

User Testing Study: Collaborizm.com. Jessica Espejel, Megan Koontz, Lauren Restivo. LIS 644: Usability Theory and Practice

Evaluation Types GOMS and KLM PRICPE. Evaluation 10/30/2013. Where we are in PRICPE: Analytical based on your head Empirical based on data

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

INTRODUCTION TO THE QUALIFIED COURSE PROFESSIONAL GUIDELINES

Evaluation report of requirements elicitation practices

Joint Application Design & Function Point Analysis the Perfect Match By Sherry Ferrell & Roger Heller

QUALITY IMPROVEMENT PLAN (QIP) FOR THE CONSTRUCTION MANAGEMENT DEGREE PROGRAM

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web

ISO INTERNATIONAL STANDARD. Ergonomic requirements for office work with visual display terminals (VDTs) Part 11: Guidance on usability

Information Systems Interfaces (Advanced Higher) Information Systems (Advanced Higher)

Chapter 4. Evaluating Interface Designs

Universal Model Framework -- An Introduction

SVENSK STANDARD SS-ISO/IEC

Foundation Level Syllabus Usability Tester Sample Exam Answers

Introduction. Heuristic Evaluation. Methods. Heuristics Used

Survey of Research Data Management Practices at the University of Pretoria

3. LABOR CATEGORY DESCRIPTIONS

DNA Certification Programs Overview (as of 10 June 2006)

Six Sigma in the datacenter drives a zero-defects culture

HCI and Design SPRING 2016

Amsterdam Medical Center Department of Medical Informatics. Improve. Usability evaluation of the sign up process of the Improve app

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 5 Diploma in IT. March 2017 PRINCIPLES OF USER INTERFACE DESIGN

Appendix 2. Level 4 TRIZ Specialists Certification Regulations (Certified TRIZ Specialist) Approved for use by MATRIZ Presidium on March 21, 2013

User Experience Report: Heuristic Evaluation

How to Choose the Right UX Methods For Your Project

Course Outline. Department of Computing Science Faculty of Science. COMP 3450 Human Computer Interaction Design (3,1,0) Fall 2015

Using the Common Industry Format to Document the Context of Use

Usability Testing Essentials

Course Information

EVALUATION AND APPROVAL OF AUDITORS. Deliverable 4.4.3: Design of a governmental Social Responsibility and Quality Certification System

Red Hat Virtualization Increases Efficiency And Cost Effectiveness Of Virtualization

Association for International PMOs. Expert. Practitioner. Foundation PMO. Learning.

Looking Back: Fitts Law

Usability Evaluation as a Component of the OPEN Development Framework

Heuristic Evaluation

Transcription:

Improving user interfaces through a methodological heuristics evaluation framework and retrospective think aloud with eye tracking Progress Report Supervisors: Dr. Tom Gedeon Mr. Christopher Chow Principal Researcher: Manik Mahajan

Overview Discount usability evaluations: Heuristics analysis: Heuristics Analysis Framework: Human Centered Computing (HCC) Workshop: Low-cost, low-time, and lowresource intensive techniques of conducting usability evaluations. A heuristics analysis is a discount usability technique, which typically sets a select number of evaluation criterion (heuristics) against which the usability of an interface is evaluated. Typically conducted by expert usability evaluators. This is to say, individuals with some background or familiarity with usability evaluation theory or techniques. However, typical heuristics evaluations are drafted uniquely for the interfaces of which the usability is to be evaluated. This research project posits the development and then utilization of a framework that abstracts and encapsulates complex concepts and practices into a standardised platform that expert usability evaluators can use to conduct stand-alone and comparative heuristics analysis, using mixedmethods. The primary goal of this research project is to help improve the usability of the HCC Workshop. The secondary goals of this project is to develop a Heuristics Analysis Framework. This framework would be open to use and improvement beyond the scope of this research project. 2

Research Processes Background Research Environment Scan Gap Analysis The cumulative methodological repertoire of HCI research is substantially broad, as a result a systematic review of literature has been deemed to be best practice om literature and background analysis in HCI research. Insights generated through consultation with usability practitioners, and subject matter experts to evaluate the difference between best practices and practices employed in the real world. (Design Thinking) Instead of creating something for the sake of it, substantial gap analysis of existing literature and practices was undertaken to find scope of surgical solutions in the problem space. As a consequences of this the Heuristics Analysis Framework was created. GOMS (Goals, Operators, Methods, and Selection Rules) Source:https://en.wikipedia.org/wiki/File:Extraction_machine.gif Source:http://designthinkingforlibraries.com Source:http://technologyadvice.com/blog/sales/finding-filling-gaps-sales-pipeline/ 3

The Design Squiggle, Damien Newman 4

Current Project State Heuristics Analysis Framework Pilot Studies Insights Generated Usability Heuristics: Visibility of System Match Between System and Real World User Control and Freedom Consistency and Standards Error Prevention Recognition rather than recall Flexibility and efficiency of use Evaluation Criteria: Effectiveness Efficiency Satisfaction Usability Score, determined as a weighted average of 33 usability heuristics evaluated against 4 evaluation criteria. In order to evaluate the desirability, feasibility and viability of the Heuristics Analysis Framework (HAF) a pilot studies were undertaken with 4 experts : Chris Chow; Manik Mahajan; and two professional usability testing practitioners. These enabled generation of early feedback which would help improve HAF to reduce the cognitive workload and other user experience aspects experienced while using HAF. The pilot studies were undertaken with the experts participating in a Think Aloud conducted by the Researcher, and then using HAF as a structured methodology to evaluate HCC Workshop s usability. The HCC Workshop has been rated a usability score of 3.85 on average in the studies so far. The feedback and experience of users of during Think Aloud was generally good. On average it took the users 40 minutes to use the HAF to evaluate the usability of the HCC Workshop. Simplification of some aspects of HAF is required to further drop the time it takes for it's users to fully implement it. This is to achieve a higher degree of discount -ness in its construct. HAF accomplishes its intention of enabling users to perform a structured heuristics analysis. 5

Future Work Eye Tracking Studies HCC Workshop The Heuristics Analysis Framework helped develop key insights into usability testing and user experience design. Eye Tracking will help further HCI research skills and knowledge. Retrospective Think Alouds with Eye Gaze Tracking allow rapid generation of user testing driven insights that can be coupled with empirical data through eye gaze tracking. Additionally this empirical data could be used to verify and validate the effectiveness of HAF. (if time permits) 10-20 participants will be recruited to participate in a Retrospective Think Aloud coupled with Eye Gaze Tracking. This project will compare the effectiveness of Retrospective Think Alouds and Retrospective Think Alouds with Eye Tracking, to compare and analyse the participants ability to identify usability issues in both cases. These studies will be conducted using the HCC Workshop as the test object, and use Eye Tribe and Empatica E4 to gather the empirical data. The HCC Workshop is the test object for this experiment. The experiment will test usability of four tasks: register/login; using eye tracking activity; using electrocardiography activity; using neural network activity. These activities were chosen as observation for workshops conducted in the past demonstrated particular interest in the three above mentioned activities. As a result of this experiment, useful feedback for the HCC Workshop will be achieved. 6

Summary The Heuristics Analysis Framework is an attempt to marry multiple theories and practices in discount usability testing, ergonomics efficiency and standardisation, and human-computer interaction principles, that enables a cheap, easy, and standardised way to test interfaces. Retrospective Think Aloud with Eye Gaze Tracking is a relatively discount usability method that can be run as a cheap and easy method by an expert. Eye tracking allows V&V of verbalisations (that are part of Think Aloud), in addition to allowing users to more deeply identify usability issues. This research project will generate useful feedback for the HCC Workshop, to help the improvement of user experience and the interface design. This adds direct value to the HCC Research Group, and RSCS. Additionally, the Heuristics Analysis Framework and the comparative analysis of Retrospective Think Alouds help add value to HCI as a discipline by: one, creating new benchmarked usability testing framework; two, corroborating the goodness of multiple usability testing methods. Core Research Considerations The World View Model 7

Questions? 8