iscreen Usability INTRODUCTION

Similar documents
1. Select/view stores based on product type/category- 2. Select/view stores based on store name-

Jakob Nielsen s Heuristics (

User Interface Evaluation

Heuristic Evaluation. Jon Kolko Professor, Austin Center for Design

Craigslist Heuristic Evaluation

INTRODUCTION USER POPULATION

User Experience Report: Heuristic Evaluation

Computer Systems & Application

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

HCI and Design SPRING 2016

Hyper Mesh Code analyzer

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation

Heuristic Evaluation

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

Design Heuristics and Evaluation

Due on: May 12, Team Members: Arpan Bhattacharya. Collin Breslin. Thkeya Smith. INFO (Spring 2013): Human-Computer Interaction

Lose It! Weight Loss App Heuristic Evaluation Report

Usability. Daniela Rosner. Web Architecture, October 9, School of Information UC Berkeley

Heuristic Evaluation of NUIG Participate Module 1

Introduction to Internet Applications

Heuristic Evaluation. Ananda Gunawardena. Carnegie Mellon University Computer Science Department Fall 2008

Crab Shack Kitchen Web Application

Heuristic Evaluation of Enable Ireland

Usability in Multimedia. By Pınar Koçer Aydın and Özgür Bayram

Group #: Evaluation Proposal For Digital Libraries, LIS 5472, Fall 2010, Professor Sanghee Oh

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

UX DESIGN BY JULIA MITELMAN

1. The Best Practices Section < >

CogSysIII Lecture 9: User Modeling with GOMS

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

cs465 principles of user interface design, implementation and evaluation

Usability analysis and inspection

Heuristic Evaluation of [Slaptitude]

Wikitude Usability testing and heuristic evaluation

Why? Usability analysis and inspection. Evaluation. Evaluation. Measuring usability. Evaluating usability

Heuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation

A Heuristic Evaluation of Ohiosci.org

USER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440

Heuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics.

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

User Experience Design

E2: Heuristic Evaluation A usability analysis of decorativethings.com. Jordana Carlin LIS Spring 2014

1. Problem Mix connects people who are interested in meeting new people over similar interests and activities.

Heuristic Evaluation of igetyou

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

Applying Usability to elearning

Heuristic Evaluation of Covalence

User Experience Research Report: Heuristic Evaluation

Chapter 10 Interactive Systems And Usability Organizational Requirements Engineering

Heuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!!

HEURISTIC EVALUATION WHY AND HOW

SFU CMPT week 11

Heuristic Evaluation of MPowered Entrepreneurship s Slack Workspace. Presented By Dylan Rabin. SI 110 Section 003

Heuristic Evaluation of PLATELIST

Heuristic Evaluation of Math Out of the Box

Heuristic Evaluation of [ Quest ]

NPTEL Computer Science and Engineering Human-Computer Interaction

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

CMSC434 Intro to Human-Computer Interaction. Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara

Heuristic Evaluation Google Play Store

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

Übung zur Vorlesung Mensch-Maschine-Interaktion. e5: Heuristic Evaluation

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

Severity Definitions:

Evaluation of Interactive Systems. Inspection methods

Analytical Evaluation

Introducing Evaluation

Property of Shree Dhavale. Not for public distribution. Practicum Report. Shree Dhavale

activated is a platform that allows students to create, organize, and share the steps on their road to college.

15/16 CSY2041 Quality and User-Centred Systems

Cognitive Walkthrough

ASSIGNMENT 06: Heuristic Evaluation

CS 147 Autumn 2017: Assignment 9 (Heuristic Evaluation Group Template) Instructor: James Landay. Fix: make web also block the desktop screen.

IPM 10/11 T1.6 Discount Evaluation Methods

Heuristic Evaluation as A Usability Engineering Method

Introduction to Usability and its evaluation

Neon Carrot Prototype I Evaluation. Dan Cody, Logan Dethrow, Ben Fisher, Jeff Stanton April 6, Preamble

Learnability of software

Heuristic Evaluation

USER RESEARCH Website portfolio prototype

Usability. HCI - Human Computer Interaction

Page 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement

PROJECT 1. Heuristic Evaluation and Cognitive Walkthrough of Goroo.com

Design Reviews. Scott Klemmer. stanford hci group / cs147. tas: Marcello Bastea-Forte, Joel Brandt, Neil Patel, Leslie Wu, Mike Cammarano

Cognitive Walkthrough

Heuristic Evaluation of Team Betamax

Lecture 14: Heuristic Evaluation. Fall UI Design and Implementation 1

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

HCI CA1 (ii) Redesign Implementation

Heuristic Evaluation of [Pass It On]

What is interaction design? What is Interaction Design? Example of bad and good design. Goals of interaction design

Analytical &! Empirical Evaluation

Transcription:

INTRODUCTION Context and motivation The College of IST recently installed an interactive kiosk called iscreen, designed to serve as an information resource for student/visitors to the College of IST. The content is designed to answer some of the common questions students/visitors might have, for example: faculty office locations, building hours, etc. The hardware comprises of a touch-screen capable 42 Plasma monitor, featuring a walk-up-and-use interface. The interface needs to support novice as well as repeat users in terms of content and accessibility. A user-based evaluation and an expertbased cognitive walkthrough of the interface revealed issues that could hinder usability of the iscreen. In this report, we use heuristic evaluation (Nielsen 1994) to investigate usability issues. Each of the methods mentioned addresses different aspects of usability. Using multiple approaches for studying usability would enable us to identify problems at various levels. Goal and findings Our goal is to study the interface using heuristics or guidelines to find usability problems that would generate recommendations/design implications for the (re)design of the information kiosk, the iscreen. The findings reveal that there are a number of things that violate one of more heuristics. The majority of the problems encountered relate more to interface elements like icons and feedback. We also found that the choice of information presentation/layout may lead to some confusion for the user in finding/processing the information requested. METHODOLOGY Heuristic evaluation involves using a small set of evaluators to examine an interface and judge it s compliance with a set of recognized usability heuristics or principles (Nielsen 1994). We used Jakob Nielsen s (1994) ten usability heuristics seen in table 1 to conduct the heuristic evaluation. H 1. Visibility of system status The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. H 2. Match between system and the real world The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than systemoriented terms. Follow real-world conventions, making information appear in a natural and logical order. H 3. User control and freedom Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. H 4. Consistency and standards Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. 1

H 5. Error prevention Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action. H 6. Recognition rather than recall Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. H 7. Flexibility and efficiency of use Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. H 8. Aesthetic and minimalist design Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. H 9. Help users recognize, diagnose, and recover from errors Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. H 10. Help and documentation Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large. Table 1. Ten Usability Heuristics Although it is recommended to use three to five evaluators for heuristic evaluation in our situation we only had two evaluators. We each conducted the heuristic evaluation first individually. We each reviewed the ten usability heuristics, then explored and inspected the interface at least three times with the list of the ten usability heuristics in front of us. Each evaluator noted discovered problems on a form (Appendix A). For each problem recorded the evaluator also recorded what heuristic the problem violated and where the problem occurred in the system. After each evaluator had conducted the heuristic evaluation individually we came together to discuss the problems we found. We discussed each problem and assigned each problem a severity rating from 0-4 based on the ratings seen below. 2

0 No problem 1 Cosmetic problem 2 Minor usability problem 3 Major usability problem; important to fix 4 Usability catastrophe; imperative to fix Table 2. Severity Ratings After each problem was assigned a severity rating we discussed possible solutions for each usability problem. The problems found during the heuristic evaluation can be seen in table 3 below. No. Problem Description Heuristic Violated Places(s) where problem occurs Severity Rating 1 2 3 Touch screen to begin not visible at all times. No affordance regarding purpose/content of the screen. Users cannot search for a person by name to find their phone number or office number. The user must know first whether someone is faculty or staff (or that there are phone numbers in Building Info) in order to find their phone number or office location. 4 No icons to supplement text in the buttons 5 6 7 8 9 10 11 Information displayed does not give any aid to the user on how to use the system. No indication if the user is on the main screen, especially when other screens provide a button "Main". No permanent navigation bar to let the user know about the other information categories. System does not provide any help to the user to help them choose between faculty and staff. Phone numbers are shown in extension format only. Some pages do not present any additional information after selection has been made. Main menu button is not on all pages. Therefore the users must wait for the screen to re-enter the inert phase in order to choose another option. system status Inert Screen 4 H* Lack of Affordance Inert screen 4 H10 Help and documentation H2 Match between system and the real world H6 Recognition Faculty / Staff / Phone Numbers 3 rather than recall All screens 3 Main / Directory / H8 Aesthetic and Building info / IST minimalist design calendar / Course H10 Help and schedule / documentation Academic FAQ 3 system status All screens 3 system status All screens 3 H5 Error prevention H6 Recognition rather than recall Directory 3 H2 Match between system Staff / faculty / and the real world telephone numbers 3 H8 Aesthetic and minimalist design Main / Directory / Building info / IST calendar / Course schedule / Academic FAQ 3 H3 User control and freedom Fall Schedule 2 3

No. 12 13 14 15 16 17 18 19 20 21 Problem Description The user cannot tell where he/she is in the system. There are no titles on the pages. Some pages say iscreen at the top and others do not. Abbreviations and acronyms are used that people may not understand Position and size of "touch screen to begin" icon not standardized. No feedback to communicate a selection has been made (Buttons inactive) Too many names presented together inhibit the ability of the user to find information quickly. Does not inform the use when the i-screen would revert to its default state. The screen is not visually divided to depict categories. Links for the "graduate fall term" not consistent. User may not know if its clickable. Similarity of transitions between screens during the inert state and active state are misleading. There is a screen that only presents one option to the user. When the user selects course schedule they are presented with a blank screen with one button for the Fall course schedule. Heuristic Violated Places(s) where problem occurs Severity Rating system status All screens 2 H2 Match between system and the real world Phone numbers 2 H4 Consistency and standard Inert screen 2 system status All screens 2 H8 Aesthetic and minimalist design Faculty / Staff 2 system status All screens 2 Telephone numbers / H8 Aesthetic and Directory / fall minimalist design courses 2 Graduate / fall courses / academic calendar. 2 H4 Consistency and standard system status H4 Consistency and standard All screens 1 H8 Aesthetic and minimalist design Course schedule 1 Table 3. Heuristic Evaluation Problems The second problem listed in the table did not violate one of the Nielsen s ten known heuristics rather it is specific to the system we evaluated. We noted this as H* Lack of Affordance because there are no affordances that inform the user of the purpose or content of the system. The design implications that we discussed are included in the following section. DESIGN IMPLICATIONS We will discuss the design implications based on the results of the heuristic evaluation in terms of the organization of the information in the iscreen. We will first discuss design implications for the system in its inert phase and then for the system as a whole. The order in which we discuss the problems and their recommendations indicates the severity of the problem. The most severe problems and the recommendations will be discussed first. 4

Inert system state The touch screen to begin text and accompanying hand should always be visible from a distance in addition to always being in the same place on the screen. Problem 1 and 14: Touch screen to begin is not visible at all times. The inert screens should contain information regarding what type of information the system contains. This could be accomplished by using the content or the main menu buttons as the inert screens, for example one of the inert screens could display the advising walk in hours or the faculty and staff directory. Problem 2: The purpose of the iscreen and its content are not readily apparent. It is also not apparent that the iscreen may be touched. System wide design implications A search function should be added to the directory in order to allow users to search for a faculty of staff member without knowing whether they are faculty of staff. Another possible solution is to simply display faculty and staff information on the same screen and distinguish between faulty or staff by supplying a title or differentiating by color. Problem 3 and 8: Users must know the position a person hold in order to find their phone number and office location. System does not provide any help to the user to aid them in choosing between faculty and staff. All phone numbers should be displayed in the directory. Problem 3: Some phone numbers are included in Building info. Icons should be added to supplement the text only buttons. Icons with labels may be most useful. Problem 4: The main menu buttons are plain text. 5

A help option should be added to the system or descriptions of what each button will lead you to may be helpful, for example if there was text that appeared that read click here (and pointed to the directory button) if you would like to find information about faculty and staff. Problem 5: Information displayed does not give any aid to the user on how to use the system. A title should be added to the main menu screen that reads Main Menu. All other screens should also be given appropriate titles. Problems 6 and 12: The system does not inform the user of where they are in the system. A permanent navigation bar should always be displayed to allow for easier navigation. Problem 7: There is also no permanent navigation the user must return to the main menu to navigate to another part of the system. The main menu button is not on all pages. Phone numbers should be displayed in full format, rather than just the extension, for example (555) 555-5555. Problem 9: Phone numbers are displayed in extension format. Any unnecessary pages should not be included in the design or pages that currently only present one button should provide additional information. Problem 10 and 21: Some pages do not present any additional information after selection has been made. Abbreviations should not be used. Everything should be fully spelled out. Problem 13: Abbreviations and acronyms are used that people may not understand. 6

Buttons should depress when they are touched or make an appropriate clicking sound to give users feedback that the button was actually pressed. Problem 15: The main menu buttons and all other system buttons do not provide any feedback to users to communicate that a selection has been made. Text on screens should be at least a minimum font size and there should be a limit to how much is displayed. There should be minimum spacing between lines. Problem 16 and 18: It is difficult read screens when there is too much text presented. There should be some mechanism for informing the users that the screen will return to its inert phase after a period of time. This could be done by asking the user to make another selection. Problem 17: Users are also not informed of when the system will revert back to the inert phase, therefore the screen a user is viewing simply disappears after a number of second without any notice. The names in the directory should be spaced out in order to facilitate browsing and easier searching. The text size may need to be increased to allow for easily finding a name. Problem 18: The screen is not visually divided to depict categories. All links should be standardized. Buttons should be used throughout the system instead of switching between buttons and simply text. Problem 19: Links are not standardized throughout the system. Transitions between the inert screens and active screens should be differentiated in some way. Problem 20: Transitions between screens during the inert phase and active state are the same and therefore misleading. 7

Appendix A Heuristic Evaluation Data Form Problem description: Heuristic(s): violated: Average severity rating: Place(s) where problem occurs: Problem description: Heuristic(s): violated: Average severity rating: Place(s) where problem occurs: Problem description: Heuristic(s): violated: Average severity rating: Place(s) where problem occurs: Problem description: Heuristic(s): violated: Average severity rating: Place(s) where problem occurs: 8

References Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, John Wiley & Sons, New York, NY. 9