Software Development and Usability Testing

Similar documents
Chapter 4. Evaluating Interface Designs

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

Testing is a very big and important topic when it comes to software development. Testing has a number of aspects that need to be considered.

Introduction to User Stories. CSCI 5828: Foundations of Software Engineering Lecture 05 09/09/2014

Foundation Level Syllabus Usability Tester Sample Exam Answers

Curtin University School of Design. Internet Usability Design 391. Chapter 1 Introduction to Usability Design. By Joel Day

Usable Privacy and Security Introduction to HCI Methods January 19, 2006 Jason Hong Notes By: Kami Vaniea

Overview: Introducing Usability and User-Centered Design (UCD)

Goals of Usability Evaluation

CS 315 Intro to Human Computer Interaction (HCI)

User Testing & Automated Evaluation. Product Hall of Shame! User Testing & Automated Evaluation. Visual Design Review. Outline. Visual design review

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

Interaction Design DECO1200

White Paper. Incorporating Usability Experts with Your Software Development Lifecycle: Benefits and ROI Situated Research All Rights Reserved

A Heuristic Evaluation of Ohiosci.org

15/16 CSY2041 Quality and User-Centred Systems

Learnability of software

EVALUATION OF PROTOTYPES USABILITY TESTING

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

CS3205 HCI IN SOFTWARE DEVELOPMENT INTRODUCTION TO PROTOTYPING. Tom Horton. * Material from: Floryan (UVa) Klemmer (UCSD, was at Stanford)

Page 1. Ideas to windows. Lecture 7: Prototyping & Evaluation. Levels of prototyping. Progressive refinement

CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018

Outline. Usability Testing CS 239 Experimental Methodologies for System Software Peter Reiher May 29, What Do We Mean By Usability?

Evaluation studies: From controlled to natural settings

Usability Testing! Hall of Fame! Usability Testing!

User-Centered Development

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.

CIO 24/7 Podcast: Tapping into Accenture s rich content with a new search capability

Users and Usability Principles

Strategy. 1. You must do an internal needs analysis before looking at software or creating an ITT

Usability: An Ultimate Journey of Experience STC-2013

Human-Computer Interaction IS4300

Programmiersprache C++ Winter 2005 Operator overloading (48)

Introduction to Usability and its evaluation

Assignments. Assignment 2 is due TODAY, 11:59pm! Submit one per pair on Blackboard.

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation

CSE 403 Lecture 17. Usability Testing. Reading: Don't Make Me Think! A Common Sense Approach to Web Usability by S. Krug, Ch. 9-11

Evaluation studies: From controlled to natural settings

Designing for humans

User interface design. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 16 Slide 1

Interaction Design. Ruben Kruiper

Introducing Evaluation

evision Review Project - Engagement Simon McLean, Head of Web & IT Support Information & Data Services.

CS3724 Human-computer Interaction. Usability Specifications. Copyright 2005 H. Rex Hartson and Deborah Hix.

I m going to be introducing you to ergonomics More specifically ergonomics in terms of designing touch interfaces for mobile devices I m going to be

cs465 principles of user interface design, implementation and evaluation

Usability Testing. November 14, 2016

NUR - Introduction to HCI. Big picture, design process, UCD, UI issues

camcorders as a social research method

6 Tips to Help You Improve Configuration Management. by Stuart Rance

COMP6471 WINTER User-Centered Design

..in a nutshell. credit: Chris Hundhausen Associate Professor, EECS Director, HELP Lab

User Interface Evaluation

Win-Back Campaign- Re-Engagement Series

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

HCI and Design SPRING 2016

Problem and Solution Overview: An elegant task management solution, that saves busy people time.

Applying Usability to elearning

EVALUATING USER INTERFACES. 15 & 17 January 2014

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

Type vs Style. Interaction types what is the aim of the interaction? Interaction styles what mechanism is to be used? E.g.

Raritan Valley Community College: Evelyn S. Field Library Homepage Usability Study. Alyssa M. Valenti, MS, MLIS

UX Research. test scenarios. UX Research test scenarios. KP Ludwig John

Strong signs your website needs a professional redesign

Usability Testing CS 4501 / 6501 Software Testing

CHAPTER 4 HUMAN FACTOR BASED USER INTERFACE DESIGN

03 Usability Engineering

USER RESEARCH Website portfolio prototype

User-Centered Design. Jeff Bos, Design Insights BlackBerry

Usability Testing: A tutorial and Case Study from the NASA Goddard Space Flight Center Library

Heuristic Evaluation of Covalence

Page 1. Welcome! Lecture 1: Interfaces & Users. Who / what / where / when / why / how. What s a Graphical User Interface?

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Making sense of chaos An evaluation of the current state of information architecture for the Web

Usability Testing Methods An Overview

UX Design. Web Design Case Study

MICRO DIGITAL: TECHNICAL CRITERIA FOR MAKING THE RTOS CHOICE

Interaction Design. Task Analysis & Modelling

Guide to a Perfect Event Communication Plan - Professional event management

BETTER TIPS FOR TAPS LIBRARY TECHNOLOGY CONFERENCE 2017 JUNIOR TIDAL WEB SERVICES & MULTIMEDIA LIBRARIAN NEW YORK CITY COLLEGE OF TECHNOLOGY, CUNY

Furl Furled Furling. Social on-line book marking for the masses. Jim Wenzloff Blog:

WENDIA ITSM EXPERT TALK

Improving the Usability of the University of Rochester River Campus Libraries Web Sites

Evaluation and Design Issues of Nordic DC Metadata Creation Tool

SWEN 444 Human Centered Requirements and Design Project Breakdown

ANALYTICS & SEO CHECKLIST FOR YOUR WEBSITE LAUNCH

Web Evaluation Report Guidelines

Analytical &! Empirical Evaluation

USABILITY EVALUATIONS OF Health Information Technology

The 23 Point UX Design Checklist

What is version control? (discuss) Who has used version control? Favorite VCS? Uses of version control (read)

Usability Engineering for the Web

THE USE OF PARTNERED USABILITY TESTING TO HELP TO IDENTIFY GAPS IN ONLINE WORK FLOW

Improving Government Websites and Surveys with Usability Testing

Usability Testing, Cont.

dt+ux Design Thinking for User Experience Design, Prototyping & Evaluation Autumn 2016 Prof. James A. Landay Stanford University

MMI 2. Tutorials. Winter Term 2017/18. LMU München - LFE Medieninformatik. Prof. Andreas Butz Renate Häuslschmid Christina Schneegaß

ACSD Evaluation Methods. and Usability Labs

Transcription:

Software Development and Usability Testing Shneiderman, Chapter 4 Preece et al, Ch 9, 11-15 Krug, Rocket Surgery Made Easy Rubin, Handbook of Usability Testing Norman Neilsen Group www

HCI in Software Development Goals for the designer How design fits into the software life cycle How to test your design what is usability testing? how do we do it? Case study: Microsoft Windows 95 (mainly) Autumn 2016 ITNP023: Foundations of IT 2

The Software Life Cycle Different models all include these main areas of activity to a greater/lesser degree: Specification Analysis Design Implementation Testing Maintenance Where should the designer be considering good design? Shneiderman, Section 3.4 Autumn 2016 ITNP023: Foundations of IT 3

Usability Engineering Model It is possible to test from very early on. Possible places to test during the development are: Early design testing (of previous versions or competitors products) provides goals for the designers Middle stages design testing validates designers improvement Later stages choices, provides feedback for ensures that product meets the design objectives Autumn 2016 ITNP023: Foundations of IT 4

Rapid Iterative Design Design Prototype Lab Test Redesign Code Field Test Involves a lot of prototypes and testing Gives fast feedback about design flaws Autumn 2016 ITNP023: Foundations of IT 5

Scenarios These are a special kind of prototyping to assist with making rapid prototyping simpler The idea behind prototyping is to cut down on costs by implementing only parts of the system Horizontal prototypes full range of features functionality reduced Vertical prototypes limited range of features each feature implemented has full functionality Scenarios combine horizontal/vertical ideas in the extreme by implementing a limited range of features with reduced functionality Autumn 2016 ITNP023: Foundations of IT 6

Goals for the Developer The big picture: Adequate functionality System reliability Good interface design Also other features such as standardization, portability, consistency, integration Shneiderman Section 1.2 Autumn 2016 ITNP023: Foundations of IT 7

Specific Goals for the Designer General aims: Improving quality of work produced Discouraging errors, for example Increasing throughput Making it easy to do things quickly Minimization of training time Making it easy to learn but we need measures in order to see whether the goals are achieved! Shneiderman, Section 1.3 Preece, Section 1.3, also chapter 10 Autumn 2016 ITNP023: Foundations of IT 8

Goals for the Designer Specific measurable goals: Subjective satisfaction (use surveys with satisfaction scales, e.g. like feedback questionnaires for lectures) Rate of errors (how many, what kind?) Speed of performance (use benchmark tests) Time to learn (measure for a specific set of tasks) Retention over time (how easy is it to remember how to use?) There are certain trade-offs that might have to be made. Autumn 2016 ITNP023: Foundations of IT 9

Testing - Crucial for good design Feedback post-release is not enough. Testing after all development is not enough. Good design requires early (and frequent) feedback Autumn 2016 ITNP023: Foundations of IT 10

Before Usability Testing Obviously, test functionality. white box testing check buttons link to right place, media appears, control code works as expected block box testing does the software/tool/presentation/www site match the specification? Can tasks (identified as part of your task profiles) be carried out? Autumn 2016 ITNP023: Foundations of IT 11

What is usability testing? Simply carrying out experiments to find out specific information about a design Getting users to give feedback may involve statistical analysis but doesn t have to Output: usability error reports (so they can be fixed) Autumn 2016 ITNP023: Foundations of IT 12

Why bother with usability testing? (you me) Just because something has been carefully designed doesn t mean that it has been designed well Designers have a very designer-centric view of their creations Autumn 2016 ITNP023: Foundations of IT 13

Why bother with usability testing? What happens if we don t test? It is a lot more expensive to fix a design fault after a software release than to fix it before release On a web site the expense is low, but it is a greater effort to win back lost customers Just how many errors are there anyway? Can be very large Remember the average of 11 usability catastrophes, 20 serious problems, 29 cosmetic problems on each web site tested? Check any bug report site to see many examples in software Old idea is that usability testing is part of beta testing. Autumn 2016 ITNP023: Foundations of IT 14

How to do usability testing Very simply: get some users, and see how they work with your product. Phases of testing (from Rubin s Handbook of Usability Testing) 1. Determine what it is you re trying to find out, e.g. Is the product good enough? Does it meet company standards? How well does it work in the real world? Which is best, our product or our competitors? Then use the purpose to create some objectives, e.g. Can customers successfully use the help facility? Is the downloading time of the front web page too long? Autumn 2016 ITNP023: Foundations of IT 15

2. Design the test How to do usability testing Identify the people you are going to need - do you need men, women, novices, experts? Determine the experimental design - the arrangements of when folks do what Develop the tasks that your users will perform Specify what equipment and personnel you ll need 3. Get some users Fellow employees, friends and family of employees For specialist user populations you might use a recruitment agency But don t use biased people (e.g. inside knowledge) Autumn 2016 ITNP023: Foundations of IT 16

Users - You only need 5? Nielsen and Landauer s study showed... Autumn 2016 ITNP023: Foundations of IT 17

Users - You only need 5? Note other insights: With zero users, you detect zero usability problems You do find more problems with more users, but with 5 users you can find 80% of the problems If you have the money for 15, better to run 3 small tests of 5 users each, at different stages of the development 15 tests with 1 user is not a better plan because people vary widely so 1 individual gives you skewed data results But 15 tests with 15 users, spread over time, is ok Krug suggests testing little and often (1 day a month) Autumn 2016 ITNP023: Foundations of IT 18

Neilsen Studies with Number of Users 5 10 15 20 25 30 Autumn 2016 ITNP023: Foundations of IT 19

How to do usability testing 4. Set up the test prepare the equipment provide a script for your test supervisors 5. Run the test prepare your users (put them at ease, explain test conditions to them, get them to sign consent forms) run your users through their tasks and collect their data 6. Don t forget to thank your users afterwards! you might want to use them again in future 7. Analyse your data Autumn 2016 ITNP023: Foundations of IT 20

Testing techniques: Heuristic evaluation Done without users! Usability guidelines are many, so it can be useful to focus on one small set of guidelines, such as Shneiderman s Autumn 2016 ITNP023: Foundations of IT 21

Testing techniques: Think Aloud Thinking aloud protocol User talks through their thought/action process Krug script See the demo usability video from Rocket Surgery Made Easy Autumn 2016 ITNP023: Foundations of IT 22

Testing Techniques: Krug Usability Test 45 minutes Trunk test (where am I, what is this, where do I go?) Think aloud for a given task Resources available on sensible.com Autumn 2016 ITNP023: Foundations of IT 23

Testing techniques: Usability labs Labs range from the very simple to the very sophisticated Features may include: desks, chairs, computer... nice décor (comfortable, nonintimidating) audio/visual recording equipment eye tracker two-way mirrors computer logging intercom soundproofing good lighting Autumn 2016 ITNP023: Foundations of IT 24

Testing techniques: Online testing There are a few cases where it is appropriate: International usability Specialized testers are needed, e.g. people suffering from a rare disease teachers of an obscure topic Scarcity of local testers, e.g. the average Silicon Valley resident is more aware of technology Can be quite easy to get people online to do this (e.g. Amazon Mechanical Turk) Autumn 2016 ITNP023: Foundations of IT 25

Testing techniques: Online Surveys Pluses: Surveys can collect information from real users when they are actually using your software/interface Good at conveying why people want to use the product if they like/dislike the product, in particular its visual appearance what additional features they might like Minuses: Surveys collect opinions and not facts Users have selective memories Users can t always tell you why or where they have difficulties, only that they do have difficulties Autumn 2016 ITNP023: Foundations of IT 26

Usability testing isn t perfect: how to get more feedback Testing won t catch certain sorts of errors: Obscure errors Errors only found after a long time of use Complex feature interaction Additional forms of feedback can be very valuable. These include: User groups, mailing lists, surveys Technical support (online or telephone) - log those calls! Test comparison of products with others on the market beta testing Autumn 2016 ITNP023: Foundations of IT 27

Myths of Usability Testing Testing is expensive - we can t afford a usability lab Testing is time-consuming Testing requires experts in usability Autumn 2016 ITNP023: Foundations of IT 28

End of Lecture