Improving Government Websites and Surveys with Usability Testing

Similar documents
Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

Foundation Level Syllabus Usability Tester Sample Exam

How to Choose the Right UX Methods For Your Project

Essential Tools For Finding And Fixing Customer Experience Problems

Usability Testing Methodology for the 2017 Economic Census Web Instrument

Usability Test Plan for Blogger Mobile Application

Choosing the Right Usability Tool (the right technique for the right problem)

PNC.com, Weather.com & SouthWest.com. Usability Analysis. Tyler A. Steinke May 8, 2014 IMS 413

USER-CENTERED DESIGN KRANACK / DESIGN 4

User Testing Study: Collaborizm.com. Jessica Espejel, Megan Koontz, Lauren Restivo. LIS 644: Usability Theory and Practice

Usability Testing, Cont.

User-Centered Analysis & Design

..in a nutshell. credit: Chris Hundhausen Associate Professor, EECS Director, HELP Lab

Introduction to Web Surveys. Survey Research Laboratory University of Illinois at Chicago October 2010

Discovering Information through Summon:

FOR LEAN AND AGILE TEAMS USER RESEARCH

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

Wireframes for Testing and Design

UX Consulting: A Look into the Design and Usability Center at Bentley

Folsom Library & RensSearch Usability Test Plan

iinview Research First Click Analysis & Other User Metrics

Mark Kelly, Head of Research and Usability at Ryanair Labs. All-in-One Platform to Rapidly Test Usability & Measure User Experience

End of Summer Usability Study

A short introduction to. designing user-friendly interfaces

UX Research. research methods overview. UX Research research methods overview. KP Ludwig John

Usability Testing. November 14, 2016

Who we tested [Eight] participants, having the following profile characteristics, evaluated [product name].

Evaluating the suitability of Web 2.0 technologies for online atlas access interfaces

HCI: THE DESIGN PROCESS. Dr Kami Vaniea

CUE-10: Moderation Page 1. Comparative Usability Evaluation 10. Moderation. Observing usability test moderators

Usability Testing Review

Usability Services at the University of Maryland: Who, What and How

Usability Testing CS 4501 / 6501 Software Testing

SWEN 444 Human Centered Requirements and Design Project Breakdown

Safety Perception Survey s: Interviewing & Sampling Method Considerations

Usability Evaluation of Web Office Applications in Collaborative Student Writing

GeoFARA (Geography Fieldwork Augmented Reality Application): design, development and evaluation

TRAINING MATERIAL. An introduction to SONET-BULL Platform for members. HOME PAGE

User Testing & Automated Evaluation. Product Hall of Shame! User Testing & Automated Evaluation. Visual Design Review. Outline. Visual design review

Usability Testing: Indigo website

SWEN 444 Human Centered Requirements and Design Project Breakdown

Don t make me think*!

Virtual Platform Checklist for Adobe Connect 9

UX + BA. User Experience & Business Analysis. Personas. What is UX? Customer Experience Maps. BA & UX roles. BA + UX Collaboration.

Tips to improve your Adobe Connect Meetings

Usability Testing Essentials

This study is brought to you courtesy of.

Web Evaluation Report Guidelines

Software Development and Usability Testing

EVALUATION OF THE USABILITY OF EDUCATIONAL WEB MEDIA: A CASE STUDY OF GROU.PS

Introduction What is Usability and Usability Testing?

Chapter 1: A Maturing Industry. Chapter 2: Great Designers are Great Communicators

Usability Testing. November 9, 2016

How to Add Usability Testing to Your Evaluation Toolbox

USABILITY EVALUATIONS OF Health Information Technology

Consumers Energy Usability Testing Report

T H E S H I F T T O SMARTPHONE DOMINANCE

Human-Computer Interaction IS4300

Usability Report for Love and Robots.com

Shutterfly Usability Evaluation: Usability Testing Proposal Alexandra De Lory, Bayan Nada, Timothy Visich, Vishwas Sastry ISE 215

USABILITY REPORT A REPORT OF USABILITY FINDINGS FOR OFF THE BEATEN PATH WEBSITE

PUTTING THE CUSTOMER FIRST: USER CENTERED DESIGN

UX Research. research methods overview. UX Research research methods overview. KP Ludwig John

Dhuel Fisher. BookFusion Usability Report

Foundation Level Syllabus Usability Tester Sample Exam Answers

Scientist: Andrew Storer

Problem and Solution Overview: An elegant task management solution, that saves busy people time.

Usability Report for Online Writing Portfolio

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

Programmiersprache C++ Winter 2005 Operator overloading (48)

Virtual Platform Checklist for WebEx Training Center

Rapid Research. For 2.0 Development. Presented by Kelly Goto

Eyetracking Study: Kent State University Library. Janie Ralston

Prepared by: Marysol Ortega & Diana Arvayo

Users are always RIGHT! Usability and Eyetracking to improve web sites

Experimental Evaluation of Effectiveness of E-Government Websites

User Experience Design

HCI and Design SPRING 2016

EVALUATION ASSIGNMENT 6. Usability Test Report. ABHISHEK SHRIRAM UMACHIGI Consultant: Team ipatts +1(906)

Analytical &! Empirical Evaluation

Applying Usability to elearning

User Experience Design

CREATING A CULTURE OF USABILITY

Usability Professionals Association

Chapter 22: Communication Process Study Guide Matching

VIDEO 1: WHY IS THE USER EXPERIENCE CRITICAL TO CONTEXTUAL MARKETING?

IBM MANY EYES USABILITY STUDY

EVALUATION OF THE USABILITY OF THE LIBRARIES PAGE WITH BGSU STUDENTS

Evaluation studies: From controlled to natural settings

IBM s approach. Ease of Use. Total user experience. UCD Principles - IBM. What is the distinction between ease of use and UCD? Total User Experience

Evaluation Museum Eye tracking-study

UX Design - Curriculum

CS-5200 Design Project

Understanding Usability: Getting Beyond Ease of Use

Eye Tracking and Usability Testing in Form Layout Evaluation

COMPARISON OF LOCAL AND REMOTE MODERATED USABILITY TESTING METHODS. A Project Report. Presented to

Plunging into the waters of UX

User Assessment for Negotiating the Quality of Service for Streaming Media Applications

User-Centered Design. Jeff Bos, Design Insights BlackBerry

Q: WHAT IS UNIFIED MEETING 5? Q: WHICH OPERATING SYSTEMS CAN YOU USE WITH UNIFIED MEETING 5? Q: HOW MANY PEOPLE CAN JOIN UNIFIED MEETING 5?

Transcription:

Improving Government Websites and Surveys with Usability Testing a comparison of methodologies Jen Romano Bergstrom & Jon Strohl FCSM Washington, DC

About this talk 1. Assessing the user experience is important. 2. There are many ways you can do this. 2

What is Usability? the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use. ISO 9241-11 3

Usability vs. User Experience (UX)? Whitney s 5 Es of Usability Peter s User Experience Honeycomb 4 The 5 Es to Understanding Users (W. Quesenbery): http://www.wqusability.com/articles/getting-started.html User Experience Design (P. Morville): http://semanticstudios.com/publications/semantics/000029.php

What People do on the Web 5 Krug, S. Don t Make Me Think

Mental models and repeating behaviors 6

Measuring the UX the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use. ISO 9241-11 How does it work for the end user? What does the user expect? How does it make the user feel? + emotions 7

Where to test LABORATORY REMOTE IN THE FIELD Controlled environment All participants have the same experience Record and communicate from control room Observers watch from control room and provide additional probes (via moderator) in real time Incorporate physiological measures (e.g., eye tracking, EDA) No travel costs Participants in their natural environments (e.g., home, work) Use video chat (moderated sessions) or online programs (unmoderated) Conduct many sessions quickly Recruit participants in many locations (e.g., states, countries) Participants tend to be more comfortable in their natural environments Recruit hard-to-reach populations (e.g., children, doctors) Moderator travels to various locations Bring equipment (e.g., eye tracker) Natural observations 8

How to test ONE-ON-ONE SESSIONS FOCUS GROUPS SURVEYS In-depth feedback from each participant No group think Can allow participants to take their own route and explore freely No interference Remote in participant s environment Flexible scheduling Qualitative and Quantitative Participants may be more comfortable with others Interview many people quickly Opinions collide Peer review Qualitative Representative Large sample sizes Collect a lot of data quickly No interviewer bias No scheduling sessions Quantitative analysis 9

When to test 10

What to measure EXPLICIT + Post-task satisfaction questionnaires + In-session difficulty ratings + Verbal responses + Moderator follow up + Real-time +/- dial IMPLICIT + Facial expression analysis + Eye tracking + Electrodermal activity (EDA) + Behavioral analysis + Linguistic analysis of verbalizations + Implicit associations + Pupil dilation OBSERVATIONAL + Ethnography + Time to complete task + Reaction time + Selection/click behavior + Ability to complete tasks + Accuracy 11

Case 1: Department of Veterans Affairs Problems: What do users want? Does the new design work? Methods: Focus groups, one-on-one interviews, in-lab usability testing with eye tracking Accuracy Steps to Complete Task* Time to Complete Task* Users 10% 8 170 seconds Admins 21% 8.3 32 seconds All Participants 15% 8.2 101 seconds 1. Participant repeatedly fixated the upper right hand corner. Participant said that he/she was looking for a search tool on the page. The search tool was in a disappearing banner on the page. 2. Participants had similar fixation counts across bottom links, indicating uncertainty of where to click to get started. 12

Case 1: Department of Veterans Affairs Focus Groups, early in development What people want and expect from this type of site Usability testing, throughout development What works well on the new site and what needs work Performance data (time on task, accuracy, steps) What people explore first (first click) Eye tracking, final product What attracts attention Where people look for information, repeatedly 13

Case 2: Internal Revenue Service Problem: What parts of the form do people actually read? Method: In-lab usability testing with eye tracking 14

Time (seconds) Case 2: Internal Revenue Service Participants did not read the instructions in their entirety (page 1: left; page 3: right); rather, they skimmed and then moved on to the form where they needed to enter information. 120 100 80 60 40 20 Aggregate fixation count heat map across all participants, Page 1. Participants looked at Purpose of Form section the most often. 0 Page 1 Page 2 Page 3 Length of time spent on each page of the instructions before working on form. 15

Case 2: Internal Revenue Service Usability testing, final product: inform about redesign What people think of the instructions What parts of the form are unclear How they complete the form Eye tracking How much of the instructions people read Which sections, in particular Where they drop off When they flip back, what they read 16

Case 3: Department of Defense Problem: Can people complete the survey as intended? Method: In-lab usability testing 17

Case 3: Department of Defense Default selection is within the first box Slider Scale selection was unclear Are these nonresponses? Switching from a left-to-right rating scale to a top-to-bottom rating scale was confusing; bottom = low value. 18

Case 4: Department of Defense Problem: What difficulties do respondents have with the survey? Method: In-the-field ethnography 19

Case 4: Department of Defense Ethnography Respondents had many distractions Respondents completed it in more than one sitting It took a lot longer than expected 20

Case 5: National Institute of Standards and Technology Problem: What makes a password secure and usable? Method: In-lab user UX study on tablet and smartphone 21

Case 5: National Institute of Standards and Technology Usability testing People had greater difficulties with passwords that required multiple screens Number of screens was more impactful than number of characters 22

Case 6: National Cancer Institute Problems: What do users want? Do the labels make sense? Which designs work best? Methods: In-lab and remote usability testing 23

Case 6: National Cancer Institute In-lab usability testing Sessions with survivors and supporters Remote testing Sessions with oncologists, advocates, and PIs Feedback from all user groups Redesigns were better informed 24

Conclusion

Better UX means Higher user satisfaction Increased efficiency and accuracy Repeat visits and recommendations Decreased costs for the organization Reduce call center phone calls and staffing Data you can trust Empirically tested products From the end users perspective 26

It s up to you. Throughout product development Any size budget You have the power to improve your product 27

Thank you! Twitter: LinkedIn: http://www.linkedin.com/company/fors-marsh-group Blog: www.forsmarshgroup.com/index.php/blog Jennifer Romano Bergstrom @romanocog jbergstrom.com FCSM 2013 Washington, DC