Case Study: Successful Adoption of a User-Centered Design Approach During the Development of an Interactive Television Application

Similar documents
IBM s approach. Ease of Use. Total user experience. UCD Principles - IBM. What is the distinction between ease of use and UCD? Total User Experience

Choosing the Right Usability Tool (the right technique for the right problem)

2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system

USER-CENTERED DESIGN KRANACK / DESIGN 4

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms

needs, wants, and limitations

Usability Evaluation as a Component of the OPEN Development Framework

Volume Licensing Program Now Available. Announcing the Release of OnePager Pro and Express 3.2. Welcome! Inside this issue: (See Volume on page 3)

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

Agile Accessibility. Presenters: Ensuring accessibility throughout the Agile development process

COGNITIVE WALKTHROUGH REPORT. The International Children s Digital Library

User Testing Study: Collaborizm.com. Jessica Espejel, Megan Koontz, Lauren Restivo. LIS 644: Usability Theory and Practice

The Role of User U. Centered Design Process in Understanding Users. Andrea F. Kravetz VP User Centered Design Elsevier August 2005

User-centered design in technical communication

Foundation Level Syllabus Usability Tester Sample Exam

Promoting Component Architectures in a Dysfunctional Organization

Evaluation and Design Issues of Nordic DC Metadata Creation Tool

Analytical &! Empirical Evaluation

PUTTING THE CUSTOMER FIRST: USER CENTERED DESIGN

cs465 principles of user interface design, implementation and evaluation

Cognitive Walkthrough

Marketing Performance in Executive perspective on the strategy and effectiveness of marketing

Acurian on. The Role of Technology in Patient Recruitment

HTC.com THE USER EXPERIENCE AWARDS SUBMISSION

Portfolio. Mihai Marin

Digital Marketing Manager, Marketing Manager, Agency Owner. Bachelors in Marketing, Advertising, Communications, or equivalent experience

The Pluralistic Usability Walk-Through Method S. Riihiaho Helsinki University of Technology P.O. Box 5400, FIN HUT

Meet our Example Buyer Persona Adele Revella, CEO

SWEN 444 Human Centered Requirements and Design Project Breakdown

Table of contents. TOOLKIT for Making Written Material Clear and Effective

Use Case Study: Reducing Patient No-Shows. Geisinger Health System Central and Northeastern Pennsylvania

Foundation Level Syllabus Usability Tester Sample Exam Answers

Improving the Usability of the University of Rochester River Campus Libraries Web Sites

Cognitive Walkthrough

VIDEO 1: WHY IS THE USER EXPERIENCE CRITICAL TO CONTEXTUAL MARKETING?

The LUCID Design Framework (Logical User Centered Interaction Design)

INF 231 Project 1 (Customer: Dr. Geoff Ward) Fernando S., Hosub L., Roeland S., Ya-Wen L., Sunakshi G., Michael W. B., Sowmya J.

How to Conduct a Heuristic Evaluation

Overview of the course. User-Centred Design. Group. Practical issue. Writting the report. Project work. Fang Chen

CASE STUDY IT. Albumprinter Adopting Redgate DLM

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

"Charting the Course... ITIL 2011 Managing Across the Lifecycle ( MALC ) Course Summary

Usability Report. Author: Stephen Varnado Version: 1.0 Date: November 24, 2014

Usability Testing CS 4501 / 6501 Software Testing

Problem and Solution Overview: An elegant task management solution, that saves busy people time.

Step-by-Step Instructions for Pre-Work

Evaluation techniques 1

Evaluation techniques 1

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

Case study on PhoneGap / Apache Cordova

TIE Project Work on Pervasive Systems

Perfect Timing. Alejandra Pardo : Manager Andrew Emrazian : Testing Brant Nielsen : Design Eric Budd : Documentation

h(p://ihm.tumblr.com/post/ /word- cloud- for- hci- human- computer- interacbon CS5340 Human-Computer Interaction ! January 31, 2013!

Data Governance Quick Start

Integrating User Evaluation into Software Development Environments

Curtin University School of Design. Internet Usability Design 391. Chapter 1 Introduction to Usability Design. By Joel Day

Design Iteration: From Evidence to Design. Slides originally by: Dick Henneman

Review, Adapt, Improve. April 3, 2018

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

Usability Testing Review

BEYOND THE LIVING ROOM

Optimize Online Testing for Site Optimization: 101. White Paper. White Paper Webtrends 2014 Webtrends, Inc. All Rights Reserved

Iteration and Refinement 02/18/2003

3-Part Guide to Developing a BYOD Strategy

Woolwich CHC adopts the CIW as a strategic planning tool. A tool to shift the conversation. Using the CIW to advance community health and wellbeing

Customize. Building a Customer Portal Using Business Portal. Microsoft Dynamics GP. White Paper

Applying Usability to elearning

IBM MANY EYES USABILITY STUDY

CS Equalizing Society - Assignment 8. Interactive Hi-fi Prototype

UX Research in the Product Lifecycle

SFU CMPT week 11

Step by Step Instructions for Pre Work

HP environmental messaging

MiPhone Phone Usage Tracking

Cognitive Walkthrough Evaluation Yale University School of Art

Hello everyone, how are you enjoying the conference so far? Excellent!

SYSPRO s Fluid Interface Design

The Power to Prototype

Delivery guide: SAGE Video

CHAPTER 4 HUMAN FACTOR BASED USER INTERFACE DESIGN

6 TOOLS FOR A COMPLETE MARKETING WORKFLOW

Usability Report for Online Writing Portfolio

THE REAL ROOT CAUSES OF BREACHES. Security and IT Pros at Odds Over AppSec

Requirements Gathering: User Stories Not Just an Agile Tool

Overview: Introducing Usability and User-Centered Design (UCD)

USABILITY REPORT A REPORT OF USABILITY FINDINGS FOR OFF THE BEATEN PATH WEBSITE

Dataverse Usability Evaluation: Findings & Recommendations. Presented by Eric Gibbs Lin Lin Elizabeth Quigley

White Paper. Incorporating Usability Experts with Your Software Development Lifecycle: Benefits and ROI Situated Research All Rights Reserved

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Responsive Redesign dispatch.com 10tv.com thisweeknews.com

Experiences in Data Quality

Stream Features Application Usability Test Report

ebook library PAGE 1 HOW TO OPTIMIZE TRANSLATIONS AND ACCELERATE TIME TO MARKET

Assignment 8 rekindl Local Community (1:30PM) Meet The Team. Ryan C. Amanda L. Sara V. James C.

Rapid Research. For 2.0 Development. Presented by Kelly Goto

Describe the benefits of incorporating design methodologies on projects for increased customer alignment and business impact.

Measuring Web 2.0. Business Challenges

Usability Testing Essentials

Canada Highlights. Cybersecurity: Do you know which protective measures will make your company cyber resilient?

Transcription:

Case Study: Successful Adoption of a User-Centered Design Approach During the Development of an Interactive Television Application Sheri Lamont Microsoft Corporation One Microsoft Way Redmond, WA U.S.A. sheril@microsoft.com Abstract When creating a user experience for interactive television applications, or for any interface for that matter, it is important to start user research early in the design cycle, so that improvements can be made on the user experience through iterations. Unfortunately, during the development of user experiences this does not often happen. This paper discusses how a user-centered design approach was successfully adopted during the creation of an interactive program guide. Specific tips and tricks for effectively working with development teams will also be presented. Keywords Interactive television, interactive program guide, usability, user-centered design, case study Introduction The term usability, in the context of creating software represents an approach that puts the user, rather than the system, at the center of the process. This philosophy, called user-centered design (UCD), incorporates user concerns and advocacy from the beginning of the design process and dictates that the needs of the user should be foremost in any design decision [ISO/IEC 1999]. In large software companies, usability professionals often find themselves coming into the process after the planning stages of a project because the team thinks of them as a resource that can be used once a user experience has been created. Alternatively, many teams believe that due to their shipping deadlines, they don t have time to incorporate user-interface changes based on user research and usability evaluations of the product. As concluded by [Rosenbaum et al., 2002], integrating a UCD approach into the development process is challenging and requires political savvy, knowledge of a wide variety of methods, flexibility in using methods, inspiration and innovation. This paper discusses how the Microsoft TV usability group rallied the development team and 13 got them to adopt a UCD approach for the creation of an interactive program guide (IPG). Case Study: Designing the Microsoft TV Interactive Program Guide The Microsoft TV Interactive Program Guide (IPG) is an embedded interactive television application that resides natively on a low-end settop box. This simple guide allows users to scan television shows listed in a grid format, search for television programs by categories or keyword, set TV reminders so they are informed when their favorite television program is on, and set parental controls for rated content. A recent report from [Datamonitor, 2002] identifies the following key attributes as central to the development of advanced IPGs: simplicity (leveraging a low level of input and providing users with a simple and familiar interface), usability (the environment should reflect the likely nature of usage, for instance with the initial options relevant to the context the user has entered the IPG from), accuracy (program listings are up to date at all times to ensure the trust of the end user) and objectivity (one type of content is not prioritized over another). When we created the Microsoft TV IPG, the entire team s primary focus was on simplicity and usability (identified as the key consumer needs

left unfulfilled by IPGs already on the market). One of the main business goals was to build a product that is faster, easier-to-use, more attractive and more fun than competitive products. In order to achieve this, the usability team proposed that a UCD approach be adopted by the product team and tightly integrated with the development process of this product. Due to the fact that the business goals aligned with usability s goals for the product, this was easy to achieve. Additionally, the development team for this product was small approximately 25-30 people, which made it a good environment to adopt a UCD approach. The user interface (UI) design specifications were written and owned by a single person in the role of program manager (spec writer) and UI designer, thus influence did not need to be directed at a group of people. With everything in place, usability embarked on a challenging and very exciting process to create a best of breed user experience for an IPG application. The following section walks through each step of our approach to implementing UCD for this project and what specific methodologies were used. Have a Business Goal Related to Ease of Use Many usability professionals feel that in order for good UCD awareness to take place, the attitudes and philosophies of usability (and design for that matter) must be internalized by those who make decisions that affect the final design of a product. The shift to internalizing usability can be accelerated by thinking about usability from a strategic (proactive and integrated) instead of tactical (responsive and isolated) perspective [Rosenbaum et al., 2000]. By having some of the main business goals for the IPG product specify ease of use and simplicity, a good user experience became a strategic initiative. The end result was that the entire team, from senior management down to the individual contributor level fostered and promoted the need to ship a best of breed user experience for our product. Fix What Was Broken Due to the fact that the entire team for creating the Microsoft TV IPG product was very small, that meant the usability resources were limited as well. In order for the UCD approach to be successful, the first thing that was done by usability was reflect upon what worked well for the product teams and what didn t work well in 14 the past. One major issue that surfaced was a lack of usability s awareness of what milestone the team was at in the development process and how each milestone was defined. From here, usability began meeting with the product team members to ensure the following: Usability understood the various milestones in the development process. Usability worked with the team to come up with usability exit criteria for each milestone (i.e., there must be no severity 1 usability issues in order for the team to declare code complete). Usability proposed a specific milestone for UI by establishing a formal UI freeze date. During the development of the product, the team understood when usability would need to run evaluations on the product, and what support (i.e., such as builds on set-top boxes) usability would need from them. The other major issue that surfaced was the product team s inability to see how effective this usability support really was. To address this issue, usability brainstormed with the members of the product team and came up with usability projects that the team was excited about and which also facilitated easy tracking of progress during the development cycle. These included benchmarking lab studies and competitive evaluations. Benchmark Using Consistent Tasks for Each Study To prepare for the benchmarking lab studies, usability identified and wrote a set of 13 core tasks we thought most users would complete with the product. Data from previous Microsoft site visits to WebTV customers and satellite subscribers were used to help narrow down the task list. Some of the 13 tasks included: Changing channels/channel surfing Finding out the name of the TV program currently being watched and what it was about Searching for a specific TV program Setting a list of favorite channels Finding child-appropriate cartoons Setting parental controls These 13 core tasks were given to participants to complete each time a lab study was conducted so that the task performance could be used to benchmark and measure the team s progress over

time. This benchmarking approach enabled usability to track the improvements the team made during each stage of the development process, over a period of about one year. In total, three benchmark studies were conducted during that one year period. Track the Changes to the User Experience As mentioned above, benchmarking enabled usability to track progress towards the UI goals throughout the year. To ensure that the benchmarking results were visible to the team and easily interpreted, we created a special report format that allowed the product team to compare and contrast the success of tasks from study to study. Due to the report format, each finding and recommendation from the previous study could be compared to the current finding so that it was easy to verify whether the last UI fix fully alleviated the usability issue. If the fix had not alleviated the issue, additional usability findings were reported and further recommendations were presented in the report. If the issue was fully addressed, no further recommendations were listed in the row for that particular issue. In addition to making it easy to compare and contrast the success of tasks from study to study, the report format also highlighted how the iterations improved the overall experience. Refer to Table 1 for an excerpt of the report format. There were some instances where usability issues surfaced that were not identified in the previous baseline study. If this occurred, these issues were listed in a New Issues section of the usability report. The findings were listed for the current study and the text Not a usability issue was placed in the previous study findings column. Finally, to help the product team understand how serious an issue really was, usability prioritized each issue using a system of asterisks and check marks. Three asterisks indicated that the issue was the most severe, while a check mark indicated that the issue had been solved or that the task was a success (see Table 2). Table 2: Legend used in report format to explain priority. Priority Definition *** Serious this can prevent users from completing tasks ** Moderate this can cause users to have difficulty completing tasks * Minor this can cause users to hesitate in completing tasks and complete tasks less efficiently NEW! Fixed no more design changes are required New Finding- this is a new usability issue identified from the current test Table 1: Excerpt of report format for tracking usability issues. STUDY #1 FINDINGS ACTION TAKEN STUDY #2 FINDINGS FURTHER RECOMMENDATIONS Expected to be able to specify Able to successfully search a time rather than a timeframe using By Day and Time label in Search PRI Three participants wanted the ability to specify specific times rather than a timeframe in the Search by Time feature. These participants also indicated that the labels Morning, Afternoon, Night and Late Night were ambiguous because they weren t certain if 11:00 pm constituted Night or Late Night. Changed search capabilities from 4 generic categories to specific hours of the day to allow user to specify a specific time. Bug # A21B5 All participants were able to correctly search on a specific day (tomorrow) at a specific time for a specific program using the By Day and Time label. All participants easily picked out the day and time to complete the task. When asked to search for programs on today, six participants selected the Today label within search, while only one participant opted to use the Browse feature. Continue to allow users to specify a specific day and time to complete a search. No recommendations are necessary. 15

After each benchmark study was completed, we quantified the team s progress by tallying up the number of usability issues found, and compared that number against the number of usability issues that were addressed. Furthermore, we found it useful to compare the number of usability issues discovered from one study to the next so the team could see that the overall discovery of usability issues was on a down trend (see Table 3). Quantitatively reporting the usability findings really emphasized the team s progress, and highlighted the merits of the UCD approach. Table 3: Example of how usability findings were reported quantitatively. Benchmark Study A 33 usability issues were discovered 30 of these issues were addressed through a redesign or UI change (to be verified in next study) Result the team is addressing more than 90% of the usability issues found with the user experience Benchmark Study B 15 usability issues were discovered 10 of these were new usability issues 5 were re-occurring issues all had dropped in severity from previous study Deliver on Your Promise One of the most important things that usability can do to build a sense of trust and commitment is to deliver on promises made to the team. This not only includes delivering usability reports on time (as determined by usability and the product team) but within an appropriate amount of time after the study has been completed. Too often, usability studies are run and it takes another 2-3 weeks for the usability report to be completed. In our case, for each study that was run for the IPG product, usability agreed that the full report would be completed and ready for reading 4 days after the last participant came through the lab. During the development cycle when the team was working hard towards a specific milestone and basically had a deadline for which UI changes could not be accepted beyond a certain point, usability worked with the team to provide quick findings so that a high level summary of the results could be shared with the team the day after the study sessions finished. This facilitated faster tracking of issues and allowed the product team to make the necessary changes to the UI in time for critical deadlines. Make it Easy for the Product Team Once product specifications are created and signed off, the product team typically lives and breaths through their bug tracking system. In other words, changes don t get done unless they are filed as a bug. This was no different for our situation. In order to make it easy for the product team to track the usability and design recommendations from the benchmark studies, we followed suit and filed all usability issues/recommendations for improving the user experience as (UI) bugs in the bug tracking system. Each bug contained steps on how to reproduce the issue, a detailed list of the usability findings and the recommendations for how to fix the usability issue. Usability assigned the bug to the person who owned that particular area in the product, and then when it was fixed and the new code was checked in, usability received a notification of the fix. Because usability had filed a bug on the UI change, that also meant closing the usability bugs following the same bug filing procedure as the product team - the person who opened the bug was responsible for verifying that the change indeed fixed the problem, and then they were responsible for assigning the bug to a Test team member for closing. In essence, we had final say on the bug! To further enhance tracking, we filed the UI bugs before distributing the report so that the bug ID number could be written alongside each issue listed in the usability report (see Table 1). Conduct Competitive Evaluations As previously mentioned, it was agreed that a competitive usability evaluation would be conducted against the incumbent guide product near the end of the development cycle. For this study, we used as many of the benchmarking tasks as possible, since those tasks were common tasks that users could perform with either product. Next, 10 participants were recruited for the study by an outside agency and these participants were asked to come into the labs to use both products (in a living room setting of course). The order in which users interacted with 16

the products was counterbalanced to control for order effects. Feedback from the participants was obtained through the verbal protocol method, and metrics such as task success, number of errors, and ability to complete tasks within the time limit were recorded. Participants completed post-task and post-test questionnaires to gather subjective feedback as well. Due to the fact that the results turned out in favor of the IPG 1.0 product, usability wanted to evangelize the success of the UCD process throughout the organization. A division-wide usability presentation was held. Once the data collection was complete, usability analyzed the results and presented the findings in a side-by-side comparison slide presentation. Video highlights were used to help illustrate the findings and make the presentation a little more interesting for the audience. Results and Discussion Overall, usability ran three benchmark studies on the IPG product and one competitive study during the year of development. Over 40 major UI/usability bugs were filed in the bug tracking system. In the end, the development team had made three full iterations on the design and had addressed over 90% of all usability issues identified through the research. Ultimately, we found that almost 90% of users who evaluated our product in the labs found the IPG product to be easier to use, required fewer steps to complete tasks, completed tasks more easily, and found our user experience more visually pleasing than the incumbent guide. Looking Back What Could We Have Done Better? It is always important to reflect on your work to see where there is room for improvement for future projects. We have identified a few key learnings based on our experience that include: Try to run the competitive study before the product ships so that any feedback you get can be used for improving the user experience in the current version. In our case, we had to use what we learned from the competitive study in future versions of our product because the competitive study was conducted after IPG 1.0 had shipped. Find an informal usability review methodology that evaluates the user experience before conducting the benchmark studies. Since shipping IPG 1.0, the usability group has worked really hard to find an informal technique that works well with the product team. We now use an adapted methodology from Wharton et al., which is called the Streamlined Cognitive Walkthrough (SCW) (Spencer, 2000). Usability has found this methodology to be highly successful for identifying usability issues that can be easily fixed before putting the product in front of users. The other benefit the SCW offered was increased trust between the usability engineer, the program manager, and the UI designer. Due to the SCW methodology s success, usability now requires running a SCW on each user experience before it is put in front of real users in a costly lab setting. Conduct more field research. During the creation of IPG 1.0, we had to rely on our previous experience from related site visit work on satellite and cable because we had to forego doing site visits specific to IPG in lieu of spending time with the development team in the planning/scheduling meetings defining the milestones. Now that the relationships have been forged, we recommend conducting more user research and site visits during the planning stage of the development process. Fortunately, because the UCD approach was so successful, it helped the usability group justify more money for these types of projects for the future so we will have enough resources to continue spending time in the planning meetings and conduct the up-front user research. Increase your involvement with the planning/scheduling meetings by writing a formal usability plan for the release so that members of the product team have a document to which they can refer. Since shipping IPG 1.0, we have written a formal usability plan for our other major shipping product and have worked with the product team to re-order development tasks so more chunks of the user experience are implemented at one time (as opposed to the tasks being spread out during the implementation phase). Since the feature set for this new project was much larger than IPG 1.0, this has helped us get some of the features in front of users even earlier in the development process. 17

Conclusions The UCD philosophy incorporates user concerns and advocacy from the beginning of the design process and dictates that the needs of the user should be foremost in any design decision. The results as presented in this case study showed that a UCD approach was successfully adopted in a development cycle and provided evidence that it positively impacted the overall user experience of the IPG product. This case study also reinforces that if usability engineers want a team to adopt a UCD approach during the development cycle of a product, they must be willing to work at it with the team. Understanding the development team s processes and the constraints in which they develop user experiences will go a long way to ensure that the tactics you take to influence the user experience will be positively received. References Datamonitor, 2002. "The Evolution of the EPG: Electronic Program Guide Develop in Europe and the US", September. ISO/IEC. 1999. 13407 Human-Centred Design Processes for Interactive Systems, ISO/IEC 13407: 1999 (E). Rosenbaum, S., J. Rohn & J. Humburg. 2000. A Toolkit for Strategic Usability: Results from Workshops, Panels and Surveys. Proceedings from the Conference on Human Factors and Computer Systems, The Hague, Netherlands, April 1-6, pp. 337-344. Rosenbaum, S., C. Wilson, T. Jokela, J. Rohn, T. Smith & K. Vredenburg. 2002. Usability in Practice: User Experience Lifecycle Evolution and Revolution. Proceedings from the Conference on Human Factors and Computer Systems, Minneapolis, Minnesota, April 20-25, pp. 898-903. Spencer, R. 2000. The Streamlined Cognitive Walkthrough Method, Working Around Social Constraints Encountered in a Software Development Company. Proceedings from the Conference on Human Factors in Computing Systems, The Hague, Netherlands, April 1-6, pp. 353-359. Wharton, C. W., J. Reiman, C. Lewis, & P. Polson. 1994. The cognitive walkthrough method: A practitioner s guide. In J. K. Nielson, & R. L. Mack (Eds.) Usability Inspection Methods. Wiley, New York. 18