ASSIGNMENT 06: Heuristic Evaluation

Size: px
Start display at page:

Download "ASSIGNMENT 06: Heuristic Evaluation"

Transcription

1 Group 4 Heuristic Evaluation 1 ASSIGNMENT 06: Heuristic Evaluation SI 622: Evaluation of Systems & Services Prof. Mark Newman, Winter 2012 March 15, 2012 Group 4: Charles Adams Jane Davis Nina Elias Liu Liu Word Count: 3,481

2 Group 4 Heuristic Evaluation 2 Table of Contents Executive Summary 3 Introduction 4 Methods 5 Findings & Recommendations. 6 Discussion 11 Conclusion 12 References 13 Appendix 14

3 Group 4 Heuristic Evaluation 3 Executive Summary The Ubi Group was able to perform a heuristic evaluation on the entirety of the Ubi service, as the site is fairly straightforward and simple. In order to maximize findings, each group member completed an individual heuristic evaluation utilizing Nielsen s ten heuristics. When a heuristic was violated, we assigned a severity rating. The group then came together after combining the total findings in a matrix and identifying patterns. Seven key findings were created from this matrix, including: 1. Unclear language and terms used 2. Difficult process for adding shows or using the channel guide 3. User unclearly notified of which video is currently playing 4. Menus and repeated links are confusing 5. Unexplained errors and unavailable content 6. Ambiguous button functionality 7. Lack of help Each finding resulted in various recommendations for Ubi. These recommendations range from utilizing more common language to consolidating links, all of which will help Ubi in creating a more pleasing and satisfying user experience on the site.

4 Group 4 Heuristic Evaluation 4 Introduction MyUbi.tv is an online streaming video service that focuses on content organized by the user. Ubi is currently in beta, with approximately 500 users. Although the startup is based out of Detroit, Michigan, the user base is not limited to the Detroit area. On the site, users can curate content by using the DVR function or by creating their own channel. Ubi sees this user content curation as the key function that differentiates it from other similar services, and places most emphasis on features that will support this. In the future, Ubi would like to offer its services through more platforms, specifically mobile devices and televisions. In its current iteration, however, it is an online service that incorporates functions and features native to computer users. Ubi users are heavy television watchers, most of whom are closely connected to the tech community in some way. Ubi seeks to capture a significant number of users who enjoy watching television online, and not while the shows are being aired. There is significant competition in this streaming television online market, however. Interviews conducted previously with a few current beta users indicated user discomfort and confusion with various aspects of the service. Thus, it seems Ubi would benefit from a better understanding of how it can support a more intuitive, satisfying user experience. In order to accomplish this, Ubi Group conducted a heuristic evaluation using Jakob Nielsen s researched heuristics (1994). With four evaluators, this method makes for a faster, more efficient method of determining usability issues without the hassle of recruiting users. Since the interface is rapidly changing, the heuristic evaluation allows Ubi Group to provide feedback to the developers even faster, and with more guidance and parameters than a traditional usability test. Additionally, since there is a usability test planned in the near future for this service, the findings and recommendations from this heuristic evaluation can further inform the tasks and major issues that might determine crucial areas for testing in the future.

5 Group 4 Heuristic Evaluation 5 Methods Ubi is a relatively simple site, with almost all post-login interactions taking place on a single main screen, with windows and menus opening or closing based on the user s actions. Since the site is of manageable size, we performed our heuristic evaluation on its entirety, determining that the scope of the product was such that we could do so without our assessment suffering from a lack of focus. For our evaluation, we chose to focus on the following heuristics: 1. Visibility of system status - Is it easy to determine if a feature is loading? Can the user easily determine if the system is still working to finish an operation? 2. Match between system and real world - Does the site use terms that are familiar to users, in a way that is familiar to them? 3. User control and freedom - Can the user move freely throughout the site without getting errors? Is it easy to move back and forth between locations on the site? 4. Consistency and standards - Are the aesthetics consistent? 5. Error prevention - Are unavailable or broken features and content hidden? Are links functional? 6. Recognition rather than recall - Are there cues that help the user remember how to perform a specific task? Is the user forced to rely on their memory to navigate the site? In order to maximize our findings and identify areas that were consistently identified as problematic, we each performed a separate heuristic evaluation of Ubi. By performing four separate evaluations, we were able to see patterns in our assessments of Ubi. This also minimized the possibility that one team member s findings would influence the rest of the group, or lead other members to identify issues they might not have otherwise considered to be severe enough to include in their findings. Each team member entered their findings into a compiled matrix, and rated the severity of the findings on a scale from 0 to 4, with 0 meaning that the finding was not a usability problem, and 4 identifying an extremely severe problem with the site. Once these findings were compiled, the matrix was used to identify the most severe issues, as well as problems that had been experienced by multiple users. We identified seven key findings that we considered to be most severe, which are discussed in further depth in the following section of this report.

6 Group 4 Heuristic Evaluation 6 Findings & Recommendations Finding 1: Unclear Language and Terms Used The site uses some terms that are not regular or common words, such as DVR, uvue, and uimport (see Figure 1). Without further explanation or guidance, users may have no idea about the exact functionality of each menu item. For instance, the term DVR in an online system does not translate well, because this concept is associated with regular television, not online viewing. Thus, it violates the match between the system and the real world heuristic, as these Ubi-oriented terms are unfamiliar to the user. Additionally, even though users can click the information icon to see more information about the functionality of uvue, the message is confusing. It states, Anytime you click a uvue icon, you will be directed to the latest episode. However, the uvue icon referenced in this tip is referring to the small boxes below the menu, not the larger uvue icon with the information option. Therefore, it may be unclear to some users which uvue icon has this functionality. Figure 1: This screenshot shows icons in the Ubi interface that utilize unfamiliar terms Recommendation: These terms can be replaced by more regular, commonly used words that users can easily associate with watching television shows online. For uapps, DVR, and uimport, perhaps creating more intuitive icons along with the current guide may be helpful, too. For uvue, it may make more sense to use the terms Playlist or My Favorite Shows instead, since its purpose is to allow users to add shortcuts to their favorite shows. In addition, the uvue icon should be used in a more consistent manner for either adding items to uvue or loading the latest episode for the show chosen from uvue.

7 Group 4 Heuristic Evaluation 7 Finding 2: Unintuitive process for adding shows and using the channel guide Even after multiple visits to Ubi, using the DVR feature still causes confusion and might require multiple attempts to correctly accomplish what they came to the DVR function to accomplish. This is a clear violation of the recognition heuristic. Users are forced to rely on recall, and when their memory fails them, performing the task becomes an undue burden. No shortcut exists to help users figure out how to quickly add shows to the DVR from anywhere but the channel guide, and there is no way to search for additional shows to add while the user is still in the DVR menu. The channel guide is also a source of confusion, based on its violation of our heuristic evaluating the match between the system and the real world. In this instance, the icons on the shows in the channel guide do not match up with industry standards and user expectations. Each show has a green plus icon, which does not add the show to the DVR, but does add it to the uvue feature, and a blue icon with a lowercase i on it, which offers users the option of going to the show page (which does not work in all browsers) or adding the show to the user s DVR. The channel guide s preset functions for creating personalized channels do not seem to serve any particular purpose, since the user can already access all of the channels from the main channel guide. All of these factors add to the confusion of using the guide. Recommendation: Ubi offers users multiple ways to accomplish seemingly identical tasks, all revolving around saving videos for future viewing. Although this was made this way to assist the user, we now find this duplication is unnecessary. In addition, it goes against the delightfully simple, minimalist nature of the user interface. When saving shows for future viewing, users should only need to save them once, and they should all go to a single, easily accessible location. Shortcuts should be easy to understand and serve a single purpose. When clicking a plus button, the expectation will be that this allows the user to save the show they want to save. If it is a series, they should be given the option of saving a single episode, or saving all available episodes. Buttons with the blue information-style icon, although informative and well placed, should contain only information about a particular show, and should not be combined with options to save the video for future viewing. Finding 3: User is not clearly notified of which video is currently playing There is more than one way to add a TV show to the playlist. For example, the user can click the title of the show through the main Schedule/Channel window and then that particular show will be highlighted as NOW PLAYING. Another option is to hover on the title and then click the (+) button to add that item to the list of uvue window. Multiple

8 Group 4 Heuristic Evaluation 8 options are great because they provide the user with more freedom to switch between shows. However, inconsistencies in displaying the title for the currently viewing show can arise. As shown on Figure 2, the show Jimmy Kimmel Live is still highlighted as NOW PLAYING, although the user is, in reality, watching the newest episode of Glee. Since the actual show that the user is currently watching only shows its title after the user clicks the small triangle button on the right of the Autoplay is on feature under the video, it might confuse the user as to which show he or she is viewing at that moment. This violates the recognition rather than recall heuristic, because it forces the user to remember their last selection or do an extra click to confirm which show he or she is watching. Figure 2: Inconsistency of NOW PLAYING and Playlists taken Feb 24, 2012 Recommendations: Ubi should let the user know where they are in the interface. As more than one option is available for users to choose which show they want to watch, signs indicating which show is currently playing should be clear. In particular, if the show is selected from uvue window, NOW PLAYING should disappear and all shows displayed in Schedule/Channel window might have a consistent black background because they are unselected. Even better, if the border or outer box of the selected show in the uvue playlist was highlighted while playing, or any other special signs that indicate that that particular show is playing now would be helpful. Finding 4: Menus and repeated links are confusing Certain areas of the Ubi interface are cluttered with duplicate links, violating the aesthetic and minimalist design heuristic to some degree. As discussed in the findings

9 Group 4 Heuristic Evaluation 9 above, some functions can be accessed from more than one location. For example, the DVR Menu is linked from the drop-down menu and the uvue window, and the See All Channels function is linked from both the drop-down menu and the channel guide. This crossover of links crowds the interface and may cause confusion as to why the site s functions have been divided into three areas (left-hand menu, uvue window, and channel guide). In addition, the drop-down menu is organized in a confusing manner. It is divided into three headings titled Menu, Tools, and Help, but it is unclear as to why only two links appear under the Tools header when other functions are clearly accessible in other areas of the interface. Recommendations: The site s functions could be organized in a more concise and clearly organized way to make accessing them more intuitive to users. For example, the Tools section of the drop-down menu could offer links to every function (DVR, uimport, uapps, etc.), and other links to these options could be removed from the interface. Links to the site s main functions should be concentrated in one area of the screen in order to avoid confusion. Finding 5: Unexplained errors and unavailable content The ability to easily view shows is an integral part of the online television streaming service. The user needs to know which video clips are viewable before committing to the action, clicking, and then failing to find any content. The user is also not alerted beforehand to the blank results they are about to view, which can be an extremely painful and disappointing experience - especially if it happens in various instances. This violates the standard of error prevention by not allowing the user to make an efficient choice beforehand. In addition, it also violates the heuristic of providing methods for the user to diagnose and recover from the lack of content since there are no further instructions as to what the user should do if there are no clips available at this time. Figure 3: This screenshot shows the message displayed after clicking Video Clips in the menu to the left.

10 Group 4 Heuristic Evaluation 10 Recommendation: We recommend not only fixing the issue of telling the user what is available beforehand, but also providing instruction when there is no video available at that time. Thus, Ubi can be improved in this area by helping the user in two specific ways when content is missing. First, inform them before they click on an option that does not provide any available videos. This can be done by simply not providing an option for the user to click on, graying the button out, or just not making it clickable. Secondly, after any errors are reached (which will likely happen regardless of how well you might plan ahead), provide users with alternatives. These alternatives should be polite and instructive, or informative. A few examples of methods to mitigate this issue are providing assistance with searching for another video, displaying other related clips that are actually available, or giving instruction as to how the clips they were originally looking for might be found. Finding 6: Ambiguous button functionality Some buttons within the Ubi interface are used in different locations to accomplish different functions, violating the consistency and standards heuristic. For example, clicking the (i) button for more information within the uvue window offers the user a tip for using the feature, while the same button is used on the channel guide to open information and further actions related to a specific show. This may cause confusion over the use of the (i) button. If the user determines that it is used to trigger a help feature, he or she will not know that it is also used to access other crucial features. Similarly, the (+) and (-) buttons are used for multiple tasks. Within menus, they are used to maximize or minimize options. They are also used within the channel guide to add shows to uvue and within other screens to remove items from lists. Recommendation: To make the interface more consistent, buttons used to complete one type of task should not be used in other areas to complete unrelated tasks. For example, the graphic used to open and close menus should be unique from the one used to add and remove items from lists. Finding 7: Lack of help The only help available to users is via videos. There are no hands-on tutorials or FAQ pages.

11 Group 4 Heuristic Evaluation 11 One of the most important things a new product or service does for its users is orient them by providing help in useful and appropriate places. While Ubi does offer videos teaching users how the service works, these are the only guidance provided by the system. Watching a full video is time-consuming, and much of its content can be forgotten after the initial viewing, leading to frustration when trying to use the actual site, since a user s only recourse is to re-watch the help video. This also means that information about individual features cannot be found if a user runs into confusion or trouble when trying to, for example, add a show to the DVR. Recommendations: An initial help video is a useful way for viewers to get a high-level view of Ubi s capabilities, but not all users will take advantage of this option. Ubi should create tutorials that go over each feature and show the user how to move through the site. After an initial visit, this tutorial could be turned off, but additional help should be available via a thorough FAQ page. Each feature in the menus could also have a small information or help icon that users can click if they are having difficulty remembering how to use it. Discussion Based on the ten usability heuristics developed by Jakob Nielsen, this heuristic evaluation was intended to reveal potential usability issues within MyUbi.tv. Individual evaluators may have had different foci, areas of expertise, and perhaps not been able to identify different problems. It is not realistic to expect that each evaluator catch all the usability problems in the Ubi site. Therefore, the reliability and comprehensiveness of heuristic evaluation depends on experience, opinions, and insights of individual evaluators. Our findings are merely based on four individual evaluators. Since the size of our team is limited to this size, the lack of evaluators might be an issue since Nielsen recommends five (1994). It is possible that we may have not found a few usability issues. Additionally, because we have been evaluating Ubi for about two months, the diversity of evaluators experience is limited. Furthermore, the methods we used to consolidate individual evaluation matrices may provide biased results. Since the usability problems were ranked according to the average severity score (the sum severity score divided by the number of evaluators identifying that problem), when only one evaluator found the problem and rated it as extremely severe, this problem would have a high severity score. However, if all four evaluators found the same or a similar usability issue, but have different opinions towards its severity, then the severity score would be not as high or low. We regarded those usability problems from both cases as

12 Group 4 Heuristic Evaluation 12 key findings, so that problems with high severity scores or any uncovered by more than two evaluators would be fully considered when the scores and results were combined. Finally, we discovered that Ubi made some changes to the interface, and subsequently fixed some of the problems we point out in this heuristic evaluation, after we did our individual heuristic evaluations. This means not all of the findings discussed in this heuristic evaluation may apply to the current iteration of Ubi. Conclusion By performing individual heuristic evaluations, rating the severity of usability issues, and combining our results, the group was able to generate seven key findings. We found that some terms are confusing; it is difficult to use some features, such as DVR and the channel guide; it is not always clear what video the user is viewing; menus and links are generally confusing; unavailable content is not signified or explained to the user; the function of some buttons is ambiguous; and Ubi general lacks help documentation and assistance. We then made recommendations as to potential methods of addressing these issues. Our recommendations include changing Ubi s vocabulary to more recognizable terms; offering more simplistic shortcuts to perform key functions; making it clearer which video is currently being played, creating simpler, more concise menus; preventing the user from clicking on unavailable videos and providing more explanation for alternate steps; making the function of individual buttons more consistent across the interface; and offering tutorials and in-screen help. While our evaluation was limited by the small size of the group and the continually evolving nature of the television streaming site, we still feel that Ubi can improve their user experience significantly by addressing the major usability issues we have pointed out as development moves forward.

13 Group 4 Heuristic Evaluation 13 References Newman, M. (2012). Report Writing & Comparative Analysis [PowerPoint slides]. Retrieved from 92bf1b399689/page/2a4e6e0b-d d Nielsen, J. (1994b). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, John Wiley & Sons, New York, NY.

14 Group 4 Heuristic Evaluation 14 Appendix Appendix 1: Evaluators Chuck (C) Jane (J) Nina (N) Liu (L) Age Range Sex Male Female Female Female Computer MacBook Macbook Pro Macbook Macbook Air Model Web Browser Firefox Chrome Chrome Firefox Operating System Mac OSX MacOS OS X Snow Leopard OS X Snow Leopard Monitor size 13 inch 13 inch 13 inch 11.6 inch Monitor 32-bit 32-bit 32-bit 32-bit colors Monitor Resolution 1280 x x x x 786 Appendix 2: Jacob Neilson s Heuristics for user interface design Reference: Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability Inspection Methods, John Wiley & Sons, New York, NY. NO. Heuristics Descriptions 1 Visibility of system status The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. 2 Match between system and the real world The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. 3 User control and freedom Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. 4 Consistency and standards Users should not have to wonder whether different words, situations, or actions mean

15 Group 4 Heuristic Evaluation 15 the same thing. Follow platform conventions. 5 Error prevention Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action. 6 Recognition rather than recall Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. 7 Flexibility and efficiency of use 8 Aesthetic and minimalist design 9 Help users recognize, diagnose, and recover from errors Accelerators -- unseen by the novice user - - may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. 10 Help and documentation Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large. Appendix 3: Severity Rating Scale 0 = I don't agree that this is a usability problem at all 1 = Cosmetic problem only: need not be fixed unless extra time is available on project 2 = Minor usability problem: fixing this should be given low priority

16 Group 4 Heuristic Evaluation 16 3 = Major usability problem: important to fix, so should be given high priority 4 = Usability catastrophe: imperative to fix this before product can be released Apendix 4: Consolidated Heuristic Evaluation Consolidated Heuristic Evaluation Problems/Questions Ratings Comments The Color change for like/dislike is confusing. L J N C ALL Visibility of system status After hitting "refresh", the video screen goes blank, even if something was playing before this occurred There is no indication of whether or not the video is simply loading slowly, or will not be loading at all. Match between system and the real world Some of the terms are not "regular" English that without translations Some of the terms are unclear such as "DVR", "uvue", uapps and uimport.using the term "DVR" for an online system doesn't translate, because this concept is associated with regular television, not watching things online. User control and freedom

17 Group 4 Heuristic Evaluation 17 When a user hits "reload" in their browser, the site does not return them to the video they were watching The channel guide does not support scrolling using a mousepad on a MacBook Back button is not supported enough Using the back button results in a "confirm form resubmission" screen in place of the video viewer.it would be nice if the back button was supported. Removing shows from playlist can't be redone When viewing a show, rating box will overlay the search box, users can't use search function if the menu is unfolding Consistency and standards

18 Group 4 Heuristic Evaluation 18 Users don't really know what to click on in terms of what is clickable. Minus (-) and plus (+) signs are unclear Both "Menu" and "uvue" use (+)and (-) signs. Users can click either signs or the "Menu" title to switch between fold and unfold but only signs are clickable not the "uvue" title. Minus (-) and plus (+) signs are used to open/close menus in some places and used to add or delete items in other places. Certain items only occur on both manues, but it is not clear why In search box, the default has two different texts - "type search here" in the left sidebar under Search, and "type show name" in the profile account under Search Shows Location of the "Add to DVR" button is not consistent - on the timeline, the user must first open a show's "information" page, but while browsing, it appears directly underneath each thumbnail TALK TO US and FEEDBACK are the same however the names are not consistent

19 Group 4 Heuristic Evaluation 19 Info (i) button used for help in some places (such as uvue) and for show/episode information in other places Two Signals for loading status: one is for loading TV shows, and the other is for loading channels, DVR, and others Error prevention Users are allowed to click on unavailable videos Some videos have expired, but Ubi does not prevent users from selecting them from the channel guide. There is an error message at the bottom of the section when the user enters a ill-formatted address in FEEDBACK/TALK TO US Just says coming soon in certain areas. There are a lot of blank spaces Recognition rather than recall

20 Group 4 Heuristic Evaluation 20 Hard to remember how to add a show to DVR and create chanel guide No clear step by step guide. Users not always know what they are viewing now Sometimes if users click one TV show on chanel/timeline window which will change the show's status to "NOW PLAYING", and then if the user clicks to view another show from playlist, the status of previous show is still "NOW PLAYING" while the actual playing show is not that one. Flexibility and efficiency of use No shortcut for users who want to add videos to their DVR Managing "favorites" should have more complexity Aesthetic and minimalist design Menu is confusing, some links appear in more than one place.drop-down menu on the left is confusingly organized. It is unnecessary to include SEARCH SHOWS in the section of PROFILE ACCOUNT Help users recognize, diagnose, and recover from errors

21 Group 4 Heuristic Evaluation 21 Video expiration occurs, but the user is not informed why, or what to do to find more recent videos When there are no shows that fit users' description, it just says no results found Help and documentation The small blue i button in the corner of shows does not do anything or provide information about what it is supposed to do The only help available to users is via videos - there are no hands-on tutorials or FAQ pages The explanations for uimport is too long and involved Appendix 5: Heuristic Evaluation Notes 1. Notes from Jane Davis Visibility of system status After hitting "refresh", the video screen goes blank, even if something was playing before this occurred - there is no indication of whether or not the video is simply loading slowly, or will not be loading at all. 2 Match between system and the real world Using the term "DVR" for an online system doesn't translate, because this concept is associated with regular television, not watching things online 3 User control and freedom Using the back button results in a "confirm form resubmission" screen in place of the video viewer. Most people do not navigate this way, but it is still a problem. 2 When a user hits "reload" in their browser, the site does not return them to the video they were watching 2 The channel guide does not support scrolling using a mousepad on a MacBook 2

22 Group 4 Heuristic Evaluation 22 Consistency and standards Aesthetically the site is quite consistent, but menus are not - certain items only occur on both menus, but it is not clear why. 2 Error prevention Some videos have expired, but Ubi does not prevent users from selecting them from the channel guide 3 Recognition rather than recall It was impossible for me to remember how to add a show to my DVR, despite having done so in the past, and there are no cues on the site indicating how to do it. 4 Flexibility and efficiency of use There is no shortcut for users who want to add videos to their DVR 3 Help users recognize, diagnose, and recover from errors Video expiration occurs, but the user is not informed why, or what to do to find more recent videos 3 Help and documentation The small blue i button in the corner of shows does not do anything or provide information about what it is supposed to do. 2 The only help available to users is via videos - there are no hands- on tutorials or FAQ pages Notes from Nina Elias Visibility of system status There is a loading symbol when the channel guide is loading and any other buttons are clicked. There is no wait time at all when clicking the menu buttons around the channel guide. (0) Match between system and the real world The terms are not technical, just aren t common language. I don t know what ShowTilePlus buttons are I have a vague concept of what DVR is, I have no idea what uapps and uimport might be (3) User control and freedom Really no need to support this, as the interface stays the same you can't really go back to something, but you can just click on the button that is nearly always there on the homepage. It would be nice if the back button was supported though. (2) Consistency and standards Everything is fairly similar except links - users don't really know what to click on in terms of what is clickable. The menu text looks like it should be a link. (3) Error prevention Just says coming soon in certain areas. There are a lot of blank spaces where I would look for text, but the channel guide is just blank. (1) Recognition rather than recall

23 Group 4 Heuristic Evaluation 23 Task: Trying to create a channel guide. I tried using help to do it since I've never done it before, but now I don't know where my step by step guide to creating my channel guide went! I tried following the instructions and it disappeared :( So I tired to do the first step - add a channel by dragging. But when I tried to drag, it didn't stay in one place! And how do I know where I'm dragging to? I am unclear where to get channels and how to drag them. (4) Flexibility and efficiency of use It looks like there are shortcuts if you're interested. But right now, no one seems to be advanced users (1) Aesthetic and minimalist design They are great with this in the learning center. The menu on the left is still confusing. (2) Help users recognize, diagnose, and recover from errors When there are no shows that fit my description, it just says no results found or whatever. I think they should suggest other ones! I tried clicking around, I wasn't able to break anything :) (1) Help and documentation Help is starting to look a lot better! I would provide text explanations as a pop out window or something so that people can follow along as they do it. But the explanations for uimport is still too long and involved. Users don't care how it works - they just want to access their videos! (1) 3. Notes from Liu Liu Visibility of system status The function of "uvue" is unclear because the meaning of it is hard to understand (4) Match between system and the real world The function of "uvue" is unclear because the meaning of it is hard to understand (4) User control and freedom When viewing a show, rating box will overlay the search box, users can't use search function if the menu is unfolding (3) Removing shows from playlist can t be redone (2) Consistency and standards Both "Menu" and "uvue" use +and - signs. Users can click either signs or the "Menu" title to switch between fold and unfold but only signs are clickable not the "uvue" title (2) Two Signals for loading status: one is for loading TV shows, and the other is for loading channels, DVR, and others (1) In search box, the default has two different texts - "type search here" in the left sidebar under Search, and "type show name" in the profile account under Search Shows (1) TALK TO US and FEEDBACK are the same. Why not name it the same to be consistent? (2) Error prevention There is an error message at the bottom of the section when the user enters a ill- formatted

24 Group 4 Heuristic Evaluation 24 address in FEEDBACK/TALK TO US. Why not use sign- in user's name and as default to prevent error? (2) Recognition rather than recall Users not always know what they are viewing now because sometimes if they click one TV show on chanel/timeline window which will change the show's status to "NOW PLAYING", and then if the user clicks to view another show from playlist, the status of previous show is still "NOW PLAYING" while the actual playing show is not that one. (4) Aesthetic and minimalist design It is unnecessary to include SEARCH SHOWS in the section of PROFILE ACCOUNT (2) Help users recognize, diagnose, and recover from errors There is a red error message at the bottom of the section when the user enters a ill- formatted address in FEEDBACK/TALK TO US. It is not obvious enough to help users recognize errors (1) 4. Notes from Chuck Adams Visibility of system status User sees a spinning icon to indicate that screens are loading and are given messages about video status, although these icons do not always appear when videos initially load or when thumbnails are loading. (1) Match between system and the real world Most of the system uses recognizable television terminology, such as "channel guide," but some unfamiliar terms, such as "uvue," require more explanation. (3) User control and freedom Positive: Most tasks at this point are relatively simple, and they all allow the user to cancel or undo changes. (0) Consistency and standards Minus (- ) and plus (+) signs are used to open/close menus in some places and used to add or delete items in other places. (2) Minus (- ) sign are used to remove items from some lists, while removing an item from uvue requires dragging to a "trash can." (2) Info (i) button used for help in some places (such as uvue) and for show/episode information in other places. (2) Location of the "Add to DVR" button is not consistent - on the timeline, the user must first open a show's "information" page, but while browsing, it appears directly underneath each thumbnail. (2) Error prevention Users are allowed to click on unavailable videos. (2) Flexibility and efficiency of use Managing "favorites" should have more complexity (ability to bring up a list of recent episodes

25 Group 4 Heuristic Evaluation 25 for favorite shows, etc.) (2) Aesthetic and minimalist design Interface/menus are cluttered - some links appear in more than one place (ex. DVR, See All Channels) and basic functions are organized into three separate areas. (3) Drop- down menu on the left is confusingly organized. (3) Help and documentation Help is currently extremely limited and only available through videos preceded by advertisements. (4)

iscreen Usability INTRODUCTION

iscreen Usability INTRODUCTION INTRODUCTION Context and motivation The College of IST recently installed an interactive kiosk called iscreen, designed to serve as an information resource for student/visitors to the College of IST. The

More information

User Experience Report: Heuristic Evaluation

User Experience Report: Heuristic Evaluation User Experience Report: Heuristic Evaluation 1 User Experience Report: Heuristic Evaluation Created by Peter Blair for partial fulfillment of the requirements for MichiganX: UX503x Principles of Designing

More information

Due on: May 12, Team Members: Arpan Bhattacharya. Collin Breslin. Thkeya Smith. INFO (Spring 2013): Human-Computer Interaction

Due on: May 12, Team Members: Arpan Bhattacharya. Collin Breslin. Thkeya Smith. INFO (Spring 2013): Human-Computer Interaction Week 6 Assignment: Heuristic Evaluation of Due on: May 12 2013 Team Members: Arpan Bhattacharya Collin Breslin Thkeya Smith INFO 608-902 (Spring 2013): Human-Computer Interaction Group 1 HE Process Overview

More information

User Interface Evaluation

User Interface Evaluation User Interface Evaluation Heuristic Evaluation Lecture #17 Agenda Evaluation through Expert Analysis Cognitive walkthrough Heuristic evaluation Model-based evaluation Cognitive dimension of notations 2

More information

Jakob Nielsen s Heuristics (

Jakob Nielsen s Heuristics ( Jakob Nielsen s Heuristics (http://www.useit.com/papers/heuristic/heuristic_list.html) What is heuristic evaluation: Heuristic evaluation is done as a systematic inspection of a user interface design for

More information

Crab Shack Kitchen Web Application

Crab Shack Kitchen Web Application Crab Shack Kitchen Web Application EVALUATION ASSIGNMENT 2 HEURISTIC EVALUATION Author: Sachin FERNANDES Graduate 8 Undergraduate Team 2 Instructor: Dr. Robert PASTEL February 16, 2016 LIST OF FIGURES

More information

HCI and Design SPRING 2016

HCI and Design SPRING 2016 HCI and Design SPRING 2016 Topics for today Heuristic Evaluation 10 usability heuristics How to do heuristic evaluation Project planning and proposals Usability Testing Formal usability testing in a lab

More information

1. Select/view stores based on product type/category- 2. Select/view stores based on store name-

1. Select/view stores based on product type/category- 2. Select/view stores based on store name- COMPETITIVE ANALYSIS Context The world of mobile computing is fast paced. There are many developers providing free and subscription based applications on websites like www.palmsource.com. Owners of portable

More information

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation

CO328- Human Computer Interaction Michael Kölling Caroline Li. Heuristic Evaluation CO328- Human Computer Interaction Michael Kölling Caroline Li Heuristic Evaluation Signage: What does this tells you? - History, what went earlier there before - Tells you more about a problematic situation

More information

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing

Assignment 5 is posted! Heuristic evaluation and AB testing. Heuristic Evaluation. Thursday: AB Testing HCI and Design Topics for today Assignment 5 is posted! Heuristic evaluation and AB testing Today: Heuristic Evaluation Thursday: AB Testing Formal Usability Testing Formal usability testing in a lab:

More information

Lose It! Weight Loss App Heuristic Evaluation Report

Lose It! Weight Loss App Heuristic Evaluation Report Lose It! Weight Loss App Heuristic Evaluation Report By Manuel Ryan Espinsa Manuel Ryan Espinosa 1-27-2017 Heuristic Evaluation IN4MATX 283 LEC A: USER EXPERIENCE (37000) TABLE OF CONTENTS EXECUTIVE SUMMARY

More information

Heuristic Evaluation

Heuristic Evaluation Heuristic Evaluation For CS4760 & HU4628 Group 3 -- BAER Can Li (canli@mtu.edu) 2/17/2015 Background: The primary mission of BAER Teams is to protect lives, property and sensitive habitats from post-fire

More information

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation Lecture 12: Inspection-Based Methods James Fogarty Daniel Epstein Brad Jacobson King Xia Tuesday/Thursday 10:30 to 11:50

More information

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation Lecture 11: Inspection Tuesday / Thursday 12:00 to 1:20 James Fogarty Kailey Chan Dhruv Jain Nigini Oliveira Chris Seeds

More information

Hyper Mesh Code analyzer

Hyper Mesh Code analyzer Hyper Mesh Code analyzer ABSTRACT Hyper Mesh Code Analyzer (HMCA) is a text based programming environment, designed for programmers to write their source code in a more organized and manageable fashion.

More information

Heuristic Evaluation. Jon Kolko Professor, Austin Center for Design

Heuristic Evaluation. Jon Kolko Professor, Austin Center for Design Heuristic Evaluation Jon Kolko Professor, Austin Center for Design Heuristic Evaluation Compare an interface to an established list of heuristics best practices to identify usability problems. Heuristic

More information

Design Heuristics and Evaluation

Design Heuristics and Evaluation Design Heuristics and Evaluation Rapid Evaluation Selected material from The UX Book, Hartson & Pyla Heuristic Evaluation Another method for finding usability problems in a UI design Validation during

More information

Usability. Daniela Rosner. Web Architecture, October 9, School of Information UC Berkeley

Usability. Daniela Rosner. Web Architecture, October 9, School of Information UC Berkeley Usability Daniela Rosner Web Architecture, 290-03 October 9, 2007 School of Information UC Berkeley Outline Introduction what is usability Best Practices common solutions Design Patterns shared languages

More information

Heuristic Evaluation of Math Out of the Box

Heuristic Evaluation of Math Out of the Box Heuristic Evaluation of Math Out of the Box Evaluator #A: Joanie Hollberg Evaluator #B: Rassan Walker Evaluator #C: Alex Wang Evaluator #D: Carlos Araujo 1. Problem Math Out of The Box is a mobile application

More information

Heuristic Evaluation of NUIG Participate Module 1

Heuristic Evaluation of NUIG Participate Module 1 Heuristic Evaluation of NUIG Participate Module 1 Nielsen s 10 Usability Heuristics (Nielsen & Mack, 1994) 1. Aesthetic & Minimalist Design Pages should not contain information which is irrelevant or rarely

More information

Group #: Evaluation Proposal For Digital Libraries, LIS 5472, Fall 2010, Professor Sanghee Oh

Group #: Evaluation Proposal For Digital Libraries, LIS 5472, Fall 2010, Professor Sanghee Oh Group #: Evaluation Proposal For Digital Libraries, LIS 5472, Fall 2010, Professor Sanghee Oh Background constraints. Group 3 worked within set background constraints which may or may not apply to other

More information

Heuristic Evaluation of [ Quest ]

Heuristic Evaluation of [ Quest ] Heuristic Evaluation of [ Quest ] 1. Problem Quest is an app that allows you to stay involved in, participate in, and create local clubs and events happening in your community 2. Violations Found 1. [H10.

More information

Heuristic Evaluation. Ananda Gunawardena. Carnegie Mellon University Computer Science Department Fall 2008

Heuristic Evaluation. Ananda Gunawardena. Carnegie Mellon University Computer Science Department Fall 2008 Heuristic Evaluation Ananda Gunawardena Carnegie Mellon University Computer Science Department Fall 2008 Background Heuristic evaluation is performed early in the development process to understand user

More information

Introduction to Internet Applications

Introduction to Internet Applications to Internet Applications Internet Applications, ID1354 1 / 36 Contents 2 / 36 Section 3 / 36 Local Application We are familiar with an architecture where the entire application resides on the same computer.

More information

Computer Systems & Application

Computer Systems & Application For updated version, please click on http://ocw.ump.edu.my Computer Systems & Application Computer System and Application Development Process By Mr. Nor Azhar Ahmad Faculty of Computer Systems & Software

More information

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

Assistant Professor Computer Science. Introduction to Human-Computer Interaction CMSC434 Introduction to Human-Computer Interaction Week 12 Lecture 24 Nov 21, 2013 Intro to Evaluation Human Computer Interaction Laboratory @jonfroehlich Assistant Professor Computer Science Hall of Fame

More information

Heuristic Evaluation of Enable Ireland

Heuristic Evaluation of Enable Ireland Heuristic Evaluation of Enable Ireland Aesthetic and minimalist design Pages should not contain information which is irrelevant or rarely needed. Currently, there is too much content on the Enable Ireland

More information

Usability in Multimedia. By Pınar Koçer Aydın and Özgür Bayram

Usability in Multimedia. By Pınar Koçer Aydın and Özgür Bayram Usability in Multimedia By Pınar Koçer Aydın and Özgür Bayram 1 OUTLINE: Part 1: What is usability and what it isn t Part 2: Jakob Nielsen s 10 principles Part 3: The user experience testing Part 4: Application

More information

activated is a platform that allows students to create, organize, and share the steps on their road to college.

activated is a platform that allows students to create, organize, and share the steps on their road to college. Heuristic Evaluation of [ activated ] 1. Problem activated is a platform that allows students to create, organize, and share the steps on their road to college. 2. Violations Found 1. H7: Flexibility and

More information

Craigslist Heuristic Evaluation

Craigslist Heuristic Evaluation Heuristic Evaluation Introduction This document is a heuristic evaluation of part of the website. Due to the size of the website, this is a partial examination ed to showcase Al Strauss analytical capabilities.

More information

User Experience Research Report: Heuristic Evaluation

User Experience Research Report: Heuristic Evaluation User Experience Research Report: Heuristic Evaluation SI 622 003 Group 3: Yunqi Hu, Diane Pham, Chieh-Lin Wu, Ruofan Zhang Date: March 31, 2016 Word Count: 2,610 Table of Contents Executive Summary...

More information

Heuristic Evaluation of Covalence

Heuristic Evaluation of Covalence Heuristic Evaluation of Covalence Evaluator #A: Selina Her Evaluator #B: Ben-han Sung Evaluator #C: Giordano Jacuzzi 1. Problem Covalence is a concept-mapping tool that links images, text, and ideas to

More information

CS 147 Autumn 2017: Assignment 9 (Heuristic Evaluation Group Template) Instructor: James Landay. Fix: make web also block the desktop screen.

CS 147 Autumn 2017: Assignment 9 (Heuristic Evaluation Group Template) Instructor: James Landay. Fix: make web also block the desktop screen. Fix: make web also block the desktop screen. 5. H10 Help and documentation / Severity: 2 / Found by: A On the web calendar, there aren t any detailed instructions on how to engage in self-care activities,

More information

1. The Best Practices Section < >

1. The Best Practices Section <   > DRAFT A Review of the Current Status of the Best Practices Project Website and a Proposal for Website Expansion August 25, 2009 Submitted by: ASTDD Best Practices Project I. Current Web Status A. The Front

More information

Heuristic Evaluation of igetyou

Heuristic Evaluation of igetyou Heuristic Evaluation of igetyou 1. Problem i get you is a social platform for people to share their own, or read and respond to others stories, with the goal of creating more understanding about living

More information

E2: Heuristic Evaluation A usability analysis of decorativethings.com. Jordana Carlin LIS Spring 2014

E2: Heuristic Evaluation A usability analysis of decorativethings.com. Jordana Carlin LIS Spring 2014 E2: Heuristic Evaluation A usability analysis of decorativethings.com Jordana Carlin LIS-644-01 Spring 2014 2 E2: HEURISTIC EVALUATION Executive Summary Decorative Things is an online retailer of unique

More information

UX DESIGN BY JULIA MITELMAN

UX DESIGN BY JULIA MITELMAN UX DESIGN BY JULIA MITELMAN TOPICS Why should we care? Usability Heuristics It s all about Context The Visual Applied Psychology The Pursuit of Product Design WHY SHOULD WE CARE? (IT S ALWAYS THE DESIGNER

More information

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test CS 160: Evaluation Professor John Canny Spring 2006 2/15/2006 1 2/15/2006 2 Iterative Design Prototype low-fi paper, DENIM Design task analysis contextual inquiry scenarios sketching 2/15/2006 3 Evaluate

More information

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

CS 160: Evaluation. Professor John Canny Spring /15/2006 1 CS 160: Evaluation Professor John Canny Spring 2006 2/15/2006 1 Outline User testing process Severity and Cost ratings Discount usability methods Heuristic evaluation HE vs. user testing 2/15/2006 2 Outline

More information

Heuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!!

Heuristic Evaluation! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame or Shame?! Hall of Fame!! CS 147: HCI+D UI Design, Prototyping, and Evaluation, Autumn 2014 HCI+D: USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Hall of Fame or Shame? Heuristic Evaluation Computer Science Department Autumn

More information

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough Interaction Design Heuristic Evaluation & Cognitive Walkthrough Interaction Design Iterative user centered design and development Requirements gathering Quick design Build prototype Evaluate and refine

More information

Heuristic Evaluation Google Play Store

Heuristic Evaluation Google Play Store Heuristic Evaluation Google Play Store Daniel J. Black April 2016 Contents EXECUTIVE SUMMARY... 3 INTRODUCTION... 4 Product Information... 4 Product Description... 4 Target Population... 4 HEURISTIC EVALUATION

More information

Heuristic Evaluation of [Slaptitude]

Heuristic Evaluation of [Slaptitude] Heuristic Evaluation of [Slaptitude] 1. Problem I am evaluating Slaptitude, a mobile app that allows you to set a timer and monitor leaderboards to help achieve and improve focus. 2. Violations Found 1.

More information

Heuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation

Heuristic Evaluation. Hall of Fame or Shame? Hall of Fame or Shame? Hall of Fame! Heuristic Evaluation 1 USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Hall of Fame or Shame? Heuristic Evaluation Prof. James A. Landay University of Washington Pocket By Read It Later 11/1/2012 2 Hall of Fame or Shame?

More information

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt.

Software Quality. Martin Glinz. Thomas Fritz. Lecture 7 UI Design, Usability & Testing. Many thanks to Meghan Allen and Daniel Greenblatt. Institut für Informatik Software Quality Lecture 7 UI Design, Usability & Testing Thomas Fritz Martin Glinz Many thanks to Meghan Allen and Daniel Greenblatt. Overview Introduction to UI design User-centered

More information

A Heuristic Evaluation of Ohiosci.org

A Heuristic Evaluation of Ohiosci.org A Heuristic Evaluation of Ohiosci.org Executive Summary Site evaluated: Goal: Method: The Ohio Academy of Science http://www.ohiosci.org/ The goal of this heuristic evaluation is to test the overall usability

More information

USER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440

USER INTERFACE DESIGN + PROTOTYPING + EVALUATION. Heuristic Evaluation. Prof. James A. Landay University of Washington CSE 440 USER INTERFACE DESIGN + PROTOTYPING + EVALUATION Heuristic Evaluation Prof. James A. Landay University of Washington CSE 440 February 19, 2013 Hall of Fame or Shame? Pocket By Read It Later Jan. 14-18,

More information

cs465 principles of user interface design, implementation and evaluation

cs465 principles of user interface design, implementation and evaluation cs465 principles of user interface design, implementation and evaluation Karrie G. Karahalios 24. September 2008 1. Heuristic Evaluation 2. Cognitive Walkthrough 3. Discuss Homework 3 4. Discuss Projects

More information

Heuristic Evaluation of PLATELIST

Heuristic Evaluation of PLATELIST 1. Problem Heuristic Evaluation of PLATELIST https://platelist.proto.io/share/?id=5793e1ea-5fd2-4f9c-9af9-4f745e2e30f2&v=1 This is an evaluation of Platelist, a mobile application that aims to facilitate

More information

USER RESEARCH Website portfolio prototype

USER RESEARCH Website portfolio prototype USER RESEARCH Website portfolio prototype Researcher & Author: Álvaro Ibargüen Villa UX, UI & Visual Designer Tel. E-mail Online +34 695 42 17 92 alvaroiv1@gmail.com aivweb.es INTRODUCTION 2 OBJECTIVES

More information

10 Usability Heuristics by Nielsen; Lazada and Shopee Review

10 Usability Heuristics by Nielsen; Lazada and Shopee Review 10 Usability Heuristics by Nielsen; Lazada and Shopee Review Summary Over decade to give user best experience, lot of designers had research and evaluate all possible user experience on digital platforms.

More information

Wikitude Usability testing and heuristic evaluation

Wikitude Usability testing and heuristic evaluation Wikitude Usability testing and heuristic evaluation O. Perutka a a Faculty of Information Technology, CTU, Prague, Czech Republic Abstract Since augmented reality mobile applications depend on surrounding

More information

1. Problem Mix connects people who are interested in meeting new people over similar interests and activities.

1. Problem Mix connects people who are interested in meeting new people over similar interests and activities. 1. Problem Mix connects people who are interested in meeting new people over similar interests and activities. 2. Violations Found 1. [H2 1 Visibility of Status][Severity 2][Found by: A] The interface

More information

Heuristic Evaluation of Team Betamax

Heuristic Evaluation of Team Betamax Heuristic Evaluation of Team Betamax Eric Gallimore Connor Riley Becky Scholl Chris Stone November 4, 2006 Overview Evaluation Let s just state for the record that we like this a whole lot better than

More information

Problem and Solution Overview: An elegant task management solution, that saves busy people time.

Problem and Solution Overview: An elegant task management solution, that saves busy people time. An elegant task management solution, that saves busy people time. Team: Anne Aoki: Storyboarding, design, user studies, writing Alex Anderson: User studies, design Matt Willden: Ideation, writing, user

More information

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003

Usability Inspection Methods. Overview. Usability Measures. SMD157 Human-Computer Interaction Fall 2003 INSTITUTIONEN FÖR SYSTEMTEKNIK LULEÅ TEKNISKA UNIVERSITET Usability Inspection Methods SMD157 Human-Computer Interaction Fall 2003 Nov-20-03 SMD157, Usability Inspection Methods 1 L Overview Usability

More information

Chapter 10 Interactive Systems And Usability Organizational Requirements Engineering

Chapter 10 Interactive Systems And Usability Organizational Requirements Engineering Chapter 10 Interactive Systems And Usability Organizational Requirements Engineering Prof. Dr. Armin B. Cremers Sascha Alda Overview Introduction: What is usability? Why is usability an important non-functional

More information

Spark is a mobile application that allows teachers to capture, track, and and share their students important learning moments.

Spark is a mobile application that allows teachers to capture, track, and and share their students important learning moments. Heuristic Evaluation of Spark Problem Spark is a mobile application that allows teachers to capture, track, and and share their students important learning moments. Violations Found 1 [H2-10: Help & Documentation]

More information

Evaluation Plan for Fearless Fitness Center Website

Evaluation Plan for Fearless Fitness Center Website Evaluation Plan for Fearless Fitness Center Website A three-step process will be used to prepare the Fearless Fitness website for use. First, at the outset of the project, Back-of-the-Envelope Task Analysis

More information

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an

Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an Heuristic evaluation is a usability inspection technique developed by Jakob Nielsen. The original set of heuristics was derived empirically from an analysis of 249 usability problems (Nielsen, 1994). -Preece

More information

CogSysIII Lecture 9: User Modeling with GOMS

CogSysIII Lecture 9: User Modeling with GOMS CogSysIII Lecture 9: User Modeling with GOMS Human Computer Interaction Ute Schmid Applied Computer Science, University of Bamberg last change June 26, 2007 CogSysIII Lecture 9: User Modeling with GOMS

More information

Heuristic Evaluation of MPowered Entrepreneurship s Slack Workspace. Presented By Dylan Rabin. SI 110 Section 003

Heuristic Evaluation of MPowered Entrepreneurship s Slack Workspace. Presented By Dylan Rabin. SI 110 Section 003 Heuristic Evaluation of MPowered Entrepreneurship s Slack Workspace Presented By Dylan Rabin SI 110 Section 003 You have so little time to attract their attention every little thing counts. Stewart Butterfield

More information

Severity Definitions:

Severity Definitions: Heuristic Evaluation: Death by Hypothermia Overall, this project does a good job keeping the user interface simple. It is not difficult to go from one place to another within the application, but navigation

More information

User Interfaces Assignment 3: Heuristic Re-Design of Craigslist (English) Completed by Group 5 November 10, 2015 Phase 1: Analysis of Usability Issues Homepage Error 1: Overall the page is overwhelming

More information

The 23 Point UX Design Checklist

The 23 Point UX Design Checklist The 23 Point UX Design Checklist The 23 Point UX Design Checklist During the design process, some flaws in your product will go unnoticed. Those little (or sometimes big) things can do a lot to hurt the

More information

MBooks Usability Reports

MBooks Usability Reports University of Michigan Deep Blue deepblue.lib.umich.edu 2007-03 MBooks Usability Reports Bailey, Kehr; Chen, Wei; Karimova, Valentina; Raux, Sarah; Williams, Mary Ann http://hdl.handle.net/2027.42/107044

More information

Heuristic Evaluation Report. The New York Philharmonic Digital Archives archives.nyphil.org

Heuristic Evaluation Report. The New York Philharmonic Digital Archives archives.nyphil.org Heuristic Evaluation Report The New York Philharmonic Digital Archives archives.nyphil.org Cassie Hickman Wednesday, October 14, 2015 Table of Contents Executive Summary... 3 Introduction... 4 Methodology...

More information

SFU CMPT week 11

SFU CMPT week 11 SFU CMPT-363 2004-2 week 11 Manuel Zahariev E-mail: manuelz@cs.sfu.ca Based on course material from Arthur Kirkpatrick, Alissa Antle and Paul Hibbits July 21, 2004 1 Analytic Methods Advantages can be

More information

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web

Usability & User Centered Design. SWE 432, Fall 2018 Design and Implementation of Software for the Web Usability & User Centered Design SWE 432, Fall 2018 Design and Implementation of Software for the Web Review: Mental models Only single temperature sensor. Controls not independent, need to adjust both.

More information

Heuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics.

Heuristic Evaluation. Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics. Heuristic Evaluation Heuristic evaluation evaluates the interface to identify usability problems against recognized usability design heuristics. Usability heuristics are best practices developed and identified

More information

Getting Help...71 Getting help with ScreenSteps...72

Getting Help...71 Getting help with ScreenSteps...72 GETTING STARTED Table of Contents Onboarding Guides... 3 Evaluating ScreenSteps--Welcome... 4 Evaluating ScreenSteps--Part 1: Create 3 Manuals... 6 Evaluating ScreenSteps--Part 2: Customize Your Knowledge

More information

Heuristic Evaluation of Mango

Heuristic Evaluation of Mango Heuristic Evaluation of Mango 1. Problem Mango is an application that makes it easier to plan group travel and collaborate on group itineraries by providing an interface to invite friends to a group trip,

More information

Usability. HCI - Human Computer Interaction

Usability. HCI - Human Computer Interaction Usability HCI - Human Computer Interaction Computer systems optimization for easy access and communication Definition design Basic principles Testing assessment implementation Quality Utility funcionality

More information

User Experience Design

User Experience Design User Experience Design PRESENTED BY Morgan Bibbs Director of Creative Services J. William Fulbright College of Arts & Sciences John C. Dailey, Ph.D. Content Strategist University Relations WHAT IS USER

More information

PROJECT 1. Heuristic Evaluation and Cognitive Walkthrough of Goroo.com

PROJECT 1. Heuristic Evaluation and Cognitive Walkthrough of Goroo.com PROJECT 1. Heuristic Evaluation and Cognitive Walkthrough of Goroo.com Cherese Cooper, Tatiana Iegorova, Andrew Wasowicz HCI 460 Usability Evaluations Spring 2013 1 Executive Summary In this report we

More information

Heuristic Evaluation

Heuristic Evaluation Heuristic Evaluation Assignment 11: HE of Prototypes (Individual) PROBLEM PlateList is a mobile application designed to help people overcome small obstacles when trying to cook by allowing users to (1)

More information

MSN TV Heuristic Evaluation

MSN TV Heuristic Evaluation Final Report IDIS: 1845 9 th September 009 This document was prepared by: Robert Murphy Table of Contents Executive Summary... Purpose & Aims of the Study... 5 Methodology... 7 Date and Location... 7 Procedure...

More information

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web

User-Centered Design. SWE 432, Fall 2017 Design and Implementation of Software for the Web User-Centered Design SWE 432, Fall 2017 Design and Implementation of Software for the Web In class exercise As you come in and take a seat Write down at least 3 characteristics that makes something usable

More information

HEURISTIC EVALUATION WHY AND HOW

HEURISTIC EVALUATION WHY AND HOW HEURISTIC EVALUATION WHY AND HOW REF: Scott Klemmer Jacob Nielsen James Landay HEURISTIC EVALUATION Multiple ways to evaluate Empirical: Assess with real users trying s/w Formal: Models and formulas to

More information

Applying Usability to elearning

Applying Usability to elearning Applying Usability to elearning 6 December 08 John Wooden, PhD Director of Usability Services Fredrickson Communications jwooden@fredcomm.com We make information useful! About Fredrickson Communications

More information

Usability Test Report: Bento results interface 1

Usability Test Report: Bento results interface 1 Usability Test Report: Bento results interface 1 Summary Emily Daly and Ian Sloat conducted usability testing on the functionality of the Bento results interface. The test was conducted at the temporary

More information

Heuristic Evaluation of [Pass It On]

Heuristic Evaluation of [Pass It On] Heuristic Evaluation of [Pass It On] Evaluator #A: Janette Evaluator #B: John Evaluator #C: Pascal Evaluator #D: Eric 1. Problem Pass It On aims to transform some of the numerous negative and stressful

More information

Usability analysis and inspection

Usability analysis and inspection Usability analysis and inspection Why and how? 1MD113 Why? Iterative design Prototyping Measuring usability Objective/subjective feedback Quick and dirty Slow and clean With or without users 1 Evaluation

More information

SkillSwap. A community of learners and teachers

SkillSwap. A community of learners and teachers Team: Jacob Yu Villa, Dana Murphy, Tuan Tran SkillSwap A community of learners and teachers Problem During our needfinding process, we found that many people felt discouraged about learning due to the

More information

Why? Usability analysis and inspection. Evaluation. Evaluation. Measuring usability. Evaluating usability

Why? Usability analysis and inspection. Evaluation. Evaluation. Measuring usability. Evaluating usability Usability analysis and inspection Why and how? Iterative design Prototyping Measuring usability Why? Objective/subjective feedback Quick and dirty Slow and clean With or without users 1MD113 Evaluation

More information

The Pluralistic Usability Walk-Through Method S. Riihiaho Helsinki University of Technology P.O. Box 5400, FIN HUT

The Pluralistic Usability Walk-Through Method S. Riihiaho Helsinki University of Technology P.O. Box 5400, FIN HUT The Pluralistic Usability Walk-Through Method S. Riihiaho Helsinki University of Technology P.O. Box 5400, FIN-02015 HUT sirpa.riihiaho@hut.fi Abstract Pluralistic usability walkthrough is a usability

More information

CMSC434 Intro to Human-Computer Interaction. Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara

CMSC434 Intro to Human-Computer Interaction. Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara CMSC434 Intro to Human-Computer Interaction Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara #inspiration [Applied Sciences Group: High Performance Touch,

More information

MiPhone Phone Usage Tracking

MiPhone Phone Usage Tracking MiPhone Phone Usage Tracking Team Scott Strong Designer Shane Miller Designer Sierra Anderson Designer Problem & Solution This project began as an effort to deter people from using their phones in class.

More information

Amsterdam Medical Center Department of Medical Informatics. Improve. Usability evaluation of the sign up process of the Improve app

Amsterdam Medical Center Department of Medical Informatics. Improve. Usability evaluation of the sign up process of the Improve app Amsterdam Medical Center Department of Medical Informatics Improve Usability evaluation of the sign up process of the Improve app Author L.J.M. Heerink Principal investigator Prof. Dr. M.W.M Jaspers Supervised

More information

Neon Carrot Prototype I Evaluation. Dan Cody, Logan Dethrow, Ben Fisher, Jeff Stanton April 6, Preamble

Neon Carrot Prototype I Evaluation. Dan Cody, Logan Dethrow, Ben Fisher, Jeff Stanton April 6, Preamble Neon Carrot Prototype I Evaluation Dan Cody, Logan Dethrow, Ben Fisher, Jeff Stanton April 6, 2009 Preamble Overall, we were impressed with the prototype's visual style, although some interface elements

More information

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1

evaluation techniques goals of evaluation evaluation by experts cisc3650 human-computer interaction spring 2012 lecture # II.1 topics: evaluation techniques usability testing references: cisc3650 human-computer interaction spring 2012 lecture # II.1 evaluation techniques Human-Computer Interaction, by Alan Dix, Janet Finlay, Gregory

More information

Heuristic Evaluation: OneView CRM current application

Heuristic Evaluation: OneView CRM current application Evaluation: OneView CRM current application Angela Edwards 22nd August 2016 Executive Summary This report contains the results of a heuristic evaluation undertaken on the OneView CRM application for Lloyds

More information

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines INSTITUTIONEN FÖR SYSTEMTEKNIK LULEÅ TEKNISKA UNIVERSITET Design Principles SMD157 Human-Computer Interaction Fall 2005 Nov-4-05 SMD157, Human-Computer Interaction 1 L Overview User-center design Guidelines

More information

Team Manatee Group Heuristic Evaluation

Team Manatee Group Heuristic Evaluation Team Manatee Group Heuristic Evaluation by Andy Barry, Greg Marra, Brad Powers, Will Yarak. Identified Issues Team Manatee's prototype performs well in the given scenarios but still has a number of major

More information

Ryan Parsons Chad Price Jia Reese Alex Vassallo

Ryan Parsons Chad Price Jia Reese Alex Vassallo Ryan Parsons - Paper Prototype, Writing Chad Price - Paper Prototype, Digital Mockup Jia Reese - Paper Prototype, Usability Testing Alex Vassallo - Usability Testing, Writing All we have to decide is what

More information

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines

Design Principles. Overview. User-Center Design. SMD157 Human-Computer Interaction Fall User-center design Guidelines INSTITUTIONEN FÖR SYSTEMTEKNIK LULEÅ TEKNISKA UNIVERSITET Design Principles SMD157 Human-Computer Interaction Fall 2003 Nov-6-03 SMD157, Human-Computer Interaction 1 L Overview User-center design Guidelines

More information

Evaluating myat&t Redesign. Conner Drew

Evaluating myat&t Redesign. Conner Drew Evaluating myat&t Redesign Conner Drew Types of 1. Think Aloud Protocol Evaluation 2. Cognitive Walkthrough 3. Heuristic Evaluation Think Aloud Protocol Think Aloud Protocol Evaluate the usability of designs

More information

Arduino IDE Friday, 26 October 2018

Arduino IDE Friday, 26 October 2018 Arduino IDE Friday, 26 October 2018 12:38 PM Looking Under The Hood Of The Arduino IDE FIND THE ARDUINO IDE DOWNLOAD First, jump on the internet with your favorite browser, and navigate to www.arduino.cc.

More information

CHAPTER 1 COPYRIGHTED MATERIAL. Finding Your Way in the Inventor Interface

CHAPTER 1 COPYRIGHTED MATERIAL. Finding Your Way in the Inventor Interface CHAPTER 1 Finding Your Way in the Inventor Interface COPYRIGHTED MATERIAL Understanding Inventor s interface behavior Opening existing files Creating new files Modifying the look and feel of Inventor Managing

More information

Nielsen s 10 Usability Heuristics. Heuristics evaluations and identifying heuristics violations

Nielsen s 10 Usability Heuristics. Heuristics evaluations and identifying heuristics violations Nielsen s 10 Usability Heuristics Heuristics evaluations and identifying heuristics violations Introduction About me - background with HCI HCI/usability in educational software Game based and gamified

More information