(Milano Lugano Evaluation Method) A systematic approach to usability evaluation. Part I LAB HCI prof. Garzotto - HOC-POLITECNICO DI MILANO a.a.

Size: px
Start display at page:

Download "(Milano Lugano Evaluation Method) A systematic approach to usability evaluation. Part I LAB HCI prof. Garzotto - HOC-POLITECNICO DI MILANO a.a."

Transcription

1 (Milano Lugano Evaluation Method) A systematic approach to usability evaluation Part I LAB HCI prof. Garzotto - HOC-POLITECNICO DI MILANO a.a

2 ABOUT USABILITY 1. WHAT IS USABILITY? Ë Usability is the effectiveness, efficiency and satisfaction with which specified users can achieve specified goals in particular environments (ISO ) Ë What determines usability What the product is product characteristics Who is using it user characteristics What they want to achieve goals and tasks The context circumstances of use 2

3 ABOUT USABILITY 2. USABILITY S CHALLENGE Ë Applications that people want to use understand how to use are successful in using are satisfied with make fewer errors with Ë Higher levels of use conquer new visitors, make customer come back Ë Lower user support costs fewer problems and complaints Ë Contribute to trust building and branding 3

4 ABOUT USABILITY 3. EVALUATION DEPTH Ë The usability of an application can be analysed at three levels of depth: 1 : general (for any interactive digital artifact) 2 : category (e.g. web application) 3 : destination (e.g. Museum web site) 4

5 ABOUT USABILITY 4. EXISTING USABILITY METHODS / 1 Ë Two categories: Empirical testing: involves some representative users evaluators collect usability data from users simulating a session of use Problems: gap btw real sessions and simulated ones the problem of representative users effective for detecting impact of overall look&feel high costs Main techniques: Thinking aloud Contextual inquiry Focus group 5

6 ABOUT USABILITY 5. EXISTING USABILITY METHODS / 2 Inspection methods: involve usability experts ONLY evaluator systematically performs tasks on the application effective for detecting usability and design errors and breakdowns analytic, repeatable, at lower costs than empirical testing Drawback: may neglect or underestimate some problems Main techniques: Heuristic evaluation Pluralistic Walkthrough 6

7 ABOUT USABILITY 5. EXISTING USABILITY METHODS / 3 It is empirically proved that the most effective evaluation (i.e., the highest amount of usability problems discovery) is achieved by systematically combining evaluation and empirical testing 7

8 Ó MiLE methodology: principles 8

9 MiLE methodology: principles WHAT IS MILE? Ë MILE is a Systematic Usability Evaluation method systematically combining inspection method with empirical testing Ë Developed at Politecnico di Milano (Hypermedia Open Centre) and University of Lugano, Communication Sciences (TEC-lab) Ë Up to now applied to museum web sites, e- government, and e-commerce 9

10 MiLE methodology: principles MILE ACTIVITIES Ë COMBINATION OF INSPECTION AND EMPIRICAL TESTING Ë INSPECTION: evaluation of usability attributes that are independent from the specific goals and users of the application Ë EMPIRICAL TESTING: DRIVEN BY (focused by means of) THE RESULTS OF INSPECTION Ë ORGANIZED IN TWO MAIN PHASES PRELIMINARY PHASE Defining or choosing the evaluation tools (U-KIT) EXECUTION PHASE Performing the evaluation 10

11 MiLE methodology: principles KEY CONCEPTS Ë Analysis Levels Ë Usability attributes (heuristics) Ë Abstract Task Ë Scenario Ë Measurement based on Scoring and Weighting 11

12 MiLE methodology: principles Different perspectives/dimensions of analysis during inspection During inspection, MiLE emphasizes the need for: Ëseparating different perspectives of analysis: Technical perspective: The analysis of the aspects that are independent from the specific user requirements Requirement perspective The analysis of the aspects that can be judget only in relationship to the actual user requirements Ëseparating different dimensions of analysis, corresponding to different design dimensions: NAVIGATION CONTENT OPERATIONAL FUNCTIONALITY COGNITIVE ASPECTS SEMIOTIC ASPECTS GRAPHICS TECHNOLOGY (PERFORMANCE) 12

13 MiLE methodology: principles ATTRIBUTES (HEURISTICS)_1 MiLE attributes are Ë usability factors - application properties that contribute to usability, e.g. Learnability Predictability... Ë Heuristic, i.e., based on common sense, usability experience, principles from different disciplines Ë Have different levels of abstractions and generality, and should be decomposed into finer grained, more measurable sub-factors 13

14 MiLE methodology: principles ATTRIBUTES (HEURISTICS)_2 MiLE attributes can be organized as: Ë Technical attributes: Can be measured or judged largely indipendent from the user actual requirements E.g. Lay-out consistency Ë Requirements-dependent attributes: Can be measured or judged only in relationship with the user actual requirements (profile and goals of the user) E.g. Content completeness, content richness, functional completeness, learnability Ë Orthogonally, most attributes can be organized according to the different design/analyisis dimensions Navigation, content, lay-out,... (see previous slide) 14

15 MiLE methodology: principles ATTRIBUTES (HEURISTICS)_3 Evaluating usability requires evaluating the attributes by which we choose to describe usability Ë During inspection: The inspector arguments about the satisfaction/violation of some attributes; assigning a quantitative value to each attribute (evaluation ) is based on inspector s personal judgement MiLE makes inspection activities more systematic (based on scenarios and abstract tasks) but the evaluation/ranking of attributes is still subjective 15

16 MiLE methodology: principles ATTRIBUTES (HEURISTICS)_4 Ë During empirical testing Collection of empirical data about a set of MEASURABLE factors associated to some usability attribute performance measures (data about the user behavior with the system): e.g., (Efficiency: how well a user achieves goals of interest) time to complete a task, number and percentage of tasks completed correctly (resp. incorrectly) with and without assistance; time of task completation; (Orientation: the degree of user understanding on where he is in the hyperspace) number of use/request of help; number of incorrect links traversed; number of backs ;.. preference measures: data about the user opinition or user satisfaction, collected by means of questionnaires, de-briefings, user s ranking etc. 16

17 MiLE methodology: principles GUIDING THE INSPECTION AND THE EMPIRICAL TESTING PLANNING HOW TO DEFINE WHAT TO DO WITH THE APPLICATION.? Abstract tasks (Inspection) Concrete tasks derived from scenarios (Empirical Testing).. AND WHERE TO DO IT? Scenarios 17

18 MiLE methodology: principles ABSTRACT TASK / 1 Ë An abstract task (A.T) describes a pattern for an inspection activity: a (set of) parametric action(s) that the inspector should perform on the application in order to evaluate TECHNICAL attributes Ë A.T. structure: <technical attribute, (set of) inpsection action(s) > See.doc files!!! Abstract tasks: Ë capture evaluation experience Ë Enforce standardisation and uniformity, thus reducing time and cost needed for inspection So far: A.T have been defined for level 2: navigation/content/semiotics/layout/multiple media dynamics See.doc files!!! level 3: museum web sites 18

19 MiLE methodology: principles ABSTRACT TASK / 2 During inspection, the user: Ë performs the abstract tasks corresponding to the chosen attributes Ë Subjectively measure the corresponding attributes (see section: scoring and weighting) The portions of the applications to be inspected can be selected: Ë According to some previously defined scenarios To focus where to move Ë Randomly When the application is small or scenario are missing Ë Comprehensively (the entire application) When the application is small or there is enough resourses to inspect all pages 19

20 MiLE methodology: principles SCENARIOS _1 Ë Scenario: story about use (Carroll, 2002) the description of a concrete episode of use of the application (Cato, 2001) Ë A Scenario in MiLE is composed by: User profile User goal something the user wants to achieve with the application Set of Tasks something the user wants to do with the application to achieve the goal 20

21 Example (web-based learning application) SCENARIOS_2 Scenario 1 : student Goal Plan the study Tasks Know the time required to attend a course Find the ideal period to frequent a classroom session Scenario 2: Student Know course conditions Identify course goals Understand the course structure Understand how to communicate with tutors and peers Scenario 3: Instructor Know the Achieved learning level Make a test in order to verify the level of learning achieved Verify in which topics there are learning gaps 21

22 MiLE methodology: principles USER PROFILE Ë Define the main target (s) of the application ËSome criteria for creating user profiles: Person profile Age Job Degree of general culture Provenance. Background with similar applications Ability with internet Degree of web site's familiarity (first-time visitor, frequent user...)

23 MiLE methodology: principles SCENARIOS_3 Why scenarios? 1) During inspection: Ë Scenarios act as inspection focus they help inspectors to identify the portion(s) of the application that are representative for evaluation purposes (those which look more useful to fullfill the scenario s goal) The inspector carries on the abstract tasks and measures the corresponding usability attributes while trying to achieve the scenario s goal and performing the scenario s tasks 23

24 MiLE methodology: principles SCENARIOS_3 Why scenarios? (cont.) 2) During inspection Ë Scenarios define what the ispector should try to do/achieve with the application, when putting him/herself from the point of view of the user Ë Scenarios define the tasks that inspectors should try to carry on to: evaluate requirement dependent usability attributes (related to the domain and user goals/needs) which cannot be supported by ATs E.g. Content completeness,... To get an idea of what can be problematic in the user feeling 24

25 MiLE methodology: principles SCENARIOS_4 Why scenarios? (cont.) 3) During empirical testing: The user carries on (a subset of the) scenarios tasks al least those ones which helped to indentify more problems during inspection 25

26 MiLE methodology: principles THE CONCEPT OF EVALUATION KIT (U-KIT) A MiLE U-KIT includes: ËA set of Specialized Scenarios for a specific domain (e.g., museums) for a category of applications (e.g., cell phones) ËTechnical Attributes (Heuristics) coupled with Abstract Tasks ËRequirement dependend attributes The U-KIT is defined during the preparatory phase (choosing the kit elements for an existing library if available) 26

27 Ó MiLE methodology: execution process 27

28 MiLE methodology: execution process Ë MiLE execution process involves 2 main phase: INSPECTION and USER TESTING 28

29 MiLE methodology: execution process 0. THE PHASES OF INSPECTION Ë MiLE inspection involves several activities (not all of which are strictly necessary, nor are performed in a strict sequence): Ë Modeling the application under inspection (mandatorily the first step) Ë Performing abstract tasks + Scoring technical usability attributes Ë Performing scenarions tasks + Scoring requirement driven usability attributes (possibly, also higher level technical attributes) Ë Weighting the scores according to user actual requirements Ë Reporting: the evaluation activity the results descriptions of improvements, design and requirements revisions... 29

30 MiLE methodology: execution process MODELING THE APPLICATION UNDER INSPECTION Ë Goal: Understanding what the application is about and building a mental model of the application Mental model = a cognitive structure of concepts and procedures that help users to choose the appropriate actions and understand what happens with the application (Carrol) Ë Activity: The reviewer draws a high-level model either totally informally or adopting a semi-formal model (e.g. W2000) of the application under inspection. Ë Expected output: General schema of the application features e.g., structure of the content, navigational capabilities, the interface elements, services. The output varies in terms of level of detail according to the aspects of the web site that the reviewer is committed to evaluate (navigation, content, interactions, layout readability, etc...) and the required depth of the evaluation 30

31 MiLE methodology: execution process PERFORMING ABSTRACT TASKS (scenario driven approach) Ë Goal: Evaluating technical attributes Ë Activity: For each task of a salient scenario, while trying to accomplish the task the reviewer: performs the abstract tasks for the technical attributes defined in the preliminary phase; assesses whether or not the user task can be properly accomplished assignes a score to each attribute Ë Expected output: For each user task: a two-value mark (YES: can be accomplish, NO: impossible to accomplish) and a score for each technical attribute 31

32 MiLE methodology: execution process EVALUATING REQUIREMENT DEPENDENT USABILITY ATTRIBUTES Ë Goal: Evaluating requirement dependent usability attributes Ë Activity: For each task of a scenario, while trying to accomplish the task the reviewer: assesses whether or not the user task can be properly accomplished assignes a score to each relevant requirement dependent attribute Ë Expected output: For each user task: a two-value mark (YES: can be accomplish, NO: impossible to accomplish) and a score for each requirement dependent attribute Ë SEE PART II 32

33 MiLE methodology: execution process EVALUATING REQUIREMENT DEPENDENT USABILITY ATTRIBUTES (cont.) Ë Scores: the metrics can be chosen by the evaluator Ë The metrics can be quantitive (numbers) or qualitative (a judgment, e.g., bad, average, good, very good) Ë Any quantitative metrics can be converted into a numeric one and normalized according to the chosen numeric interval Ë Our suggestion: SCORE RANGE: = the attribute is totally violated 2 = insufficient fulfillment (attribute violated in most cases) 3 = in average, sufficient fulfillment 4 = good fulfillment, with minor violations 5 = excellent fulfulment (no violations detected) 33

34 MiLE methodology: execution process REPORTING THE SCORING Basic scoring schema: for each task of each scenario Scenario number Task Number: Task description: Accomplishment (Yes/No) Dimension of analysis (e.g. navigation, layout...) Score (1..5) Source of problems (brief description of a situation where violations occurred)... Source of problems (brief description of a situation where violations occurred) Attribute A1 Attribute An Dimension of analysis (e.g. navigation, layout...) Attribute H1 Attribute Hn 34

35 MiLE methodology: execution process REPORTING THE SCORING (cont.) Ë The set of all tables for all scenarions is a hypercube (a multidimension matrix) hard to represent! Ë Ë Ë Data can be grouped-by in different ways (e.g., by attribute) Several aggregated values can be derived (e.g., average score for each attribute, global score for each task, average global score for each scenario...) for further analysis The kinds of aggregations depends on which data the inspectors looks for 35

36 MiLE methodology: execution process WEIGHTING THE RESULTS Goals: Ë defining the relevance, the criticality of the various problems, i.e., how important they are in real-world situations Ë Establishing a priority for correction actions How: Ë weighting the evaluation results according to the relevance of the task the scenario Both (combination of weights) Or/and Ë weighting the evaluation results according to the relevance of the attributes In some cases, an attribute which is important for a scenario is less important for another N.B. Ë A subjective measure (to be agreed with the stakeholders and the requirements analysis) Ë To be validated (or contradicted) by means of empirical testing 36

37 MiLE methodology: execution process WEIGHTING THE RESULTS (cont.) The weighting method depends on how data are analysed For example: Ë Identification of the attributes that are more violated in more important scenarios/tasks Assignement of a priority rate to scenarios (in the range 1-3: 1= high priority; 3= low priorirty) Weight = priority See next table 37

38 MiLE methodology: execution process It is more important to find corrections for the violations here, even if A1 is more violated in Task 2!! Task 1 Task priority Score Weighted score (weight = priority) 1 (high) A1 2 2 A2 4 4 Task 2 Task priority Score Weighted score 3 (low) A1 1 3 A

Introduction on Usability

Introduction on Usability Introduction on Usability Courtesy of Luca Triacca (luca.triacca@lu.unisi.ch) TEC-Lab University of Lugano www.tec-lab.ch Agenda Introduction on Usability MiLE+: a systematic method for usability evaluation

More information

WP4 EVALUATING USABILITY ASSESSMENT METHODS FOR WEB BASED CULTURAL HERITAGE APPLICATIONS

WP4 EVALUATING USABILITY ASSESSMENT METHODS FOR WEB BASED CULTURAL HERITAGE APPLICATIONS WP4 EVALUATING USABILITY ASSESSMENT METHODS FOR WEB BASED CULTURAL HERITAGE APPLICATIONS Davide Bolchini (2), Nicoletta Di Blas (1), Franca Garzotto (1), Paolo Paolini (1), Elisa Rubegni(2) HOC-LAB Politecnico

More information

Evaluating Web Usability With MiLE+

Evaluating Web Usability With MiLE+ Evaluating Web Usability With MiLE+ Luca Triacca TEC-Lab University of Lugano, Switzerland luca.triacca@lu.unisi.ch Alessandro Inversini TEC-Lab University of Lugano, Switzerland alessandro.inversini@lu.unisi.ch

More information

User-Centered Analysis & Design

User-Centered Analysis & Design User-Centered Analysis & Design Section Topic Slides Study Guide Quick References (QR) Introduction UCA vs. UT 1-26 12 Comparing Analysis and Testing Approaches ROI 1-29 7 Formulas for Calculating ROI

More information

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search

Overview of Today s Lecture. Analytical Evaluation / Usability Testing. ex: find a book at Amazon.ca via search Overview of Today s Lecture Analytical Evaluation / Usability Testing November 17, 2017 Analytical Evaluation Inspections Recapping cognitive walkthrough Heuristic evaluation Performance modelling 1 2

More information

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005

Evaluation in Information Visualization. An Introduction to Information Visualization Techniques for Exploring Large Database. Jing Yang Fall 2005 An Introduction to Information Visualization Techniques for Exploring Large Database Jing Yang Fall 2005 1 Evaluation in Information Visualization Class 3 2 1 Motivation What are the advantages and limitations

More information

This PDF was generated from the Evaluate section of

This PDF was generated from the Evaluate section of Toolkit home What is inclusive design? Why do inclusive design? How to design inclusively Overview Map of key activities Manage This PDF was generated from the Evaluate section of www.inclusivedesigntoolkit.com

More information

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it? Concepts of Usability Usability Testing What is usability? How to measure it? Fang Chen ISO/IS 9241 Usability concept The extent to which a product can be used by specified users to achieve specified goals

More information

Usability Evaluation of Software Testing Based on Analytic Hierarchy Process Dandan HE1, a, Can WANG2

Usability Evaluation of Software Testing Based on Analytic Hierarchy Process Dandan HE1, a, Can WANG2 4th International Conference on Machinery, Materials and Computing Technology (ICMMCT 2016) Usability Evaluation of Software Testing Based on Analytic Hierarchy Process Dandan HE1, a, Can WANG2 1,2 Department

More information

Experimental Evaluation of Effectiveness of E-Government Websites

Experimental Evaluation of Effectiveness of E-Government Websites Experimental Evaluation of Effectiveness of E-Government Websites A. Basit Darem 1, Dr. Suresha 2 1 Research Scholar, DoS in Computer Science, University of Mysore 2 Associate Professor, DoS in Computer

More information

2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system

2/18/2009. Introducing Interactive Systems Design and Evaluation: Usability and Users First. Outlines. What is an interactive system Introducing Interactive Systems Design and Evaluation: Usability and Users First Ahmed Seffah Human-Centered Software Engineering Group Department of Computer Science and Software Engineering Concordia

More information

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms Standard Glossary of Terms used in Software Testing Version 3.2 Foundation Extension - Usability Terms International Software Testing Qualifications Board Copyright Notice This document may be copied in

More information

Usability Testing. November 14, 2016

Usability Testing. November 14, 2016 Usability Testing November 14, 2016 Announcements Wednesday: HCI in industry VW: December 1 (no matter what) 2 Questions? 3 Today Usability testing Data collection and analysis 4 Usability test A usability

More information

User-centered design in technical communication

User-centered design in technical communication User-centered design in technical communication Information designer & information architect Sharing knowledge is better than having it. Tekom - TC Europe November 19-20, 2003 Nov. 19-20, 2003 User-centered

More information

Analytical Evaluation

Analytical Evaluation Analytical Evaluation November 7, 2016 1 Questions? 2 Overview of Today s Lecture Analytical Evaluation Inspections Performance modelling 3 Analytical Evaluations Evaluations without involving users 4

More information

CS-5200 Design Project

CS-5200 Design Project CS-5200 Design Project 20.3.2017 User testing Mika P. Nieminen Based on Sirpa Riihiaho Definition of Usability The extent to which a product can be used by specified users to achieve specified goals with

More information

Course Outline. Department of Computing Science Faculty of Science. COMP 3450 Human Computer Interaction Design (3,1,0) Fall 2015

Course Outline. Department of Computing Science Faculty of Science. COMP 3450 Human Computer Interaction Design (3,1,0) Fall 2015 Course Outline Department of Computing Science Faculty of Science COMP 3450 Human Computer Interaction Design (3,1,0) Fall 2015 Instructor: Office: Phone/Voice Mail: E-Mail: Course Description Students

More information

Human Computer Interaction Lecture 14. HCI in Software Process. HCI in the software process

Human Computer Interaction Lecture 14. HCI in Software Process. HCI in the software process Human Computer Interaction Lecture 14 HCI in Software Process HCI in the software process Software engineering and the design process for interactive systems Usability engineering Iterative design and

More information

HCI in the software process

HCI in the software process chapter 6 HCI in the software process HCI in the software process Software engineering and the process for interactive systems Usability engineering Iterative and prototyping Design rationale the software

More information

HCI in the software. chapter 6. HCI in the software process. The waterfall model. the software lifecycle

HCI in the software. chapter 6. HCI in the software process. The waterfall model. the software lifecycle HCI in the software process chapter 6 HCI in the software process Software engineering and the process for interactive systems Usability engineering Iterative and prototyping Design rationale the software

More information

Human Computer Interaction Lecture 06 [ HCI in Software Process ] HCI in the software process

Human Computer Interaction Lecture 06 [ HCI in Software Process ] HCI in the software process Human Computer Interaction Lecture 06 [ HCI in Software Process ] Imran Ihsan Assistant Professor www.imranihsan.com aucs.imranihsan.com HCI06 - HCI in Software Process 1 HCI in the software process Software

More information

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough

Interaction Design. Heuristic Evaluation & Cognitive Walkthrough Interaction Design Heuristic Evaluation & Cognitive Walkthrough Interaction Design Iterative user centered design and development Requirements gathering Quick design Build prototype Evaluate and refine

More information

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

Assistant Professor Computer Science. Introduction to Human-Computer Interaction CMSC434 Introduction to Human-Computer Interaction Week 12 Lecture 24 Nov 21, 2013 Intro to Evaluation Human Computer Interaction Laboratory @jonfroehlich Assistant Professor Computer Science Hall of Fame

More information

"Charting the Course... ITIL 2011 Managing Across the Lifecycle ( MALC ) Course Summary

Charting the Course... ITIL 2011 Managing Across the Lifecycle ( MALC ) Course Summary Course Summary Description ITIL is a set of best practices guidance that has become a worldwide-adopted framework for IT Service Management by many Public & Private Organizations. Since early 1990, ITIL

More information

A Study on Website Quality Models

A Study on Website Quality Models International Journal of Scientific and Research Publications, Volume 4, Issue 12, December 2014 1 A Study on Website Quality Models R.Anusha Department of Information Systems Management, M.O.P Vaishnav

More information

Analytical evaluation

Analytical evaluation Chapter 15 Analytical evaluation 1 Aims: Describe the key concepts associated with inspection methods. Explain how to do heuristic evaluation and walkthroughs. Explain the role of analytics in evaluation.

More information

IPM 10/11 T1.6 Discount Evaluation Methods

IPM 10/11 T1.6 Discount Evaluation Methods IPM 10/11 T1.6 Discount Evaluation Methods Licenciatura em Ciência de Computadores Miguel Tavares Coimbra Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg

More information

3 Prototyping and Iterative Evaluations

3 Prototyping and Iterative Evaluations 3 Prototyping and Iterative Evaluations Viktoria Pammer-Schindler March 15, 2016 Prototyping and Iterative Evaluations 1 Days and Topics March 1 March 8 March 15 April 12 April 19/21 April 26 (10-13) April

More information

03 Usability Engineering

03 Usability Engineering CS -213 Human Computer Interaction Spring 2016 03 Usability Engineering Imran Ihsan Assistant Professor (CS) Air University, Islamabad www.imranihsan.com www.opuseven.com opuseven iimranihsan imranihsan

More information

Improving user interfaces through a methodological heuristics evaluation framework and retrospective think aloud with eye tracking

Improving user interfaces through a methodological heuristics evaluation framework and retrospective think aloud with eye tracking Improving user interfaces through a methodological heuristics evaluation framework and retrospective think aloud with eye tracking Progress Report Supervisors: Dr. Tom Gedeon Mr. Christopher Chow Principal

More information

A model of information searching behaviour to facilitate end-user support in KOS-enhanced systems

A model of information searching behaviour to facilitate end-user support in KOS-enhanced systems A model of information searching behaviour to facilitate end-user support in KOS-enhanced systems Dorothee Blocks Hypermedia Research Unit School of Computing University of Glamorgan, UK NKOS workshop

More information

Requirements Validation and Negotiation (cont d)

Requirements Validation and Negotiation (cont d) REQUIREMENTS ENGINEERING LECTURE 2017/2018 Joerg Doerr Requirements Validation and Negotiation (cont d) REQUIREMENTS VALIDATION AND NEGOTIATION Requirements Validation Techniques 2 Techniques Overview

More information

Chapter 15: Analytical evaluation

Chapter 15: Analytical evaluation Chapter 15: Analytical evaluation Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain how to do doing heuristic evaluation and walkthroughs.

More information

EVALUATION OF PROTOTYPES USABILITY TESTING

EVALUATION OF PROTOTYPES USABILITY TESTING EVALUATION OF PROTOTYPES USABILITY TESTING CPSC 544 FUNDAMENTALS IN DESIGNING INTERACTIVE COMPUTATION TECHNOLOGY FOR PEOPLE (HUMAN COMPUTER INTERACTION) WEEK 9 CLASS 17 Joanna McGrenere and Leila Aflatoony

More information

UX Research in the Product Lifecycle

UX Research in the Product Lifecycle UX Research in the Product Lifecycle I incorporate how users work into the product early, frequently and iteratively throughout the development lifecycle. This means selecting from a suite of methods and

More information

The LUCID Design Framework (Logical User Centered Interaction Design)

The LUCID Design Framework (Logical User Centered Interaction Design) The LUCID Design Framework (Logical User Centered Interaction Design) developed by Cognetics Corporation LUCID Logical User Centered Interaction Design began as a way of describing the approach to interface

More information

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece

Nektarios Kostaras, Mixalis Xenos. Hellenic Open University, School of Sciences & Technology, Patras, Greece Kostaras N., Xenos M., Assessing Educational Web-site Usability using Heuristic Evaluation Rules, 11th Panhellenic Conference on Informatics with international participation, Vol. B, pp. 543-550, 18-20

More information

Advanced IT Risk, Security management and Cybercrime Prevention

Advanced IT Risk, Security management and Cybercrime Prevention Advanced IT Risk, Security management and Cybercrime Prevention Course Goal and Objectives Information technology has created a new category of criminality, as cybercrime offers hackers and other tech-savvy

More information

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques 8 Evaluation 8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques Ludwig-Maximilians-Universität München Prof. Hußmann Mensch-Maschine-Interaktion

More information

15/16 CSY2041 Quality and User-Centred Systems

15/16 CSY2041 Quality and User-Centred Systems 15/16 CSY2041 Quality and User-Centred Systems INTERACTION DESIGN 1 Heuristic evaluation and walkthroughs 2 1 Aims: Describe the key concepts associated with inspection methods. Explain how to do heuristic

More information

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques

8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques 8 Evaluation 8.1 Goals of Evaluation 8.2 Analytic Evaluation 8.3 Empirical Evaluation 8.4 Comparing and Choosing Evaluation Techniques Ludwig-Maximilians-Universität München Prof. Hußmann Mensch-Maschine-Interaktion

More information

SFU CMPT week 11

SFU CMPT week 11 SFU CMPT-363 2004-2 week 11 Manuel Zahariev E-mail: manuelz@cs.sfu.ca Based on course material from Arthur Kirkpatrick, Alissa Antle and Paul Hibbits July 21, 2004 1 Analytic Methods Advantages can be

More information

A Heuristic Evaluation of Ohiosci.org

A Heuristic Evaluation of Ohiosci.org A Heuristic Evaluation of Ohiosci.org Executive Summary Site evaluated: Goal: Method: The Ohio Academy of Science http://www.ohiosci.org/ The goal of this heuristic evaluation is to test the overall usability

More information

Design Heuristics and Evaluation

Design Heuristics and Evaluation Design Heuristics and Evaluation Rapid Evaluation Selected material from The UX Book, Hartson & Pyla Heuristic Evaluation Another method for finding usability problems in a UI design Validation during

More information

Software Engineering 2 A practical course in software engineering. Ekkart Kindler

Software Engineering 2 A practical course in software engineering. Ekkart Kindler Software Engineering 2 A practical course in software engineering Quality Management Main Message Planning phase Definition phase Design phase Implem. phase Acceptance phase Mainten. phase 3 1. Overview

More information

A new interaction evaluation framework for digital libraries

A new interaction evaluation framework for digital libraries A new interaction evaluation framework for digital libraries G. Tsakonas, S. Kapidakis, C. Papatheodorou {gtsak, sarantos, papatheodor} @ionio.gr DELOS Workshop on the Evaluation of Digital Libraries Department

More information

iinview Research First Click Analysis & Other User Metrics

iinview Research First Click Analysis & Other User Metrics iinview Research First Click Analysis & Other User Metrics July 2014 research iinview Research 2 Testing Methodology Task Based A/B Testing and Subjective Questionnaires The purpose of this test is to

More information

Usability Testing CS 4501 / 6501 Software Testing

Usability Testing CS 4501 / 6501 Software Testing Usability Testing CS 4501 / 6501 Software Testing [Nielsen Normal Group, https://www.nngroup.com/articles/usability-101-introduction-to-usability/] [TechSmith, Usability Basics: An Overview] [Ginny Redish,

More information

Human Computer Interaction in Health Informatics: From Laboratory Usability Testing to Televaluation of Web-based Information Systems

Human Computer Interaction in Health Informatics: From Laboratory Usability Testing to Televaluation of Web-based Information Systems Human Computer Interaction in Health Informatics: From Laboratory Usability Testing to Televaluation of Web-based Information Systems André W. Kushniruk, Ph.D. Arts Information Technology Program York

More information

CS 315 Intro to Human Computer Interaction (HCI)

CS 315 Intro to Human Computer Interaction (HCI) 1 CS 315 Intro to Human Computer Interaction (HCI) 2 3 Acceptance Tests Set goals for performance Objective Measurable Examples Mean time between failures (e.g. MOSI) Test cases Response time requirements

More information

SWEN 444 Human Centered Requirements and Design Project Breakdown

SWEN 444 Human Centered Requirements and Design Project Breakdown SWEN 444 Human Centered Requirements and Design Project Breakdown Team Status Reports: (starting in Week 2) Your team will report weekly project status to your instructor, and as you wish, capture other

More information

Designing Usable Apps

Designing Usable Apps This is a free sample excerpt from the book: Designing Usable Apps An agile approach to User Experience design Author: Kevin Matz 264 pages (softcover edition) Print edition ISBN: 978-0-9869109-0-6 E-book

More information

UNDERGRADUATE PROGRAMME SPECIFICATION

UNDERGRADUATE PROGRAMME SPECIFICATION UNDERGRADUATE PROGRAMME SPECIFICATION Programme Title: Awarding Body: Teaching Institution: Final Awards: BSc (Hons) Computer Networks (progression award) Staffordshire University Faculty of Computing,

More information

MAKING YOUR GATEWAY EASY AND PLEASANT TO USE

MAKING YOUR GATEWAY EASY AND PLEASANT TO USE MAKING YOUR GATEWAY EASY AND PLEASANT TO USE AN INTRODUCTION TO USABILITY AND USER-CENTERED DESIGN Paul Parsons November 8, 2017 SGCI Webinar Background PhD, Computer Science Specialty: Human-Computer

More information

IJESRT. (I2OR), Publication Impact Factor: (ISRA), Impact Factor: 2.114

IJESRT. (I2OR), Publication Impact Factor: (ISRA), Impact Factor: 2.114 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY EVALUATING ISO STANDARDS APPLICATION OF SECURITY REQUIREMENTS OF E- BANKING IN SUDAN Inshirah M. O. Elmaghrabi*, Hoida A. Abdelgadir,

More information

User Centered Design - Maximising the Use of Portal

User Centered Design - Maximising the Use of Portal User Centered Design - Maximising the Use of Portal Sean Kelly, Certus Solutions Limited General Manager, Enterprise Web Solutions Agenda What is UCD Why User Centered Design? Certus Approach - interact

More information

ACSD Evaluation Methods. and Usability Labs

ACSD Evaluation Methods. and Usability Labs ACSD Evaluation Methods and Usability Labs Department of Information Technology Uppsala University Why Evaluation? Finding out problems Checking for quality of task support Changing design 2/24/11 #2 Three

More information

SWEN 444 Human Centered Requirements and Design Project Breakdown

SWEN 444 Human Centered Requirements and Design Project Breakdown SWEN 444 Human Centered Requirements and Design Project Breakdown Team Status Reports: (starting in Week 2) Your team will report bi-weekly project status to your instructor, and as you wish, capture other

More information

Analytical &! Empirical Evaluation

Analytical &! Empirical Evaluation Analytical &! Empirical Evaluation Informatics 132 5/22/2012 TODAY Evaluation Due: A3 Paper Prototyping UPCOMING Friday: Group Project Time Monday: Memorial Day, No Class Wednesday: HCI in the Real World

More information

HCI Research Methods

HCI Research Methods HCI Research Methods Ben Shneiderman ben@cs.umd.edu Founding Director (1983-2000), Human-Computer Interaction Lab Professor, Department of Computer Science Member, Institute for Advanced Computer Studies

More information

Overview of the course. User-Centred Design. Group. Practical issue. Writting the report. Project work. Fang Chen

Overview of the course. User-Centred Design. Group. Practical issue. Writting the report. Project work. Fang Chen Overview of the course User-Centred Design Fang Chen 6 lectures, 3 hr each. L 1: April 6, 9-12, user-centered design concept L2: April 14, 9-12, usability concept L3. user-centered requirement study L4.

More information

Assessment Plan. Academic Cycle

Assessment Plan. Academic Cycle College of Business and Technology Assessment Plan Division or Department: School of Business (Accounting, BS) Prepared by: Nat Briscoe Date: June 21, 2017 Approved by: Margaret Kilcoyne Date: June 21,

More information

Information Systems Interfaces (Advanced Higher) Information Systems (Advanced Higher)

Information Systems Interfaces (Advanced Higher) Information Systems (Advanced Higher) National Unit Specification: general information NUMBER DV51 13 COURSE Information Systems (Advanced Higher) SUMMARY This Unit is designed to develop knowledge and understanding of the principles of information

More information

CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018

CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018 CPSC 444 Project Milestone III: Prototyping & Experiment Design Feb 6, 2018 OVERVIEW... 2 SUMMARY OF MILESTONE III DELIVERABLES... 2 1. Blog Update #3 - Low-fidelity Prototyping & Cognitive Walkthrough,

More information

HOMEPAGE USABILITY: MULTIPLE CASE STUDIES OF TOP 5 IPTA S HOMEPAGE IN MALAYSIA Ahmad Khairul Azizi Ahmad 1 UniversitiTeknologi MARA 1

HOMEPAGE USABILITY: MULTIPLE CASE STUDIES OF TOP 5 IPTA S HOMEPAGE IN MALAYSIA Ahmad Khairul Azizi Ahmad 1 UniversitiTeknologi MARA 1 HOMEPAGE USABILITY: MULTIPLE CASE STUDIES OF TOP 5 IPTA S HOMEPAGE IN MALAYSIA Ahmad Khairul Azizi Ahmad 1 UniversitiTeknologi MARA 1 khairulazizi@perak.uitm.edu.my Anwar Fikri Abdullah 2 UniversitiTeknologi

More information

Topic 01. Software Engineering, Web Engineering, agile methodologies.

Topic 01. Software Engineering, Web Engineering, agile methodologies. Topic 01 Software Engineering, Web Engineering, agile methodologies. 1 What is Software Engineering? 2 1 Classic Software Engineering The IEEE definition: Software Engineering is the application of a disciplined,

More information

ISO INTERNATIONAL STANDARD. Ergonomic requirements for office work with visual display terminals (VDTs) Part 11: Guidance on usability

ISO INTERNATIONAL STANDARD. Ergonomic requirements for office work with visual display terminals (VDTs) Part 11: Guidance on usability INTERNATIONAL STANDARD ISO 9241-11 First edition 1998-03-15 Ergonomic requirements for office work with visual display terminals (VDTs) Part 11: Guidance on usability Exigences ergonomiques pour travail

More information

Folsom Library & RensSearch Usability Test Plan

Folsom Library & RensSearch Usability Test Plan Folsom Library & RensSearch Usability Test Plan Eric Hansen & Billy Halibut 1 Table of Contents Document Overview!... 3 Methodology!... 3 Participants!... 3 Training!... 4 Procedure!... 4 Roles!... 4 Ethics!5

More information

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2)

Expert Reviews (1) Lecture 5-2: Usability Methods II. Usability Inspection Methods. Expert Reviews (2) : Usability Methods II Heuristic Analysis Heuristics versus Testing Debate Some Common Heuristics Heuristic Evaluation Expert Reviews (1) Nielsen & Molich (1990) CHI Proceedings Based upon empirical article

More information

Using the Web in Your Teaching

Using the Web in Your Teaching Using the Web in Your Teaching November 16, 2001 Dirk Morrison Extension Division, University of Saskatchewan Workshop Outline What will we cover? Why use the Web for teaching and learning? Planning to

More information

Evaluation Types GOMS and KLM PRICPE. Evaluation 10/30/2013. Where we are in PRICPE: Analytical based on your head Empirical based on data

Evaluation Types GOMS and KLM PRICPE. Evaluation 10/30/2013. Where we are in PRICPE: Analytical based on your head Empirical based on data Evaluation Types GOMS and KLM PRICPE Where we are in PRICPE: Predispositions: Did this in Project Proposal. RI: Research was studying users. Hopefully led to Insights. CP: Concept and initial (very low-fi)

More information

Usability of interactive systems: Current practices and challenges of its measurement

Usability of interactive systems: Current practices and challenges of its measurement Usability of interactive systems: Current practices and challenges of its measurement Δρ. Παναγιώτης Ζαχαριάς Τμήμα Πληροφορικής Πανεπιστήμιο Κύπρου 23/2/2010 Concepts and Definitions Usability engineering

More information

Course Information

Course Information Course Information 2018-2020 Master of Information Systems: Management and Innovation Institutt for teknologi / Department of Technology Index Index... i 1... 1 1.1 Content... 1 1.2 Name... 1 1.3 Programme

More information

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation

CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation CSE 440: Introduction to HCI User Interface Design, Prototyping, and Evaluation Lecture 11: Inspection Tuesday / Thursday 12:00 to 1:20 James Fogarty Kailey Chan Dhruv Jain Nigini Oliveira Chris Seeds

More information

NPTEL Computer Science and Engineering Human-Computer Interaction

NPTEL Computer Science and Engineering Human-Computer Interaction M4 L5 Heuristic Evaluation Objective: To understand the process of Heuristic Evaluation.. To employ the ten principles for evaluating an interface. Introduction: Heuristics evaluation is s systematic process

More information

Criteria and methods in evaluation of digital libraries: use & usability

Criteria and methods in evaluation of digital libraries: use & usability Criteria and methods in evaluation of digital libraries: use & usability, Ph.D. This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 United States License 1 Evaluating

More information

Requirements Validation and Negotiation

Requirements Validation and Negotiation REQUIREMENTS ENGINEERING LECTURE 2015/2016 Eddy Groen Requirements Validation and Negotiation AGENDA Fundamentals of Requirements Validation Fundamentals of Requirements Negotiation Quality Aspects of

More information

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 5 Diploma in IT. March 2017 PRINCIPLES OF USER INTERFACE DESIGN

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 5 Diploma in IT. March 2017 PRINCIPLES OF USER INTERFACE DESIGN BCS THE CHARTERED INSTITUTE FOR IT BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 5 Diploma in IT March 2017 PRINCIPLES OF USER INTERFACE DESIGN EXAMINERS REPORT General Comments Candidates should focus

More information

CMSC434 Intro to Human-Computer Interaction. Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara

CMSC434 Intro to Human-Computer Interaction. Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara CMSC434 Intro to Human-Computer Interaction Visual Design #3 and Evaluation #1 Monday, April 8th, 2012 Instructor: Jon Froehlich TA: Kotaro Hara #inspiration [Applied Sciences Group: High Performance Touch,

More information

Automated Cognitive Walkthrough for the Web (AutoCWW)

Automated Cognitive Walkthrough for the Web (AutoCWW) CHI 2002 Workshop: Automatically Evaluating the Usability of Web Sites Workshop Date: April 21-22, 2002 Automated Cognitive Walkthrough for the Web (AutoCWW) Position Paper by Marilyn Hughes Blackmon Marilyn

More information

User Interface Evaluation

User Interface Evaluation User Interface Evaluation Heuristic Evaluation Lecture #17 Agenda Evaluation through Expert Analysis Cognitive walkthrough Heuristic evaluation Model-based evaluation Cognitive dimension of notations 2

More information

h(p://ihm.tumblr.com/post/ /word- cloud- for- hci- human- computer- interacbon CS5340 Human-Computer Interaction ! January 31, 2013!

h(p://ihm.tumblr.com/post/ /word- cloud- for- hci- human- computer- interacbon CS5340 Human-Computer Interaction ! January 31, 2013! h(p://ihm.tumblr.com/post/105778492/word- cloud- for- hci- human- computer- interacbon CS5340 Human-Computer Interaction January 31, 2013 Today s Class Administrivia User-centered Design Establishing Requirements

More information

Programmiersprache C++ Winter 2005 Operator overloading (48)

Programmiersprache C++ Winter 2005 Operator overloading (48) Evaluation Methods Different methods When the evaluation is done How the evaluation is done By whom the evaluation is done Programmiersprache C++ Winter 2005 Operator overloading (48) When the evaluation

More information

Taxonomy Governance Checklist

Taxonomy Governance Checklist Author and Manage a SharePoint Taxonomy Taxonomy Governance Checklist v.1.0 Table of Content Introduction Methodology Phase 1: Planning Phase 2: Taxonomy Authoring Phase 3: Maintenance Appendix 1: Non-functional

More information

Empirical Studies on the Security and Usability Impact of Immutability

Empirical Studies on the Security and Usability Impact of Immutability Empirical Studies on the Security and Usability Impact of Immutability Sam Weber (NYU), Michael Coblenz (CMU), Brad Myers (CMU), Jonathan Aldrich (CMU), Joshua Sunshine (CMU) Acknowledgements This research

More information

CS 4317: Human-Computer Interaction

CS 4317: Human-Computer Interaction September 8, 2017 Tentative Syllabus CS 4317: Human-Computer Interaction Spring 2017 Tuesday & Thursday, 9:00-10:20, Psychology Building, room 308 Instructor: Nigel Ward Office: CCS 3.0408 Phone: 747-6827

More information

Compile together the individual QA Testing Checklists for your team site.

Compile together the individual QA Testing Checklists for your team site. Overview In this phase of the project you test and revise your client site using three different testing methods: quality assurance testing (done individually), user testing, and heuristic evaluation.

More information

Choosing the Right Usability Tool (the right technique for the right problem)

Choosing the Right Usability Tool (the right technique for the right problem) Choosing the Right Usability Tool (the right technique for the right problem) User Friendly 2005 December 18, Shanghai Whitney Quesenbery Whitney Interactive Design www.wqusability.com Daniel Szuc Apogee

More information

Introducing Evaluation

Introducing Evaluation Chapter 12 Introducing Evaluation 1 The aims Explain the key concepts used in evaluation. Introduce different evaluation methods. Show how different methods are used for different purposes at different

More information

Interaction Design. Ruben Kruiper

Interaction Design. Ruben Kruiper Interaction Design Ruben Kruiper What do you know? What do you think Interaction Design stands for? 2 What do you expect? Interaction Design will be: Awesome Okay Boring Myself I m not a big fan... This

More information

How to Choose the Right UX Methods For Your Project

How to Choose the Right UX Methods For Your Project How to Choose the Right UX Methods For Your Project Bill Albert, Ph.D. Director, Bentley Design and Usability Center May 4, 2011 walbert@bentley.edu 781.891.2500 www.bentley.edu/usability Motivation Our

More information

COSC 341 Human Computer Interaction. Dr. Bowen Hui University of British Columbia Okanagan

COSC 341 Human Computer Interaction. Dr. Bowen Hui University of British Columbia Okanagan COSC 341 Human Computer Interaction Dr. Bowen Hui University of British Columbia Okanagan 1 Recall: Learning About Your User Common ways to elicit user feedback: Interviews Focus groups Expert reviews

More information

Assessment Plan. Academic Cycle

Assessment Plan. Academic Cycle College of Business and Technology Division or Department: School of Business (Business Administration, BS) Prepared by: Marcia Hardy Date: June 21, 2017 Approved by: Margaret Kilcoyne Date: June 21, 2017

More information

Usability Report for Online Writing Portfolio

Usability Report for Online Writing Portfolio Usability Report for Online Writing Portfolio October 30, 2012 WR 305.01 Written By: Kelsey Carper I pledge on my honor that I have not given or received any unauthorized assistance in the completion of

More information

Information Architecture

Information Architecture Information Architecture Why, What, & How Internet Technology 1 Why IA? Information Overload Internet Technology 2 What is IA? Process of organizing & presenting information in an intuitive & clear manner.

More information

Addition about Prototypes

Addition about Prototypes Vorlesung Mensch-Maschine-Interaktion Evaluation Ludwig-Maximilians-Universität München LFE Medieninformatik Heinrich Hußmann & Albrecht Schmidt WS2003/2004 http://www.medien.informatik.uni-muenchen.de/

More information

Better Bioinformatics Through Usability Analysis

Better Bioinformatics Through Usability Analysis Better Bioinformatics Through Usability Analysis Supplementary Information Davide Bolchini, Anthony Finkelstein, Vito Perrone and Sylvia Nagl Contacts: davide.bolchini@lu.unisi.ch Abstract With this supplementary

More information

Data Preprocessing. Slides by: Shree Jaswal

Data Preprocessing. Slides by: Shree Jaswal Data Preprocessing Slides by: Shree Jaswal Topics to be covered Why Preprocessing? Data Cleaning; Data Integration; Data Reduction: Attribute subset selection, Histograms, Clustering and Sampling; Data

More information

Usability Evaluation of Tools for Nomadic Application Development

Usability Evaluation of Tools for Nomadic Application Development Usability Evaluation of Tools for Nomadic Application Development Cristina Chesta (1), Carmen Santoro (2), Fabio Paternò (2) (1) Motorola Electronics S.p.a. GSG Italy Via Cardinal Massaia 83, 10147 Torino

More information

Course Information

Course Information Course Information 2018-2020 Master of Information Systems: Digital Business System Institutt for teknologi / Department of Technology Index Index... i 1 s... 1 1.1 Content... 1 1.2 Name... 1 1.3 Programme

More information