Introduction to Web Surveys. Survey Research Laboratory University of Illinois at Chicago October 2010

Similar documents
2/28/2011. Web Surveys. Introduction to Web Surveys. Basic Advantages of Web Surveys

Advanced Topics in Web Surveys SOWMYA ANAND SURVEY RESEARCH LABORATORY

Web Survey Design. Mick P. Couper (Survey Research Center, University of Michigan)

A Comparative Usability Test. Orbitz.com vs. Hipmunk.com

Placement and Design of Navigation Buttons in Web Surveys

Placement and Design of Navigation Buttons in Web Surveys

The Mode of Invitation for Web Surveys

Our experience in producing the CAWI questionnaire for the survey "Statistical Report on the Careers of Doctorate Holders (CDH)"

A Step-by-Step Guide to Creating More Accessible Surveys

The Effectiveness of Mailed Invitations for Web Surveys

Elizabeth Garrett Anderson. FAQs. The Anderson Team Journey. EGA FAQ s

Segmented or Overlapping Dual Frame Samples in Telephone Surveys

Jennifer Nip, P.Eng. Portfolio

Computer Projection Presentation Guide

Improving Government Websites and Surveys with Usability Testing

Table of contents. TOOLKIT for Making Written Material Clear and Effective

Website Usability Study: The American Red Cross. Sarah Barth, Veronica McCoo, Katelyn McLimans, Alyssa Williams. University of Alabama

MICROSOFT WORD. Table of Contents. What is MSWord? Features LINC TWO

Computer Projection Presentation Guide HPS Midyear Meeting Bethesda, MD

Towards a Web-based multimode data collection system for household surveys in Statistics Portugal:

Heuristic Evaluation of NUIG Participate Module 1

Human-Computer Interaction IS4300

Using Mixed-Mode Contacts in Client Surveys: Getting More Bang for Your Buck

The Importance of Design for Web Surveys

Constructing a Qualtrics survey Before you = start

Good afternoon and thank you for being at the webinar on accessible PowerPoint presentations. This is Dr. Zayira Jordan web accessibility coordinator

Readers are wary of out of date content, so it's important to actively manage the information you publish.

ways to present and organize the content to provide your students with an intuitive and easy-to-navigate experience.

THE IMPACT OF CONTACT TYPE ON WEB SURVEY RESPONSE RATES. STEPHEN R. PORTER MICHAEL E. WHITCOMB Wesleyan University

Developing an optimal screen layout for CAI

Baltimore Health and Mental Health Study Training Manual Page II - 1

Guiding Principles for PowerPoint Presentations

Safety Perception Survey s: Interviewing & Sampling Method Considerations

Teacher name: Mrs. Gramiak Names. CATEGORY Excellent Good Satisfactory Needs Improvement

Version 1.0: klm. General Certificate of Education. Media Studies. Creating Media. Report on the Examination examination - January series


Writing Practice Tool Guide

Heuristic Review of iinview An in-depth analysis! May 2014

Mobility Enabled: Effects of Mobile Devices on Survey Response and Substantive Measures

Creating an Accessible Microsoft Word document

Typographic hierarchy: How to prioritize information

COMP 6650 Web Usability Semester Project Report

Using Qualtrics. ERL Workshop Social Psychology Lab Spring 2014

Are dual-frame surveys necessary in market research?

Writing For The Web. Patricia Minacori

The Website. Teaching Thoughts. Usability Report. By Jon Morris

Gian Maria Greco. Guidelines for an Accessible Presentation

Exemplar for Internal Achievement Standard. Technology Level 1

UX Design. Web Design - Part 2. Topics. More web design: Page design: Where am I? Navigation Colour choices Special Needs Size Layout

Cognitive Walkthrough Evaluation Yale University School of Art

Practical Issues in Conducting Cell Phone Polling


Template Tidbits. Q How do I get the places I can enter copy to show up? (Highlight Fields Bar)

Introduction Methodology

Preliminary Examination of Global Expectations of Users' Mental Models for E-Commerce Web Layouts

Teaching with Primary Sources

PNC.com, Weather.com & SouthWest.com. Usability Analysis. Tyler A. Steinke May 8, 2014 IMS 413

SilverStripe - Website content editors.

Cost and Productivity Ratios in Dual-Frame RDD Telephone Surveys

THE IMPACT OF MOBILE DEVICES ON INFORMATION SECURITY:

Research Project. SUBASIC RESEARCH! Jake Pflum! Stephanie Martinez! Richard Escobar! Anh Nguyen! Ajla Subasic

Reading books on an ipad or other electronic device is a

INDEX UNIT 4 PPT SLIDES

Evaluating the suitability of Web 2.0 technologies for online atlas access interfaces

Text and Lists Use Styles. What Are Styles?

Section 6 HCI & the Web 02/01/2013 1

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

BETTER LOOKING S

Paging vs. Scrolling: Looking for the Best Way to Present Search Results

MAT-225 Informational Brief & Questionnaire

Google Sites Creating Websites and e-portfolios

USE OF AUDIO VISUAL AIDS. Shital Moktan

CHAPTER 9: PRESENTATTIONAL AIDS. A presentational aid is any visual, audio, audio visual, or other sensory material used by the speaker in a speech.

Mega International Commercial bank (Canada)

The Business Case for Web Accessibility. Facilitator s Notes for this Module

2017 SPE/IATMI ASIA PACIFIC OIL & GAS CONFERENCE AND EXHIBITION OCTOBER 2017 BALI, INDONESIA TECHNICAL SESSION PRESENTATION GUIDELINES

I think you will find our redesigning methods viable. Our research suggests that you should consider our changes towards the website.

We asked the following questions about having fun at TESOL (24 point, Arial)

First-Time Usability Testing for Bluetooth-Enabled Devices

Assignment 3 - Usability Report. A Paper Presented in Partial Fulfilment Of the Requirements Of EDID 6508 Developing Instructional Media

Fast Company Homepage This ad is very distracting and grabs the viewer attention more than the logo and navigation. It could cause the user to overloo

Reference Services Division Presents WORD Introductory Class

What are the elements of website design?

Universal Design Principles Checklist

ACTIVE CAMPUS PORTAL ADVANCED (SHAREPOINT BASICS)

How to Create Consumer-Friendly Health IT

YuJa Enterprise Video Platform WCAG 2.0 Checklist

the NXT-G programming environment

Kevin DeBoer. Professor Culik. English October 12, Algorithms

University of Wisconsin - Stout

Clear language and design. Joan Acosta

Usability Services at the University of Maryland: Who, What and How

2017 USER SURVEY EXECUTIVE SUMMARY

2013 Association Marketing Benchmark Report

Tips to improve your Adobe Connect Meetings

GUIDELINES FOR MASTER OF SCIENCE INTERNSHIP THESIS

The Mobile Consumer Lifestyle. Implications for Marketers

Handy guide for effective EPFL PowerPoint presentations

Library Website Migration and Chat Functionality/Aesthetics Study February 2013

Should the Word Survey Be Avoided in Invitation Messaging?

Transcription:

Introduction to Web Surveys Survey Research Laboratory University of Illinois at Chicago October 2010

Web Surveys First reported use in early 1990 s Dramatic increase in use over the past decade Numerous web survey software packages now available

Basic Advantages of Web Surveys Speed Cost Convenient (self-administered) Multi-media delivery (sound, video) Power of computer-assisted programming Unique, hi-tech Similar arguments were made regarding CATI (in the 1970s) and CAPI (in the 1980s) technologies

Typology of Web Surveys With thanks to Mick Couper, University of Michigan Survey Research Center (see: Couper, Web Surveys: A Review of Issues and Approaches. Public Opinion Quarterly 2000; 64: 464-494) Non-probability methods Probability methods

Non-probability Methods 1. Polls as entertainment 2. Unrestricted self-selected surveys 3. Volunteer opt-in panels

Probability Methods 4. Intercept surveys 5. List-based samples 6. Web option in mixed-mode surveys 7. Pre-recruited panels of internet users 8. Pre-recruited panels of full population

Type 1: Entertainment Polls question of the day Intended primarily for entertainment No pretense at science or representativeness Basically harmless so long as audience can distinguish them from real surveys

Type 2: Unrestricted self-selected surveys Open invitation on portals, frequently-visited web sites, or dedicated survey sites No access restrictions Ballot-stuffing possible Equivalent of 1-900 poll or magazine insert survey Main distinction with type 1 are claims to legitimacy are sometimes made here

Type 3: Volunteer panels of internet users Create volunteer (opt-in) panel using open invitation, recruit large number of people willing to do web surveys Use quota controls or random sampling of select persons from this group for a particular survey Control access through invitation & PIN Although 2 nd step (selection within panel) is controlled, 1 st step is self-selected and uncontrolled

Type 3: Continued Claims to generalize to total population Maybe most common approach to web surveys now As with all panels, some of these pay respondents to participate Some are claiming that these panels are equal or better than other forms of data collection based on probability methods

Probability-Based Methods #4: Intercept Surveys Target is visitors to a web site Customer satisfaction Web site evaluation Systematic sample commonly used every nth visitor No coverage problem because the population of interest is active web users Biggest problem is nonresponse Also problem of timing when to intercept?

#5: List-Based Samples of High- Coverage populations Recruit and invite those with web access to participate Restricted to current internet users Controlled access Nonresponse occurs at many stages in the process but can be measured (via analyses of baseline data) High coverage population examples: College students Members of professional organizations IT professionals

#6: Mixed-Mode Designs with Choice of Completion Method Web survey as part of a mixed mode design Example: mail survey with invitation to do via web Requires controlled access Concerns over equivalence of measurement mode effects More expensive than single mode survey Offering respondents a choice helps confront nonresponse problems

#7: Pre-recruited Panels of Internet Users Random probability methods used to contact and invite persons with internet access to participate This approach is limited to active internet users More expensive than some other approaches May be difficult to assess nonresponse

#8: Probability Samples of Full Population Start with probability sample of target population Give everyone access to internet in exchange for participation Only approach with potential to be representative of general population Biases and errors can be measured Nonresponse, panel effects and costs are big concerns Example: Knowledge Networks

Designing Web Questionnaires

Basic Design Approaches Static web questionnaire Survey in single HTML document Respondents can scroll through document Data sent to server once when survey is completed Interactive web questionnaire Questions are delivered one at a time or in modules Data is sent to server after each screen is completed Conducive to use of skip patterns, consistency checks, range checks, etc.

Static Web Questionnaires Very similar to mail and other selfadministered questionnaires Can minimize download time Respondents can skip questions, but the process is not usually automated Hypertext links can be used to facilitate skips All information is lost if respondent quits before finishing More advantageous for short questionnaires

Interactive Web Questionnaires This approach permits the use of all computer-assisted programming devices May increase length of survey due to additional download time Partial data is captured for respondents who quit before finishing questionnaire More advantageous for longer and more complex questionnaires

Progress Indicators The purpose is to motivate respondents to complete the questionnaire in the absence of an interviewer Couper et al. (2001): 89.9% completed survey with progress indicator vs. 86.4% completing survey without one Very useful in interactive questionnaires, where respondent does not know how long the questionnaire is Not necessary in static questionnaires where respondents can determine the length by scrolling through it May add to survey length if increase download time There is some concern of increased break-offs Transition sentences are an alternative Empirical evidence regarding effectiveness not clear

General Screen Design Do not use background color or images Background colors can create contrast & reading problems visual noise

General Screen Design #2 Be aware that images may bias responses Witte et al (2004) National Geographic Survey Images increased support for species protection Couper et al (2007) healthy vs. sick person image When exposed to fit person, respondents consistently rated their own health as lower than when exposed to sick person Use upper right corner for contact information Privacy/IRB information can be clickable from there If top of screen format is consistent: Respondents will tend to ignore that section across pages Banner-Blindness

General Screen Design #3 Access to other relevant information can also be provided: Answers to commonly asked questions about the survey pdf versions of the full questionnaire

Effect of Color on Web Survey Completion Do not overuse color but use it consistently Use red only for emergency messages Red-green distinctions a problem with persons who are color-blind 10% of males are color blind 99% of color blind persons cannot distinguish green & red White or off-white backgrounds seem to work best Some evidence that R s view black-on-white web pages as being more professional than white-onblack web pages Couper (2008) prefers light blue backgrounds

Color, continued For maximum readability, should be high contrast between text color and background color Bright colors are easier to see than pastels Colored backgrounds often used by spammers and may reduce response rates

Text Always avoid small font sizes (use 10-12 point) Appears to be some preference for Arial over Times Roman font Do not overuse bold, underline, italics and other forms of emphasis

Question Presentation Avoid requiring R to horizontally scroll Avoid any scrolling may be best

Question Presentation Avoid requiring R to horizontally scroll Avoid any scrolling may be best No agreement about inclusion of question numbers Excluding them may avoid skip logic confusion Likert questions (fully-labeled) should be displayed vertically

Question Presentation #2 Respondents less likely to skip words when lines are kept short Provide computer-operating instructions at the precise point when a R may need to use that information When # of responses cannot be fitted on single screen: Double- or triple-banking may be best approach Place a box around the categories in order to group them as being relevant to the question

Question Presentation #3 Visibility principle Options that are visible are more likely to be selected than those that are not visible until the R takes some action to display them Response models Serial processing model Search options for pre-existing judgment Deadline processing model Spend certain amount of time and select best answer found before cognitive deadline (a form of satisficing)

Common Types of Response Options for Web Surveys 1. Radio buttons or boxes 2. Drop-down boxes 3. Check boxes 4. Slider bars 5. Text boxes 6. Open-ended questions

Radio Buttons Options are typically mutually exclusive

Be careful not to use long grids that lose column headings:

Boxes instead of buttons

Drop-Down Boxes Useful only for closed lists of response options Can be designed to allow for single or multiple choices Options provided must be exhaustive Drop boxes more difficult to use than radio buttons

Beware of Scroll Mice Healy (2007) Drop-downs (compared to radio buttons) led to higher item nonresponse and longer response times Respondents using scroll mice to complete the survey were prone to accidentally changing an answer if presented with drop-down questions

Check Boxes Unlike radio buttons, multiple choices can be clicked via check boxes

Radio Button/Check Box Hybrid

Slider Bars (a.k.a. visual analog scale, graphic rating scales, sliders )

Slider Bars - Research Random experiment by Bayer & Thomas (2004) of Harris Interactive Slider bars took about twice as long for completion as any other scale type (including semantic differentials, likert, etc) Answering 2 slider bar questions averaged 42.3 seconds, compared to 21.3 seconds for semantic differential questions Couper (2008) says results using slider bars are quite similar to what is obtained from a scale that uses radio buttons

Open-ended Questions providing more space encourages respondents to providing more space encourages respondents to provide longer answers

Present Single or Multiple Items per Screen? For interactive questionnaires, multiple items per screen: Are completed more quickly by respondents May provide more context Intercorrelations among items are consistently higher when grouped together on one screen (Couper et al. 2001). Also, multiple item screen versions: take less time to complete produce less missing data

Survey Navigation A consistent format should be followed Use action buttons that are different from any response input elements such as radio buttons next screen or next question buttons should be on all pages Crawford et al (2005) recommends putting them in the lower left corner previous screen or previous question buttons should be in the bottom right corner

Key Point Never force respondents to answer a question Adds to frustration IRB implications No other questionnaire formats force answers

Key Questionnaire Design Principles Summary Minimize respondent burden and frustration The fewer clicks, the better The less scrolling, the better The fewer distractions, the better The fewer problems knowing how to navigate the questionnaire, the better The less download time required, the better Never forcing respondents to answer questions

Some other design recommendations to consider (from Couper 2008): Remove unneeded content and clutter Minimize the number of different colors and fonts being used Use consistent design formats through the entire instrument Avoid putting too much material on any page

Summary Web surveys vary greatly in their goals, design, execution, analysis, etc. Evaluation must be done in the context of the type of survey being conducted Cannot say that all web surveys are good or bad Methodological research is being done on a moving target

Thank You timj@uic.edu