Should the Word Survey Be Avoided in Invitation Messaging?

Similar documents
The Mode of Invitation for Web Surveys

Are Advanced Postcards or Letters Better? An Experiment with Advance Mailing Types

Improving survey response rates: The effect of embedded questions in web survey Invitations

Using Mixed-Mode Contacts in Client Surveys: Getting More Bang for Your Buck

The Effectiveness of Mailed Invitations for Web Surveys

First-Class Versus Pre-Canceled Postage: A Cost/ Benefit Analysis

Evaluating the Effectiveness of Using an Additional Mailing Piece in the American Community Survey 1

Web+Mail as a Mixed-Mode Solution to General Public Survey Challenges in the United States

Advanced Topics in Web Surveys SOWMYA ANAND SURVEY RESEARCH LABORATORY

Measuring the Effects of Reminder Postcards in the Occupational Employment Statistics Survey

THE IMPACT OF CONTACT TYPE ON WEB SURVEY RESPONSE RATES. STEPHEN R. PORTER MICHAEL E. WHITCOMB Wesleyan University

Web Surveys: The Good, The Bad, & The Ugly

Mail And Internet Surveys: The Tailored Design Method By Don A. Dillman READ ONLINE

How to Conduct a Heuristic Evaluation

Better connections: Australian attitudes to mail and . November 2013

581 staff (16.4% of the overall number of staff) 1195 students (7.3% of the overall population of students)

Survey Questions and Methodology

Segmented or Overlapping Dual Frame Samples in Telephone Surveys

Paper 7 SO6 Annex 2 Annual Customer Service Complaints Report John Stevenson, Head of Complaints Standards Authority (CSA) May Purpose

Telephone Survey Response: Effects of Cell Phones in Landline Households

Personal and business current account prompt pilot findings

2013, Healthcare Intelligence Network

Cost and Productivity Ratios in Dual-Frame RDD Telephone Surveys

Introduction to Web Surveys. Survey Research Laboratory University of Illinois at Chicago October 2010

Chapter 13: CODING DESIGN, CODING PROCESS, AND CODER RELIABILITY STUDIES

MOBILE MARKETING ASSOCIATION CONSUMER INTEREST IN MOBILE BANKING: GERMANY FEBRUARY 2010

Midterm 2 Solutions. CS70 Discrete Mathematics and Probability Theory, Spring 2009

Mobility Enabled: Effects of Mobile Devices on Survey Response and Substantive Measures

Infinite Campus Parent Portal

SONA INSTRUCTIONS FOR PRINCIPAL INVESTIGATORS AND RESEARCHERS

Marketing Performance in Executive perspective on the strategy and effectiveness of marketing

STATISTICS (STAT) Statistics (STAT) 1

SFU OFFICIAL APP: SURVEY FINDINGS AND RECOMMENDATIONS BUS 345 YUPIN YANG SPRING 2016

WEB ANALYTICS A REPORTING APPROACH

SCHOOL. Climate Survey. Administrator User Guide

Client Services Procedure Manual

Kristine Wiant 1, Joe McMichael 1, Joe Murphy 1 Katie Morton 1, Megan Waggy 1 1

Hyacinth Macaws for Seniors Survey Report

SLIDER BARS IN MULTI-DEVICE WEB SURVEYS

EFFECTIVE AND EFFICIENT USE OF THE FUNDAMENTALS OF ENGINEERING EXAM FOR OUTCOMES ASSESSMENT


Visual Appeal vs. Usability: Which One Influences User Perceptions of a Website More?

Elastic Block Ciphers: The Feistel Cipher Case

Gartner Client Operating Systems Surveys and Polls: Enterprises Plan Early, but Slow, Move to Windows 7

2015 Shopping Cart Abandonment Research Study

Detecting Polytomous Items That Have Drifted: Using Global Versus Step Difficulty 1,2. Xi Wang and Ronald K. Hambleton

Survey Questions and Methodology

User Assessment for Negotiating the Quality of Service for Streaming Media Applications

Field Testing. *NEW Benefits of Field Testing

Why is the Chancellor s Office commissioning a 360?

CUSTOMER COMMUNICATION PREFERENCES SURVEY. Conducted by Harris Interactive Sponsored by Varolii Corp.

Test designs for evaluating the effectiveness of mail packs Received: 30th November, 2001

Elizabeth Garrett Anderson. FAQs. The Anderson Team Journey. EGA FAQ s

Exploring Persuasiveness of Just-in-time Motivational Messages for Obesity Management

CIM Level 3 Foundation Certificate in Marketing

2017 USER SURVEY EXECUTIVE SUMMARY

A HYBRID METHOD FOR SIMULATION FACTOR SCREENING. Hua Shen Hong Wan

Communication Preferences A Survey Report. Prepared for. OMSI Evaluation and Visitor Studies Division

WHOIS Proxy/Privacy Abuse

Acurian on. The Role of Technology in Patient Recruitment

for Credit is from January 22 through February 20 at midnight.

DISCLOSER GUIDE. Getting Started Creating an FCOE Disclosure Creating an Outside Activity Disclosure. Submitting Disclosures Post-Submission Actions

Procedia Computer Science

JEAS-Accredited Environmental Assessor Qualification Scheme

Factors Affecting the Performance of Ad Hoc Networks

Dual-Frame Sample Sizes (RDD and Cell) for Future Minnesota Health Access Surveys

The Effectiveness of in Obtaining Respondents for Web-based Surveys

CICS insights from IT professionals revealed

Speed and Accuracy using Four Boolean Query Systems

Target Audience: GIS data developers GIS database managers and administrators Local addressing authorities. Page 1 of 6

DATA MEMO. The volume of spam is growing in Americans personal and workplace accounts, but users are less bothered by it.

A quick guide to... Split-Testing

Baltic International Academy Foreign Language Centre and ECL Certification

2007 Annual Summary. Board of Certification (BOC) Certification Examination for Athletic Trainers. CASTLE Worldwide, Inc.

How Is the CPA Exam Scored? Prepared by the American Institute of Certified Public Accountants

Campus Community Guide October 2012

The Growing Gap between Landline and Dual Frame Election Polls

Learning C language Programming with executable flowchart language

Mini-Lectures by Section

Examining the Impact of Drifted Polytomous Anchor Items on Test Characteristic Curve (TCC) Linking and IRT True Score Equating

Usability Evaluation of Cell Phones for Early Adolescent Users

Nebraska State College System Cellular Services Procedures Effective Date June 15, 2012 Updated August 13, 2015

archiving survey in United Kingdom

FINDINGS FROM THE 1-STOP CONNECTIONS CUSTOMER SATISFACTION SURVEY

NACTA 2013 Texas A&M University

Sona Systems, Ltd. Experiment Management System

Simplifying OCR Neural Networks with Oracle Learning

Preparing for the CHT Examination

Reducing the Effects of Careless Responses on Item Calibration in Item Response Theory

A Tale of Two Studies Establishing Google & Bing Click-Through Rates

Web Accessibility for Older Readers: Effects of Font Type and Font Size on Skim Reading Webpages in Thai

Submission to the International Integrated Reporting Council regarding the Consultation Draft of the International Integrated Reporting Framework

Elastic Block Ciphers: The Feistel Cipher Case

Chapter X Security Performance Metrics

4 th Annual Security Survey: IT Executives and Network Administrators

2010 Web Analytics Progress and Plans in BtoB Organizations: Survey Report

Shedding Light on the Graph Schema

for Credit is between September 5 and October 3 at midnight.

2017 NEW JERSEY STATEWIDE SURVEY ON OUR HEALTH AND WELL BEING Methodology Report December 1, 2017

Handling Missing Values via Decomposition of the Conditioned Set

Transcription:

ACT Research & Policy Issue Brief 2016 Should the Word Survey Be Avoided in Email Invitation Messaging? Raeal Moore, PhD Introduction The wording of email invitations requesting respondents participation in a survey has the potential to impact the data collected. 1 Research shows that a short invitation message, 2 with the survey URL towards the top of the invitation 3 and an explicit indication of a short time estimate for survey completion 4 all positively affect response rates relative to when these invitation design elements are excluded. Evidence also suggests that email subject line messaging affects response rates. An authoritative subject line ( MSU Vice President asks you to take a survey ) as compared to one with a higher subject matter salience ( Take an MSU survey on campus environmental stewardship ) improves response rates. 5 Likewise, a plea request ( Please help us to make improvements ) improves response rates relative to a subject line that poses a question ( Would you like to provide your feedback? ). 6 Similarly, Trouteaud (2004) 7 indicated that a simple Please help resulted in a higher response rate than when the subject line presented an offer ( Share your advice or Take some of your time to share ). To the author s knowledge, little research has been conducted to determine whether including the word survey in either the invitation message or the subject line has a differential impact on response rates. This investigation is particularly salient because when ACT Survey Research (ACT SR) sends a survey invitation, the word survey is typically excluded from the messaging in an attempt to avoid flagging the email as spam. In the past, ACT SR had encountered situations in which spam filters had been shown to flag such messages and prevent them from arriving in email inboxes. ACT SR therefore implemented two experimental studies to determine whether the inclusion or exclusion of the word survey in the email invitation messaging had a differential impact on response rates. Experiment 1 Students who took the ACT test were invited to participate in an online survey about their test taking experience (N = 50,000). To study the relationship between invitation messaging and survey participation, an equal number of students was randomly assigned to one of two groups based on whether the word survey was included Raeal Moore is a senior research scientist specializing in survey methodological research and research on education best practices in P 12 schools. www.act.org/research-policy Email research.policy@act.org for more information 2016 by ACT, Inc. All rights reserved. 6070

or excluded in the email invitation. Figure 1 presents a sample invitation message illustrating the inclusion of the word survey. The invitation message that excluded the word survey replaced it with the word questions. A total of 2,242 students participated in the survey. 8 Three research questions were investigated. Research Question 1: Does including the word survey in the invitation message decrease the number of students who open the email? Of central importance was determining whether including the word survey in the invitation message increased the number of emails flagged as spam. Given that the collection of these data was not possible, the number of emails opened was used as a proxy indicator. Figure 2 illustrates the number of respondents, by survey behavior and whether the word survey was included or excluded in the invitation message. Almost twice as many students opened the email when the word survey was included in the invitation message (n = 6,073; 24%) than when it was excluded (n = 3,619; 14%). This difference in response pattern was statistically significant (X 2 (1) = 813.46, p. <.05) and had a small effect size associated with it (w =.13). 9 Not only did emails that included the word survey successfully reach their destinations, but they were also associated with higher open rates. Including the word survey therefore did not decrease the number of students who opened the email but rather increased it. Research Question 2: Does including the word survey in the invitation message increase the number of students who complete the survey? Since including the word survey in the invitation message increased the number of students who opened the email, our second research question sought to identify whether this invitation message also increased the Figure 1. Including the word survey in the email invitation 6,073 (24%) Opened Email 3,619 (14%) 728 (3%) 263 (1%) 890 (4%) 361 (1%) Completed Survey Broke Off Survey Excluded in Invitation Survey Included in Invitation Figure 2. Number of responses, by survey behavior and invitation message type number of students who completed the survey. Figure 2 displays the number of students who completed the survey and the number of students who broke off before completion. There were slightly more students (n = 890; 4%) who completed the survey when the invitation included the word survey than the number who completed the survey when it was excluded (n = 728; 3%). This difference in completion rates was statistically significant (X 2 (1) = 16.76, p. <.05), but had little practical significance (w =.02). Statistically speaking, completion rates were higher when students received an invitation message with the word survey included relative to when the word was excluded. The magnitude of this difference (d = 162) was virtually non-existent, implying that the statistical difference was most likely due to the large sample size. Interestingly, those students who received an invitation with the word survey included had a higher break off rate (n = 361; 1%) relative to those students who had the word excluded 2

(n = 263; 1%); however, this difference was neither statistically (X 2 (1) = 1.48, p. >.05) nor practically significant (w =.03). Including the word survey in the invitation message did not necessarily increase the number of students who completed the survey nor did it influence whether students broke off from participating. Survey Included in Invitation Survey Excluded in Invitation 43% 61% 57% 39% Research Question 3: Do the response patterns before and after the reminder message is received differ for those students who received the word survey in the invitation message, versus the response patterns for those students who receive the invitation message with the word survey excluded? A total of 2,242 students responded to at least one survey question (i.e., those students who completed the survey or broke off before completion); these students were included in this analysis. Figure 3 presents the percentage of students who answered at least one survey question, disaggregated by whether participation occurred before or after the reminder message was received and by whether or not the student received the invitation message with the word survey included. The results demonstrate that there were differential response rate patterns for those students who responded to the survey before and after the reminder message was received. When the invitation message included the word survey, a higher response rate occurred before the reminder message (61%) than after it (39%). The inverse pattern emerged when students received an invitation message excluding the word survey. A smaller percentage of students responded to the survey before the reminder message (43%) than after it (57%). This difference in response patterns for the two invitation message types was statistically significantly (X 2 (1) = 73.57, p. <.05) and had a small effect size (w =.18). Before the Reminder Message After the Reminder Message Figure 3. Percentage of responses, by invitation message type and reminder message status Subject Line Invitation Message The word survey included The word survey excluded The word survey included 11,350 11,350 The word survey excluded 11,350 11,350 Figure 4. Random assignment of students to four experimental groups Summary Three research questions were investigated to determine whether including or excluding the word survey in the invitation message produced differences in response rates. Differences emerged. Including the word survey in the invitation message increased the number of students who opened the email and the number of students who completed the survey. A total of 162 more students completed the survey when the invitation message explicitly articulated Experiment 2 Since the first experiment showed that including the word survey in the invitation message increased the likelihood of opening the email invitation, the second experiment strove to first replicate the effects on response rates when including or excluding the word survey in the invitation message and, second, to expand this question and determine whether including and excluding the word survey in the subject line also impacts response rate patterns. what was being asked of them. This gain Students who took the ACT test in February in the number of students who completed 2016 were invited to participate in an online the survey was statistically but perhaps survey about how they prepared for the not practically significant. Furthermore, ACT (N = 45,400). To study the relationship differences in response patterns by invitation between invitation messaging and survey message type before and after the reminder participation, students were randomly message was received also appeared. These assigned to one of four groups, based on results suggest that if there is no time to whether the word survey was included in administer a reminder message, including the email invitation and in the subject line. the word survey in the original email Figure 4 shows these four groups and the invitation is important to obtain a higher number of students randomly assigned to response rate. each group. 3

Figures 5a and 5b present a sample invitation message and subject line illustrating the inclusion and exclusion of the word survey. A total of 7,204 students responded. 10 Three research questions were investigated. Research Question 1: Does including the word survey in the email invitation and/ or subject line influence the percentage of students who answer at least one question in the survey? Figure 6 presents the percentage of respondents classified into one of the four invitation message groups. The results showed that when the word survey was included in either the invitation message or the subject line, the percentage of students who answered at least one survey question was higher than when it was excluded from both. The largest response rate (17.1%) was when the subject line included the word survey but the invitation message did not. This survey pattern was followed by including the word survey in the invitation message only (15.6%) and including the word survey in both the invitation message and subject line (15.4%). The smallest response rate was when students received an invitation and subject line message that excluded the word survey entirely (15.3%). These differences in response patterns are statistically significant (X 2 (3) = 17.86, p. <.05), but the effect size associated with them is negligible (w =.02). Including the word survey in the invitation messaging did not have a meaningful impact on response rates, although including the word in the subject line increased response rates by approximately two percentage points relative to when the word survey was excluded from both the subject line and the invitation message. Figure 5a. Including the word survey in the email invitation and subject line Figure 5b. Excluding the word survey in the email invitation and subject line Subject Line Only Invitation Message Only Both Neither 17.1% 15.6% 15.4% 15.3% Note: Subject Line Only = only the subject line included the word survey (n = 1,941); Invitation Message Only = only the invitation message included the word survey (n = 1,772); Both = both the invitation message and the subject line included the word survey (n = 1,751); Neither = neither the invitation message nor the subject line included the word survey (n = 1,739). Figure 6. Response rates, by invitation messaging 4

Research Question 2: Does including the word survey in the email invitation and/ or subject line increase the percentage of students who complete the survey? Figure 7 summarizes the percentage of students who completed the survey or who broke off the survey by invitation messaging type. There was a larger percentage of students who completed the survey when the invitation message and subject line included the word survey relative to the other three invitation messaging groups. This difference was statistically significant (X 2 (3) = 14.21, p. <.05), but the effect size associated with it was negligible (w =.02). Including the word survey in the invitation message and the subject line did not appear to improve the completion rates. The inclusion of the word survey in these messaging locations did not decrease the completion rates, either. Research Question 3: Does including the word survey in in the email invitation and/or subject line influence response rates before and after the reminder message is received? Results presented in Figure 8 illustrate that there was approximately a 50% split between the percentage of students who participated in the survey before the reminder message was received and after it was received. This pattern of responses was relatively consistent across the four different invitation messaging groups. These differences in response patterns before and after the invitation message was received were statistically significant (X 2 (3) = 11.04, p. <.05), but not meaningfully significant (w =.04). Including the word survey in the invitation messaging did not appear to influence response rates before or after the reminder message. Summary No differences in response rate patterns emerged between the four invitation messaging types. Whether students Subject Line Only Invitation Message Only Both Neither 90% 89% Completed 92% 90% Broke Off Note: Subject Line Only = only the subject line included the word survey (finished = 1,738); Invitation Message Only = only the invitation message included the word survey (finished = 1,569); Both = both the invitation message and the subject line included the word survey (finished = 1,611); Neither = neither the invitation message nor the subject line included the word survey (finished = 1,564). Figure 7. Percentage of students, by survey behavior and invitation messaging type Subject Line Only Invitation Message Only Both Neither 54% 55% 50% 53% Before the Reminder Message 46% 45% 50% 47% After the Reminder Message Note: Subject Line Only = only the subject line included the word survey ; Invitation Message Only = only the invitation message included the word survey ; Both = both the invitation message and the subject line included the word survey ; Neither = neither the invitation message nor the subject line included the word survey. This figure includes only those students who answered at least one survey question (n = 7,204). Figure 8. Percentage of responses, by invitation messaging type and reminder message status received an invitation message or a subject line message that included or excluded the word survey had little bearing on survey participation or completion rates. Furthermore, the invitation messaging did not create a differential response pattern before and after the invitation message was received. Conclusion Two experimental studies were implemented to determine whether including or excluding the word survey from the invitation messaging would have an impact on 11% 12% 8% 10% response rates. The most surprising result was that students who received an invitation message with the word survey included were more inclined to open the email message. This is good news as it provides evidence that refutes the prior belief that including the word survey would be a flag for spam. This also means that the ACT Survey Research team does not need to avoid the word survey when soliciting students participation in online surveys. 5

Both experiments showed that including or excluding the word survey in an invitation message, be it in the email invitation or the subject line, did not have a meaningful impact on completion or break off rates. It is worth noting, however, that in both experiments those students who received messaging with the word survey included had higher survey participation rates than when the invitation message excluded this word, although such differences had negligible effect sizes. Inconsistencies between the two experiments emerged with respect to survey behavior for those students who responded before the reminder message and after it. In the first experiment there were more students who participated in the survey after the reminder message when the word survey was excluded, whereas more students participated in the study before the reminder message when the word survey was included. The second experiment showed no differences in response patterns before and after the reminder message, whether or not the word survey was included in the messaging. It is unclear why these contradictory results emerged. However, the time of year the invitations were administered might relate to these inconsistent trends. Both studies used ACT test takers as the population of interest, but the surveys were administered at different times of the year. In the first experiment, surveys were administered in June, a few weeks after the students had taken the test. The invitations for the second experiment were sent in February, three hours after students had taken the test. Readers outside of ACT should interpret these results with caution. Students who complete ACT surveys have had prior experience with the organization. This prior interaction is not a luxury all survey researchers have. Regardless, the results are promising in that, for these respondents, including the word survey in invitation messaging did not negatively impact the data collected. Notes 1 Dillman, D. A., Smyth, J. D., & Christian, L. M. (2013). Internet, phone, mail, and mixed-mode surveys: The tailored design method (3rd ed.). New York, NY: John Wiley; Kaplowitz, M. D., Lupi, F., Couper M. P., & Thorp, L. (2012) The Effect of Invitation Design on Web Survey Response Rates. Social Science Computer Review 30(3): 339 49; Mavletova, A., Deviatko, I., & Maloshonok, N. (2014). Invitation Design Elements in Web Surveys Can One Ignore Interactions? Bulletin de Méthodologie Sociologique 123: 8 79. 2 Dillman et al., 2013. 3 Couper, M. P. (2008). Designing effective web surveys. New York, NY: Cambridge University Press. 4 Marcus, B., Bosnjak, M., Lindner, S., Pilischenko, S., & Schutz, A. (2007). Compensating for low topic interest and long surveys A field experiment on nonresponse in web surveys. Social Science Computer Review 25: 372 383. 5 Kaplowitz et al., 2012. 6 Henderson, V. (2011) Increasing (or Decreasing) Response Rate by Changing the Subject of Email Invitations. Paper presented at AAPOR Annual Conference, 12 15 May 2011. Phoenix, USA. 7 Trouteaud, A. R. (2004) How You Ask Counts: A Test of Internet-Related Components of Response Rates to a Web-Based Survey. Social Science Computer Review 22(3): 385 92. 8 Participation is defined as answering at least one survey question. 9 Phi was used to calculate effect size. A value of.1 is considered a small effect;.3 a medium effect; and.5 a large effect. 10 See end note 8. 6