factors in errors of omission on a self-administered paper questionnaire

16
This article was downloaded by: [University of California Santa Cruz] On: 22 November 2014, At: 10:14 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Journal of Health Communication: International Perspectives Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/uhcm20 Factors in Errors of Omission on a Self- Administered Paper Questionnaire Brett McBride a & David Cantor a a Westat , Rockville , Maryland , USA Published online: 10 Dec 2010. To cite this article: Brett McBride & David Cantor (2010) Factors in Errors of Omission on a Self- Administered Paper Questionnaire, Journal of Health Communication: International Perspectives, 15:sup3, 102-116, DOI: 10.1080/10810730.2010.525690 To link to this article: http://dx.doi.org/10.1080/10810730.2010.525690 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms- and-conditions

Upload: david

Post on 27-Mar-2017

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

This article was downloaded by: [University of California Santa Cruz]On: 22 November 2014, At: 10:14Publisher: Taylor & FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Journal of Health Communication:International PerspectivesPublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/uhcm20

Factors in Errors of Omission on a Self-Administered Paper QuestionnaireBrett McBride a & David Cantor aa Westat , Rockville , Maryland , USAPublished online: 10 Dec 2010.

To cite this article: Brett McBride & David Cantor (2010) Factors in Errors of Omission on a Self-Administered Paper Questionnaire, Journal of Health Communication: International Perspectives,15:sup3, 102-116, DOI: 10.1080/10810730.2010.525690

To link to this article: http://dx.doi.org/10.1080/10810730.2010.525690

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the“Content”) contained in the publications on our platform. However, Taylor & Francis,our agents, and our licensors make no representations or warranties whatsoever as tothe accuracy, completeness, or suitability for any purpose of the Content. Any opinionsand views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Contentshould not be relied upon and should be independently verified with primary sourcesof information. Taylor and Francis shall not be liable for any losses, actions, claims,proceedings, demands, costs, expenses, damages, and other liabilities whatsoever orhowsoever caused arising directly or indirectly in connection with, in relation to or arisingout of the use of the Content.

This article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

Survey Methodology

Factors in Errors of Omission on aSelf-Administered Paper Questionnaire

BRETT MCBRIDE AND DAVID CANTOR

Westat, Rockville, Maryland, USA

This article examines the role of question and respondent characteristics on omissionerrors made on the 2007 Health Information National Trends Survey (HINTS)questionnaire. Higher omission error rates were found for items with open-endedresponse formats, items placed outside of the body of the questionnaire, and itemsfollowing skip instructions. Respondent and survey completion characteristics seento impact omission error included age, education level, household income level,and the amount of time respondents reported having spent on the questionnaire.

Having complete and accurate health data is important for the identification ofthe effectiveness of communication strategies. Knowledge of individuals’ healthseeking behavior requires the collection of attitudinal and behavioral data, oftenthrough surveys. Problems involved in collecting data include inaccurate reportsand individuals’ refusal or inability to provide some or all of the requested data. Thisarticle focuses on factors leading to missing data in a questionnaire on health seekingbehaviors.

Omission error (OE) occurs when a respondent who should answer an itemdoes not. Respondents might leave items blank if they do not know how to answer,if the item asks for information they do not want to provide, or if other aspects of theitem inhibit them from responding. This research looked at design and respondentcharacteristics leading to errors of omission on a self-administered paper surveycovering health and health communications. The goal is to assess which character-istics were associated with OE. The implications for the design of the questionnaireare discussed. This research also sought to identify population groups vulnerableto higher rates of OE. Once population groups vulnerable to OE are identified, itmay be possible to minimize OE by reducing the complexity of the items to whichthey are exposed.

Errors of commission occur when respondents mistakenly answer an item that isnot in the intended navigational path. The research discussed below concentrates onerrors of omission, since these errors limit researchers to using data collected fromindividual respondents who provide complete questionnaires. This reduced data uni-verse may obscure health attitudes or behaviors specific to certain groups, such as

Address correspondence to Brett McBride, Statistician, Westat, RE447, 1600 ResearchBoulevard, Rockville, MD 20850, USA. E-mail: [email protected]

Journal of Health Communication, 15:102–116, 2010Copyright # Taylor & Francis Group, LLCISSN: 1081-0730 print=1087-0415 onlineDOI: 10.1080/10810730.2010.525690

102

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 3: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

the elderly, if these groups are more likely to omit data. There are imputationmethods available that provide plausible values to substitute for values respondentsleave unanswered. Various methods may be used depending upon assumptionsabout the causes of missing data (Groves, 2006; De Leeuw, Hox, & Huisman,2003); however, these methods require respondents to provide data for somequestions and will not provide a perfect substitute for the data that were notcollected.

Research into question characteristics has suggested that question typesdemanding greater cognitive sophistication on the part of respondents may lead tohigher levels of OE (Dillman, Redliner, & Carley-Baxter, 1999). Some of the featureslikely to impose greater cognitive burden include items with a large number ofwords, items rated as being challenging to comprehend, items in unconventionallocations within the questionnaire, and those that follow a skip instruction. Theinterplay of different features may also contribute to OE. For example, items locatedat the bottom of a questionnaire page that require respondents to skip elsewhere inthe questionnaire were found to lead to higher OE (Dillman et al., 1999; Redline &Dillman, 2002; Redline, Dillman, Carley-Baxter, & Creecy, 2003).

Past research has suggested adapting designs to reflect natural question responseprocesses in order to simplify the respondent’s task. One design principle is thatinstructions should be placed where they are visible and will immediately be used.As applied to skip instructions, placing instructions on the last response categorythat respondents see before continuing to the next item has been associated withreduced rates of skip errors (Jenkins & Dillman, 1997). In contrast, having a skipinstruction associated with a response category in the middle of a long list of cate-gories may lead to it being forgotten by the time the respondent finishes readingthe entire list (Eysneck, 2001). Another common item format that may increasecognitive burden is open-ended response items. These items require respondents touse free recall to supply a response instead of selecting a response from a set ofoptions.

Studies have sought to determine which respondent characteristics may beresponsible for higher rates of missing data. Although items imposing greatercognitive burden are seen as problematic, research has been mixed on the role ofthe education level of respondents, with some finding decreased omissions formore-educated respondents (Craig & McCann, 1978; Ferber, 1966) but others notfinding education to play a significant role (Messmer & Seymour, 1982). Resultshave also been mixed as to the importance of respondent gender (Craig & McCann,1978; Ferber, 1966; Messmer & Seymour, 1982). Research has found that OEincreased among older respondents (Ferber, 1966; Messmer & Seymour, 1982). Thismay be a result of a diminished capacity to handle complex cognitive processesamong older respondents, who have less working memory capacity.

This current research revisits some of these findings and examines other featuresthought to have an impact on OE. As with earlier research, these findings are shapedby the context of the questionnaire, in this case respondents answering items abouthealth awareness and behavior. The intent is to provide a ‘‘case study’’ of factors andthe extent to which they lead to OE. The questionnaire used as the example includesa wide variety of items and skip instructions administered to a nationally represen-tative population of adults. The level of OE and the factors associated with thesetypes of errors give survey designers perspective on the possible magnitude of theerrors and how they might be avoided.

Questionnaire Omission Errors 103

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 4: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

Data Source

This research uses data from the Health Information National Trends Survey(HINTS), sponsored by the National Cancer Institute (National Institutes ofHealth). HINTS is a nationally representative survey of American adults’ use ofhealth and cancer-related information, assessing health communication, perceptions,knowledge, and behaviors. HINTS has been conducted three times since 2003, withthe current research making use of data collected by mail in 2007. Data were alsocollected in 2007 by use of an RDD landline survey. In the mail administration ofthe questionnaire, households were sent three copies of the questionnaire bookletwith instructions that each adult living in the household fill one out. The question-naire contained 189 items, 31 of which contained skip instructions. Respondentsreported that the questionnaire took an average of 30 minutes to complete. A totalof 3,582 respondents returned partial or completed questionnaires,1 which translatedinto a weighted response rate of 31% (using American Association for PublicOpinion Research [AAPOR] response rate formula #3).

Errors of Omission and Question Characteristics

The analysis of OE examines question characteristics that are related to cognitive,motivational, and navigational issues in filling out a paper self-administered ques-tionnaire. In particular, we examined the following characteristics as they relateto OE:

. Series of items sharing the same response categories (Figure 1). Faced with thetask of answering a seemingly repetitive set of items with the same response cate-gories, respondents may exhibit acquiescence and choose only to answer thoseitems they believe are relevant to them.

1A questionnaire was determined to be partially completed if the respondent had at leastfilled out key health communication items.

Figure 1. Example of items sharing the same response categories.

104 B. McBride and D. Cantor

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 5: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

. Items With Open-Ended Response Categories. Answering an open-ended itemrequires more cognitive energy because respondents have to provide their owncontext for the item. This contrasts with close-ended items, which provide cuesthat assist in defining the scope of the item as well as serve as recall cues.

. Long Items. Long items may intimidate respondents who are not adept at reading.Long items, with multiple conditions and clauses, also increase demands on carry-ing more information in short term memory. For purposes of the analysis below,we define a long item as one that contains more than 25 words.2

. Item Location. Two types of location are examined.. Items Located on the Front and Items Located on the Reverse of the Last

Page of the Questionnaire. These items may get overlooked because theyare not part of the main body of items. For HINTS, there was a single itemon household size included on the inside cover of the instrument. There werealso items included on the back cover of the instrument. Because of theirlocation, respondents may inadvertently miss these items because they arenot looking for them.

. Items Located Toward the End of the Questionnaire. These may receive lesseffort from respondents as they have waning motivation to attend to the ques-tion response task.

. Items That Are Part of a Skip Instruction. A skip instruction requires that respon-dents read and comprehend the correct navigational path. In addition, it increasesburden by requiring respondents to process the instruction. Several specific typesof skip instructions were examined.. Items That Are Part of an Embedded Skip (Figure 2). Items that are located

within two or more skip instructions (referred to as ‘‘embedded skips’’)require respondents to successfully interpret multiple skip instructions.

. Skip Instructions for Items at the Bottom of the Page (Figure 3). When aquestion with a skip instruction is located at the bottom of the page, therespondent must attend to the navigational task of going to the next column

2Questions in which the question stem was more than 25 words but it was followed by aseries of component items were not included as many of these question stems were primarilycomposed of introductory instructions.

Figure 2. Example of a single and an embedded skip type.

Questionnaire Omission Errors 105

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 6: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

or page. This disruption may lead them to incorrectly carry out the naviga-tional task, leading to skip errors (Dillman et al., 1999; Redline & Dillman,2002; Redline et al., 2003).

To test whether the above characteristics make a difference in OE, an OE rate for anitem was calculated as the percentage of respondents who were instructed to answerthe item but did not (more information on the calculation of OE rates is provided inthe Appendix). When testing whether the above question characteristics were relatedto errors of omission, the proportion in error for items with and without the charac-teristic was computed. It was tested whether the difference of the two proportionswas significantly different from zero through the use of replication methods usingWesvar 4.2 (for more details, see Westat, 2002; Wolter, 1985).

Across all 189 items, the questionnaire had an average OE rate of 5.7%. Figure 4indicates OE rates for the 108 items that all respondents were instructed to answer(i.e., they exclude those items that were affected by a prior skip instruction).

The 10 items that shared the same response categories were not found to havesignificantly higher OE rates than the 98 items that had independent response cate-gories. Open-ended items were more likely to have higher OE rates than items thathad close-ended response categories. Items with open-ended response categories hadan average OE rate of 4.7%. The 11 items that contained more than 25 words, whencompared with the 97 shorter items, were not found to have significantly differentaverage OE rates (3.2% and 2.9%, respectively). With respect to the location of items,four items on the reverse of the last page contained a continuation of demographicand administrative items. These items had a higher OE rate, averaging 4.6%,compared with the other 14 items in the demographic section, averaging 3.1%.

Figure 3. Example of an item (H10a) skipped from the bottom of the page.

106 B. McBride and D. Cantor

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 7: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

The questionnaire additionally featured an item placed on the inside cover of theinstrument asking about the number of adults living in the household. This itemhad the second highest OE rate, 10.1%, of any item in the questionnaire that allrespondents were instructed to answer.3 The chronological location of items didnot seem to affect OE. As respondents progressed through the questionnaire, the rateat which items were left unanswered remained surprisingly steady (Figure 5). Divid-ing the questionnaire into 9 sets of 12 items that all respondents were instructed toanswer, the OE rate remained at almost 3% throughout the questionnaire. Only atthe end, where the demographic and reverse-of-the-last-page items were located,did OE rates increase slightly.

With respect to items that followed skip instructions, the 81 items that somerespondents were instructed to skip had elevated OE rates, averaging 9.3%(Figure 6). In contrast, the 108 items that no respondents were instructed to skiphad an average OE rate of 2.9%. The OE rate on the 81 items following skip instruc-tions was elevated by respondents following skip instructions incorrectly—choosingas their answer to the skip item (the item with a skip instruction) the ‘‘nonskip cate-gory’’ but then omitting following items. Among these 81 items, open-ended itemswere found to have higher OE rates (averaging 20.8%) than close-ended items (aver-aging 8.1%). Apparently the combination of both the skip instruction and theopen-ended response format significantly increases the probability that an item willnot be filled out. It is also important to note, however, that even for the close-endeditems, the OE rate was higher than the average for items that all respondents wereinstructed to answer.

Perhaps more telling are items that immediately followed embedded skip items;embedded skip items are skip items located within a larger skip instruction from aprevious skip item. These were associated with higher OE rates than those followingonly one skip instruction. Figure 2 shows an example of an item following one skipinstruction (H2) and an item following an embedded skip item (H2a).

Figure 4. Average OE rates by item characteristics. The number at the bottom of the barsshowing the number of items represented (� represents significant difference at p� .01 level).Shared response—If item was part of a series that shared the same response categories.Reverse of last page—If demographic section item was on the very last questionnaire page.

3The nonskip item with the highest OE rate, 10.3%, asked respondents to indicate theirhousehold income level.

Questionnaire Omission Errors 107

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 8: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

The 12 items, such as H2a, that immediately followed an embedded skip had anaverage OE rate of 13.9%, compared with an OE rate averaging 7.0% for the 18items that immediately followed a single skip. In this example, 13.0% of respondentswho were supposed to answer item H2a omitted H2a, whereas a relatively small5.2% omitted H2. Part of the increase in OE rates for items following embeddedskips was due to a reduction in the number of respondents who were supposed toanswer these items, leading any omission errors to have a larger impact on theOE rate.

A set of items expected to result in higher OE rates, items skipped from the bot-tom of a page, had the reverse effect. Items following a page-bottom skip instructionhad a lower average OE rate (5.5%) than the rate for the other items that only somerespondents were instructed to answer (which averaged 9.7%). Items H10, H10a, andI1 (Figure 3) provide an example of this type of item series.

Figure 5. Average of OE rates by item location in questionnaire (among items that all respon-dents were instructed to answer).

Figure 6. Average OE rates by item characteristics affected by skip instructions (the number atthe bottom of the bars showing the number of items represented).

108 B. McBride and D. Cantor

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 9: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

Only 3.3% of respondents who were supposed to answer item H10a were missingon this item, and most had omitted item H10 as well.4 While the skip instruction atitem H10 resulted in an unusually low number of skip errors of omission, all but oneof the other items skipped from the bottom of a page also had OE rates below theaverage for other skipped items.

Respondent-Level Analysis

The role attributed to respondent characteristics in OE rates was calculated separ-ately for items that were located within a skip instruction and those that were not.As part of the respondent-level analysis, we conducted linear regressions, regressingOE rates on demographic variables. The same independent variables were used tomodel skip errors of omission and nonskip errors of omission. The dependent vari-able in the model of nonskip errors of omission indicated the number of omissionsmade on the 90 nondemographic items that all respondents were supposed toanswer. The dependent variable in the model of skip errors of omission was a countof OEs for the 16 nondemographic items at which all respondents could make a skiperror (thus excluding several embedded skips), which was categorized into threelevels: 0, 1, and 2–7.5

The linear regressions were weighted to account for the stratified samplingdesign of the survey. Regressions were carried out using Wesvar 4.2 making use ofreplicate weights. To test whether OE is correlated with certain types of respondents,the following respondent characteristics were used as independent variables in themodels: gender; age; income, including a category for missing; and the respondent’shighest education level. Also of interest was the effect of the survey completion pro-cess on OE. Paradata that respondents self-reported on the questionnaire was thusused as independent variables in the models: respondents indicated whether or notthey completed the survey in one sitting, and respondents entered the time, inminutes or hours, that it took for them to complete the survey. Table 1 lists theweighted values of these variables among those reporting this information on thequestionnaire.

Respondent Characteristics

Only 28% of respondents made one or more skip errors of omission.6 In terms ofnonskip omissions, 50% of respondents left one or more items unanswered; respon-dents made an average of 2.5 nonskip omissions on the questionnaire.7 Table 2 pre-sents respondent-level factors influencing the extent to which OE occurred. Whethera respondent was male or female had no effect on OE rates. In both models, the

4Only 2% of those who selected the nonskip category at H10, ‘‘yes,’’ did not answer itemH10a.

5As this dependent variable is actually ordinal, a multinomial regression was also carriedout. This resulted in similar findings with the exception that the coefficient for respondents inthe $50,000–$99,000 income range was significant at the 0.06 significance level rather than the0.01 significance level.

6This is calculated for the 16 nondemographic items at which all respondents could makea skip error.

7These figures are also calculated for the items included in the dependent variable (i.e.,they do not include omissions in the demographic section).

Questionnaire Omission Errors 109

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 10: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

oldest respondents had higher rates of OE. Focusing on skip errors of omission, agewas one of only two respondent characteristics seen to influence these OE rates.Higher respondent education levels were found to be associated with significantreductions in nonskip omissions. Unexpectedly, respondents with postsecondaryeducation did not make significantly fewer skip errors of omission than those withless than 12 years of education. Although these respondents did make fewer skiperrors of omission, this association was not significant. Examining the role ofincome, respondents with higher household income levels were seen to have lowerrates of OE. This was significant in both models for those with income levels from$50,000 to $99,999. Not surprisingly, respondents who did not answer this item leftother nonskip items unanswered at a higher rate than those who reported having lowincome levels, though this finding was not significant in the model.

Survey Completion Factors

Although respondents reported taking an average of 30 minutes to complete thequestionnaire, some respondents spent over an hour. Of interest was whether theserespondents were putting more effort into correctly filling out the questionnaire.HINTS also collected data revealing that a small proportion of respondents—15%—reported taking a break before completing the questionnaire (see Table 1). Even

Table 1. Weighted respondent demographic composition andassociated paradata, HINTS mail survey (2007; N¼ 3,582)

Characteristic

Female (%) 51.5Age (%)18 to 29 21.930 to 44 28.245 to 74 41.975 and older 8.1

Household Income Level (%)$0 to $19,999 19.1$20,000 to $49,999 27.3$50,000 to $99,999 28.1$100,000 and above 16.1Missing 9.3

Education level (%)Up to 12 years 13.912 years=high school graduate 24.6Postsecondary 61.6

Survey completed in one sitting (%) 85.0Time spent on questionnaire (%)0 to 19 minutes 20.820 to 29 minutes 26.430 to 39 minutes 25.640 to 59 minutes 11.260 minutes and greater 16.0

110 B. McBride and D. Cantor

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 11: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

among respondents indicating that the questionnaire took an hour or longer to com-plete, a majority completed the questionnaire without taking a break. Surprisingly,respondents taking over 40 minutes to complete the questionnaire left more itemsunanswered compared with those completing the questionnaire in less than 20 min-utes. Controlling for how long respondents took on the questionnaire, those whocompleted the questionnaire in one sitting had fewer nonskip omissions (this findingapproached significance, p¼ 0.084).

Conclusions

These findings suggest that certain question and respondent characteristics lead tohigher rates of OE on a paper self-administered questionnaire. With respect to ques-tion characteristics, respondents had higher OE rates for open-ended response items,items located outside the perceived body of the questionnaire, and those requiringskips. Open-ended items on a self-administered questionnaire are generally avoidedbecause of the labor involved with coding, as well as the possibility that respondentsmay not understand the proper context of the item. On the other hand, use ofresponse categories can lead respondents to fit their answer into categories seen astypical or mainstream (Schwarz & Hippler, 1987; Schwarz, Hippler, Deutsch, &Strack, 1985). The HINTS used open-ended items asking for elapsed time durations,

Table 2. Linear regression estimates (SEs) of respondent characteristics=behavior onOE rates (N¼ 3,338)

Variable (reference category) Skip error of omission Nonskip omission

Intercept 0.22 (0.056)��� 3.02 (0.607)���

Gender (Male)Female �0.02 (0.026) 0.03 (0.221)

Age (18–29 years old)30 to 44 years old 0.04 (0.036) 0.27 (0.181)45 to 74 years old 0.14 (0.035)��� 1.00 (0.179)���

75 to 99 years old 0.31 (0.054)��� 4.06 (0.737)���

Education (<12 years)12 years=HS grad 0.04 (0.055) �0.57 (0.399)Postsecondary 0.03 (0.047) �1.28 (0.334)���

Income (<$20,000)Missing 0.00 (0.059) 0.93 (0.636)$20,000 to $49,999 �0.03 (0.040) �0.97 (0.448)�

$50,000 to $99,999 �0.12 (0.041)�� �1.23 (0.436)��

$100,000 and above �0.08 (0.052) �1.36 (0.413)��

Survey completion (break)Uninterrupted survey 0.01 (0.033) �0.41 (0.235)

Time on survey (<20 minutes)20 to 39 minutes �0.03 (0.029) �0.02 (0.226)40 minutes and greater �0.01 (0.041) 0.65 (0.266)�

R2 0.03 0.13

�p� .05, ��p� .01, ���p� .001.

Questionnaire Omission Errors 111

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 12: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

partly to avoid this ‘‘typical’’ response set. However the increase in OE rates forthese open-ended items in this questionnaire makes clear that there is a trade-offbetween the potential improvements in response accuracy that they provide andthe number of respondents that will answer them. The threefold increase in OE ratesfor these open-ended items illustrates the increase in response burden that thesequestion types can pose.

The location of the item was important for one of the two hypothesized effects.Items on the inside front cover and the back of the questionnaire had significantlyhigher rates of OE. This has clear design applications when deciding on where toplace items, even if those items are intended for special use, as was the case forthe household size item on HINTS.8 Placing an item within the body of the question-naire is more likely to elicit a response than placing it on a facing page of a question-naire booklet. It was not found that respondents were more likely to commitomission errors toward the end of the questionnaire. This runs counter to conven-tional thinking that respondents may get fatigued as they progress through the ques-tionnaire. The nonsignificant result may be due to a number of factors, including akeen interest in the topic by the respondents, the general ease of use of the question-naire, and=or the filtering out of respondents who may be most subject to low motiv-ation. For example, the overall response rate was 31%, which may mean that thosemost subject to low motivation did not answer the survey at all.

As one might expect, items associated with skip instructions had higher OErates. This was especially true for embedded skips. A majority of questionnaire itemsthat had OE rates over 10%9 were located within an embedded skip pattern. Singleskip instructions interrupt the respondent routine of moving sequentially throughthe questionnaire. Having multiple items that redirect some respondents provideseven greater likelihood that the respondent will follow a skip path that does notapply to them. Aside from the navigational burden of answering these items, the con-tent involved, which often seeks detailed information thought to be available tothose who answer ‘‘yes’’ to a filter question, may be more challenging for respon-dents to answer. Therefore, OE rates for these items may be driven by both skiperrors and inability to respond to the item content. This certainly suggests thatone should avoid embedded skips whenever possible. If it is necessary to includethese types of skips, one possible solution might be to use an exclusionary responsecategory, rather than a second skip pattern. For example, in Figure 2, it would bepossible to take out the second skip around H2a and add a category (‘‘do not smokenow’’) to that item.

Skip patterns that redirect respondents from the bottom of a page elsewhere inthe questionnaire have been found in prior research to lead to higher OE (Dillmanet al., 1999; Redline & Dillman, 2002; Redline et al., 2003). This was not found to bethe case for the HINTS questionnaire. While in some cases the skips redirectedrespondents only to the top of the same page, a potentially less-burdensome task,even skip instructions directing respondents to a new questionnaire page had OErates that were below the average for items some respondents were instructed to skip.One potential explanation for this finding was that skip instructions were locatednext to the last response category in three of the four page-bottom skips in this

8This item was used to calculate the within-household response rate. Ironically, it wasplaced at the beginning of the book to increase the chance that it would be filled out.

9This includes nonskip omission rates.

112 B. McBride and D. Cantor

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 13: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

questionnaire. This is in keeping with findings that placing skip instructions next tothe last response category reduces burden by providing instructions at the pointwhere respondents need to use them. For items with only two response categories,respondents may not have difficulty remembering the skip instruction even if it isplaced on the first response category. An advantage of placing the skip instructionwith the last response category in that case is that it is less likely to be noticed byrespondents who do not need to read it. Respondents selecting ‘‘yes’’ to item H10(see Figure 3) can continue to the top of the page and answer item H10a withouthaving to visually scan over a skip instruction that is located below their responsecategory (and comes after they have already finished their response task).

The research into the role of respondent characteristics revealed an increasedlikelihood of skip errors of omission among older respondent populations, as foundin prior research. As a result, reducing the use of skip instructions may be beneficialwhen carrying out surveys targeted to older respondent populations. This researchalso suggests that items that do not follow skip instructions may have elevated OErates among this population group, suggesting a need to reduce overall item com-plexity where possible. While past research had found mixed effects for gender, nodifference was noted between males and females on OE rates in this questionnaire.Although higher education did reduce nonskip omission rates, this was not foundto be significant for skip errors of omission. We do not have a specific hypothesiswhy this difference occurs. It may be indicative of different cognitive and=ormotivational skills that are needed to navigate skip instructions as opposed towillingness (and ability) to answer nonskip items. Finally, it was notable that thetime respondents devoted to the questionnaire was not associated with improve-ments in correctly following skip instructions and completely answering the ques-tionnaire. Holding constant the effect of survey duration (among other factors),there was some evidence that respondents who were able to complete the surveyin one session were more thorough in answering all of the items that they weresupposed to answer.

Applying these findings to other surveys should be carried out with this study’snonexperimental design in mind. Items on the HINTS questionnaire had multiplecharacteristics that influenced OE. Thus it was hard to separate the role of the ques-tionnaire from that of the items it contained. Item placement, for example, may playa greater role for surveys containing a greater number of items, or surveys containingitems that are more challenging for respondents to answer. The fit of the respondent-level models was relatively low, suggesting there are other factors, not controlled for,that impact omissions. It is possible that including other characteristics not includedin the models would affect the conclusions that were drawn. Another caveat is therole that respondents’ answers play in the skip errors they can make. For example,only respondents who answered that they had looked for information on cancer werecapable of incorrectly omitting follow-up items asking about their sources of infor-mation. This accordingly limited the analysis conducted on how respondentshandled items with embedded skip instructions, which were highly confounded withthe characteristics of respondents who answered them. Furthermore, this analysiscould not distinguish between omissions caused by the complexity of the item ornavigation and intentional omissions caused by respondents not knowing, or notwanting to provide, the information. Our analysis assumes that the patterns foundacross the entire questionnaire are indicative of complexity issues, but are con-founded by some respondents intentionally leaving items blank. Further research

Questionnaire Omission Errors 113

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 14: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

into the differential placement of skip instructions within responses option setswould additionally provide a greater understanding of the causes of respondent skiperrors. Having a better understanding of OEs on self-administered questionnaireswill help identify item characteristics best able to maximize the data collected fromrespondents.

References

Craig, S., & McCann, J. (1978). Item nonresponse in mail surveys: Extent and correlation.Journal of Marketing Research, 15, 285–289.

De Leeuw, E. D., Hox, J., & Huisman, M. (2003). Prevention and treatment of item non-response. Journal of Official Statistics, 19, 153–176.

Dillman, D. A., Redline, C. D., & Carley-Baxter, L. R. (1999). Influence of type of question onskip pattern compliance in self-administered questionnaires. Retrieved December 9, 2009,from http://www.sesrc.wsu.edu/dillman/papers.htm

Eysneck, M. W. (2001). Principles of cognitive psychology (2nd ed.). Hove, UK: PsychologyPress Ltd.

Ferber, R. (1966). Item nonresponse in a consumer survey. Public Opinion Quarterly, 30,399–415.

Groves, R. M. (2006). Nonresponse rates and nonresponse bias in household surveys. PublicOpinion Quarterly, 70, 646–675.

Jenkins, C., & Dillman, D. A. (1997). Towards a theory of self-administered questionnairedesign. In L. Lyberg, P. Biemer, M. Collins, L. Decker, E. DeLeeuw, C. Dippo,N. Schwarz, & D. Trewin (Eds.), Survey measurement and process quality (pp. 165–196).New York: Wiley-Interscience.

Messmer, D. J., & Seymour, D. T. (1982). The effects of branching on item nonresponse.Public Opinion Quarterly, 46, 270–277.

Redline, C., & Dillman, D. A. (2002). The influence of alternative visual designs on respon-dents’ performance with branching instructions in self-administered questionnaires. InR. Groves, D. Dillman, E. Eltinge, & R. Little (Eds.), Survey nonresponse (pp. 179–195).New York: John Wiley and Sons, Inc.

Redline, C. D., Dillman, D. A., Carley-Baxter, L., & Creecy, R. (2003). Factors that influencereading and comprehension in self-administered questionnaires. Retrieved November 17,2009, from http://www.sesrc.wsu.edu/dillman/papers.htm

Schwarz, N., & Hippler, H.-J. (1987). What response scales may tell your respondents: Infor-mation functions of response alternatives. In H. J. Hippler, N. Schwarz & S. Sudman(Eds.), Social information processing and survey methodology (pp. 163–178). New York:Springer-Verlag.

Schwarz, N., Hippler, H.-J., Deutsch, B., & Strack, F. (1985). Response categories: Effectson behavioral reports and comparative judgments. Public Opinion Quarterly, 49,388–395.

Westat. (2002). WesVar 4.2 User’s Guide. Rockville, MD: Westat.Wolter, K. (1985). Introduction to variance estimation (2nd ed.). New York: Springer-Verlag.

Appendix

Generally, OE rates were computed following the skip patterns in the questionnaire.It was straightforward to calculate OEs for those items that every respondent wassupposed to answer. However, for items containing a skip, hereafter referred to asa ‘‘skip item,’’ rules were developed for situations where it was not clear what the

114 B. McBride and D. Cantor

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 15: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

proper path was for respondents to follow. In this Appendix, we provide the rules weused to decide whether or not to count something as an OE.

I. Selection of multiple responses.

. On a skip item:If a respondent selected both the skip category and the nonskip category of askip item, items that were left unanswered within that skip instruction werenot counted as omission errors.

. On an item following a skip:This was not counted as an OE.

II. Leaving a skip item unanswered.

. This was counted as an OE if the item was to be filled out by all respondents.

III. Situations where the item following an omitted skip item was unanswered.

. This was not counted as an OE if all of the items up to the ‘‘return’’ itemwere unanswered. The ‘‘return’’ item is the item that the skip categoryinstructed the respondent to go to. This rule assumed that respondentsshould have answered the skip category of the skip item because they fol-lowed the instructions for that category. An illustration is shown below.Assume the respondent omitted D8 and D8a, but had data for D9. In thiscase, D8 was counted as an OE but D8a was not. The assumption is thatthe respondent should have answered ‘‘no’’ to D8, since they followed theskip instruction for this category.See Figure A1 for an illustration of this skip sequence.

Figure A1. Illustration of skip sequence.

Questionnaire Omission Errors 115

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014

Page 16: Factors in Errors of Omission on a Self-Administered Paper Questionnaire

IV. Embedded skips.This is a skip item that is located within a skip instruction from a previous item.For example, in Figure A2, skip item H2 is preceded by skip item H1. H2 is anembedded skip.. If a respondent omitted an embedded skip item (e.g., H2), and all subsequent

items up to the skip item return (e.g., H6), then an OE was not counted forthe item following the embedded skip (e.g., H2a).

Similarly, if a respondent omitted an embedded skip item (e.g., H2), and all sub-sequent items up to the embedded skip item return (e.g., H5), then an OE was notcounted for the item following the embedded skip (e.g., H2a). See Figure A2.

Figure A2. Section illustrating embedded skip sequence with different return items.

116 B. McBride and D. Cantor

Dow

nloa

ded

by [

Uni

vers

ity o

f C

alif

orni

a Sa

nta

Cru

z] a

t 10:

14 2

2 N

ovem

ber

2014