university of manchester · web vieware answered using two different devices: personal computers...

40
Factors affecting response times: A comparative analysis of smartphone and PC web surveys Running Header: Smartphone versus PC web response times Christopher Antoun, University of Maryland Alexandru Cernat, University of Manchester 1

Upload: others

Post on 17-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

Factors affecting response times:

A comparative analysis of smartphone and PC web surveys

Running Header: Smartphone versus PC web response times

Christopher Antoun, University of Maryland

Alexandru Cernat, University of Manchester

1

Page 2: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

Abstract

This paper compares the factors affecting response times (RT) to survey questions when they

are answered using two different devices: personal computers (PCs) and smartphones. Several

studies have reported longer RTs when respondents use smartphones than PCs. However, few

have analyzed the specific reasons for the time difference. In this analysis we analyze timing data

from 822 respondents who completed the same web survey twice, once using a smartphone and

once using PC, as part of a randomized crossover experiment in the LISS panel (Longitudinal

Internet Studies for the Social Sciences). We include both page-level and respondent-level

factors that may contribute to the time difference in multilevel models. We find that respondent-

level characteristics account for more of the time difference than page-level characteristics.

Specifically, slower RTs by those with low levels of familiarity with smartphones and

multitasking on smartphones contribute to the time difference.

Keywords: completion times, mobile web surveys; smartphone surveys

Article type: Article

2

Page 3: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

Factors affecting response times: A comparative analysis of smartphone and PC Web

surveys

Introduction

As respondents increasingly use smartphones rather than personal computers (PCs) to

complete web surveys, researchers are eager to study the design and implementation of mobile

web surveys (for reviews, see Couper, Antoun, & Mavletova, 2017; Link et al., 2014). But

because this shift to mobile technology is relatively recent, the survey field is missing a full

understanding of its implications – including why respondents tend to take longer to complete

surveys when using smartphones than PCs. This paper intends to address this question by

investigating the factors that affect how long it takes respondents to answer survey questions in

web surveys when using smartphones and when using PCs.

The article begins with a review of the literature. It follows with our data and analysis plan

and concludes with our results and their implications.

Previous Literature

A consistent finding is that web surveys take longer to complete on mobile devices,

particularly smartphones, than on PCs. Most evidence comes from secondary analyses of timing

data from web surveys where respondents choose to use different devices, including PCs, tablets,

and smartphones. For example, in a secondary data analysis of 21 web surveys taken on different

devices, Gummer and Roßmann (2015) report that respondents using smartphones took

significantly longer than those using PCs, even after controlling for demographic differences

between the groups. Couper and Peterson (2017) present ratios of web survey completion times

on mobile devices and PCs across 26 studies that included a mix of experiments with random

3

Page 4: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

assignment to device and non-experiments. For 24 of them the ratios were positive (with a

median ratio of 1.4), indicating longer completion times among respondents using mobile

devices. In one of the few studies that looked at question-level RTs by device in a web survey,

Andreadis (2015) report that respondents using smartphones spent about 1.5 seconds longer per

question (on average) than those using PCs. By contract, recent investigations suggest that

respondents using tablets do not necessarily take more time to complete web surveys than those

using PCs (see e.g., Gummer & Roßmann, 2015). Thus, we focus our attention on the

smartphone-PC time difference.

Why do web surveys take longer on smartphones? The current state of knowledge about the

topic is still evolving. Gummer and Roßmann (2015) outline three potential explanations. One is

that mobile respondents take more time because of the extra scrolling required to view and

answer questions on a small device. Another explanation is that survey pages load more slowly

on smartphones because they rely on a cellular Internet connection (e.g., 3G, 4G LTE) that is

slower and sometimes less reliable than a typical Wi-Fi connection. A third explanation is that

respondents respond to questions more slowly because of increased multitasking or distractions

when using smartphones. This includes doing two activities simultaneously (e.g., completing the

survey while riding a bus) and taking interstitial breaks to switch from one task to another (see

Sendelbah, Vehovar, Slavec, & Petrovčič, 2016).

Unfortunately, Gummer and Roßmann were not able to test their proposed mechanisms.

Couper and Peterson (2017) expand upon these explanations and describe two others. The first is

that respondent may take longer to record their answers using touch input. In particular, typing

answers to open-ended questions may take longer on a smaller virtual keypad than a larger

physical one. The other explanation is that some of the respondents using smartphones have low

4

Page 5: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

levels of comfort and familiarity with their devices. These novice users might be take more time

than experienced users to navigate through the questionnaire.

Couper and Peterson (2017) tested some of these mechanisms in a secondary data analysis

of page-level time data from three web surveys of college students. They found that survey pages

do indeed load more slowly on mobile devices. They estimated the time to load a new survey

page after clicking the “Next” button by taking the difference between the time spent on a page

according to server-side timestamps (for total time) and client-side timestamp (for time when a

page is loaded in a respondent browser). They report that average transmissions times across the

three surveys were between 0.31 and 0.57 seconds longer per page on PCs than smartphones.

This is consistent with Mavletova and Couper (2016) who also investigated transmission times in

a survey in Russia, finding that it took about two seconds longer to load each survey page (on

average) on mobile devices than PCs. Across the two studies, this extra loading time accounted

for about between 13% and 28% of the over difference in timings between devices. It seems that

other things being equal, the faster the cellular Internet connection, the smaller the time

difference will be.

Couper and Peterson (2017) also found support for the idea that screen size affects RTs

because it leads to increased scrolling. They report that smartphone users scrolled on a higher

percentage of screens (49%) than PC users (4%), and were especially likely to scroll on grid

questions. Mobile users took about 1.5 times longer than PC users for screens that required

scrolling, which in their analysis explained the majority of the difference in timings.

Another possible explanation for the time difference by device is that mobile respondents

take longer to read words because they are displayed on the screen in small fonts. However, no

studies have found evidence for this. In a secondary data analysis of page-level timings from a

5

Page 6: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

web survey conducted in Greece, Andreadis (2015) tested for interactions between device and

question length and found no significant results (reported at the 0.01 significance level).

Likewise, Couper and Peterson (2017) report that the difference in timings was comparable for

questions with few words and questions with many words.

Another possible explanation for the difference in timings by device, that as far as we know

has not yet been tested, is that some mobile respondents take more because of physical

limitations that make it difficult to use smartphones. For example, those with poor eyesight may

have greater difficultly reading questions on a small screen. Those who have problems using

their hands (i.e. low dexterity) may find it harder to select a response and type on smartphones.

It should be noted that these prior investigations made an implicit assumption that group

differences between mobile users and PC users in completion times reflected device differences

rather than unaccounted for differences in the composition of the groups. By contrast, the

analysis presented here makes within-respondent comparisons using data from an experiment

where participants were invited to complete the same survey on both a smartphone and a PC.

Further, the study was conducted in a probability-based web panel and the sample members who

did not own smartphones were provided with devices. Thus, the resulting study sample is quite

diverse in terms of respondents’ demographics and level of smartphone experience.

Hypotheses and Research Question

Here, we test five of the hypotheses described above: the smartphone-PC time difference is

explained by (i) increased scrolling on smartphones, (ii) slower typing speed on smartphones,

(iii) increased completion of other tasks (multitasking, distractions) on smartphones, (iv) slower

completion on smartphones by those with low familiarity with their devices, and (v) slower

completion on smartphones by those with physical limitations.

6

Page 7: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

Since both page-level and respondent-level factors are thought to affect survey RTs, some of

our hypotheses are related to page-level characteristics (whether scrolling and typing is required

to answer the questions), while others are related to respondent-level characteristics (the

respondent’s level of experience and physical capabilities). It is unclear which level contributes

more to the time overall difference. Here, we investigate this issue using multilevel models that

partition total variance into variance between pages and variance between respondents. Our

specific research question is: Will the time difference be more influenced by page-level or

respondents-level characteristics? If it is the former, then it suggests that the time gap can be

reduced substantially through better questionnaire design. On the other hand, if respondent

characteristics explain more variation, then this suggests there is little that researchers can do

(apart from recruiting different respondents) to reduce the time difference.

In sum, initial work shows that web surveys take longer on smartphones than PCs. Several

explanations for this have been offered. It is important to continue to test these explanations as

well as to test new ones using a carefully designed experiment and diverse study sample.

Methods

Data

The data are from an experiment conducted in the LISS panel (Longitudinal Internet Studies

for the Social sciences), a probability-based online panel consisting of about 7000 individuals

from the Netherlands who were originally recruited from the national population register. They

are invited to complete surveys for 15-30 minutes each month in exchange for payment (see

Scherpenzeel, 2011). The experiment used a crossover design: panelists were invited to complete

the same survey twice, once on a smartphone and once on a PC, with the order randomized and

7

Page 8: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

with a month-long break in between. It was conducted in 2013 – from October 7 to October 29

for wave 1 and from December 2 to December 31 for wave 2. This design enables us to do two

types of analyses. The first is to investigate the determinants of RTs in each individual survey

and compare the results. The second is to investigate the determinants of the within-respondent

change in RTs from one survey to the other.

Panelists who did not have their own smartphone were provided one for the wave in which

they were invited to participate in the mobile web survey. While proving devices is not common

in practice, it is sometimes used as a way to include mobile users in smartphone surveys

conducted in online panels (e.g., Fernee & Sonck, 2013).

The mobile version of the questionnaire was adapted for small screens by eliminating the

sponsor’s logo from each page and by using larger fonts and wide, rectangular buttons for

response options.

The questionnaire contained 46 questions on topics ranging from health to politics. They

were displayed on 32 survey pages (consisting of one or more questions). Our analysis focuses

on the 22 pages that contained single-choice (radio button) and text entry questions. The two

pages near the end of the questionnaire that contained special formats (a slider question and

spinner question) were excluded because respondents using smartphones took an especially long

time to answer them (reported in Antoun, Couper, & Conrad, 2017). Eight pages at the end of the

questionnaire that contained non-traditional questions about respondents’ experience taking the

survey and current setting (e.g., location, whether they were distracted or not) were also

excluded. Most pages (19) contained only one question. The three pages that had more than one

question displayed only one type of question (i.e. all text entry questions or all single-choice

questions). In total, 10 pages contained one or more single-choice question and 12 pages

8

Page 9: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

contained one or more text entry question. See Figures 1 and 2 for examples of both types of

questions in mobile and PC web. We viewed each page on a standard (4-inch) smartphone to

determine whether its question text was fully visible from the onset or not: 4 pages required

scrolling in mobile web and 18 did not.

[Figure 1 about here]

[Figure 2 about here]

Panelists were first asked in a screener questionnaire whether they were willing to

participate in the experiment. Of the 5,486 panelists who responded to the screener, 2,263

indicated that they were willing to participate in the experiment. A sample of willing panelists

was drawn. Of the 1,390 panelists who were selected and invited to participate, 895 completed

both surveys using the assigned device for an overall response rate of 64% (RR1, AAPOR 2015).

Cases with missing values for at least one of the variables used in our multivariate analysis were

removed, resulting in final sample size of 822. User agent strings were collected (see e.g.,

Callagaro 2010). When more than one device was used for a single survey, we used information

from the last one under the assumption that respondents switched devices early in the

questionnaire. Although the final sample would not be considered representative of the full LISS

panel, it was still quite diverse in terms of age: 22% 16-30 years, 27% 31-45 years, 30% 46-60

years, and 21% 61-75 years; education: 60% no college and 40% college; and gender: 51% male

and 49% female.

9

Page 10: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

The dependent variable for our analysis is page-level timings. They were captured using

server-side timestamps and measured in seconds. In order to insure an approximately normal

distribution we have excluded the longest 1% RTs and have logged the dependent variable. The

complete case dataset used in the multilevel models has 18,011 rows.

Several characteristics were used as predictors (Table 1). Two page-level characteristics

were included: (1) question type and (2) whether scrolling was required on a smartphone.

Six respondent-level characteristics were included, all of which were self-reported. Two

were indicators of whether respondents’ attention was fragmented when they completed the

survey, specifically: (1) the number of other activities they completed during the survey and (2)

whether they were distracted. These indicators are time-varying because they could take on

different values for each survey, for example, if someone multitasked while completing the

mobile web survey but not the PC web survey. Two indicators about respondents’ familiarity

with smartphones were included: (1) a three-level indicator of experience using smartphones and

(2) their frequency of smartphone use. Finally, two indicators of physical limitations were

included: (1) visual acuity and (2) whether someone had problems working with their hands.

These last four indicators were fixed across the two surveys.

[Table 1 about here]

Analytic approach

In order to answer our research question we estimate multilevel models (see e.g., Snijders &

Bosker, 2011; Yan & Tourangeau, 2008). RTs, the dependent variable, is measured for each

combination of survey page and type of survey. In order to take into account the hierarchical

10

Page 11: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

structure of the data, we estimate random effects at the page and respondent level. The advantage

of this approach, in addition to estimating correct regression coefficients, is that we can

investigate the proportion of variation that is due to page characteristics and respondent

characteristics for RTs. Models are estimated using R3.5 with the package lme4.

We estimated models for RTs when respondents used a smartphone and when they used a

PC separately. Since RTs come from a crossover experiment, the smartphone times include a

combination of values recorded in the first wave and second wave of the experiment. Similarly,

the PC times include a combination of values recorded in the first and second experimental

wave. The fact that the order of the surveys was randomized insures that this does not have a

large impact on our estimates. Still, we included a control variable for the wave in which the

survey was completed in our models.

To test if there are significant differences between PC and smartphone RTs we will

investigate if the confidence intervals of the corresponding estimates from two models overlap or

not. Due to our design we can also investigate the differences at the respondent level. For this,

we estimate a model in which the dependent variable is the within-respondent change in RTs

from one survey to the other for each survey page. This will help bring more insight to the

question regarding how RTs change when the same respondent completes the survey with a

different device, an approach that can’t be used without a within-person experimental design.

Results

Descriptive Results

Respondents took an average of 34.84 seconds per page when using smartphones and an

average of 24.83 seconds per page when using PCs. Thus, respondents took an average of 1.4

11

Page 12: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

times (or 40%) longer to complete each survey page when using smartphones than PCs. The size

of the time difference is comparable to several other studies (e.g., Mavletova & Couper, 2015;

McGeeney & Marlar, 2013) and happens to be the same as the median difference from Couper

and Peterson’s (2017) summary of 26 studies.

Multivariate Models

Table 2 shows the results of the multilevel models predicting page-level RTs. For each of

the three dependent variables – RTs in the smartphone survey, RTs in the PC survey, and

differences in RTs between the two surveys – we run two models. An empty one, that allows us

to investigate the variance composition, and a full one, that allows us to see the contribution of

different page- and respondent-level factors.

[Table 2 about here]

We see that for the models we have 18,011 timings, 22 pages, and 822 respondents. The

variance decomposition for the empty models is presented in Figure 3. This indicates the

proportion of variation in our dependent variable, RT’s, that can be accounted for by pages and

by respondents. The figure highlights that for the first two dependent variables most of the

variance is residual (i.e., unexplained), followed by page characteristics and then respondent

characteristics. We observe that respondent characteristics are more important in the model for

the smartphone survey than the model for the PC survey. This is true even though we have the

same respondents in both groups.

12

Page 13: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

Our research question focused on the variance decomposition for the within-respondent time

differences. The third bar of Figure 3 shows that most of the variance in the time difference is

unexplained, around 10% of the variation is explained by respondent characteristics, and less

than 2% is explained by page characteristics. Thus, we conclude that smartphone-PC time

difference was more influenced by respondent characteristics than page characteristics.

[Figure 3 around here]

Next we investigate the coefficients in the full models to test our hypotheses. In all of the

models, we included demographic variables (gender, age, education) as well and the

experimental wave in which the survey was completed. Not surprisingly we found significant

wave effect for RTs in both surveys, indicating that respondents were faster when answering

questions for the second time. The effects of education seem to be approximately linear, with

older respondents taking more time in both surveys. Surprisingly, the time difference between

surveys was smaller for middle-age and older adults (31 years and older) than for younger adults

(16-30 years). Education was not associated with RTs in the PC web survey. However, in the

mobile web survey those with a college degree were faster than those without a college degree.

We expected that the extra step of scrolling adds time and helps explain the time difference.

Scrolling did significantly increase mobile web RTs, consistent other research. However,

scrolling did not significantly predict the within-respondent differences in RTs. In other words,

the smartphone-PC time difference was not significantly larger for survey pages that required

respondents using smartphones to scroll.

13

Page 14: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

We also expected that typing would be slower on a smartphone than PC and contribute to

the time difference. Having to type a response did take significantly longer than selecting one

from a list in both surveys, but text entry did not significantly predict the smartphone-PC time

difference. This is because the increase in RTs due to typing was approximately the same in both

surveys.

We also expected the completion of other tasks (multitasking, distractions) would contribute

to the time difference. As expected, respondent who divided their attention between other

activities did respond more slowly. Those reported multitasking took more time, regardless of

their device. Similarly, those who reported being distracted were slower, regardless of their

device. It is possible that multitasking or distractions had a larger impact when carried out using

one device or the other. But the activities had approximately the same effect on response timings

in both surveys. Nonetheless, because multitasking was more prevalent in mobile web than PC

web (53% versus 44%), it was still a significant predictor of the time difference. Thus, we

conclude that fragmented attention among mobile respondents does indeed contribute to the time

difference.

We predicted that the time difference would be larger for respondents who had low

familiarity with smartphones, and this was strongly supported. Focusing on descriptive statistics

(not shown), “low familiarity” users – that is, those who were using devices provided to them –

took 1.7 times longer when using smartphones, those classified as “medium” familiarity – that is,

those who use smartphones but never for surveys – took only 1.2 times longer, and those

classified as “high” familiarity – that is those who sometimes use their smartphones to complete

surveys – took approximately the same amount of time when using smartphones. Similarly, those

who report “never” using smartphones took 1.8 times longer when using such devices, whereas

14

Page 15: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

those who report using smartphones “every day” took only 1.2 times longer. In the multivariate

models, as expected, having “low” or “medium” familiarity or being an infrequent smartphone

user significantly increased mobile web RTs and had no significant effect on PC web RTs. Most

importantly, both familiarity variables were significant predictors of the time difference. Taken

together this suggests that the more experience respondents have using smartphones, the smaller

the time difference will be.

Finally, we expected that the time difference would be larger for respondents who had

certain physical limitations. This was not supported. Physical characteristics were not significant

predictor of RTS in either of the individual surveys or of the time difference. This is perhaps

because the text and buttons in the optimized mobile survey were sufficiently large for those

with low levels of visual acuity or dexterity to read and select without slowing down.

Discussion

We compared the effects of several factors on RTs when respondents use smartphones and

when they use PCs. We have three main findings. First, those with fragmented attention are

slower on either type of device and respondents are more likely to divide their attention when

using smartphones than PCs. Thus, multitasking is an important factor in explaining the time

difference not because it was more impactful on one type of device than the other but because

they are more prevalent when using smartphones than PCs.

Second, familiarity and experience with smartphones had large effects on RTs in mobile

web. Novice users, especially those who were provided smartphones for the experiment (and

thus using an unfamiliar device), were slower when using smartphone than PCs. However,

experienced users took about the same amount of time to complete the survey on either device.

15

Page 16: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

At first blush, this seems to suggest that the time difference should disappear in surveys where

respondents choose which device to use, as one might expect that it is only the expert users who

choose to use smartphones. However, such surveys still report time differences (e.g., Andreadis,

2015). This is likely due to the fact that respondents may choose their device not based solely on

comfort and familiarity but rather based on which device is available to them. For examples, they

might choose to use a smartphone because it happens to be nearby, they are on-the-go, or it is

their only device. Further, increased multitasking on smartphones contributes to the time

difference irrespective of respondent’s level of comfort with their devices.

Our third result was that respondent characteristics, not page characteristics, explain most of

the time difference. In other words, the time gap had more to do with the composition of the

sample than with the design of individual pages, at least in our particular survey. It is possible

that if there had been more variation in question types (e.g., if grids were included) and page

design (e.g., if more pages had multiple questions) then there would have been more variation at

the page level. Nonetheless, for our study this implies that if a researcher wanted to eliminate the

time gap then changes to the questionnaire design (e.g. eliminating scrolling or text entry

questions) would have had less impact than changes to the sample (e.g., recruiting more

experienced smartphone users). This finding presents a challenge for researchers given that

changes to the questionnaire design are often in their control whereas changes to the sample are

not if trying to recruit a representative sample.

We found no evidence that slower typing on touchscreens or respondents’ level of vision or

level of dexterity account for the time difference. While scrolling did add time in the mobile web

survey, we found that is did not have a significant effect of the smartphone-PC time difference.

The questions in the survey required little scrolling, and those that did required only vertical

16

Page 17: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

scrolling which we assume to be quite natural on smartphones (unlike horizontal scrolling). It is

likely that if the survey had longer questions that required more scrolling (especially horizontal

scrolling), then scrolling would have contributed to the time difference.

There are limitations with this analysis. We could not construct a measure of transmission

time between pages because we did not have access to client-side timestamps. We did not have a

direct, passively measured indicator of whether respondents actually scrolled (see Couper &

Peterson, 2017). Instead, we viewed each page on a standard (4-inch) smartphone to determine

whether the full question text was initially visible or not. It is possible that some users of large

phones did not actually have scroll on every page that required scrolling on a 4-inch screen.

Another limitation is that several measures used in our analysis were self-reported and thus

subject to measurement error. Further, the multitasking and distraction measures were reported at

the end of the survey even though they were likely intermittent, with respondent attention

fluctuating throughout the survey. There would be value if future research that uses passively

measured, fine-grained indicators of respondents’ activities and environment (see Sendelbah et

al. 2017). Nonetheless, we think this analysis demonstrates that respondent characteristics – in

particular, multitasking and respondents’ level of experience with smartphones – contribute to

the smartphone-PC time difference.

Will the time gap eventually be erased? If mobile survey design improves and users gain

more experience using smartphones, the time gap will likely shrink. However, it seems unlikely

to be eliminated given the prevalence of multitasking on smartphones.

The shift towards mobile data collection shows no signs of slowing down, and research in

this area must keep pace. An important task will be to investigate the factors affecting response

17

Page 18: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

behaviors when using smartphones, including response latencies. The resulting findings will

likely inform new strategies to improve the design and implementation of mobile surveys.

18

Page 19: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

Author Information

Christopher Antoun is an assistant research professor in the Joint Program in Survey

Methodology and College of Information Studies at the University of Maryland. E-mail:

[email protected]

Alexandru Cernat is a lecturer in Social Statistics in the School of Social Sciences at the

University of Manchester. E-mail: [email protected]

Acknowledgements

The data for this analysis come from an experiment carried out by the LISS (Longitudinal

Internet Studies for the Social sciences), which is administered by CentERdata (Tilburg

University, The Netherlands). We grateful to them for conducting this experiment and for

providing the timing data.

19

Page 20: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

References

American Association for Public Opinion Research (AAPOR). 2015. “Standard definitions: Final

dispositions of case codes and outcome rates for surveys.” Available at

http://www.aapor.org/AAPORKentico/AAPOR_Main/media/publications/Standard-

Definitions2015_8theditionwithchanges_April2015_logo.pdf.

Andreadis, I. (2015). Comparison of response times between desktop and smartphone users. In

Toninelli, D. et al. (eds.) Mobile Research Methods, 63–79. London: Ubiquity Press.

Antoun, C., Couper, M. P., & Conrad, F. G. (2017). Effects of mobile versus PC web on survey

response quality: A crossover experiment in a probability web panel. Public Opinion

Quarterly, 81, 280–306.

Callegaro, M. (2010). Do you know which device your respondent has used to take your online

survey? Using paradata to collect information on device type. Survey Practice, 3 (6).

Couper, M. P., & Peterson, G. J. (2017). Why do web surveys take longer on smartphones?.

Social Science Computer Review, 35, 357–377.

Couper, M. P., Antoun, C. & Mavletova, A. (2017). Mobile web surveys: A total survey error

perspective. In Biemer, P. et al. (eds.) Total Survey Error in Practice, 133–154. New York:

Wiley.

Fernee, H., & Sonck, N. (2013). “Is Everyone Able to Use a Smartphone in Survey Research?”

Survey Practice, 6, 1–7.

Gummer, T., & Roßmann, J. (2015). Explaining interview duration in web surveys: A multilevel

approach. Social Science Computer Review, 33, 217–234.

Link, M. W., Murphy, J., Schober, M. F., Buskirk, T. D., Hunter Childs, J., & Langer Tesfaye,

C. (2014). Mobile technologies for conducting, augmenting and potentially replacing

20

Page 21: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

surveys: Executive summary of the AAPOR task force on emerging technologies in public

opinion research. Public Opinion Quarterly, 78, 779–787.

Mavletova, A., & Couper, M. P. (2016). Device use in web surveys: The effect of differential

incentives. International Journal of Market Research, 58, 523–544.

McGeeney, K., & Marlar, J. (2013). “Mobile browser web surveys: Testing response rates, data

quality, and best practices.” Paper presented at the AAPOR annual conference, Boston,

May.

Scherpenzeel, A. C. (2011). Data collection in a probability-based internet panel: How the LISS

panel was built and how it can be used. Bulletin of Sociological Methodology 109, 56–61.

Sendelbah, A., Vehovar, V., Slavec, A., & Petrovčič, A. (2016). Investigating respondent

multitasking in web surveys using paradata. Computers in Human Behavior, 55, 777–787.

Snijders, T. A. B., & Bosker, R. (2011). Multilevel Analysis: An Introduction To Basic And

Advanced Multilevel Modeling (2nd Revised edition edition). Sage Publications Ltd.

Yan, T., & Tourangeau, R. (2008). Fast times and easy questions: the effects of age, experience

and question complexity on web survey response times. Applied Cognitive Psychology, 22,

51–68.

21

Page 22: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

Table 1. Description and distribution of page-level and respondent-level predictors. Predictor Description Percentage/MeanPage-levelQuestion type Single-choice Radio button and check box questions 55% Text entry Mix of numeric entry and text entry 45%Scrolling required in mobile version Yes 18% No 82%Respondent-level (time varying)Multitasking (# of other tasks) Self-reported at end of each survey 0 Mobile web: 47%

PC web: 56% 1 Mobile web: 28%

PC web: 25% 2 Mobile web: 17%

PC web: 15% 3 Mobile web: 5%

PC web: 4% 4 Mobile web: 1%

PC web: <1% 5+ Mobile web: 1%

PC web: <1%Distracted Self-reported at end of each survey Yes Mobile web: 38%

PC web: 36%

No Mobile web: 62% PC web: 64%

Respondent-level (fixed)Experience with smartphones Survey taking experience self-reported in

earlier survey a. Low Smartphone provided for experiment 38% Medium Own smartphone but don’t use it to

complete web surveys56%

High Own smartphone and do use it to sometimes complete surveys

7%

Frequency of smartphone use Self-reported in earlier survey a. “Never” 26% “Rarely” 5% “Some days” 10% “Most days” 12% “Every day” 46%Corrected eye sight Eye site with “(reading) glasses or contact

lenses.” Self-reported in earlier surveyb. “excellent”/“very good”/“good” 89% “reasonable”/“poor” 11%Working with hands and Self-reported in earlier surveyb.

22

Page 23: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

fingers “no complaints” 92%“complaints” 8%

aNonresponse and Measurement Error in Mobile Web Surveys – Baseline (Sept 2013)bLISS Core Study – Health Wave 7 (Nov-Dec 2013)

23

Page 24: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

Table 2. Results from multilevel models explaining log response times for PC web, mobile web, and within-respondent differences between the two.

Coefficients PC-empty PC-full Mobile-empty Mobile-full Diff-empty Diff-full  Intercept 2.30*** 3.34*** 2.65*** 3.60*** 0.35*** 0.20*  

(-0.12) (-0.25) (-0.11) (-0.21) (-0.03) (-0.1)  Female 0.01 0.01 0  

(-0.02) (-0.02) (-0.02)  Age: 31-45 0.15*** 0.08* -0.07**  

(-0.03) (-0.03) (-0.03)  Age: 46-60 0.28*** 0.21*** -0.07*  

(-0.03) (-0.03) (-0.03)  Age: 61- 75 0.44*** 0.34*** -0.10**  

(-0.04) (-0.04) (-0.03)  College -0.03 -0.07*** -0.04  

(-0.02) (-0.02) (-0.02)  Text input -0.81*** -0.73*** 0.08

(-0.15) (-0.13) (-0.05)Scrolling 0.60** 0.60*** 0

(-0.2) (-0.16) (-0.06)Multitasking 0.07*** 0.06*** 0.06***

(-0.01) (-0.01) (-0.01)Distracted 0.06** 0.06** -0.02

(-0.02) (-0.02) (-0.02)Experience: Medium -0.03 0.06 0.09*  

(-0.04) (-0.04) (-0.04)  Experience: Low -0.06 0.24*** 0.30***  

(-0.05) (-0.05) (-0.05)  Freq. smartphone use -0.03** -0.07*** -0.04***  

(-0.01) (-0.01) (-0.01)  Poor eyesight 0.02 0.03 0  

(-0.03) (-0.03) (-0.03)  Dexterity complaints 0.01 0.01 0  

(-0.04) (-0.04) (-0.04)  Second wave -0.08*** -0.07***  

(-0.02) (-0.02)  Mobile first 0.16***              (-0.02)  AIC 34037.16 33821.29 33635.21 33144.53 42942.45 42706.76  BIC 34068.35 33977.26 33666.4 33300.5 42973.64 42862.74  Log Likelihood -17014.58 -16890.64 -16813.6 -16552.26 -21467.22 -21333.38  N: Respondents x Pages 18011 18011 18011 18011 18011 18011  N: Respondents 822 822 822 822 822 822  

24

Page 25: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

N: Pages 22 22 22 22 22 22  Var: Respondents 0.1 0.06 0.15 0.07 0.09 0.05  Var: Pages 0.33 0.13 0.26 0.09 0.01 0.01  Var: Residual 0.35 0.35 0.34 0.34 0.59 0.59  

* p < 0.05, ** p < 0.01, *** p < 0.001

25

Page 26: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

Figure 1. Single-choice question in mobile (left) and PC web (right).

26

Page 27: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

Figure 2. Text-entry question in mobile (left) and PC web (right)

27

Page 28: University of Manchester · Web vieware answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer RTs when respondents use

Figure 3. Variance decompositions for the three models explaining log times in PC web and mobile web as well as the difference between them.

28