Introduction
Most people live busy lives, filling their days with a wide range of activities and experiences. While retrospective questions on traditional surveys have been the typical method employed to measure these experiences, they suffer from a number of weaknesses, most notably the inability of people to accurately recall the minute details of what they do and the way they interpreted those experiences in the moment. A vast literature chronicles the difficulties and biases inherent in such retrospective measurement (Tourangeau, Rips, & Rasinski, 2000).
A particular form of survey research known as “experience sampling” offers an alternative that addresses some of the weaknesses of traditional surveys in this respect. Proponents of this method argue that its strength is “to capture daily life as it is directly perceived from one moment to the next, affording an opportunity to examine fluctuations in the stream of consciousness and the links between the external context and the contents of the mind” (Hektner, Schmidt and Csikszentmihalyi, 2007).
Experience sampling entails short, repeated data collection from a sample of individuals at specified times, usually upon a signal from the researcher. When first introduced, experience sampling required special technology for contacting individuals and required them to carry a copy of the survey with them or to answer a telephone call. The growth of the use of smartphones has made it feasible both to contact people and collect data from them using short, self-administered surveys.
This study presents the results of an experiment with “signal-contingent experience sampling” using smartphones to study attitudes and behavior related to smartphone use itself. In addition to documenting the use of experience sampling among a nationally representative survey panel, the study examines the feasibility, costs and benefits of using mobile software applications (smartphone “apps”) rather than web-based means of contact and data collection for studies of this nature. Among the research questions this study intended to begin to address was whether panelists would respond to experience sampling method surveys in the first place, how the response rate to these surveys would differ by assignment to the app vs. web treatment, whether different demographic groups would be more or less likely to respond using the app and how the substantive survey responses would differ by the experimental treatment.
About This Report
This report utilizes a form of survey known as “signal-contingent experience sampling” to gather data about how Americans use their smartphones on a day-to-day basis. Respondents were asked to complete two surveys per day for one week (using either a mobile app they had installed on their phone or by completing a web survey) and describe how they had used their phone in the hour prior to taking the survey. This report examines whether this type of intensive data collection is possible with a probability-based panel and to understand the differences in participation and responses when using a smartphone app as opposed to a web browser for this type of study.
This study is a complement to the core data collection of the report U.S. Smartphone Use in 2015, which examined the increasingly important role that smartphones play in helping Americans access, share, and create information and communicate with others. It places a particular focus on the sometimes-fragile financial and technical circumstances of those who rely heavily on their smartphones for internet access.
These findings are based on 2,011 smartphone owners in Pew Research’s American Trends Panel in November 2014.
Pew Research Center is a subsidiary of The Pew Charitable Trusts, its primary funder. This report was made possible by The Pew Charitable Trusts, which received support for the project from the John S. and James L. Knight Foundation.
This report is a collaborative effort based on the input and analysis of the following individuals.
Kyley McGeeney, Research Methodologist Scott Keeter, Director of Survey Research Ruth Igielnik, Research Analyst Aaron Smith, Senior Researcher Lee Rainie, Director, internet, science & technology research