Published on 29.06.16 in Vol 5, No 2 (2016): Apr-Jun
Preprints (earlier versions) of this paper are available at http://preprints.jmir.org/preprint/5170, first published Sep 28, 2015.
Feasibility of an Electronic Survey on iPads with In-Person Data Collectors for Data Collection with Health Care Professionals and Health Care Consumers in General Emergency Departments
Background: Translating Emergency Knowledge for Kids was established to bridge the research-practice gap in pediatric emergency care by bringing the best evidence to Canadian general emergency departments (EDs). The first step in this process was to conduct a national needs assessment to determine the information needs and preferences of health professionals and parents in this clinical setting.
Objective: To describe the development and implementation of two electronic surveys, and determine the feasibility of collecting electronic survey data on iPads with in-person data collectors in a busy clinical environment.
Methods: Two descriptive surveys were conducted in 32 general EDs. Specific factors were addressed in four survey development and implementation stages: survey design, survey delivery, survey completion, and survey return. Feasibility of the data collection approach was determined by evaluating participation rates, completion rates, average survey time to completion, and usability of the platform. Usability was assessed with the in-person data collectors on five key variables: interactivity, portability, innovativeness, security, and proficiency.
Results: Health professional participation rates (1561/2575, 60.62%) and completion rates (1471/1561, 94.23%) were strong. Parental participation rates (974/1099, 88.63%) and completion rates (897/974, 92.09%) were excellent. Mean time to survey completion was 28.08 minutes for health professionals and 43.23 minutes for parents. Data collectors rated the platform “positively” to “very positively” on all five usability variables.
Conclusions: A number of design and implementation considerations were explored and integrated into this mixed-mode survey data collection approach. Feasibility was demonstrated by the robust survey participation and completion rates, reasonable survey completion times, and very positive usability evaluation results.
JMIR Res Protoc 2016;5(2):e139
- survey development;
- electronic survey;
- survey implementation;
- needs assessment;
- pediatric emergency medicine
Crowdfunding campaign to support this specific research
We help JMIR researchers to raise funds to pursue their research and development aimed at tackling important health and technology challenges. If you would like to show your support for this author, please donate using the button below. The funds raised will directly benefit the corresponding author of this article (minus 8% admin fees). Your donations will help this author to continue publishing open access papers in JMIR journals. Donations of over $100 may also be acknowledged in future publications.
Suggested contribution levels: $20/$50/$100
The Translating Emergency Knowledge for Kids (TREKK) project aims to bridge the research-to-practice gap in pediatric emergency medicine, and reduce variable emergency care, by ensuring that practitioners in general emergency departments (EDs) have access to and apply the latest research evidence in their pediatric practice . The first phase of the TREKK project (the Needs Assessment) surveyed health care professionals and parents seeking care for their children in general EDs, to determine information needs and preferences to guide the development of knowledge translation tools on key child health topics. Given the well-documented challenges of survey research [ ], particularly in health research [ - ], specific factors were addressed in four survey development and implementation stages, including survey design, survey delivery, survey completion, and survey return. The aim of this research was to increase data quality (ie, increasing participation rates, reducing item nonresponse, and reducing dropouts) [ ].
Despite a wealth of research, much debate remains regarding the superiority of electronic and/or mailed paper surveys [, ]. Recent studies indicate that the future of survey research involves mixed-mode approaches (ie, two or more modes of administration including mail, web, telephone, and/or in-person) [ ] and/or additional recruitment techniques to generate higher response rates [ - ]. However, further research is recommended to describe variations in survey content and administration, and effects on participation rates and data quality [ ]. In this paper, we describe electronic survey development and implementation using iPads and in-person data collectors. We also detail the feasibility of this novel mixed-mode approach to survey research by providing survey response rate results, average length of time for survey completion, and the results of a usability evaluation with data collectors.
Given the complexity of the ED environment (ie, fast-paced, high volume, high acuity), traditional paper-based surveys were too cumbersome and resource-intensive for this study. An electronic survey was determined to be the most appropriate method to meet study timelines, due to ease of implementation across a large geographic area, and decreased administrative costs [, , , ]. To be included in the survey, each participant was either a health care professional working in a participating general ED, or a parent seeking care for a child in the EDs. Participants were excluded if they were unable to read or write English or French.
Survey questions were developed using relevant research literature  and in consultation with content experts in pediatric emergency medicine, nursing, and information science. The surveys underwent several iterations, and face validity was determined through team meetings and pilot testing within the research team. Both surveys were developed in English and translated into French. The Health Care Professional Needs Assessment survey collected demographic information, current information-seeking practices, information needs related to caring for children in the ED, and preferences for receiving new information related to caring for children in the ED ( ). The Parent Needs Assessment survey collected demographic information, information about the current visit to the ED, and health information needs and preferences ( ).
In addition to survey content, six features affecting response rate of web-based surveys were considered: (1) general format, (2) length, (3) disclosure of survey progress, (4) visual presentation, (5) interactivity, and (6) question/response format . A screen design format was selected to display one question per page, as this design has been shown to have lower item nonresponse than scrolling designs [ ]. Much thought was given to survey length, and the surveys were constructed to achieve an average survey completion time of approximately 20 minutes. This target aimed to mitigate busyness as a barrier to health care professional participation [ ], and to maximize the opportunity for parents to complete the survey in the waiting room before being brought into an examination room. Disclosure of survey progress has shown limited effect on response rates, so we did not incorporate this feature in our surveys [ ]. In terms of visual presentation of the survey, a plain visual presentation approach with selective use of color was used, based on research indicating higher completion rates and later dropouts using this design [ ]. Sans serif font was selected for ease of readability on a screen [ ], questions and responses were numbered, and bolding, shading, italics, and color were used in a consistent fashion, with the aim to enhance understandability. Arrows were used to direct participants to subsequent screens. Interactivity was also incorporated, as it has been linked to lower item nonresponse [ ]. This feature included automatic jumps to questions based on previous answers, and a missing data message was displayed when an item was left blank; however, responses were not forced due to the association of this option with higher dropout rates [ ]. Four question/response formats were used throughout the surveys, and included single and multi-touch responses with radio buttons, sliding scales, and drag and drop boxes in which responses could be dragged to a new column and rank ordered ( ).
Consistent hardware was used at each site to streamline training and mitigate technological issues in survey delivery . iPads were selected as the most effective survey delivery and data collection device because of their functionality and participant preference. iPads are lightweight devices that are easy to transport and hand to participants. The iPad interface is user-friendly and the touch screen technology, which has been shown to reduce mean time for patients completing questionnaires [ ], enabled new and interesting approaches to survey question design [ ]. In addition, previous research has demonstrated that both health care professionals and parental respondents preferred participation on a tablet compared to paper-based surveys [ - ], and found these devices easy to use [ , ]. Furthermore, electronic tablets have been shown to be a viable method of collecting patient self-report data in pediatric waiting rooms [ ] and in rural settings [ ].
External validity has been identified as a methodological issue of concern in survey research [- , , ]; we attempted to address this issue by having in-person data collectors accompany the technology while conducting on-site recruitment. Census sampling aimed to recruit all health care professionals, and convenience sampling was used to recruit parents. The protocol for recruitment involved approaching all staff and parents to introduce TREKK and determine study eligibility. iPads were provided to interested parties to review the electronic consent form; once consent was indicated, participants automatically proceeded to the electronic survey. Data collectors were available throughout survey recruitment and completion to answer questions and assist with overcoming any technological barriers, including comfort with web browsers and touch screen technology. This consideration was based on research suggesting that the amount of contact and length of time in the field are important factors in health care professional response rates [ ]. Surveys could be kept open for any length of time and were closed when the Submit button was indicated at the end of the survey, or the browser window was closed. The survey was designed with this flexibility, as data collection occurred in unpredictable and busy EDs.
Survey Return & Technical Issues
The electronic survey platform incorporated synchronous and asynchronous data collection capability, meaning that data could be collected online and automatically uploaded to a secure server when a wireless connection was present, and data could also be collected offline, safely stored on the device, and later uploaded to a secure server once a wireless connection was available. This feature was particularly important in the ED setting, as many hospitals do not provide wireless internet and 3G/4G connectivity is limited or non-existent in rural and remote regions. This approach also addressed previously identified security concerns with cloud-based data storage [, ]. Automatic data upload also eliminated the need for data entry, thus reducing the potential for error, and allowed the research team to monitor data quality via a secure, password-protected portal to provide feedback or additional training to data collectors. This functionality required enhanced device security; however, iPads are equipped with the appropriate security features to meet this need, including passcodes and restrictions to limit access and usage, encryption to protect information stored on the device, GPS technology to track the device, and remote and automatic data wiping capabilities (in the event that the device is lost or stolen) [ ].
Usability Evaluation of Data Collection Platform
A usability evaluation was conducted with data collectors using an anonymous 20-question electronic survey. The content of the survey was theoretically informed on the basis of a small-scale review of previous studies employing electronic platforms for data collection and the National Institutes of Health’s Usability Guidelines . These guidelines outline key features of visual design and user experience, including interactivity, portability, innovativeness, security, and proficiency. The usability survey ranked features of the electronic survey and iPad on a five-point scale according to how positively or negatively these features affected their experience collecting data for the TREKK Needs Assessment ( ). Face validity was determined via team meetings and pilot testing within the research team.
Survey Participation & Completion Rates
The recruitment rate for health care professionals was 68.66% (1768/2575) and participation rate was 60.62% (1561/2575), and among participants the survey completion rate was 94.23% (1471/1561). Among parents the recruitment rate could not be determined because the eligible population was dependent on people coming into the general EDs; however, the participation rate was 88.63% (974/1099) and among participants the survey completion rate was 92.09% (897/974). Seefor recruitment and participation details for both populations.
Length of Time for Survey Completion
Of the 1471 health care professional surveys included in the final analysis, 79 surveys did not have a time stamp (ie, participants did not press Submit at the end of the survey) and were not included in the calculation of average survey completion time. Of the 1391 surveys with a time stamp, the mean time to survey completion was 28.08 minutes (standard deviation = 118.54 minutes). This estimate includes participants that left the survey open and returned to complete it at a later time (eg, 9 surveys were open for more than 1000 minutes).
Of the 897 parental surveys included in the final analysis, 25 did not have a time stamp and were not included in the calculation for average survey completion time. Of the 872 surveys with a time stamp, the mean time to survey completion was 43.23 minutes (standard deviation = 691.25 minutes). This estimate includes participants that left the survey open and returned to complete it at a later time (eg, 4 surveys were open for more than 1000 minutes).
Usability Evaluation of Data Collection Platform
Eight data collectors were approached to participate in the usability evaluation, six of whom agreed to participate and complete the survey (75% participation and completion rate). On a five-point scale, responses were largely positive (score=4) to very positive (score=5) on the five usability measures (see). Additionally, respondents could enter free text information to describe, in detail, their perspectives on the strengths and weaknesses of iPads as a data collection tool. Respondents noted that the professional look and feel of the survey created a sense of trustworthiness and legitimacy of the research study, as demonstrated by the following excerpt, “With a unique online survey specific for TREKK, it appears more trustworthy as a legitimate research study, rather than having… paper surveys.” However, respondents illuminated some drawbacks to this approach, such as, “The only real negative of using the iPad relates to the survey participants [sic] level of comfort with technology, but not to such an extent that it affects participation - only initial comfort.”
Our findings make an important contribution to the web-based survey literature by addressing calls for research to examine web-based survey response and completion issues . Generally, web-based surveys are correlated with low response rates [ ], with estimates suggesting an average 10% decrease in response rates compared to traditional paper-based surveys. Given that 68.66% of health care professionals that were approached in our study reviewed the consent, and 94.23% of those who consented completed the survey, it is apparent that our mixed-mode approach mitigated these commonly accepted disadvantages of web-based surveys.
Utilizing data collectors to approach potential survey participants, and explain the study, eliminated the need for email or web-based invitations and completion reminders, and significantly enhanced survey participation and completion rates. We suggest that the addition of in-person data collectors offered the benefits of personal connection, and caused potential survey participants to make an active decision to participate in the study. In-person data collectors were also able to engage with potential participants and answer any questions the potential participants had about the study or the technology. Deploying a web-based survey without a mixed-mode approach allows potential participants to easily ignore electronic invitations to participate in survey research. With a mixed-mode approach, we were able to capitalize on the many benefits of web-based surveys, including improved data quality and the ability to immediately begin data analysis, while simultaneously mitigating previously reported downfalls of web-based surveys, including lower response rates and lower completion rates.
Vicente and Reis itemized six areas to consider when designing and implementing web-based surveys , and our usability findings support five of these recommendations (general structure, survey length, visual presentation, interactivity, and question/response format). Specifically, the usability findings collected by our in-person data collectors were largely positive to very positive for interactivity, portability, innovativeness, security, and proficiency. Free text responses further strengthened these findings by highlighting that the general structure, visual presentation, and question/response format of our survey helped to legitimize and enhance the credibility of our study. These findings highlight the importance of the investment of time and resources into survey design and implementation elements. Our participation and completion rates, and survey usability findings, are evidence that attention to survey design and implementation is strategic.
This study provides strong evidence for the feasibility of a mixed-mode approach to survey data collection using iPads and in-person data collectors, based on strong response rates, reasonable survey completion times, and very positive usability evaluation results. This study also details survey development and implementation considerations that will be useful to survey researchers working with a variety of populations. Great potential exists for utilizing a mixed-mode approach for future survey research in clinical settings.
TREKK is funded by the Networks of Centers of Excellence – Knowledge Mobilization initiative, and supported (either financially or through in-kind contributions) by the following agencies and institutions: Pediatric Emergency Research Canada, Children’s Hospital Research Institute of Manitoba, University of Manitoba, Research Manitoba, ManitobaCPD.com, Cochrane Child Health Field, Alberta Children’s Hospital Research Institute, Children’s Hospital of Eastern Ontario, Ontario ED Local Health Integration Network Leads, Alberta Health Services, and Women and Children’s Health Research Institute. SDS is a Canada Research Chair (tier II) for knowledge translation in child health and a Population Health Investigator with Alberta Innovates – Health Solutions (formerly Alberta Heritage Foundation for Medical Research). LA is supported by Alberta Innovates – Health Solutions Graduate Studentship and Women & Children’s Health Research Institute Graduate Studentship.
We would like to acknowledge and thank the TREKK Team, including the Board of Directors, Prioritization Committee, Nodal Leaders, and Pediatric Emergency Research Canada Representatives, Coordinators, and Administrative staff for their dedication to this project.
TPK and SDS obtained research funding. SDS conceptualized and led the research study. SDS, LA and LMG designed the electronic surveys. LA conducted data collection and data analysis for the Needs Assessment. SDS and DA conceptualized the small-scale iPad evaluation project. DA conducted data collection and data analysis for this evaluation. SDS and LA wrote the manuscript. All authors provided substantive feedback and approved the manuscript prior to submission.
Conflicts of Interest
Multimedia Appendix 1
TREKK Needs Assessment Healthcare Professional Survey.PDF File (Adobe PDF File), 51KB
Multimedia Appendix 2
TREKK Needs Assessment Parent Survey.PDF File (Adobe PDF File), 49KB
Multimedia Appendix 3
Electronic survey question design.PDF File (Adobe PDF File), 219KB
Multimedia Appendix 4
TREKK Usability Survey with In-person Data Collectors.PDF File (Adobe PDF File), 34KB
- Translating Emergency Knowledge for Kids. Translating Emergency Knowledge for Kids. 2015. URL: http://trekk.ca/about [accessed 2015-08-26] [WebCite Cache]
- Shih TH, Fan X. Comparing response rates from web and mail surveys: a meta-analysis. Field Methods 2008 Mar 03;20(3):249-271. [CrossRef]
- Burns KE, Duffett M, Kho M, Meade M, Adhikari NK, Sinuff T, et al. A guide for the design and conduct of self-administered surveys of clinicians. CMAJ 2008 Jul 29;179(3):245-252 [FREE Full text] [CrossRef] [Medline]
- VanGeest JB, Johnson TP, Welch VL. Methodologies for improving response rates in surveys of physicians: a systematic review. Eval Health Prof 2007 Dec;30(4):303-321. [CrossRef] [Medline]
- Braithwaite D, Emery J, De Lusignan S, Sutton S. Using the Internet to conduct surveys of health professionals: a valid alternative? Fam Pract 2003 Oct;20(5):545-551 [FREE Full text] [Medline]
- Morris M, Edwards P, Doyle P, Maconochie N. Women in an infertility survey responded more by mail but preferred a choice: randomized controlled trial. J Clin Epidemiol 2013 Feb;66(2):226-235. [CrossRef] [Medline]
- Golnik A, Ireland M, Borowsky IW. Medical homes for children with autism: a physician survey. Pediatrics 2009 Mar;123(3):966-971. [CrossRef] [Medline]
- Rodriguez H, von Glahn T, Rogers W, Chang H, Fanjiang G, Safran D. Evaluating patients' experiences with individual physicians: a randomized trial of mail, Internet, and interactive voice response telephone administration of surveys. Med Care 2006 Feb;44(2):167-174. [Medline]
- Yusuf TE, Baron TH. Endoscopic transmural drainage of pancreatic pseudocysts: results of a national and an international survey of ASGE members. Gastrointest Endosc 2006 Feb;63(2):223-227. [CrossRef] [Medline]
- Fan W, Yan Z. Factors affecting response rates of the web survey: a systematic review. Comp Human Behav 2010 Mar;26(2):132-139. [CrossRef]
- Cunningham CT, Quan H, Hemmelgarn B, Noseworthy T, Beck CA, Dixon E, et al. Exploring physician specialist response rates to web-based surveys. BMC Med Res Methodol 2015 Apr;15:32 [FREE Full text] [CrossRef] [Medline]
- Beebe TJ, Locke GR, Barnes SA, Davern ME, Anderson KJ. Mixing web and mail methods in a survey of physicians. Health Serv Res 2007 Jun;42(3 Pt 1):1219-1234 [FREE Full text] [CrossRef] [Medline]
- Raziano DB, Jayadevappa R, Valenzula D, Weiner M, Lavizzo-Mourey R. E-mail versus conventional postal mail survey of geriatric chiefs. Gerontologist 2001 Dec 01;41(6):799-804. [CrossRef] [Medline]
- Potts HW, Wyatt JC. Survey of doctors' experience of patients using the Internet. J Med Internet Res 2002;4(1):e5 [FREE Full text] [CrossRef] [Medline]
- Dykema J, Jones NR, Piché T, Stevenson J. Surveying clinicians by web: current issues in design and administration. Eval Health Prof 2013 Sep;36(3):352-381. [CrossRef] [Medline]
- VanDenKerkhof EG, Parlow JL, Goldstein DH, Milne B. In Canada, anesthesiologists are less likely to respond to an electronic, compared to a paper questionnaire. Can J Anaesth 2004 May;51(5):449-454. [Medline]
- Vicente P, Reis E. Using questionnaire design to fight nonresponse bias in web surveys. Soc Sci Comput Rev 2010 May 10;28(2):251-267. [CrossRef]
- Thompson DS, O'Leary K, Jensen E, Scott-Findlay S, O'Brien-Pallas L, Estabrooks CA. The relationship between busyness and research utilization: it is about time. J Clin Nurs 2008 Feb;17(4):539-548. [CrossRef] [Medline]
- Josephson S. Keeping your readers' eyes on the screen: an eye-tracking study comparing sans serif and serif typefaces. Vis Comm Q 2011 Jan 20;15(1-2):67-79. [CrossRef]
- Salaffi F, Gasparini S, Ciapetti A, Gutierrez M, Grassi W. Usability of an innovative and interactive electronic system for collection of patient-reported data in axial spondyloarthritis: comparison with the traditional paper-administered format. Rheumatology (Oxford) 2013 Nov;52(11):2062-2070 [FREE Full text] [CrossRef] [Medline]
- Linder LA, Ameringer S, Erickson J, Macpherson CF, Stegenga K, Linder W. Using an iPad in research with children and adolescents. J Spec Pediatr Nurs 2013 Apr;18(2):158-164. [CrossRef] [Medline]
- Aiello EJ, Taplin S, Reid R, Hobbs M, Seger D, Kamel H, et al. In a randomized controlled trial, patients preferred electronic data collection of breast cancer risk-factor information in a mammography setting. J Clin Epidemiol 2006 Jan;59(1):77-81. [CrossRef] [Medline]
- Tscholl DW, Weiss M, Spahn DR, Noethiger CB. How to conduct multimethod field studies in the operating room: the iPad combined with a survey app as a valid and reliable data collection tool. JMIR Res Protoc 2016 Jan;5(1):e4 [FREE Full text] [CrossRef] [Medline]
- Newell SM, Logan HL, Guo Y, Marks JG, Shepperd JA. Evaluating tablet computers as a survey tool in rural communities. J Rural Health 2015;31(1):108-117 [FREE Full text] [CrossRef] [Medline]
- Fanning J, McAuley E. A comparison of tablet computer and paper-based questionnaires in healthy aging research. JMIR Res Protoc 2014 Jul;3(3):e38 [FREE Full text] [CrossRef] [Medline]
- Anand V, McKee S, Dugan TM, Downs SM. Leveraging electronic tablets for general pediatric care: a pilot study. Appl Clin Inform 2015 Jan;6(1):1-15 [FREE Full text] [CrossRef] [Medline]
- Akl EA, Maroun N, Klocke RA, Montori V, Schünemann HJ. Electronic mail was not better than postal mail for surveying residents and faculty. J Clin Epidemiol 2005 Apr;58(4):425-429. [CrossRef] [Medline]
- Granello D, Wheaton J. Online data collection: strategies for research. J Couns Dev 2004;82(4):387-393. [CrossRef]
- Wilcox AB, Gallagher KD, Boden-Albala B, Bakken SR. Research data collection methods: from paper to tablet computers. Med Care 2012 Jul;50 Suppl:S68-S73. [CrossRef] [Medline]
- Wilcox AB, Gallagher K, Bakken S. Security approaches in using tablet computers for primary data collection in clinical research. EGEMS (Wash DC) 2013 Feb;1(1):1008 [FREE Full text] [CrossRef] [Medline]
- Apple Inc.. iOS Security. 2015 Jun. URL: http://www.apple.com/business/docs/iOS_Security_Guide.pdf [accessed 2015-08-26] [WebCite Cache]
- U.S. Department of Health & Human Services. Research-Based Web Design & Usability Guidelines, Enlarged/Expanded edition. Washington: U.S. Government Printing Office; 2006. URL: https://www.usability.gov/sites/default/files/documents/guidelines_book.pdf [accessed 2015-08-26] [WebCite Cache]
|ED: emergency department|
|TREKK: Translating Emergency Knowledge for Kids|
Edited by G Eysenbach; submitted 28.09.15; peer-reviewed by J Avis, P Muller, P Paulo Tanaka; comments to author 10.02.16; revised version received 13.04.16; accepted 07.05.16; published 29.06.16
©Shannon D Scott, Lauren Albrecht, Lisa M Given, Danielle Arseneau, Terry P Klassen. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 29.06.2016.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Research Protocols, is properly cited. The complete bibliographic information, a link to the original publication on http://www.researchprotocols.org, as well as this copyright and license information must be included.