Published on in Vol 12 (2023)

Preprints (earlier versions) of this paper are available at, first published .
Decision Trade-Offs in Ecological Momentary Assessments and Digital Wearables Uptake: Protocol for a Discrete Choice Experiment

Decision Trade-Offs in Ecological Momentary Assessments and Digital Wearables Uptake: Protocol for a Discrete Choice Experiment

Decision Trade-Offs in Ecological Momentary Assessments and Digital Wearables Uptake: Protocol for a Discrete Choice Experiment


1Division of Intramural Research, National Institute on Minority Health and Health Disparities, National Institutes of Health, Rockville, MD, United States

2Department of Epidemiology, Bloomberg School of Public Health, Johns Hopkins University, Baltimore, MD, United States

Corresponding Author:

Sherine El-Toukhy, MA, PhD

Division of Intramural Research, National Institute on Minority Health and Health Disparities, National Institutes of Health

11545 Rockville Pike

Rockville, MD

United States

Phone: 1 3015944743


Background: Ecological momentary assessments (EMAs) and digital wearables (DW) are commonly used remote monitoring technologies that capture real-time data in people’s natural environments. Real-time data are core to personalized medical care and intensively adaptive health interventions. The utility of such personalized care is contingent on user uptake and continued use of EMA and DW. Consequently, it is critical to understand user preferences that may increase the uptake of EMA and DW.

Objective: The study aims to quantify users’ preferences of EMA and DW, examine variations in users’ preferences across demographic and behavioral subgroups, and assess the association between users’ preferences and intentions to use EMA and DW.

Methods: We will administer 2 discrete choice experiments (DCEs) paired with self-report surveys on the internet to a total of 3260 US adults through Qualtrics. The first DCE will assess participants’ EMA preferences using a choice-based conjoint design that will ask participants to compare the relative importance of prompt frequency, number of questions per prompt, prompt type, health topic, and assessment duration. The second DCE will measure participants’ DW preferences using a maximum difference scaling design that will quantify the relative importance of device characteristics, effort expectancy, social influence, and facilitating technical, health care, and market factors. Hierarchical Bayesian multinomial logistic regression models will be used to generate subject-specific preference utilities. Preference utilities will be compared across demographic (ie, sex, age, race, and ethnicity) and behavioral (ie, substance use, physical activity, dietary behavior, and sleep duration) subgroups. Regression models will determine whether specific utilities are associated with attitudes toward or intentions to use EMA and DW. Mixture models will determine the associations of attitudes toward and intentions to use EMA and DW with latent profiles of user preferences.

Results: The institutional review board approved the study on December 19, 2022. Data collection started on January 20, 2023, and concluded on May 4, 2023. Data analysis is currently underway.

Conclusions: The study will provide evidence on users’ preferences of EMA and DW features that can improve initial uptake and potentially continued use of these remote monitoring tools. The sample size and composition allow for subgroup analysis by demographics and health behaviors and will provide evidence on associations between users’ preferences and intentions to uptake EMA and DW. Limitations include the cross-sectional nature of the study, which limits our ability to measure direct behavior. Rather, we capture behavioral intentions for EMA and DW uptake. The nonprobability sample limits the generalizability of the results and introduces self-selection bias related to the demographic and behavioral characteristics of participants who belong to web-based survey panels.

International Registered Report Identifier (IRRID): DERR1-10.2196/47567

JMIR Res Protoc 2023;12:e47567



Ecological momentary assessments (EMAs) and digital wearables (DWs) are exemplars of remote monitoring technologies that afford us the ability to monitor the individual and their environment around the clock [1]. Remote monitoring technologies are a hallmark of precision medicine, whereby medical and health decisions can be tailored to the individual based on their unique and up-to-date data [2]. Specifically, EMA is an approach where people’s experiences and behaviors are repeatedly captured through brief self-report surveys [3]. DWs are electronic devices that repeatedly collect, process, store, and transmit data either directly through an internet connection or indirectly through a smartphone [4]. EMAs and DWs capture momentary contextual, sociopsychological, physiological, and behavioral data in the wild (ie, in real time and in a person’s natural environment), which improves the accuracy, granularity, and ecological validity of the measurements [5].

User-generated data on behavioral and medical conditions from EMAs and DWs can be used for diagnostic, management, and treatment purposes [1]. Examples include (1) detection of smoking and alcohol use episodes and lapses [6-8] and of various medical conditions [9] (eg, seizures [10] and atrial fibrillation) [11,12]; (2) management of diabetes [13], hypertension [14], physical activity, and diet [15-18]; and (3) treatment of cigarette smoking [19], sleep [20], and mood disorders [21] by intervening whenever and wherever support is needed. Data from EMAs and DWs can improve health and clinical outcomes through various mechanisms. EMAs and DWs capture events and behaviors, their determinants, and their effects. These data relate to several cognitive processes (eg, self-monitoring) underlying the self-regulatory mechanism that motivates and guides proactive and purposeful actions [22]. Additionally, data from EMAs and DWs represent feedback information to their users that is used to form and assess self-views and goals [23], thereby reinforcing desired behaviors [24]. Indeed, evidence shows that EMA and DW use is associated with improved health outcomes [25,26] (cf [27,28]).

Real-time EMA and DW data are the foundation of intensively adaptive health interventions and personalized medical care. Intensively adaptive interventions use ongoing information about the user to disseminate (or not) an appropriate treatment type and dose at the right time and place, relying on predefined decision rules that accommodate between- and within-user characteristics and other tailoring factors (eg, intrapersonal state and contextual cues) that change by the day, hour, or second over the course of an intervention [29,30]. For example, in Text2Quit, an interactive text messaging smoking cessation intervention, users’ data are periodically updated throughout the intervention (eg, number of cigarettes smoked) and subsequently integrated in pre- and postquit support messages [19]. In Sense2Stop, users wear chest and wrist DWs and answer smoking- and mood-related EMAs, both of which trigger digital prompts for users to engage in exercises to manage stress, a known antecedent of smoking lapse [31]. Although evidence on the efficacy and effectiveness of intensively adaptive interventions is largely based on pilot studies [32], they often outperform nonadaptive interventions [33]. Similarly, in clinical settings, continuous input from remote monitoring tools can be integrated into a patient’s electronic health record, which subsequently informs medical decision-making and improves clinical outcomes [34,35].

To reap the benefits associated with the use of remote monitoring technology, end users must uptake and continually use these digital tools. Uptake refers to the likelihood that an individual is willing to take part in an EMA or use a DW [36]. Continued use refers to the likelihood that an individual consistently completes the EMAs or uses DWs as prescribed [37]. Rates of EMA and DW uptake and continued use are frequently low or variable. For example, health and fitness mobile apps have a mere 3.7% retention rate 30 days post installation [38] and a third of DW owners stop using their devices within 6 months [39]. When reported, EMA completion rate varies across studies, between 20% and 90% in substance use studies [8] and between 44% and 96% in diet and physical activity studies [16]. Compounding these issues, the operationalization of uptake and continued use is often absent or inconsistent across research studies. For example, some studies report percentages of participants who engage in an activity or a previously set threshold of that activity, while others report averages or the exact number of times or number of days participants engage in a given activity [40]. Researchers have documented some of the facilitators of and barriers to the uptake and continued use of EMAs and DWs. Facilitators of use include perceptions of utility, usefulness, ease of use, usability, and having the motivation and ability to use the technology, to name a few [40]. Barriers include technical (eg, technology malfunction and incompatibility) and nontechnical (eg, digital literacy and cost) factors [40].

Uptake and continued use may be improved when the features of EMAs and DWs match the preferences of potential users. However, there is scarce evidence on the relative importance of different attributes and attribute levels of EMAs and DWs that might affect uptake and potentially continued use, although uptake and continued use are driven by different factors [4]. Indeed, the design of EMAs varies greatly regarding key attributes. For instance, the length of EMA studies ranges from 1 to 182 days, with a median of 7 days, whereas the number of daily prompts ranges from 1 to 42 [41]. Additionally, user preferences for specific attributes are rarely examined across sociodemographic and behavioral subgroups, despite well-documented differences in uptake and continued use of remote monitoring technologies between these groups [41,42]. There is also little evidence on the relationship between EMA and DW preferred attributes and intentions to use remote monitoring technologies.

Using a discrete choice experiment (DCE) [43], this study aims to identify the optimal attributes of EMAs and DWs that may increase uptake and continued use. A DCE is a quantitative survey-based methodology that elicits preferences by presenting participants with alternatives and asking them to make a choice. The process is repeated with different combinations of attributes, and the resulting data are used to calculate the relative importance each participant places on each attribute [44]. In previous studies, this method has been used to elicit preferences for health mobile apps [45-47], sharing health data [48,49], and health interventions [50]. In this study, 2 DCEs will elucidate preferences for EMAs and DWs, examine how these preferences vary across demographics (ie, sex, age, and race and ethnicity) and behaviors (ie, substance use, physical activity, dietary behavior, and sleep duration), and assess whether specific preferences are associated with attitudes toward or intentions to use EMAs and DWs in the future.


This study aims to (1) quantify the relative importance of 5 EMA attributes (ie, prompt frequency for surveys, number of questions per prompt, prompt type, health topic, and assessment duration) and 30 DW features centered around 6 domains (ie, device characteristics, effort expectancy, social influence, and facilitating conditions related to technical, health care, and market factors); (2) identify the relative importance of EMA and DW attributes within demographic (ie, sex, age, and race and ethnicity) and behavioral (ie, those who meet vs who do not meet predefined criteria or guidelines for a given behavior) subgroups; and (3) examine the associations between users’ preferences and EMA and DW uptake intentions.


The focal point of the study is 2 DCEs (Multimedia Appendix 1). Each participant will complete only 1 DCE.

The first DCE will assess EMA preferences using a choice-based conjoint design [51,52] in which 5 attributes (eg, number of surveys per day) are presented as a package. Each attribute is assigned one of several levels (eg, 2-3 surveys or 6 or more surveys). Participants choose between 2 packages or neither package. Attributes and levels (Table 1) were selected from previous research on EMAs [15,16,18,41,53]. Collectively, over 1500 unique packages can be constructed from these attribute levels. However, a balanced fractional factorial design will be implemented in Qualtrics so that each participant only needs to repeat the task 9 times and evaluate 18 packages.

The second DCE will measure DW preferences using a maximum difference scaling design [54,55] in which participants view a subset of 4 features from a larger set of 30 features (Table 2). Participants choose which feature in the subset they value the most and which they value the least. The attributes and corresponding statements were adapted and developed from previous research on DWs [4,56-60] and digital health tools more broadly [40,46,48,50,61-65]. Attribute domains correspond to constructs derived from technology acceptance theoretical frameworks [66]. The attributes and corresponding statements center around effort expectancy, defined as the level of ease in using the technology; social influence, defined as the level of support the user of the technology receives from important others; and facilitating conditions related to technical infrastructure, health care, and market factors that can prohibit or support the technology use [66]. We also include device characteristics as an external factor associated with technology acceptance [59]. Although over 24,000 combinations of attributes are possible, each participant will only be required to complete the task 23 times.

Table 1. Attributes and attribute levels in an ecological momentary assessment discrete choice experiment.
Attributes and attribute levelsDescriptions
Prompt frequency

0-1 per dayNone or only 1 survey

2-3 per day2-3 surveys

4-5 per day4-5 surveys

≥6 per day6 or more surveys
Number of questions per prompt

1 question1 question

2-3 questions2-3 questions

4-5 questions4-5 questions

≥6 questions6 or more questions
Prompt type

Event-contingentSelf-initiated when a predefined event has occurred (eg, smoking a cigarette, snacking between meals, or being in a specific location such as a bar)

Signal-contingentAt random times

Time-contingentOn fixed times

MixedCombination of random and fixed times
Health topic

Nicotine or tobacco useNicotine or tobacco use

Alcohol useAlcohol drinking

Marijuana useMarijuana use

Physical activityExercise

NutritionDiet or nutrition

Assessment duration

≤1 month1 month or shorter

>1 but <6 monthsMore than 1 month but less than 6 months

≥6 but <12 monthsMore than 6 months but less than 1 year

≥1 year1 year or longer
Table 2. Attributes examined in the digital wearables discrete choice experiment.
Domains and attributesStatements
Device characteristics

  • The device collects nonmedical data (eg, steps)
  • The device collects medical data (eg, blood glucose)

Point of contact
  • I must wear the device on my chest or ankle

Battery life
  • The device must be charged every 48 hours

  • The device has a touchscreen interface

  • The device has a memory chip to prevent data loss

Cellular data
  • The device must be always connected to the internet

  • I can personalize the device to my individual goals and preferences
  • I can change the look and feel of the device (eg, by using different strap styles and colors)

Added features
  • The device provides information in languages other than English (eg, Spanish)
  • The device allows me to interact with other users if I want to

  • The accuracy of the device has been proven in scientific studies

Effort expectancy

Ease of use
  • The device is easy to use

Data sync
  • The device syncs automatically with a smartphone

  • The device fits with my lifestyle and daily activities
  • The device can cause skin irritations

Social influence

Peer recommendation
  • A friend or family member recommended the device to me

Facilitating technical factors

  • The device is compatible with all smartphones (eg, Android and Windows)
  • The device is compatible with popular health apps

Data security and privacy
  • I can enable or disable location tracking on the device
  • I get to choose when and how the device shares data
  • My data are encrypted
  • My data will be sold to third parties for profit

Facilitating health care factors

  • The device is covered by health insurance

Doctor recommendation
  • My doctor recommended the device to me

Data exchange
  • I can share the data with my doctor

Facilitating market factors

  • The device is under US $150

Customer reviews
  • The device has a high customer rating

  • The device comes with a 2-year warranty

Customer service
  • There is a number I can call if the device stops working


A nonprobability sample of 3260 US adults, 18 years and older, will be recruited nationwide to participate in a web-based survey. The sample will be equally split across biological sex, age groups, and race and ethnicity to allow comparisons of users’ preferences within demographic subgroups (Table 3). Sample size calculations were performed using standard DCE formulas [67,68]. A sample of 333 participants is required for the EMA DCE based on the formula n≥1000(x)/yz, where x is the maximum number of levels of any attribute, y is the number of choices per task excluding the option none, and z is the number of times the task is completed per participant [ie, 1000(6)/(2 × 9) = 333]. A sample of 163 participants is required for the DW DCE based on the formula n≥500(w)/yz, where w is the number of features and y and z are defined identically to the previous formula [ie, 500(30)/(4 × 23) = 163]. However, the minimum sample for the DW DCE will be set at 300 participants in accordance with recommended guidelines [68]. Based on these calculations, the study should be well powered even when the sample is further stratified by demographic and behavioral subgroups.

Table 3. Target sample characteristics (N=3260).
CharacteristicsParticipants, n (%)

Female1630 (50)

Male1630 (50)
Age (years)

18-29815 (25)

30-44815 (25)

45-59815 (25)

≥60815 (25)
Race and ethnicity

Non-Hispanic American Indian, Alaska Native, Native Hawaiian, or Pacific Islander652 (20)

Non-Hispanic Asian652 (20)

Hispanic or Latino652 (20)

Non-Hispanic Black652 (20)

Non-Hispanic White 652 (20)


Through its research panels, Qualtrics will send an invitation email to individuals with demographic characteristics that match the target sample. The email includes links to 2 identical surveys: the first has the EMA DCE, and the second has the DW DCE. The email also includes an estimated time to complete the survey and the incentive offered. Qualtrics will compensate participants and distribute the incentives at a rate equivalent to surveys of similar scope, burden, and duration. Data collection will continue until the desired sample size and composition are reached.


Participants will consent before proceeding to any survey questions. Consenting participants will respond to inclusion and exclusion questions. To be included in the study, participants must be adults residing in the United States. Inclusion questions include zip code, age, biological sex, and race and ethnicity. The latter 3 questions are the basis of built-in hidden quotas for sample characteristics (Table 3). When a quota is met (eg, 50% females already recruited), participants who belong to that subgroup will not be allowed to take the survey. The only exclusion criterion is when participants respond yes to “Do you use any fitness tracker, smartwatch, or electronic medical device to monitor or track your health?” [69]. Participants who use a DW are excluded to improve inferences about preferences among nonusers. Ineligible participants receive a thank you message and are not allowed to proceed any further. Eligible participants will proceed to the remainder of the study questions. Except for completing 1 of the 2 DCEs, all instructions and questions are identical for all participants. The survey will take about 30 minutes to complete.


We will collect sociodemographic (ie, zip code, biological sex, age, race and ethnicity, sexual orientation, education, household income, employment, marital status, and English language proficiency) [69,70] and health (ie, weight and height [69], general health [71], emotional health [72], perceived susceptibility to disease [73], underlying health conditions [70], having a regular health care provider [69], and health insurance [69]) data. We also collect behavioral data on nicotine and tobacco use [74,75], marijuana use [76], alcohol use [77], fruit and vegetable intake [69], physical activity [69], and sleep [69], all of which are risk factors for chronic conditions and are thus the focus of health and medical interventions. These behavioral data allow comparisons of users’ preferences within behavioral subgroups. Where each participant can demonstrate any number of the 6 behaviors examined in the study, we follow public health guidelines to group each participant into 1 of 2 groups, one where participants do (vs do not) meet those guidelines and thresholds for each behavior as follows: (1) Nicotine and tobacco use is defined as current use of any nicotine or tobacco product [78]. (2) Marijuana use is defined as current use of marijuana or cannabis [78]. (3) Alcohol misuse is defined as daily consumption of >1 drink per day for women and >2 drinks per day for men, or as total consumption of >7 drinks per week for women and >14 drinks per week for men [79,80]. Alcohol use questions are asked only of participants 21 years or older. (4) Insufficient physical activity is defined as <150 minutes or 2.5 hours of physical activity per week [81]. (5) Inadequate fruit and vegetable intake is defined as <5 servings of fruit and vegetables total per day [79]. (6) Insufficient sleep is defined as <7 hours of sleep per night [82].

We also collect data on digital technologies access and use [69,83]; phone affinity [84]; previous use of [69], attitudes toward [85], and satisfaction with health applications [86]; attitudes toward wearable devices [85]; technology acceptance [59,87-89]; and willingness and intentions to use EMAs and DWs.

Data Analysis

Data from both DCEs will be analyzed using hierarchical Bayesian multinomial logistic regression models [90,91]. These models will generate subject-specific utilities quantifying the relative importance of each attribute to each participant. EMA and DW attribute preferences will be characterized among all participants and within subgroups defined by sex (female or male), age group (18-29 years, 30-44 years, 45-59 years, or 60 years or older), race and ethnicity (Non-Hispanic American Indian, Alaska Native, Native Hawaiian, or Pacific Islander; Non-Hispanic Asian; Hispanic or Latino; Non-Hispanic Black; or Non-Hispanic White), or health behavior (meeting vs not meeting established public health guidelines for each behavior). Differences between subgroups will be evaluated using t tests, analysis of variance, or linear regression, as appropriate. The association between each utility and each measure of attitude toward or intention to use EMAs and DWs will be depicted in correlation matrices generated from data comprising all participants or data from specific demographic or behavioral subgroups. A latent profile analysis [92,93] of individual-level utilities will be performed to detect segments of individuals who share common preferences. The extent to which demographics or behavioral characteristics predict membership in these segments will be quantified using the 3-step method [94,95]. The Bolck, Croon, and Hagenaars method [96] will evaluate whether specific segments are associated with specific attitudes or intentions.

Ethical Considerations

The study was deemed exempt on December 19, 2022, under category 2: research that only includes interactions involving educational tests, survey procedures, interview procedures, or observation of public behavior [§45 CFR 46.10(d)(2)], National Institutes of Health Institutional Review Board (001208).

Data collection started on January 1, 2023, and concluded on May 4, 2023. Data analysis is ongoing.

This protocol outlines a survey-based DCE to identify attributes that are associated with the uptake of EMAs and DWs. EMAs and DWs are remote monitoring technologies that capture real-time data in users’ natural environments that are foundational to precision medicine. These data-driven approaches to health and medical care are the face of an increasingly digital ecosystem and align with the emphasis on person-centered care [97].

The study will generate insights on the optimal attributes and features that users value to maximize the uptake of EMAs and DWs. Such insights have significant health and medical implications given the ubiquitousness of mobile technologies. In 2021, smartphone ownership was at 85% in the United States [98] and there were approximately 100,000 health care apps on the Apple App Store and Google Play Store [99,100]. As of 2022, wearables penetration was at 25.3% among US adults [101] and is forecast to reach 628.3 million devices globally by 2026 [102]. Additionally, mobile health technologies enjoy acceptance and demand from users and patients [103]; interest of commercial, research, and health care entities [104,105]; and support of national and international policies and initiatives [106,107]. These technologies have given rise to terms like digital biomarkers, digital diagnostics, digital therapeutics, and digital treatments [1]. They are becoming common in medical research as well as available directly to users who can engage in self-care within (eg, electronic health records) [108] or outside (eg, faith organizations) [109] the traditional health care system. This is especially relevant given the prevalence of chronic conditions (eg, heart disease) and associated modifiable risk factors (eg, tobacco use) that are suited for remote monitoring [110]. Programs for the integration of telemonitoring in clinical and medical care for managing and treating chronic conditions in patients are developed and piloted [26,34], providing a far-reaching platform for health care delivery that can benefit underserved populations such as those with limited access to health care or those who disproportionally bear the highest burden of disease and risk factors [111].

The study has several limitations. Participants belong to web-based survey panels, rendering the results nongeneralizable to the US population and raising concerns about self-selection bias. The Qualtrics platform does not permit the inclusion of 2 DCEs in the same survey. Accordingly, participants received an invitation email with links to both surveys but were permitted to take only one. Although current use of a wearable device is an exclusion criterion for this study, the sample can include ex-users of wearables who may hold preexisting attitudes toward DWs based on previous experiences. Because of the cross-sectional nature of the survey, we capture behavioral intentions rather than EMA and DW use behavior. To avoid survey burden, we limited the number of EMA attributes and DW features. The study has several strengths, including a robust sample size that allows for subgroup analysis by sociodemographic and behavioral groups. The study also examines the preferences-intentions relationship that provides actionable information to increase the uptake of EMAs and DWs, especially among at-risk or disadvantaged populations.


This work and the effort of SE, GZ, and PH were supported by the Division of Intramural Research of the National Institute of Minority Health & Health Disparities, National Institutes of Health.

Data Availability

Data sharing is not applicable to this article as no data sets were generated or analyzed during this study.

Authors' Contributions

SE conceptualized the study and drafted the manuscript. JP developed the statistical analysis plan. GZ and PH built and tested the web-based surveys. All authors approve the manuscript as submitted.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Discrete choice experiment tasks.

DOCX File , 15 KB

  1. Sim I. Mobile devices and health. N Engl J Med 2019;381(10):956-968 [CrossRef] [Medline]
  2. Kosorok MR, Laber EB. Precision medicine. Annu Rev Stat Appl 2019;6:263-286 [] [CrossRef] [Medline]
  3. Shiffman S, Stone AA, Hufford MR. Ecological momentary assessment. Annu Rev Clin Psychol 2008;4:1-32 [CrossRef] [Medline]
  4. Canhoto AI, Arp S. Exploring the factors that support adoption and sustained use of health and fitness wearables. J Mark Manage 2016;33(1-2):32-60 [CrossRef]
  5. Hamaker EL, Wichers M. No time like the present: discovering the hidden dynamics in intensive longitudinal data. Curr Dir Psychol Sci 2017;26(1):10-15 [CrossRef]
  6. Saleheen N, Ali AA, Hossain SM, Sarker H, Chatterjee S, Marlin B, et al. puffMarker: a multi-sensor approach for pinpointing the timing of first lapse in smoking cessation. Proc ACM Int Conf Ubiquitous Comput 2015;2015:999-1010 [] [Medline]
  7. Alharbi R, Shahi S, Cruz S, Li L, Sen S, Pedram M, et al. SmokeMon: unobtrusive extraction of smoking topography using wearable energy-efficient thermal. Proc ACM Interact Mob Wearable Ubiquitous Technol 2023;6(4):1-25 [CrossRef]
  8. Shiffman S. Ecological momentary assessment (EMA) in studies of substance use. Psychol Assess 2009;21(4):486-497 [] [CrossRef] [Medline]
  9. Chakrabarti S, Biswas N, Jones LD, Kesari S, Ashili S. Smart consumer wearables as digital diagnostic tools: a review. Diagnostics (Basel) 2022;12(9):2110 [] [CrossRef] [Medline]
  10. Embrace2. Empatica. URL: [accessed 2023-03-01]
  11. Heart health notifications on your Apple watch. Apple. 2022. URL: [accessed 2023-03-01]
  12. Kardia. Alivecor. URL: [accessed 2023-03-01]
  13. The new Dexcom G7. Dexcom. URL: [accessed 2023-03-01]
  14. Blood pressure anytime, anywhere. Omron. URL: [accessed 2023-03-01]
  15. Maugeri A, Barchitta M. A systematic review of ecological momentary assessment of diet: implications and perspectives for nutritional epidemiology. Nutrients 2019;11(11):2696 [] [CrossRef] [Medline]
  16. Liao Y, Skelton K, Dunton G, Bruening M. A systematic review of methods and procedures used in ecological momentary assessments of diet and physical activity research in youth: an adapted STROBE Checklist for Reporting EMA Studies (CREMAS). J Med Internet Res 2016;18(6):e151 [] [CrossRef] [Medline]
  17. Dunton GF. Ecological momentary assessment in physical activity research. Exerc Sport Sci Rev 2017;45(1):48-54 [] [CrossRef] [Medline]
  18. Degroote L, DeSmet A, De Bourdeaudhuij I, Van Dyck D, Crombez G. Content validity and methodological considerations in ecological momentary assessment studies on physical activity and sedentary behaviour: a systematic review. Int J Behav Nutr Phys Act 2020;17(1):35 [] [CrossRef] [Medline]
  19. Abroms LC, Ahuja M, Kodl Y, Thaweethai L, Sims J, Winickoff JP, et al. Text2Quit: results from a pilot test of a personalized, interactive mobile health smoking cessation program. J Health Commun 2012;17(Suppl 1):44-53 [] [CrossRef] [Medline]
  20. Häusler N, Marques-Vidal P, Haba-Rubio J, Heinzer R. Does sleep predict next-day napping or does napping influence same-day nocturnal sleep? Results of a population-based ecological momentary assessment study. Sleep Med 2019;61:31-36 [CrossRef] [Medline]
  21. Colombo D, Fernández-Álvarez J, Patané A, Semonella M, Kwiatkowska M, García-Palacios A, et al. Current state and future directions of technology-based ecological momentary assessment and intervention for major depressive disorder: a systematic review. J Clin Med 2019;8(4):465 [] [CrossRef] [Medline]
  22. Bandura A. Social cognitive theory of self-regulation. Organ Behav Hum Decis Process 1991;50(2):248-287 [CrossRef]
  23. Gregg AP, Hepper EG, Sedikides C. Quantifying self-motives: functional links between dispositional desires. Eur J Soc Psychol 2011;41(7):840-852 [CrossRef]
  24. Stiglbauer B, Weber S, Batinic B. Does your health really benefit from using a self-tracking device? Evidence from a longitudinal randomized control trial. Comput Hum Behav 2019;94:131-139 [CrossRef]
  25. Brickwood KJ, Watson G, O'Brien J, Williams AD. Consumer-based wearable activity trackers increase physical activity participation: systematic review and meta-analysis. JMIR Mhealth Uhealth 2019;7(4):e11819 [] [CrossRef] [Medline]
  26. McKenzie AL, Athinarayanan SJ, McCue JJ, Adams RN, Keyes M, McCarter JP, et al. Type 2 diabetes prevention focused on normalization of glycemia: a two-year pilot study. Nutrients 2021;13(3):749 [] [CrossRef] [Medline]
  27. Patel MS, Asch DA, Volpp KG. Wearable devices as facilitators, not drivers, of health behavior change. JAMA 2015;313(5):459-460 [CrossRef] [Medline]
  28. Jo A, Coronel BD, Coakes CE, Mainous AG. Is there a benefit to patients using wearable devices such as Fitbit or health apps on mobiles? A systematic review. Am J Med 2019;132(12):1394-1400.e1 [CrossRef] [Medline]
  29. Collins LM, Murphy SA, Bierman KL. A conceptual framework for adaptive preventive interventions. Prev Sci 2004;5(3):185-196 [] [CrossRef] [Medline]
  30. Riley WT, Serrano KJ, Nilsen W, Atienza AA. Mobile and wireless technologies in health behavior and the potential for intensively adaptive interventions. Curr Opin Psychol 2015;5:67-71 [] [CrossRef] [Medline]
  31. Battalio SL, Conroy DE, Dempsey W, Liao P, Menictas M, Murphy S, et al. Sense2Stop: a micro-randomized trial using wearable sensors to optimize a just-in-time-adaptive stress management intervention for smoking relapse prevention. Contemp Clin Trials 2021;109:106534 [] [CrossRef] [Medline]
  32. Seewald NJ. Adaptive interventions for a dynamic and responsive public health approach. Am J Public Health 2023;113(1):37-39 [] [CrossRef] [Medline]
  33. Wang L, Miller LC. Just-in-the-moment adaptive interventions (JITAI): a meta-analytical review. Health Commun 2020;35(12):1531-1544 [CrossRef] [Medline]
  34. Lee SG, Fisher NDL. Innovative remote management solutions for the control of hypertension. Hypertension 2023;80(5):945-955 [CrossRef] [Medline]
  35. Uhlig K, Patel K, Ip S, Kitsios GD, Balk EM. Self-measured blood pressure monitoring in the management of hypertension: a systematic review and meta-analysis. Ann Intern Med 2013;159(3):185-194 [ 0pubmed] [CrossRef] [Medline]
  36. Glasgow RE, Harden SM, Gaglio B, Rabin B, Smith ML, Porter GC, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health 2019;7:64 [] [CrossRef] [Medline]
  37. Murray E, Hekler EB, Andersson G, Collins LM, Doherty A, Hollis C, et al. Evaluating digital health interventions: key questions and approaches. Am J Prev Med 2016;51(5):843-851 [] [CrossRef] [Medline]
  38. Retention rate on day 30 of mobile app installs worldwide in 3rd quarter 2022, by category. Statista. 2023. URL: [accessed 2023-03-15]
  39. Inside wearables part 1: how behavior change unlocks long-term engagement. Endeavour Partners. 2014. URL: https:/​/medium.​com/​@endeavourprtnrs/​inside-wearable-how-the-science-of-human-behavior-change-offers-the-secret-to-long-term-engagement-a15b3c7d4cf3 [accessed 2023-03-15]
  40. Simblett S, Greer B, Matcham F, Curtis H, Polhemus A, Ferrão J, et al. Barriers to and facilitators of engagement with remote measurement technology for managing health: systematic review and content analysis of findings. J Med Internet Res 2018;20(7):e10480 [] [CrossRef] [Medline]
  41. Williams MT, Lewthwaite H, Fraysse F, Gajewska A, Ignatavicius J, Ferrar K. Compliance with mobile ecological momentary assessment of self-reported health-related behaviors and psychological constructs in adults: systematic review and meta-analysis. J Med Internet Res 2021;23(3):e17023 [] [CrossRef] [Medline]
  42. Chandrasekaran R, Katthula V, Moustakas E. Patterns of use and key predictors for the use of wearable health care devices by US adults: insights from a national survey. J Med Internet Res 2020;22(10):e22443 [] [CrossRef] [Medline]
  43. Thurstone LL. A law of comparative judgment. Psychol Rev 1927;34(4):273-286 [CrossRef]
  44. Hensher DA, Rose JM, Greene WH. Applied Choice Analysis: A Primer. New York, NY: Cambridge University Press; 2005.
  45. Jonker M, de Bekker-Grob E, Veldwijk J, Goossens L, Bour S, Van Mölken MR. COVID-19 contact tracing apps: predicted uptake in the Netherlands based on a discrete choice experiment. JMIR Mhealth Uhealth 2020;8(10):e20741 [] [CrossRef] [Medline]
  46. Nittas V, Mütsch M, Braun J, Puhan MA. Self-monitoring app preferences for sun protection: discrete choice experiment survey analysis. J Med Internet Res 2020;22(11):e18889 [] [CrossRef] [Medline]
  47. Szinay D, Cameron R, Naughton F, Whitty JA, Brown J, Jones A. Understanding uptake of digital health products: methodology tutorial for a discrete choice experiment using the bayesian efficient design. J Med Internet Res 2021;23(10):e32365 [] [CrossRef] [Medline]
  48. Johansson JV, Bentzen HB, Shah N, Haraldsdóttir E, Jónsdóttir GA, Kaye J, et al. Preferences of the public for sharing health data: discrete choice experiment. JMIR Med Inform 2021;9(7):e29614 [] [CrossRef] [Medline]
  49. Heidel A, Hagist C, Schlereth C. Pricing through health apps generated data-digital dividend as a game changer: discrete choice experiment. PLoS One 2021;16(7):e0254786 [] [CrossRef] [Medline]
  50. Phillips EA, Himmler SF, Schreyögg J. Preferences for e-mental health interventions in Germany: a discrete choice experiment. Value Health 2021;24(3):421-430 [] [CrossRef] [Medline]
  51. Louviere JJ, Woodworth G. Design and analysis of simulated consumer choice or allocation experiments: an approach based on aggregate data. J Mark Res 1983;20(4):350-367 [CrossRef]
  52. Louviere JJ, Hensher DA, Swait JD. Stated Choice Methods: Analysis and Applications. New York, NY: Cambridge University Press; 2000.
  53. Jones A, Remmerswaal D, Verveer I, Robinson E, Franken IHA, Wen CKF, et al. Compliance with ecological momentary assessment protocols in substance users: a meta-analysis. Addiction 2019;114(4):609-619 [] [CrossRef] [Medline]
  54. Finn A, Louviere JJ. Determining the appropriate response to evidence of public concern: the case of food safety. J Public Policy Mark 1992;11(2):12-25 [CrossRef]
  55. Louviere JJ, Flynn TN, Marley AAJ. Best-Worst Scaling: Theory, Methods and Applications. Cambridge, UK: Cambridge University Press; 2015.
  56. Ferguson C, Hickman LD, Turkmani S, Breen P, Gargiulo G, Inglis SC. "Wearables only work on patients that wear them": barriers and facilitators to the adoption of wearable cardiac monitoring technologies. Cardiovasc Digit Health J 2021;2(2):137-147 [] [CrossRef] [Medline]
  57. Gc VS, Iglesias CP, Erdem S, Hassan L, Peek N, Manca A. Using discrete-choice experiments to elicit preferences for digital wearable health technology for self-management of chronic kidney disease. Int J Technol Assess Health Care 2022;38(1):e77 [] [CrossRef] [Medline]
  58. Lu L, Zhang J, Xie Y, Gao F, Xu S, Wu X, et al. Wearable health devices in health care: narrative systematic review. JMIR Mhealth Uhealth 2020;8(11):e18907 [] [CrossRef] [Medline]
  59. Puri A, Kim B, Nguyen O, Stolee P, Tung J, Lee J. User acceptance of wrist-worn activity trackers among community-dwelling older adults: mixed method study. JMIR Mhealth Uhealth 2017;5(11):e173 [] [CrossRef] [Medline]
  60. Kang HS, Exworthy M. Wearing the future-wearables to empower users to take greater responsibility for their health and care: scoping review. JMIR Mhealth Uhealth 2022;10(7):e35684 [] [CrossRef] [Medline]
  61. König LM, Attig C, Franke T, Renner B. Barriers to and facilitators for using nutrition apps: systematic review and conceptual framework. JMIR Mhealth Uhealth 2021;9(6):e20037 [] [CrossRef] [Medline]
  62. Nittas V, Mütsch M, Puhan MA. Preferences for sun protection with a self-monitoring app: protocol of a discrete choice experiment study. JMIR Res Protoc 2020;9(2):e16087 [] [CrossRef] [Medline]
  63. van Westrhenen A, Wijnen BFM, Thijs RD. Parental preferences for seizure detection devices: a discrete choice experiment. Epilepsia 2022;63(5):1152-1163 [] [CrossRef] [Medline]
  64. Simblett S, Matcham F, Siddi S, Bulgari V, di San Pietro CB, López JH, RADAR-CNS Consortium. Barriers to and facilitators of engagement with mHealth technology for remote measurement and management of depression: qualitative analysis. JMIR Mhealth Uhealth 2019;7(1):e11325 [] [CrossRef] [Medline]
  65. Goad D, Collins AT, Gal U. Privacy and the internet of things-an experiment in discrete choice. Inf Manag 2021;58(2):103292 [CrossRef]
  66. Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. MIS Q 2003;27(3):425-478 [CrossRef]
  67. Lancsar E, Louviere J. Conducting discrete choice experiments to inform healthcare decision making: a user's guide. Pharmacoeconomics 2008;26(8):661-677 [CrossRef] [Medline]
  68. The CBC system for choice-based conjoint analysis, version 9. Sawtooth Software. 2017. URL: [accessed 2023-08-16]
  69. All HINTS questions. National Cancer Institute. URL: [accessed 2023-03-01]
  70. NIH RADx-UP common data elements. National Institutes of Health. URL: [accessed 2023-03-01]
  71. 36-item short form survey instrument (SF-36). RAND Corporation. URL: [accessed 2023-03-01]
  72. CAHPS Clinician and Group Survey 2022. Agency for Healthcare Research and Quality. 2022. URL: [accessed 2023-03-01]
  73. Brewer NT, Chapman GB, Gibbons FX, Gerrard M, McCaul KD, Weinstein ND. Meta-analysis of the relationship between risk perception and health behavior: the example of vaccination. Health Psychol 2007;26(2):136-145 [CrossRef] [Medline]
  74. Messeri P, Cantrell J, Mowery P, Bennett M, Hair E, Vallone D. Examining differences in cigarette smoking prevalence among young adults across national surveillance surveys. PLoS One 2019;14(12):e0225312 [] [CrossRef] [Medline]
  75. Soule E, Bansal-Travers M, Grana R, McIntosh S, Price S, Unger JB, et al. Electronic cigarette use intensity measurement challenges and regulatory implications. Tob Control 2023;32(1):124-129 [] [CrossRef] [Medline]
  76. Protocol—substances—30-day frequency. PhenX Toolkit. URL: [accessed 2023-03-15]
  77. Protocol—alcohol—30-day quantity and frequency. PhenX Toolkit. URL: [accessed 2023-03-15]
  78. Key substance use and mental health indicators in the United States: results from the 2021 National Survey on Drug Use and Health (HHS publication no. PEP22-07-01-005, NSDUH series H-57). Substance Abuse and Mental Health Services Administration. 2022. URL: [accessed 2023-08-16]
  79. Dietary guidelines for Americans 2020-2025. U.S. Department of Agriculture, U.S. Department of Health and Human Services. 2020. URL: https:/​/www.​​sites/​default/​files/​2020-12/​Dietary_Guidelines_for_Americans_2020-2025.​pdf [accessed 2023-08-16]
  80. Alcohol and substance misuse. Centers for Disease Control and Prevention. URL: [accessed 2023-03-15]
  81. How much physical activity do adults need? Centers for Disease Control and Prevention. URL: [accessed 2023-03-15]
  82. Consensus Conference Panel, Watson NF, Badr MS, Belenky G, Bliwise DL, Buxton OM, et al. Joint consensus statement of the American Academy of Sleep Medicine and Sleep Research Society on the recommended amount of sleep for a healthy adult: methodology and discussion. Sleep 2015;38(8):1161-1183 [] [CrossRef] [Medline]
  83. Sample ACS and PRCS forms and instructions. United States Census Bureau. URL: [accessed 2023-03-15]
  84. Konok V, Gigler D, Bereczky BM, Miklósi Á. Humans' attachment to their mobile phones and its relationship with interpersonal attachment style. Comput Hum Behav 2016;61:537-547 [CrossRef]
  85. Lee SY, Lee K. Factors that influence an individual's intention to adopt a wearable healthcare device: the case of a wearable fitness tracker. Technol Forecast Soc Change 2018;129:154-163 [CrossRef]
  86. Cho J. The impact of post-adoption beliefs on the continued use of health apps. Int J Med Inform 2016;87:75-83 [CrossRef] [Medline]
  87. Li J, Ma Q, Chan AH, Man SS. Health monitoring through wearable technologies for older adults: smart wearables acceptance model. Appl Ergon 2019;75:162-169 [CrossRef] [Medline]
  88. Wu LH, Wu LC, Chang SC. Exploring consumers’ intention to accept smartwatch. Comput Hum Behav 2016;64:383-392 [CrossRef]
  89. Gagnon MP, Orruño E, Asua J, Abdeljelil AB, Emparanza J. Using a modified technology acceptance model to evaluate healthcare professionals' adoption of a new telemonitoring system. Telemed J E Health 2012;18(1):54-59 [] [CrossRef] [Medline]
  90. Allenby GM, Ginter JL. Using extremes to design products and segment markets. J Mark Res 1995;32(4):392-403 [CrossRef]
  91. Lenk PJ, DeSarbo WS, Green PE, Young MR. Hierarchical bayes conjoint analysis: recovery of partworth heterogeneity from reduced experimental designs. Mark Sci 1996;15(2):173-191 [CrossRef]
  92. Gibson WA. Three multivariate models: factor analysis, latent structure analysis, and latent profile analysis. Psychometrika 1959;24(3):229-252 [CrossRef]
  93. Lazarsfeld PF, Henry NW. Latent Structure Analysis. Boston, MA: Houghton Mifflin; 1968.
  94. Asparouhov T, Muthén B. Auxiliary variables in mixture modeling: three-step approaches using M plus. Struct Equ Modeling 2014;21(3):329-341 [CrossRef]
  95. Vermunt JK. Latent class modeling with covariates: two improved three-step approaches. Polit Anal 2010;18(4):450-469 [CrossRef]
  96. Bakk Z, Vermunt JK. Robustness of stepwise latent class modeling with continuous distal outcomes. Struct Equ Modeling 2016;23(1):20-31 [CrossRef]
  97. Committee on Quality of Health Care in America, Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press; 2001.
  98. Mobile fact sheet. Pew Research Center. 2021. URL: https:/​/www.​​internet/​fact-sheet/​mobile/​?menuItem=d40cde3f-c455-4f0e-9be0-0aefcdaeee00 [accessed 2023-03-10]
  99. Number of mHealth apps available in the Google Play Store from 1st quarter 2015 to 3rd quarter 2022. Statista. URL: [accessed 2023-03-10]
  100. Number of mhealth apps available in the Apple App Store from 1st quarter 2015 to 3rd quarter 2022. Statista. URL: [accessed 2023-03-10]
  101. Adult wearable users penetration rate in the United States from 2016 to 2022. Statista. URL: [accessed 2023-03-10]
  102. Wearable devices market share. International Data Corporation. 2022. URL: [accessed 2023-03-10]
  103. Knowles M, Krasniansky A, Nagappan A. Consumer adoption of digital health in 2022: moving at the speed of trust. Rock Health. 2023. URL: https:/​/rockhealth.​com/​insights/​consumer-adoption-of-digital-health-in-2022-moving-at-the-speed-of-trust/​ [accessed 2023-03-15]
  104. Krasniansky A, Evans B, Zweig M. 2021 year-end digital health funding: seismic shifts beneath the surface. Rock Health. 2022. URL: https:/​/rockhealth.​com/​insights/​2021-year-end-digital-health-funding-seismic-shifts-beneath-the-surface/​ [accessed 2023-03-15]
  105. Shah RN, Nghiem J, Ranney ML. The rise of digital health and innovation centers at academic medical centers: time for a new industry relationship paradigm. JAMA Health Forum 2021;2(3):e210339 [] [CrossRef] [Medline]
  106. 2020-2025 federal health IT strategic plan. The Office of the National Coordinator for Health Information Technology. URL: https:/​/www.​​sites/​default/​files/​page/​2020-10/​Federal%20Health%20IT%20Strategic%20Plan_2020_2025.​pdf [accessed 2023-03-15]
  107. Global strategy on digital health 2020-2025. World Health Organization. URL: [accessed 2023-03-15]
  108. Khadjesari Z, Brown TJ, Ramsey AT, Goodfellow H, El-Toukhy S, Abroms LC, et al. Novel implementation strategy to electronically screen and signpost patients to health behavior apps: mixed methods implementation study (OptiMine study). JMIR Form Res 2022;6(7):e34271 [] [CrossRef] [Medline]
  109. Brewer LC, Hayes SN, Jenkins SM, Lackore KA, Breitkopf CR, Cooper LA, et al. Improving cardiovascular health among African-Americans through mobile health: the FAITH! app pilot study. J Gen Intern Med 2019;34(8):1376-1378 [] [CrossRef] [Medline]
  110. Health and economic costs of chronic diseases. Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion. 2022. URL: [accessed 2023-03-15]
  111. 2022 national healthcare quality and disparities report. Agency for Healthcare Research and Quality. Rockville, MD; 2022. URL: [accessed 2023-08-16]

DCE: discrete choice experiment
DW: digital wearable
EMA: ecological momentary assessment

Edited by A Mavragani; submitted 24.03.23; peer-reviewed by H Yin, H Kim; comments to author 31.07.23; revised version received 05.08.23; accepted 07.08.23; published 25.09.23


©Sherine El-Toukhy, James Russell Pike, Gabrielle Zuckerman, Phillip Hegeman. Originally published in JMIR Research Protocols (, 25.09.2023.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Research Protocols, is properly cited. The complete bibliographic information, a link to the original publication on, as well as this copyright and license information must be included.