This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Research Protocols, is properly cited. The complete bibliographic information, a link to the original publication on https://www.researchprotocols.org, as well as this copyright and license information must be included.
Ecological momentary assessment (EMA) is an innovative tool for capturing in-the-moment health behaviors as people go about their daily lives. EMA is an ideal tool to measure weight-related behaviors, such as parental feeding practices, stress, and dietary intake, as these occur on a daily basis and vary across time and context. A recent systematic review recommended standardized reporting of EMA design for studies that address weight-related behaviors.
To answer the call for reporting study designs using EMA, this paper describes in detail the EMA design of the
A detailed description of EMA strategies, protocols, and methods used in phase 1 of the
The results from this study provided an important next step in identifying best practices for EMA use in assessing weight-related behaviors in the home environment.
DERR1-10.2196/30525
Ecological momentary assessment (EMA) is an innovative method used to capture real-time information about people’s health behaviors (eg, eating, physical activity, parental feeding practices, and stress and mood), which is becoming more commonly used by researchers who study weight-related behaviors. Although EMA is new to the field of weight-related health, it has been used for decades in other related fields such as smoking cessation, eating disorders, chronic pain, and sleep [
EMA was developed as a dynamic tool in response to the static nature of other self-report tools, such as retrospective surveys, which are subject to both random error and systematic bias [
Within the field of weight-related health behavior research, EMA is being used in the assessment of behaviors such as dietary intake and physical activity [
A systematic review that evaluated the use of EMA to assess weight-related behaviors in youth and their families recommended standardizing the reporting of EMA measures across studies as details about EMA design are often absent and can lead to misinterpretation of study results [
Concerns have also been raised regarding the use of EMA with participants from low-income or low-educational attainment households and in populations who may be less technologically savvy [
Given the increased use of EMA in the field of weight-related health research and its high potential for measuring important weight-related behaviors in the home environment, this study seeks to provide a detailed description of the EMA strategies, protocols, and methods used in the
Participants from phase 1 of
In phase 1, mixed methods data were collected from family participants during an 8-10–day period, which included 2 home visits (parents or primary guardians were registered for EMA on an iPad mini [Apple Inc], at the first home visit, with more information to follow), and an 8-day observation period between home visits where EMA was collected. Detailed information about other measures from phase 1 collected besides EMA (eg, 24-hour dietary recalls, home food inventory, accelerometry, video-recorded task, and qualitative interviews) during in-home visits has been published elsewhere [
All study materials, including EMA survey questions, were translated from English into Spanish, Hmong, and Somali. The following process was used in the translation of materials: (1) a bilingual and bicultural team member translated materials into Spanish, Hmong, or Somali; (2) two additional bilingual and bicultural team members reviewed the translated materials; and (3) the 3 translators met to resolve any differences in translation, focusing on capturing the intent of the English question. The Institutional Review Board Human Subjects Committee of the University of Minnesota approved all protocols used in both phases of the
Participant characteristics | Primary caregiver | ||||
|
Phase 1 (n=150) | Phase 2 (n=1307) | |||
Female, n (%) | 137 (91) | 1171 (90) | |||
Age (years), mean (SD) | 34.5 (7.1) | 35.7 (7.9) | |||
Born in the United States, n (%) | 87 (58) | 859 (66) | |||
|
|||||
|
≤1 | 1 (2) | 8 (2) | ||
|
1 to ≤5 | 5 (8) | 52 (12) | ||
|
5-10 | 8 (13) | 51 (11) | ||
|
≥10 | 48 (76) | 336 (75) | ||
|
Not reported | 1 (1) | —a | ||
|
|||||
|
Native American | 25 (17) | 211 (16) | ||
|
Hmong | 25 (17) | 226 (17) | ||
|
Black | 25 (17) | 280 (21) | ||
|
White | 25 (17) | 239 (18) | ||
|
Somali or Ethiopian | 25 (17) | 136 (10) | ||
|
Hispanic | 25 (17) | 215 (16) | ||
|
|||||
|
English | 107 (71) | 1148 (88) | ||
|
Spanish | 16 (11) | 134 (10) | ||
|
Hmong | 7 (5) | 8 (1) | ||
|
Somali | 20 (13) | 17 (1) | ||
|
|||||
|
Some high school | 32 (21) | 183 (14) | ||
|
High school or associates | 88 (59) | 521 (40) | ||
|
Some college or bachelors | 11 (7) | 409 (31) | ||
|
Graduate degree | 18 (12) | 194 (15) | ||
|
Not reported | 1 (1) | — | ||
|
|||||
|
≤20,000 | 50 (33) | 393 (30) | ||
|
20,000-34,999 | 55 (37) | 323 (25) | ||
|
35,000-49,999 | 16 (11) | 203 (16) | ||
|
50,000-74,999 | 12 (8) | 143 (11) | ||
|
75,000-99,999 | 7 (5) | 75 (6) | ||
|
≥100,000 | 9 (6) | 159 (12) | ||
|
Not reported | 1 (1) | 11 (1) |
aNo data were missing from phase 2.
For the
event-based or event-contingent messaging, where parents completed a survey after every meal occurrence they shared with the study child aged 5-7 years;
time-based or signal-contingent, which assessed momentary constructs (eg, stress and coping) at random intervals throughout the day;
end-of-day, where parents provided an overall summary of their day. All surveys were designed to be experienced by the participant as calm, soothing via a blue color scheme.
Screenshots of ecological momentary assessment survey questions answered by
At the end of the first home visit, parents worked with research staff on reviewing EMA instructions and protocols and registering for EMA on a study-provided iPad mini. During the visit, research members logged into a web-based EMA registration portal developed by the
A research team member reviewed a binder of EMA instructions with the parent, which was then left at home. This binder included (1) descriptions of the surveys to be completed, as well as the number of daily surveys necessary to complete; (2) screenshots reminding participants how to access EMA surveys; (3) basic information about the iPad mini, such as charging, turning the iPad mini on and off, turning up the volume (to hear survey notifications), and finding the home screen; and (4) contact information for the study team. In addition, before the home visit, the parent completed an event-contingent meal survey. This ensured that the surveys were submitted appropriately and helped introduce parents to EMA surveys. The practice surveys were excluded from the analysis. After the second home visit, the iPad mini was taken from the home and the
Parents were instructed to complete an event-contingent survey (or meal survey) after every meal the study child ate when they (the parent) were present. Parents accessed the meal survey by clicking the
Parents were sent 4 signal-contingent surveys per day. Signal-contingent surveys were spaced so that they began after the parent woke up (reported during the EMA registration;
In addition, for the
Parents were sent the final (fifth) survey at the start of the final scheduled block (determined by parents’ wake and sleep time). As with the signal-contingent surveys, the end-of-day survey began by asking whether the parent had shared a meal with the study child since either waking up or completing the last survey that had not yet been reported. Unlike signal-contingent surveys, which asked about in-the-moment measures (eg, How stressed are you right now?), the end-of-day survey asked for an overall assessment of the day (eg, Overall, how stressed were you today?). To help ensure that the end-of-day survey was completed, participants were given 6 hours to complete the final survey. The last question of the end-of-day survey asked the parents to assess how difficult it was for them to fill out the surveys during the day (
All phase 1 participants (n=150) completed 8 days of EMA. The decision to include 8 days instead of 7 was based on prior research using observational methods, suggesting the need to allow for a
A computer programmer (third author, MJ) with experience in technology-assisted research methods developed the
Staff members were involved in phase 1 EMA in the following ways: (1) educating participants on how to use EMA and registering them in the EMA system at the first home visit, (2) tracking participants to ensure EMA surveys were completed, (3) remotely trouble shooting EMA issues with participants, and (4) deactivating participants from the EMA system at the end of their observation window.
To ensure that the participants were able to meet the minimum criteria for a complete EMA day and for staff to be able to identify any participant’s troubleshooting needs, our EMA programmer built a web-based tracker. The staff members were able to monitor participants’ EMA progress, including seeing when signal-contingent and end-of-day surveys were scheduled to be sent and when surveys were started and completed. Contact information for the participant, language of the surveys, and the participant’s wake and sleep times were also identified. The tracker also allowed the staff to make some changes to the EMA (eg, change the survey language) without burdening the EMA programmer.
Using the EMA tracker, the staff members were instructed to contact participants: (1) if the participant did not have a
The most common EMA difficulties were some participants’ inability to complete EMA surveys while at work, particularly as some participants were not allowed to have the iPad mini with them during their shift. In these cases, the staff worked with parents on a case-by-case basis. Most parents with difficulties completing surveys due to work were able to meet the minimum requirements (eg, they completed a signal-contingent survey before work and another after their work shift). There were a small number of parents whose work schedules did not allow for the completion of EMA (eg, a nurse working a 12-hour shift). In these rare cases, parents were asked to complete the EMA on nonwork days, and extra days were added to the observation period to ensure 8 days could be completed. Although this information was provided in the registration binder, some parents were unfamiliar with iPads or using a tablet and needed additional remote assistance (eg, how to close a survey tab and return to the home screen).
Depiction of the ecological momentary assessment tracker used in phase 2 of the
Overall, participants completed 8 days of EMA (ie, a minimum of 2 signal-contingent surveys, 1 meal survey, and 1 end-of-day survey), an average of 10.5 days (SD 7.5 days;
Regarding meal surveys, participants completed an average of 3.7 (SD 1.5) meal surveys on weekdays and 4.3 (SD 1.6) on weekend days. There was no variation in this pattern across the 6 racial and ethnic groups. As described earlier, participants could complete a meal survey by either (1) self-initiating the survey after sharing a meal with their child or (2) as part of the signal-contingent prompt (eg, if they forgot to submit a meal that was previously eaten). There was no difference in participants’ patterns of self-initiating meal surveys in the first half of the observation period compared with the second half (ie, participants did not stop self-initiating meal surveys once they understood that meal surveys could also be taken as part of the signal-contingent survey).
For most of the analysis of phase 1 data, the data set of 1200 days (150 participants × 8 days) was used. However, there were times when using
EMA data can take both wide and long data formats, resulting in complex data management and analytical needs. As part of the data cleaning and management protocol, the analyst team developed a reference document that contained key information about how to explore the data, investigate missingness, describe panel data frequencies, and merge multiple sources of study data for analysis. The purpose of this document was to establish consistent data integrity procedures to ensure that analysts used descriptive and inferential procedures appropriate for intensive longitudinal data. Furthermore, EMA data collection often exploits
In phase 2, 1307 diverse parent and child dyads took a web-based survey at 2 time points, approximately 18 months apart, and approximately half of these families (n=627) were also eligible for enrollment in our EMA subsample. Participants were enrolled into the study between 2016 and 2018. The EMA survey design for phase 2 differed slightly from phase 1 including the following: (1) parents were asked to complete up to 4 surveys per day, including 3 signal-contingent surveys during the day and an end-of-day survey that combined the signal-contingent questions (eg, stress level) and event-contingent meal questions specific to the family’s dinner meal; (2) parents had to complete a minimum of 2 signal-contingent and the end-of-day survey for a day to be
A decision whether to use an iPad mini again or participant’s own phones was deliberated for phase 2 EMA. The main concern was whether all participants had access to a smartphone. Ultimately, we decided to use smartphones given some phase 1 participant feedback that the iPad minis were cumbersome. Thus, in phase 2, all participants chose to use their own phones, even though the
As phase 2 used participants’ smartphones, we were unable to use the same approach of placing an icon on the EMA (ie, iPad mini) device and developed a new approach. One approach considered was the development of an app, but there were many reasons why this was not ideal for our study. First, an app would be costly and time consuming to develop, and apps would have to be created for the different mobile phone operating systems used by participants (eg, iOS). In addition, there was concern that participants may have trouble or be hesitant to download the app and that it would require regular updates. Ultimately, the process for phase 2 EMA included participants being sent an SMS text message every time a survey was available, which contained the survey link that the participants followed to access the web-based survey. As there was one unique survey link per participant, participants could follow the link from any SMS text message (ie, not only the most recent one) to access EMA surveys. Participants also had the option to have survey notifications sent to their email addresses if they notified the staff that it was a better fit (eg, participants who worked primarily in front of a computer). Participants were alerted when enrolling in the study that they would receive SMS text messages and would be responsible for any SMS text message charges they incurred through their mobile phone plan.
Registration for phase 2 EMA was performed remotely and by the participant. After completing the web-based survey, parents who reported more than 3 family meals per week were given the opportunity to participate in an optional EMA substudy. This eligibility criteria aligned with the study aim of examining momentary mealtime routines and behaviors. Participants were able to download a form with substudy information about EMA requirements (eg, number of days and surveys needed) and interested participants were given an access code and directed to a web-based form to consent to the optional EMA substudy; they were then automatically directed to the EMA registration page. To ensure that the participants would receive texted survey links, participants were sent a test SMS text message after registration. Although staff were available to assist if necessary (and could even register participants on the web if needed), overall, participants registered themselves for EMA and understood the requirements (eg, number of surveys needed to complete) without requiring staff assistance. An important lesson learned from phase 2 of
Multiple features were added to the phase 2 EMA to assist both participant compliance and decrease staff time. First, multiple reminder SMS text messages were built into the signal-contingent and end-of-day survey windows. For signal-contingent surveys (expiration of 1 hour), participants received an initial SMS text message with the survey link and an SMS text message alerting the participant of the expiration time. If the survey was not completed, the participant received another reminder SMS text message after 30 minutes and another reminder SMS text message after 45 minutes for a maximum of 3 reminder SMS text messages. For end-of-day surveys (expiration of 4 hours), participants received the initial SMS text message with the survey link; they then received reminder SMS text messages every 45 minutes until the survey was complete or expired, for a maximum of 5 reminder SMS text messages.
Immediately after the end-of-day survey was completed or expired, the participant received another SMS text message with a summary of their study participation information to date, including (1) whether the participant had finished a complete day (ie, at least 2 signal-contingent and end-of-day surveys); (2) how many complete days the participant had done; and (3) if the participant had not finished a complete day, a reminder that additional observation days would be added to the EMA window to allow the participant to complete 7 days. The SMS text message also reminded the participants that they would receive US $75 after completing 7 complete days of EMA.
Regarding staff time, a feature was built into the phase 2 EMA system, where participants were automatically deactivated after 7 complete EMA days were completed. Therefore, unlike phase 1, the staff did not have to actively track each participant every day and manually deactivate. In addition, an email system was set up in which a study email account was emailed daily with the following information: (1) EMA participants who had not finished a complete EMA day on their first observation day, (2) EMA participants who had gone for more than 2 days without finishing a complete EMA day, and (3) the language of the EMA participant. This allowed the staff to easily identify the participants needing to be contacted each day to assist with any EMA difficulties or questions.
As surveys were conducted on participants’ smartphones in phase 2, questions were formatted so that the participant did not have to scroll to the right or left to see the full question and response option. Similarly, pages of the survey were designed so that they contained only a small number of EMA questions, which minimized how much participants had to scroll down. Related to the smaller screen of a smartphone versus an iPad mini, the style of the response option (eg, radio button vs checkbox) was carefully considered to promote response ease. For example, questions with a Likert scale had response options provided on a slider (with anchors) rather than a pull-down menu. For the slider, the participant only had to select in the general vicinity of the anchor they wanted to choose (ie, they were not required to push a very specific section of the slider bar).
As participants were registering themselves, we provided instructions in a variety of languages (ie, English, Spanish, Somali, and Hmong) to make this possible. After completing the full web-based survey (in their preferred language), participants were directed to the EMA registration page. Instructions were provided on this page in all 4 languages, and participants were asked to enter their unique access code and then to select a survey language. For phase 2, over 93.3% (585/627) of the sample took EMA surveys in English, 6.2% (39/627) took the surveys in Spanish, and only a few families took the survey in Hmong (1/627, 0.2%) or Somali (3/627, 0.5%). Upon participant request, the staff members were able to change the language of the surveys. In addition, due to less participant and staff contact in phase 2, an information button was added to each question. Participants could select the
Using EMA methods in the
Overall, the main aims of this paper were to (1) answer a call in the field to report EMA study designs and (2) to extend prior EMA research by providing a detailed description of the
It was feasible for parent participants from low-income, racially and ethnically diverse, immigrant and refugee households (referred to in this textbox as
Participants were able to complete the EMA via their own smartphones without study provision of such devices.
Participants without familiarity with iPad tablet technology were able to easily learn how to operate these devices.
Participants preferred receiving survey notifications via SMS text message versus via email.
The
Participants were able to register remotely without staff assistance, which included navigating to the EMA registration page, entering an access code, and entering in registration information (eg, phone number and name). This was the case for all study participants, regardless of the main language they spoke (ie, English, Spanish, Hmong, or Somali).
Participants who moved into a new time zone during phase 2 needed to have their survey times adjusted, as the initial system was set up using only CST.
Future studies can feel confident that EMA studies can be carried out in diverse groups, including in non–English-speaking groups.
Future studies may be able to rely on participants using their own smartphone devices to complete the EMA; researchers may want to consider having a small budget for providing devices to some participants who may have a mobile phone that is not a smartphone or in case of device failure.
Studies providing technology for participants to complete the EMA should consider providing guidance on using the technology (eg, opening web browsers and closing tabs), while at the same time feeling confident that most participants will be able to use mHealth technology.
Providing SMS text message notifications for the EMA is likely sufficient, although this may vary depending on the study population (eg, job primarily using a computer).
Although participants will become more familiar with applications, some may have hesitancy or trouble downloading an app. EMA application may be a useful tool for future research studies, although it should not feel like a requirement.
Although staff should be available to troubleshoot any participant questions or concerns, the
Future longitudinal studies or ones where participants are located in different time zones should include a time zone question during EMA registration. The adjustment of time zones can then be read and altered by the SMS text messaging service.
The EMA system was set up so that (1) participants were automatically deactivated when they had finished enough complete survey days and (2) if a participant failed to finish a complete day, another day was automatically added on to their observation period.
A protocol was designed to alert staff on when to contact participants (eg, if the participant did not finish a complete day on their first observation day). Participants received multiple reminder SMS text messages to complete each EMA survey, and participants also received a summary SMS text message at the end of each day telling them (1) if they had finished a complete day, (2) how many complete days they had done, and (3) a reminder of the incentive amount.
Having these study design elements automatically built into the EMA computer program significantly reduces staff time (eg, the time spent tracking and deactivating participants), and can lead to participant satisfaction and less participant confusion (eg, eliminates the possibility of sending the participant surveys to finish after the participant has completed all EMA requirements).
Having multiple reminders to study participants increases the likelihood that participants will complete surveys. Repeated reminders also reduce the staff time needed to contact participants.
EMA design for phase 2 (completed on participant smartphones) needed to consider how the survey would appear on a smartphone screen rather than how it looked on a computer screen.
As the EMA is a newer tool for assessing diet and physical activity The
Participants were able to complete a meal survey (ie, providing information about a meal they shared with their child) in 2 ways: (1) self-initiating the survey and (2) as part of the signal-contingent prompt. Over half of meal surveys in phase 1 were completed via self-initiation. There did not appear to be a difference in participants’ patterns of self-initiating meal surveys in the first half of the observation period versus the second half.
Although phase 1 and 2 of
The web service used during phase 2 of the EMA needed to be changed (from Twilio to Amazon Web Services) to comply with updated privacy policies of the University of Minnesota.
EMA survey design should be smartphone friendly. For example, the question text should be large enough to be viewed on a phone screen. The question should be designed so that participants do not have to scroll to read the full question or to see the response options. Questions should be on multiple pages rather than having all survey questions on only one page.
Researchers should consider adopting the EMA questions already being utilized in EMA survey research. Survey questions for constructs that have not been assessed via EMA should be selected using validated measures (when possible); questions and response options may need to be altered to be more EMA-friendly.
For researchers wanting to simplify their EMA design, assessing events (eg, smoking and eating a meal) through signaled prompts rather than participant self-initiation may be a viable option. Depending on the event being considered, researchers may want to consider adding in more signal prompts to catch more events.
EMA study designs should capture and retain all data submitted by participants, regardless of whether they meet the full criteria set by the researcher.
As the EMA will likely collect identifiable participant data (eg, phone numbers), researchers should be aware of the privacy policies of the institution they are working in to ensure they are compliant.
One functionality that we built into the design of our EMA event surveys was the ability of the parent to enter a meal survey at the beginning of a signal-contingent survey if they had forgotten to report a meal (event-contingent survey) earlier. This intentional design allowed us to capture any unreported meals that the parent shared with the child in the study. This design was important as EMA studies with racially and ethnically diverse, low-income households are less common; thus, we wanted to ensure that the EMA system was user-friendly to navigate and allowed to collect as much data as possible, within the bounds of the data being accurate. However, there are potential disadvantages to consider in the study design. First, it is not possible to ascertain the exact time of the behavior (ie, meal), although it is possible to determine the window of time the behavior occurred, and there could potentially be a loss of specificity further away from the meal the survey is entered. It is important that signal- and event-contingent surveys be collected in the order they occurred, so they do not affect a retrospective assessment (eg, you would not want participants to submit signal-contingent surveys, ie, report on momentary stress, but enter all event-contingent, ie, meal, surveys in the evening). Future EMA research collecting meal-level data through signal-contingent surveys may be able to increase the level of event-contingent (ie, meal survey) completion by offering multiple ways to complete the survey (ie, individually or as part of the signal-contingent survey), while also having the participant report on the time the meal was eaten when they took the delayed meal survey. Parents also only reported on meals for which both the parents and the child were present. This allowed parents to provide detailed information on the meal; however, this design may not allow for the assessment of overall child dietary intake behaviors as many meals (eg, those eaten at school) were not reported. Another important consideration for future research is related to the differing amounts of time parents are with their child. It may be important to assess the amount of time the parent spent with the child since the last survey to determine if parent behaviors could have had an influence on the child. Future research may wish to involve older children, who may be more reliable reporters than younger children of dietary intake behaviors, in EMA data collection alongside their parents.
EMA is a method that was successfully implemented and improved upon in the
Ecological momentary assessment questions used in phase 1 of Family Matters.
Responsiveness of phase 1 Family Matters participants to ecological momentary assessment surveys.
Selected results from the Family Matters phase 1 study using ecological momentary assessment data.
Peer-review report by the Psychosocial Risk and Disease Prevention Study Section, National Institutes of Health.
ecological momentary assessment
mobile health
None declared.