Skip to main content
Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Transl Behav Med. 2016 Dec; 6(4): 577–586.
Published online 2016 Sep 29. doi: 10.1007/s13142-016-0426-4
PMCID: PMC5110499
PMID: 27688250

Engaging research participants to inform the ethical conduct of mobile imaging, pervasive sensing, and location tracking research

Camille Nebeker, EdD, MS,corresponding author1,2 Tiffany Lagare, MPH,2 Michelle Takemoto, BS, BA, PhDc,1,2 Brittany Lewars, MPH,2 Katie Crist, MPH,2 Cinnamon S. Bloss, PhD,1,2,3 and Jacqueline Kerr, PhD1,2

Abstract

Researchers utilize mobile imaging, pervasive sensing, social media, and location tracking (MISST) technologies to observe and intervene with participants in their natural environment. The use of MISST methods and tools introduces unique ethical issues due to the type and quantity of data, and produces raising new challenges around informed consent, risk assessment, and data management. Since MISST methods are relatively new in behavioral research, there is little documented evidence to guide institutional review board (IRB) risk assessment and inform appropriate risk management strategies. This study was conducted to contribute the participant perspectives when considering ethical and responsible practices. Participants (n = 82) enrolled in an observational study where they wore several MISST devices for 1 week completed an exit survey. Survey items focused on the following: 1—device comfort, 2—informed consent, 3—privacy protections, and 4—bystander engagement. The informed consent process reflected participant actual experience. Device comfort and privacy were raised as concerns to both the participants and bystanders. While the majority of the participants reported a positive experience, it is important to note that the participants were volunteers who were not mandated to wear tracking devices and that persons who are mandated may not have a similar response. Findings support strategies proposed in the Kelly et al. (2013) ethical framework, which emphasizes procedures to improve informed consent, protect privacy, manage data, and respect bystander rights when using a wearable camera.

Keywords: Mobile health, mHealth, Research ethics, Pervasive sensing, Geo-location, Location tracking, GIS, GPS, Wearable camera, SenseCam, IRB, Informed consent, Privacy, Institutional review board

INTRODUCTION

Public health researchers are utilizing mobile imaging, pervasive sensing, social media, and location tracking technologies, what we refer to as “MISST,” to measure every day behaviors of research participants in their natural environment [1]. These tools (i.e., devices and phone apps) are worn or carried by participants to objectively measure behavior in context [17]. The purpose of location logging, for example, is to assess spatial predictors of behaviors [2, 3]. By combining a Global Positioning System (GPS) device with an activity monitor, researchers can measure where a person is active or sedentary [2]. By adding a wearable camera, researchers can obtain information about the physical and social environments where the activity is occurring [47]. The SenseCam is one type of wearable camera that has been used in studies capturing images of behavior and context [8]. SenseCam is a digital camera that uses multiple sensors (infra-red, lux, thermometer, accelerometer) to detect ambient temperature, changes in light and movement, and the presence of a body in front of the wearer. Participants in the parent study, some of whom subsequently participated in the sub-study reported here, were asked to wear the device on a lanyard around their neck (see Fig. 1) to capture first-person, point-of-view images [8]. Approximately 3000 images are captured daily at approximately 20-s intervals.

With these data from single or multiple devices, public health researchers can better understand a person’s daily behavior and design interventions that are tailored to that individual, and potentially are responsive in real time. In spite of these benefits, mobile sensing technologies introduce unique ethical issues that must be evaluated to better understand study risks, benefits, and best practices for achieving informed consent [9]. Since the use of MISST methods are relatively new in behavioral medicine research, there is little documented evidence to guide risk assessment and appropriate risk management strategies. The current study was prompted by challenges faced by an institutional review board (IRB) tasked with reviewing a research study titled “Validating Machine-Learned Classifiers of Sedentary Behavior and Physical Activity,” (working title: iWatch), in which research participants were asked to wear five MISST devices to assess the utility of these technologies in objectively measuring activity in free-living individuals.

IRB and MISST-ethics

Most researchers are familiar with the process of engaging an IRB in the research planning process. The IRB is tasked with evaluating research to ensure that studies involving humans are conducted in compliance with federal regulations and accepted ethical principles [10, 11]. The IRB is composed of at least five people, both scientists and non-scientists, from varying backgrounds who are qualified to review research conducted by the institution [10]. At least one member must be unaffiliated with the institution, and subject matter experts can be brought in to consult as needed [10]. Researchers submit a research plan and informed consent documents for the IRB to review. A primary aim of the IRB is to evaluate the potential study benefits and the probability and magnitude of possible harm to participants, and assess strategies proposed to minimize risk of harm. The IRB evaluation can end in one of three ways—an approval, a list of modifications needed to secure approval, or disapproval [10].

When the iWatch observation study was proposed, asking research participants to wear an outwardly facing camera as a measurement device was fairly novel, and there was little empirical evidence to guide the assessment of potential harm to a research participant. Initially, the IRB surmised that the risk to bystanders (those who could be included in images captured by the camera worn by the research participant) outweighed the potential benefits of the study [9]. The researchers were told they could conduct the study; however, only if the wearable camera was removed from the study protocol [9]. Around this same time, the iWatch lead investigator and colleagues were developing a framework to guide the ethical use of automated, wearable cameras in behavioral research in response to concerns noted by IRBs (i.e., documenting illegal or private moments) [12]. This framework (see Table 1 1) identifies practices that align with the ethical principles found in the Belmont Report including autonomy, beneficence, and non-maleficence with attention paid to both the research participant and bystanders [11, 12]. The iWatch team applied elements of this framework to their revised research plan and, subsequently, received IRB approval to conduct the study.

Table 1

Ethical guidelines for the use of automated, wearable cameras in observational healthy behavior research

Informed written consent of participantParticipant information should explicitly detail the following:
How many images and how much information will be collected
The nature and type of data that can be collected by wearing an automated wearable camera (images will depict where you go, what you do, and for how long) with examples
Participants can forget they are wearing device and record unwanted and unflattering images with examples provided (e.g., bathroom visits, online banking)
Data of illegal activities may not be protected by confidentiality and may be passed to law enforcement depending on the national law and nature if the activity
No individual will be identifiable in any research dissemination without their consent
Participants will have the opportunity to vie (and delete if necessary) their images in privacy
Participants are able to remove the device or temporarily pause image capture whenever they wish
Participants will not get copies of their images
A team of specifically trained researchers will have access to the image data
Privacy and confidentialityDevices should be configured so that data can only be retrieved by the research team. It should be impossible for participants or third parties who find devices to access images
Data should be stored according to national data protection regulations
Identifying images should not be used without express consent of those individuals who are depicted
Devices should be configured to allow participants to cease recording for short periods. Participants should be allowed to remove the device of any time, with examples of where this might be appropriate (e.g., airport security)
Appropriate training should be provided for all those in the research team who have contact with the image data
Non-maleficenceParticipants should be prepared for questions by the public with a short sentence that explains the device and concludes with an offer to remove if they are feeling uncomfortable
Participants should be instructed to remove device in any situation where it is attracting unwanted attention, or they feel threatened or uneasy wearing the device.
Autonomy of third partiesParticipants should seek verbal permission from family members and cohabitants before study commencement
Participants should seek verbal permission of workplace mangers or supervisor. If possible, this should be prior to study commencement, but in reality, may be a rolling process, Appropriateness of device to work setting should be assessed by researcher
Participants should inform friends and acquaintances of device when encountered and offer to remove device if they are uncomfortable
Participants should be told to inform third parties that they also can request image deletion by asking the participant to inform the research team, or contacting them directly
The privacy and anonymity of third parties must be protected; no image that identifies them should be published without their consent
Photography is inappropriate in some cultural settings, and automated, wearable cameras should not be used in these instances

The aim of this iWatch sub-study was to gather data from participants after completing study tasks to determine whether procedures used in the iWatch study facilitated informed consent, provided adequate protections for privacy, attended to data confidentiality, and demonstrated respect for bystander rights. Specifically, we wanted to obtain a first-hand accounting of study participant experiences to identify whether considerations recommended by the Kelly et al. (2013) ethical framework were useful.

Ethical framework applied to the iWatch study

Specific to visual imaging, both the consent form and research plan included a description of the number, nature, and type of images collected; the purpose and use of the privacy setting (e.g., turn off when at the bank, doctor’s office, gym, bathroom); an option to view and delete images prior to review by research team; and an explanation of who would have access to images. To enhance data confidentiality, the camera was configured so that only members of the research team were able to view the encrypted images. Encrypting the images also prevented the images from being downloaded by the research participant to share with others, including social media sites. Upon study completion, data were stored in a secure location consistent with approved security standards. While this study did not involve private health information, the research team determined that the data may be considered sensitive and opted to apply standards used by the Health Insurance Portability and Accountability Act (HIPAA). Those on the research team with access to participant data received training to enhance their understanding of the importance of data security. To demonstrate respect for others, the prospective participant was asked to confirm whether wearing a camera was permissible at the home and/or at the office. Research participants were also given a reference card containing a brief explanation of the study should a bystander ask about the device. To assess whether these procedures were useful, we asked research participants directly about their experience upon completing the iWatch study.

METHODS

Parent study recruitment

Participants enrolled in the iWatch study were recruited through a variety of sources including Craigslist, Research Match, and Clinical Trials. Individuals who had participated in prior studies and indicated interest in being contacted for future studies were also contacted. People who were between the ages of 6 and 85 were eligible to participate provided they agreed to wear the SenseCam imaging device (Vicon Revue v1.0), a Global Positioning System (GPS) tracking device (Qstarz BT100X), and three activity monitors called accelerometers (Actigraph, Inc., GT3X+) for seven consecutive days. The SenseCam was worn on the participant’s chest hanging from a lanyard around the neck (see Fig. 1). The GPS and a single accelerometer were worn around the waist on a belt. The other accelerometers were worn on each wrist like a watch. At the end of the 1-week study period, participants returned to the iWatch office to turn in the devices and complete a survey focused on their usual behaviors and perceptions of health (e.g., physical activity, the environment, eating, sleeping). Following completion of the survey, participants were invited to privately review the images that were recorded on the SenseCam. During their viewing time, participants had the opportunity to either delete images they preferred to omit from the research record or note images they wanted removed.

Sub-study recruitment for iWatch Exit Survey

Upon completion of the parent iWatch study activities, including viewing images from the SenseCam device, the participant was asked to participate in this sub-study, which involved completing the iWatch Exit Survey. Since the iWatch parent study had commenced several months prior to the decision to conduct the sub-study, 78 participants had already completed the study tasks and were not re-contacted to participate in the sub-study. The participants, who enrolled in the iWatch parent study after the sub-study modification received IRB approval, were those considered eligible to complete the iWatch Exit Survey. Of the 133 invited to complete the iWatch Exit Survey, 61 % (n = 82) agree to participate (see recruitment flowchart in Fig. 2).

An external file that holds a picture, illustration, etc.
Object name is 13142_2016_426_Fig2_HTML.jpg

iWatch sub-study recruitment flowchart

Data collection

iWatch Exit Survey

The iWatch Exit Survey was developed to learn about participant perceptions and experiences when wearing the imaging, sensing, and tracking devices. Members of the research team (CN, KC, LD, and MT) developed the survey questions, which included both multiple-choice and open-ended prompts (appended). The open-ended prompts were used to clarify multiple-choice responses; however, responses were only provided if the person decided to elaborate. Since the questions were administered in a self-report survey, we did not follow up with probing questions to clarify participant responses. As such, conducting a thematic analysis of the data was not possible.

The survey questions (appended) were designed to assess participant experience with 1—wearing the devices (e.g., SenseCam, GPS, activity monitors), 2—perceptions of the informed consent process, 3—use of privacy protection procedures, 4—bystander response, and 5—whether the images were consistent with expectations. To gauge whether participants who were agreeable to wearing the devices were generally more open regarding their privacy, we asked whether they participated in social media (i.e., Facebook, Twitter, blog, etc.) and to rate their preferences or expectations for privacy using a 5-point Likert scale. We developed a similar survey for children who participated in the study; however, this paper focuses specifically on adult participant responses.

Image censoring process

In addition to collecting data via the iWatch Exit Survey, a research assistant completed an image censoring form on a subset of participant. The image censoring form was used to record the following: 1—whether and how long each participant viewed their images, 2—whether participants deleted images, and if so, 3—the number of images deleted. To complete the form, a query was run after the participant viewed their images in private. The query identified how many file numbers were deleted by the research participant. The researchers then deleted the file numbers without viewing the actual image that was removed from the record.

Data quality

Each participant completed the survey, and the hard copy was scanned and saved as a portable document file by a research assistant. Each survey was identified using the study participant’s unique ID so that demographic data could be matched. Survey responses were entered into an Excel file. Frequencies were calculated on the multiple-choice responses using SPSS (version 22). The data are reported across the four main themes noted in the Kelly et al. framework and include informed consent, privacy and confidentiality, non-maleficence, and autonomy of bystanders.

RESULTS

Sample description

Upon completion of the iWatch study, the participants (n = 82) completed the iWatch Exit Survey. Of the 82 study participants, 45 (55 %) identified as male and 37 (45 %) as female. The average age was 50 years (range 18–85 years) with 72 % identifying as White, 11 % as Asian, 12 % as Hispanic, 4 % as African American, and 1 % as American Indian/Native American descent. A slight majority (54 %) reported not having a college degree, and 44 % reported an income greater than $40,000 (Table 2).

Table 2

Demographics of iWatch exit survey participants

VariableMean (SD)/frequency (%)
Age (N = 80)50.225 (21.02)
Gender (N = 82)
 Male45 (54.9)
 Female37 (45.1)
Ethnicity
 White59 (53.7)
 African American3 (3.7)
 American Indian/Native American1 (1.2)
 Asian9 (11.0)
 Hispanic/Latino10 (12.2)
Education (N = 80)
 Below college graduate44 (53.7)
 College graduate and above36 (43.9)
Income (N = 56)
 Below $40,00020 (24.4)
 $40,000 and above36 (43.9)

Social media and privacy preferences

To explore whether people who volunteered for a study that involved wearing a camera, activity monitors, and a location tracking device would consider themselves to be more open regarding their privacy preferences, we asked whether they participated in social media and to self-rate their privacy preferences. Specific to social media, we asked whether they participated in social media activities like Facebook, Twitter, Foursquare, had a blog, or other social media activities. Of the 76 participants who completed this question, 65.8 % reported being active on social media whereas 34.2 % were not. The participants were asked to rate their preferences or expectations for privacy using a Likert scale where 1 = extremely private to 5 = extremely open. Of the 79 participants responding, 35.4 % reported being extremely (17.7 %) or somewhat (17.7 %) private; 43 % identified as neither private nor open, and 21.5 % reported being extremely (10.1 %) or somewhat (11.4 %) open (Fig. 3).

An external file that holds a picture, illustration, etc.
Object name is 13142_2016_426_Fig3_HTML.jpg

Privacy characteristics. The participants were asked to rate their preferences or expectations for privacy using the following Likert scale: 1—extremely private—I prefer to keep information about me to myself; 2—somewhat private—I tend to share my information with a small circle of trusted family and friends; 3—neither private nor open; 4—somewhat open—I’m willing to share my information with friends, family, friends of friends, etc.; 5—extremely open—I share my information with anyone who wants to know about me. As shown in Fig. 3, of the 79 participants responding, 35.4 % reported being extremely (17.7 %) or somewhat (17.7 %) private; 43 % identified as neither private nor open and 21.5 % as somewhat (11.4 %) or extremely (10.1 %) open

Informed consent

The iWatch informed consent process included an in-person, verbal overview of the study, a review of the written consent document, and a demonstration of how to use each device. A research assistant reviewed the features of each device and explained when to use the privacy setting (e.g., bathroom, bank, doctor visit) as well as what to do if a bystander asked about the wearable devices. Consistent with the ethical guidelines, the informed consent process included a written description of:

  • the number, nature, and type of images collected,
  • the purpose and use of the privacy setting, including when to activate or remove the device,
  • limitations to privacy and requirements for reporting images that depict a reportable activity (i.e., elder or child abuse),
  • procedures for allowing the participant to review and delete images,
  • standards for managing and sharing data,
  • how to respond to questions about the device or requests to remove the device,
  • when to request permission to wear the device, and
  • the use of encryption to protect images from being viewed.

Upon study completion, we asked:

In retrospect, did the information we presented about the study when you enrolled (e.g., our discussion and review of the Informed Consent Agreement) accurately represent what you thought you would be doing as a study participant?

A majority (n = 78 or 95 %) agreed that information received during the study consent process accurately reflected their experience. A quarter of participants (n = 21 or 27 %) suggested additional information would improve the informed consent content, for example:

“Let [participants] see an example of what the camera records.”

“Add what you will do with the info and how it will relate to me.”

Privacy and confidentiality

The approved research plan included procedures for participant safeguards and control of information (e.g., privacy button, informational card to share with a bystander). A majority (n = 49 or 60 %) reported using the privacy button; whereas, about a third of respondents (n = 28 or 34 %) did not. Those who reported not using the privacy button disclosed that they flipped the device over or put the camera inside of their shirt instead of using the privacy button.

Non-maleficence

The ethical framework recommends that researchers “provide participants with an explanation about the study should someone inquire about the device” [12]. The iWatch study protocol was to give each participant a small card with a brief study description that they could either read or give it to the interested party. The card states:

“I am participating in an experiment on physical activity and the environment. This is a digital camera that automatically captures low-resolution still images throughout the day, which will later be used to describe my behavior and environment. It does not record audio or full-motion video. Any images captured will not be made public in any fashion and will only be seen by the researchers. If you would prefer, I can turn off or temporarily deactivate the camera, and/or make a note and have the images just taken deleted without anyone seeing them. I can also provide contact information for the researchers.”

Few participants reported using the card, and most stated it was easier to simply tell the person that they were in a research study. We also asked if there were situations when wearing the camera that made them feel uncomfortable. Nearly one-third of our respondents (n = 26 or 32 %) disclosed feeling uncomfortable while wearing the camera as noted in the following quotes:

“[I] hated it - felt like I was a pervert & invading other’s privacy & mine.”

“Made me feel like I had to explain myself when people looked at it.”

Autonomy of bystanders

The IRB review comments noted a concern about possible risk of harm to bystanders who may be photographed due to their proximity to a research participant. We asked the participants whether anyone asked about the camera or other wearable devices used in this study. Nearly all (n = 69) reported that a bystander asked them about the camera, and of those, 35 (43 %) were asked about one of the devices at least five or more times during the week. When a bystander inquired about the SenseCam, 16 % (n = 11) of the participants reported being asked to remove or turn off the camera, citing appropriateness of setting (e.g., work meeting) or discomfort being photographed. The participants reported that bystanders expressed both positive and negative responses to the camera device as evidenced by the following comments:

“Half thought it was cool. Half thought it was an invasion of privacy.”

“Yes, positive - no objections” “Curious” “Interested”

“They thought it was cool and weird at the same time”

Control and expectations

Upon returning the SenseCam and other study devices to research staff, the participants were given the option of viewing their images. A majority (n = 52 or 63 %) agreed that the images were what they expected. Of the 22 (26 %) who said images were not what they expected, 12 (54 %) commented that images were blurrier than expected, and 4 (18 %) stated that the images were clearer and more detailed than expected.

“Not too high res[olution] to be creepy”

“They are more clear than I thought they would be.”

Image censoring

Research staff who facilitated the image viewing process indicated that participant viewing behaviors varied. Therefore, to better understand the participant interest in censoring images, we developed the image censoring form and observed the last 33 of the 82 participants. Research staff recorded (1) whether the participants chose to review images; if so, (2) time spent on image viewing (approx. 26,000 images collected per participant); and, (3) speed of image viewing. Of these 33 participants, all but one chose to view the images captured by the SenseCam, revealing a value for having this choice. The participants were given time to review images and delete images that they did not want included in the research record. To respect the privacy of the research participants, a sequel code was created to delete selected images by identification number so the images would not be visible to research staff. After the participant viewed the images in private, a query was run to identify time spent viewing and number of files deleted by the participant. The participants spent an average of 17 min viewing the pictures (range 3–35 min). Of the 32 participants who reviewed their images, 11 (34 %) deleted images with the number deleted ranging from 1 of 22,589 images to 295 of 28,813 images. the participants were not asked to qualify the nature of the images removed nor was the one individual asked why they did not want to view the images.

While not directly related to the ethical framework, we were also interested in overall impressions of wearing the camera, activity monitors, and location logging devices. The majority of complaints focused on the wrist-worn devices with a few comments on the camera interfering with daily activities. Specific to the wrist-worn devices, the participants reported that the strap was difficult to secure and started to smell over the course of their study participation. A few commented that the device did not complement their fashion style. The camera hanging on a lanyard would move when the participant was active and was distracting and uncomfortable when playing sports. These comments were shared with the research staff. Overall, the participants reported a positive experience and said that they would recommend participating in this study to others.

DISCUSSION

The use of SenseCam as an objective measurement tool is increasing. A search on PubFacts (http://www.pubfacts.com/search/SenseCam) revealed 30 such studies that included research focused on a variety of topics including memory, physical activity, travel, and diet. Whether researchers conducting these studies faced similar challenges with IRB approval is not known. Given the increasing interest in studying behavior in “free-living” or the “in the wild,” we anticipate seeing more research utilizing wearable imaging and audio devices, pervasive sensing tools, social media data, location tracking devices, and phone apps to measure and intervene with behavior.

We believe this study is novel in that actual research participants were engaged to learn first-hand about their experience with a study that used several MISST devices to measure daily activity and context. By surveying research participants, we learned that the primary concern was not one of privacy, (despite being a concern of IRBs), but of discomfort related to wearing a MISST device (despite not being a concern of the IRB). Most critiques by the participants focused on physical irritation from the wrist-worn activity monitor band and interference of the wearable camera with daily activities. This focus on discomfort may be an artifact of the survey questions; however, it is interesting that privacy concerns were not a larger issue considering that over a third of participants (35.4 %) identified as having “extremely private” or “somewhat private” preferences or expectations for privacy. At some level, it is notable that anyone who self-identifies as “extremely private” would even agree to participate in a study that would require wearing a camera that records daily activity or a tracking device to monitor location. This suggests that the participants may have viewed the loss of privacy as “worth” the potential contributions of the study to science. Alternatively, it may simply reflect the well-known “privacy paradox,” which posits that individuals often behave in ways that run contrary to their stated privacy beliefs or preferences. Ultimately, however, these findings strongly suggest that the extent to which IRBs view privacy issues as paramount for potential research participants may not align with the views of the participants themselves.

We learned that the informed consent process accurately reflected participant experiences; however, informed consent may be improved by including examples of SenseCam images and GPS traces. Whether showing examples of data in advance will influence willingness to participate in a study is yet to be determined. Providing an informational card to explain the study to a bystander proved less useful and may not be necessary. A majority of the participants reported a positive experience and said they would recommend the study to others. It is important to note that the iWatch participants volunteered to wear the MISST devices with most reporting a favorable study experience; however, these results cannot be generalized to coerced and involuntary device users (i.e., parolees).

Empirical research studies to inform IRB decision-making have increased dramatically over the past decade [13]. We initiated this study to learn about research participant experiences and to assess whether practices to address informed consent, risks assessment, and management were appropriate. Rather than having an IRB to subjectively determine that the risk to bystanders exceeds the possible benefits of objective measurement methods or risk to participant for documenting actual behaviors, we recommend that researchers gather evidence from the participants to qualify risk and determine what risk assessment and management strategies are appropriate. We have not spoken directly to bystanders who may have been in contact with one of our iWatch participants. While we can make assumptions of bystander perspectives based on iWatch participant experiences, risks to a bystander would be more accurately assessed by collecting data from an actual bystander. As pervasive sensing technologies become more sophisticated and useful tools in health research, it will be more important than ever for researchers to take responsibility for identifying actual risk to the participants and employing management strategies deemed appropriate by the actual study participants. Sharing of these data-driven strategies or “best practices” with the greater research community will facilitate standardized practices and assist IRBs in their review of MISST research studies.

LIMITATIONS

There are limitations in this study. We did not use an experimental design to test whether the information provided during the informed consent process made a difference in understanding about the study. Likewise, we only surveyed the study participants, and questions posed by the research team may have resulted in socially desirable responses. Open-ended survey items used to clarify responses were not completed by all the participants limiting our ability to conduct a qualitative thematic analysis. Those who declined to participate may have a different perspective when compared to those who opted in. We intentionally did not analyze images deleted by the participants, as a demonstration of respect for their privacy. However, by knowing what images are deleted, we could learn the extent to which the participant focused on pictures they considered private (i.e., bathroom, locker room) versus removing images based on requests of family, co-workers, or bystanders and adjust the protocol accordingly. Moving forward, we can modify the image censor protocol to query the participants upon completion to capture this information.

CONCLUSION

Mobile and digital technologies are changing the way in which the public, research institutions, IRBs, and researchers are thinking about research and related responsibilities. With new research methods and tools come changes in how we view privacy, informed consent, and data management. As we enter this new frontier of big data, pervasive sensing, real-time interventions, and N-of-one studies, it is increasingly important for scientists, IRB members, and other stakeholders to acknowledge the ethical dimensions of this research and determine appropriate oversight [9, 14].

Specific to the MISST devices and app use in research, more studies are needed to determine best practices for obtaining meaningful informed consent, assessing the probability and magnitude of possible harm to participants, and identifying appropriate data management strategies. Our preliminary studies indicate that the participants may not fully understand the type and amount of personal data collected by MISST (unpublished) and IRBs are not consistent when it comes to risk assessment [9]. Likewise, it is not clear what type of data storage and security standards are most appropriate for the extensive amount and granular nature of the data, nor when and how these data should be shared. The lack of relevant and responsive guidance creates challenges for institutional review boards (IRBs), researchers, and consumers alike. This paper provides evidence that supports the ethical framework recommended by Kelly et al. (2013) for guiding the design and the ethical review of research involving pervasive visual imaging methods [12]. Moving forward, we advocate for researchers who are using MISST tools and methods to work closely with both research participants and their IRB to bridge the gap in our understanding of what constitutes ethical and responsible research practices.

Acknowledgments

We would like to thank the iWatch participants for contributing to this study. We also acknowledge Lindsay Dillon, MPH, who contributed to the development of the survey and Elizabeth Booen, MS, who assisted with the data management.

This study was conducted with the participants from the NCI-funded Validating Machine-Learned Classifiers of Sedentary Behavior and Physical Activity study (PI Kerr, Grant # R01CA164993, IRB # 111160). We also acknowledge the Robert Wood Johnson Foundation’s support of the Connected and Open Research Ethics (CORE) initiative (PI Nebeker, #72876, 2015-2017) and the Impact of Privacy Environments for Personal Health Data on Patients (PI Bloss, R01 HG HG008753)

APPENDIX

iWatch Exit Survey version 2: 041514

iWatch Exit Interview

Thank you for participating in the iWatch study this week. We are interested in learning more about your experience with the devices.

Please answer the following questions.

An external file that holds a picture, illustration, etc.
Object name is 13142_2016_426_Figa_HTML.jpg
An external file that holds a picture, illustration, etc.
Object name is 13142_2016_426_Figb_HTML.jpg

Notes

Conflict of interest

The authors declare that they have no conflicts to report.

Informed Consent Statement

All procedures followed were in accordance with the ethical standards of the responsible committee on human experimentation (institutional and national) and with the Helsinki Declaration of 1975, as revised in 2000. Informed consent was obtained from all patients for being included in the study.

Footnotes

1Reprinted from American Journal of Preventive Medicine 2013; 44(3): 314-319, Kelly, P. et al. “An Ethical Framework for Automated, Wearable Cameras in Health Behavior Research” with permission from Elsevier.

Manuscript prepared for re-submission to Translational Behavioral Medicine

Implications

Practice: Stakeholders, including institutional review boards (IRBs), behavioral scientists, and research participants must work collaboratively to advance evidence-based ethical practices responsive to research using emerging technologies.

Policy: Regulations and ethical practices for human research protections must evolve to meet the needs of dynamic twenty-first century science.

Research: There is a growing need for empirical research to 1—“inform” the informed consent process, 2—qualify and quantify the magnitude and probability of potential risks, and 3—guide data management strategies when collecting new forms of personal health data.

References

1. Kumar S, Nilsen WJ, Abernethy A, et al. Mobile health technology evaluation: the mHealth evidence workshop. Am J Prev Med. 2013;45(2):228–236. doi: 10.1016/j.amepre.2013.03.017. [PMC free article] [PubMed] [CrossRef] [Google Scholar]
2. Jankowska M, Schipperjin J, Kerr J. A framework for using GPS data in physical activity and sedentary behavior studies. Exerc Sport Sci Rev. 2015;43(1):48–56. doi: 10.1249/JES.0000000000000035. [PMC free article] [PubMed] [CrossRef] [Google Scholar]
3. Zenk S, Schulz A, Odoms-Young A, et al. Feasibility of using Global Positioning Systems (GPS) with diverse urban adults. J Phys Act Health. 2012;9:290–296. [PMC free article] [PubMed] [Google Scholar]
4. Kerr J, Marshall SJ, Godbole S, et al. Using the SenseCam to improve classifications of sedentary behavior in free-living settings. Am J Prev Med. 2013;44(3):290–296. doi: 10.1016/j.amepre.2012.11.004. [PubMed] [CrossRef] [Google Scholar]
5. Doherty AR, Kelly P, Kerr J, et al. Use of wearable cameras to assess population physical activity behaviours: an observational study. Lancet. 2012;380:S35. doi: 10.1016/S0140-6736(13)60391-8. [CrossRef] [Google Scholar]
6. Doherty AR, Kelly P, Kerr J, et al. Using wearable cameras to categorise type and context of accelerometer-identified episodes of physical activity. Int J Behav Nutr Phys Act. 2013;10:22. doi: 10.1186/1479-5868-10-22. [PMC free article] [PubMed] [CrossRef] [Google Scholar]
7. Oliver M, Doherty AR, Kelly P, et al. Utility of passive photography to objectively audit built environment features of active transport journeys: an observational study. Int J Health Geogr. 2013;12:20. doi: 10.1186/1476-072X-12-20. [PMC free article] [PubMed] [CrossRef] [Google Scholar]
8. Hodges S, Williams L, Berry E, et al. SenseCam: a retrospective memory aid. In: Ubicomp 2006, LNCS 4206. 2006; 177–93. doi:10.1007/11853565.
9. Nebeker C, Linares-Orozco R, Crist K. A multi-case study of research using mobile imaging, sensing and tracking technologies to objectively measure behavior. J Res Adm. 2015;46(1):118–137. [Google Scholar]
10. US Department of Health and Human Services. Protection of human subjects. 45 CFR 46. 1964.
11. National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report: ethical principles and guidelines for the protection of human subjects of research. 1978. Available at: http://ohsr.od.nih.gov/guidelines/belmont.html. [PubMed]
12. Kelly P, Marshall S, Badland H, Kerr J. An ethical framework for automated, wearable cameras in health behavior research. Am J Prev Med. 2013; 1–6. Available at: http://www.sciencedirect.com/science/article/pii/S0749379712008628. [PubMed]
13. Anderson EE, Dubois JM. IRB decision-making with imperfect knowledge: a framework for evidence-based research ethics review. J Law, Med Ethics. 2012;40(4):951–969. doi: 10.1111/j.1748-720X.2012.00724.x. [PubMed] [CrossRef] [Google Scholar]
14. Christie G, Patrick K, Schmuland D. Consultation for collective action on personalized health technology: eliminating the ethical, legal and social barriers for individual and societal benefit. J Health Commun. 20:867–868. [PMC free article] [PubMed]

Articles from Translational Behavioral Medicine are provided here courtesy of Oxford University Press