0% found this document useful (0 votes)
9 views18 pages

Out

This paper discusses the challenges faced by qualitative researchers when dealing with 'imposter participants'—individuals who misrepresent their identities to participate in studies. The authors share their experiences from a doctoral study where untrustworthy participants threatened data quality, highlighting the need for improved recruitment and verification methods in online research. They propose strategies to enhance participant honesty and data integrity, particularly in the context of virtual interviews conducted during the COVID-19 pandemic.

Uploaded by

wilson
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views18 pages

Out

This paper discusses the challenges faced by qualitative researchers when dealing with 'imposter participants'—individuals who misrepresent their identities to participate in studies. The authors share their experiences from a doctoral study where untrustworthy participants threatened data quality, highlighting the need for improved recruitment and verification methods in online research. They propose strategies to enhance participant honesty and data integrity, particularly in the context of virtual interviews conducted during the COVID-19 pandemic.

Uploaded by

wilson
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

The Qualitative Report 2022 Volume 27, Number 11, 2469-2485

https://doi.org/10.46743/2160-3715/2022.5475

Imposter Participants: Overcoming Methodological Challenges


Related to Balancing Participant Privacy with Data Quality
When Using Online Recruitment and Data Collection

Jacqueline M. Roehl and Darci J. Harland


Walden University, Minnesota, USA

In this paper we describe the lessons learned when untrustworthy participants


were included in a qualitative interview study. In online research, participants
can more easily misrepresent their identity and volunteer for studies even if they
do not meet inclusion criteria. The term “imposter participant” refers to
dishonest participants who completely fake their identities or simply exaggerate
their experiences in order to participate in qualitative studies. Untrustworthy
participants are a threat to data quality, yet little has been published on how
qualitative researchers should prevent and handle this unique methodological
challenge. In this paper, we provide a detailed account of how specific issues
with the research design create methodological challenges related to participant
honesty when participants self-identify as meeting study inclusion criteria and
participate in a virtual interview. Through our experiences as a doctoral student
and dissertation supervisor, we offer lessons learned relating to recruiting online
participants, collecting virtual interview data, and analyzing data for a
qualitative study. Our experiences and reflections might help other qualitative
researchers, including doctoral candidates and their supervising committees,
work with internal review boards to prevent imposter participants and thereby
contribute to the trustworthiness of their research.

Keywords: online recruitment, virtual interviews, qualitative methods,


participant honesty, trustworthiness

During the COVID-19 pandemic, many qualitative researchers had to shift recruitment
and data collection to fully online methods because of social distancing and lockdowns (Jones
et al., 2021; Newman et al., 2021; Salinas, 2022; Schlegel et al., 2021; Sha et al., 2020). This
shift was a logical step since researchers have found that virtual teleconferencing interviews
yield similar data to face-to-face interviews without compromising rapport between researchers
and participants (see Ahmad, 2020; Brown & Danaher, 2017; Jenner & Myers, 2019; Krouwel
et al., 2019). For example, all the participants in a nursing study who opted for the Zoom
platform as their preferred interview method expressed positive ideas about the experience
(Irani, 2019). Additionally, online interviews help researchers overcome financial challenges
and allow for geographically diverse participants in their studies (Jenner & Myers, 2019;
Krouwel et al., 2019).
Even though virtual interviews have been found to be an appropriate data collection
tool for qualitative studies, researchers need to carefully plan their recruitment procedures, data
collection, and analysis methodologies to protect their data from participants who are dishonest
about meeting study inclusion criteria and/or dishonest in their interview responses. In online
research, participants can more easily misrepresent their identities and volunteer for studies
even if they do not meet inclusion criteria (Chandler & Paolacci, 2017; Hydock, 2017; Salinas,
2022). In fact, Hydock (2017) found that a “small but nontrivial portion of participants” in
online survey studies “misrepresented their identity for the chance of financial gain” (p. 1566).
2470 The Qualitative Report 2022

In terms of naming this phenomenon of dishonest participants, several researchers have


used the term “fraudulent participants” when discussing participants whose data was eventually
questioned for participant authenticity and trustworthiness (Jones et al., 2021; Salinas, 2022).
The term “catfish” has been used for a person who creates fake online personas; however, that
term is usually associated with behavior in dating apps, not in social science research (see
Lauckner et al., 2019; Nolan, 2015). Although the fraudulent participants discussed in this
paper faked or exaggerated their identities might be closer to the “catfish” term, we used the
term “imposter participants.”
The term “imposter participants” was first introduced by Chandler and Paolacci (2017)
in context of online survey studies. “Fraud” is defined as “wrongful or criminal deception
intended to result in financial or personal gain” whereas “imposter” is defined “as a person who
pretends to be someone else in order to deceive others” (Oxford English Dictionary). We
believe the term “imposter participant” more accurately describes the dishonest, fraudulent,
fake, or false participants in qualitative research because they completely fake their identities
or exaggerate their experiences in order to participate in qualitative studies. The overconfidence
that a participant must have to volunteer for a study where they will be interviewed shows the
extent they will go to fabricate or elaborate a specific persona to purposefully deceive the
researcher. Yet little has been published on how qualitative researchers should handle this
unique methodological challenge, especially as it relates to virtual interviews for data collection
in qualitative research.

Imposter Participants in a Doctoral Study

This paper comes from the lessons learned from one qualitative doctoral study where
the student (JMR, first author) and her committee chair (DJH, second author) determined that
the novice researcher’s study design allowed for the inclusion of imposter participants which
threatened data quality. In the doctoral study, JMR explored how secondary educators use
social media to influence students’ empowerment skills (Roehl, 2021). JMR used a basic
qualitative approach (see Percy et al., 2015) with semi-structured interviews completed via
Zoom or telephone. Participants were recruited online from JMR’s social media professional
learning network and a university’s participant pool website. One of the methodological
assumptions was that participants would be honest during participant recruitment and
interviews. However, an unusual circumstance occurred during data collection when JMR
suspected that imposter participants had volunteered and been interviewed.
In JMR’s study, participants self-identified that they were classroom educators,
instructional coaches, or media specialists who work directly or indirectly with students in
grades six to twelve. Participants also self-identified that they had used social media with
secondary students at some point in the last five years in ways related to JMR’s research
questions. The research questions explored how educators describe their experiences helping
students learn to use social media to find credible information to understand diverse
perspectives about community issues, decide where they stand on issues, and then publish ideas
on social media to advocate for student opinions. Participants were identified from JMR’s
online professional learning network and a university’s participant pool. Both have an
international community. On social media, JMR shared an infographic with a link to a Google
form if the potential participant was interested in learning more about the study. Before
potential participants viewed the informed consent form, they answered questions to determine
if they met inclusion criteria for the study. Those who stated they met the inclusion criteria
then read the informed consent form, and those who wanted to participate gave their implied
consent by selecting “I consent” on the Google form. These recruitment procedures were also
Jacqueline M. Roehl and Darci J. Harland 2471

designed to get a broad range of perspectives since participants could find the infographic
online, so recruitment reached a global audience.
Once participants were identified, JMR began data collection with eleven of twelve
participants opting for Zoom interviews. The twelfth opted for a phone call. To meet IRB
privacy standards and to ease Zoom fatigue that educators were facing during the COVID-19
pandemic, JMR gave participants the option to have the camera off or on during the Zoom
interview. After JMR’s fifth virtual interview, JMR suspected that although participants had
self-identified that they met the inclusion criteria, during the interviews, it became clear that
some were exaggerating their experiences or perhaps even completely faking their identities as
educators. JMR shared a 13-page reflexive journal entry with her dissertation committee that
outlined her suspicions that she had interviewed imposter participants. JMR suspected that
people were faking an educator identity to earn a $20 Amazon gift card. JMR was aware that
scholars have warned that incentives can impact data quality (see Jones et al., 2021; Levi et al.,
2022; Patton, 2015; Williams & Walter, 2015); however, she had not anticipated the extent of
imposter participants. In her journal, she posited that perhaps the COVID-19 pandemic had
made people more comfortable with Zoom and that fact combined with a struggling
international pandemic economy increased the likelihood of imposter participants. It is possible
that the pandemic created a situation where marginalized and vulnerable individuals whose
economic circumstance was so critical that they were willing to participate in research study
solely for the compensation (Newman et al., 2021). Although a $20 Amazon gift card might
not seem like a lot of money, imposter participants could have been from other countries where
the U.S. dollar is stronger since the recruitment infographic was shared internationally. Also,
in the reflexive journal, JMR mused that she might have had duplicate participants who were
completing more than one interview, which would make the compensation even more alluring.
JMR supported her suspicions of imposter participants by detailing recruitment and
data collection patterns in her reflexive journal. During recruitment, each of the suspected
imposters had provided Google voice numbers that did not receive phone calls or text
messages. In JMR’s study design, she did not include a step where she called the participant to
set up an interview. Instead, this was done through email. Calling participants to set up an
interview could have been a first step in determining participant legitimacy. One device can
have multiple Google voice numbers. However, a potential imposter is less likely to have
multiple functioning cell phones or land lines, which would limit their ability to create multiple
identities. Another recruitment procedure implemented to protect participant privacy that
paved the way for imposter participants to enter the study was that JMR did not require
participants to provide social media account information, even though the study was about
social media use. The demographic questionnaire had an item where potential participants
could volunteer social media handle names to showcase accounts with educator and student
work. None of the five suspected imposter participants shared social media names on the
demographic questionnaire whereas the seven participants deemed to be trustworthy during
data analysis had shared social media account information. The imposters did not volunteer
any social media names during the interview either, which varied greatly from the seven other
participants who were eager to showcase educator and student work on social media during the
interviews. During data collection, JMR applied for and was granted an IRB change of
procedures to require that social media account information be provided during recruitment as
one way to verify that participants meet inclusion criteria. Not including a verification step is
an example of a novice researcher erring on the side of participant privacy concerns, resulting
in imposter participants entering a study.
In addition to similarities of how the suspected imposter participants entered the study
during recruitment, JMR’s reflexive journal detailed patterns that emerged during data
collection for the suspected imposter participants. These similar patterns led JMR to suspect
2472 The Qualitative Report 2022

that one participant may have been completing multiple interviews to increase the value of
Amazon gift cards. First, it was suspicious that the interviews were all completed at 9:00 a.m.
Eastern U.S. time at the request of the participant. Second, the suspected imposter participants
each had a baby crying in the background during the interviews. When asked if they would like
to continue the interview at another time because of the baby crying, all replied that they wanted
to continue. The researcher suspected that all four suspected imposters having babies crying
and responding that they wanted to continue the interview went beyond coincidence. Third, the
imposters all kept their cameras off for the entire Zoom interview compared to the trustworthy
participants who all volunteered to have their cameras on. Fourth, during the interview warm-
up questions, all the suspected imposter participants said that they were teaching at a New York
City school but did not volunteer any specific school information that could be verified online
despite follow-up prompting. Three participants said they taught at a high school, but the name
of the school was inaudible. When JMR asked to repeat the school’s name, one participant said,
“It’s in the Bronx, New York,” another participant said, “It’s a high school in New York,” and
the third participants said, “It’s in Brooklyn.” On the fourth interview with what JMR suspected
was the same participant, the participant did provide a specific high school name in New York.
Although the fourth interview with the suspected duplicate participant included a
specific high school name, the participant was suspect for yet another reason. The participant
entered the Zoom room for the interview with the screen name of “Jane,” (pseudonym) a
participant who was scheduled for an interview 4 days later. When JMR asked for clarification,
the participant replied, “I am using my cousin’s account. My name is Harley (pseudonym).”
This use of the cousin’s computer bolstered JMR’s suspicion that one person was using one
device with multiple names and Google voice numbers to complete multiple interviews.
During a series of Zoom meetings, JMR discussed the entries from her reflexive journal
with DJH, her dissertation supervisor. These conversations led to a “real-time approach” to
methods adaptation because of the issues that arose from problems in the study design (see
Ravitch & Carl, 2021, p. 228). Specifically, having participants self-identify as meeting
inclusion criteria without a verification step allowed for imposter participants to be
interviewed. DJH suggested that JMR file a “change in procedures” with the IRB to require
participants to share social media account information to verify their identity. The IRB change
of procedures also addressed what to do with fraudulent data. The request read in part:

I believe a few of my participants … could potentially have faked or


exaggerated their personas and did not really match with the inclusion criteria
on my recruitment Google form … My chair and methodologist are helping me
decide how much of my interview data needs to be thrown out.

In addition to the IRB-approved changes to methodology, DJH also advised JMR to ask
even more probing questions designed to check for “candor and consistency” in the remaining
interviews (see Rubin & Rubin, 2012, p. 60).
Once IRB granted JMR a change of procedures to require social media account
information or other identifying information as a secondary educator from participants, JMR
emailed Jane multiple times asking for such identifying information. JMR could not call the
participant since they had provided a Google voice number on the demographic questionnaire.
After not receiving any communication from Jane, JMR canceled the interview. Overall, since
these imposter participants were not required to and had not volunteered school names and did
not provide social media account information, JMR could not verify that they were secondary
educators who used social media with students. Again, the effort to protect participant privacy
may have encouraged imposter participants to enter the study.
Jacqueline M. Roehl and Darci J. Harland 2473

Upon suspecting that imposters had entered her study, JMR compared the verbatim
transcripts to identify inconsistent or contradictory interview responses for suspected imposter
participants in her reflexive journal entry to her dissertation committee. For example, one
suspected imposter answered “yes” on the following self-selecting inclusion criteria
questionnaire: “I have used social media to help students publish content about current, local
social and political issues.” However, during the interview the participant was asked: “was
there any time you had students post or create their own content?” but the participant
responded, “we have never posted.” Another suspected imposter detailed a lesson about the
presidential inauguration that students completed in class a few weeks ago. Since a baby had
been crying throughout the interview, JMR asked for clarification with “are you teaching right
now?” The participant responded: “as of now, no.” Overall, these patterns of contradictory
participant behavior during data collection were the most concerning aspects of JMR’s
reflexive journal entries.
Once imposter participants had been identified, JMR, along with her dissertation
committee, had to determine what to do with data from these participants. In dealing with face-
to-face interview data that was implausible, Flicker (2004) suggested three approaches
qualitative researchers might use. The first is the “cynic” approach where the researcher
assumes that the entire interview was fabricated and therefore the entire transcript is excluded
from data analysis. The second approach is the “skeptic,” where the researcher assumes that
only parts of the story is fabricated. In the skeptic approach the data are included in data
analysis, but careful comparisons are made to see how different the responses are as compared
to more trustworthy participants. The third approach is the “seeker,” where the researcher
assumes that since the participant self-selected, the story deserves to be heard and so the data
are included for analysis.
The evidence of imposter participants in JMR’s study was clear. Between the
inconsistencies and contradictions identified in the transcripts, notes from her researcher’s
reflexive journal, and the participants’ inability to share details about the phenomenon being
discussed, JMR and her committee felt the cynic approach was warranted in this situation.
Because JMR had already started coding transcripts from several of the imposter interviews,
she stopped data analysis and removed the imposter data. Because the thin descriptions in those
interviews likely influenced coding, JMR threw out her first codebook, started a new one, and
moved forward developing new codes using only the trustworthy participant data. If JMR were
to have chosen the skeptic approach to the suspected imposter participant data, she would likely
have had to “cherry pick” data (Morse, 2010), only coding excerpts that were not contradicted
anywhere in the other data. But this is problematic because the existence of one logical
inconsistency “casts doubt on the entire narrative” (Flicker, 2004, p. 532). JMR never
considered the seeker approach since the data were likely from imposters, who likely did not
have the experiences being explored in the study.
To be transparent in the reporting of study results regarding recruitment, data collection
and data analysis issues with imposter participants, JMR included an “Unusual Circumstances”
section in her final dissertation so that readers could understand the context of her data. In this
section, JMR discussed how she decided which participant data to keep in the study for data
analysis and trustworthiness. The section reads in part:

I ultimately decided to eliminate data from participants that I could not


completely trust. I was able to verify that seven participants were candid,
consistent, and trustworthy in their responses. I could verify that these seven
met the inclusion criteria for my study through a review of their social media
accounts and other documents that they volunteered during their interviews. I
2474 The Qualitative Report 2022

also did not find any contradictions in the interview responses of these seven
participants.

Based on our experiences as the doctoral student and supervisor involved in this
qualitative study with imposter participants, in the next section we offer suggestions for
preferred methodological choices that could mitigate these problems for future researchers.
Our discussion is organized according to the following phases of qualitative research:
recruiting participants online, collecting virtual interview data with checks for participant
honesty, and analyzing data to ensure trustworthiness.

Recruitment

Recruiting participants online and using social media as part of the process has many
benefits, especially with targeted recruitment that can lead to shorter recruitment periods
(Gupta, 2017; Saberi, 2020). During the COVID-19 pandemic, researchers had to either recruit
online or postpone their research until they could conduct face-to-face interviews (Schlegel et
al., 2021). For online recruitment, researchers can post an infographic or digital flyer on social
media pages, even contacting groups to share the infographic (Webber-Ritchey et al., 2021).
Information provided in the infographic should include the general purpose and topic of the
study as well as the inclusion criteria for those who are being invited to participate and
incentives being offered to participants. It is best practice to obtain permission to post from the
Facebook or LinkedIn page moderators.
To better target the needed participants, researchers should carefully identify and apply
octothorpes, more commonly known as hashtags, that the desired audience are likely to see
(Webber-Ritchey et al., 2021). Hashtags are helpful for recruitment because social media users
follow hashtags to find content on specific topics, which is different than “following” an
individual. Therefore, using hashtags give researchers access to a wider global audience
centered on a single topic (Kobakhidze et al., 2021). Researchers can even create webpages, or
Facebook or LinkedIn pages with information where participants can go to learn more about
the study (Kozinets, 2020; Schlegel et al., 2021). Schlegel et al. (2021) advised that researchers
can utilize the snowball sampling technique to recruit those who fit inclusion criteria. The
nature of social media with sharing posts and retweeting allows for organic snowball sampling.
Although recruiting on social media can be successful (Webber-Ritchey et al., 2021),
allowing participants to self-identify that they meet inclusion criteria without a researcher
verification step is problematic. Potential participants may attempt to retake the screening
survey until they “learn” how to answer to be eligible to participate (Saberi, 2020).
Additionally, screening questions may cause people to lie since they think that a certain
demographic idea is necessary to qualify for the study (Chandler & Paolacci, 2017). For
example, during JMR’s study, imposter participants knew that the study was seeking educators
who had used social media with students. Therefore, any imposters seeking financial reward
could exaggerate their experiences on the inclusion questionnaire to show that they had used
social media in the ways discussed in the study description. Additionally, participant
dishonesty may occur more often with online recruitment because it is harder to verify
participant identity, and even a small amount of participant dishonesty can skew data (Chandler
et al., 2020). If a researcher is conducting face-to-face recruitment, an imposter participant
would have a harder time completing more than one demographic questionnaire, whereas in
online recruitment, an imposter can create multiple identities and emails. Therefore,
researchers need to develop recruitment procedures that make faking an identity more difficult
for imposter participants, so that they do not choose to volunteer for study participation.
Jacqueline M. Roehl and Darci J. Harland 2475

One way to decrease online imposter participants is to require potential participants to


provide documentation that they meet the study’s inclusion criteria (see Saberi, 2020). For
example, if one of the inclusion criteria is that participants must be a middle school teacher, a
verification question might ask participants to share a link to the school’s website that lists their
name for identity verification. JMR did not require that potential participants share their
websites, social media accounts, or other written documentation to show they used social media
with secondary students. A verification step that delves into participant privacy raises ethical
concerns, and these concerns regarding online qualitative data collection have not been fully
explored in the literature (Jones et al., 2021). Researchers should determine the absolute
minimum amount of information needed to verify a participant meets inclusion criteria to limit
personal information obtained (Jones et al., 2021). These additional identity verification steps
should be submitted to IRB as part of the approval process and disclosed in the letter of consent.
Ethically, it is important to be transparent about what identifiable information will be
collected and what information will be used for confirming study eligibility versus information
that will be reported as part of the study (Kaiser, 2009). Clearly explaining the verification step
may discourage imposters from entering the study. Here is some suggested wording for the
letter of consent:

If you volunteer to participate, I will ask additional screening questions to


confirm that you fit the study’s inclusion criteria. The responses to these
eligibility screening questions will remain confidential and will not be used as
data in the study but only to confirm your identity and to discourage imposter
participants.

When requesting online identity information as part of participant verification, the letter
of consent should include a clear description of the steps the researcher will take to maintain
participants’ confidentiality (see Gupta, 2017). In addition to the elements usually included in
a letter of consent, in studies that recruit online for participants we suggest that the letter of
consent contain an assurance for potential participants that their real-life and online identities
will be protected. For example, a letter of consent might include a sentence such as: “Although
you may have been recruited through public social media sites, your social media handles and
online personas will be as equally protected as your real name.” Additionally, if, as part of the
study, researchers plan to quote or even paraphrase from participants’ social media published
posts, an explanation of protections should be noted in the letter of consent (see Roberts, 2015).
Saberi (2020) suggested that another possible solution to potential recruitment
problems is that researchers could ask final verification questions via a phone call. This strategy
might also be used to discourage imposter participants. If JMR had completed a live phone call
for verification and to set up a time for the teleconferencing interview, she could have
potentially eliminated imposter and duplicate participants. In addition, she could have used this
phone call to start building rapport with participants (see Jenner & Myers, 2019).
Verification is important because online participants can create multiple email
addresses and aliases in order to take a survey multiple times (Lawlor et al., 2021). Verification
is also important in qualitative interview studies to make sure that participants not only meet
the inclusion criteria but also to avoid duplicate participants. Jones et al. (2021) reported that
participants might submit multiple responses in quantitative and asynchronous, qualitative
studies for financial gain. Although there is an overall gap in the literature on imposter
participants in synchronous, qualitative studies, some suggestions are emerging in the
literature. Jones et al. (2021) suggested that researchers send their recruitment flyers only to
people personally known to the researcher. However, limiting a study to a convenience sample
may introduce additional methodological design weaknesses. Researchers could also examine
2476 The Qualitative Report 2022

the IP addresses of potential participants to check for any duplicates that could signal that one
person is pretending to be multiple participants for a financial reward (Chandler et al., 2020).
If IP address verification is conducted, researchers must be sure that IRB approves this step
since it does infringe on participant privacy. Overall, IRB approval involves careful planning
that reconciles a participant’s right to privacy with a researcher’s need to obtain honest data
and not include imposter participants in a study. The U.S. Office for Human Research
Protections has suggested that the need for identity confirmation from online research
participants take into account the importance to the research such as its relation to critical
inclusion criteria and if it is likely to have repeat or fraudulent participants (Secretary’s
Advisory Committee on Human Research Protections, 2013). With careful planning for
recruitment, researchers can successfully navigate IRB approval and protect participants
appropriately, while still decreasing the possibility for imposter participants at the same time.
One benefit to recruiting online is that interested participants can click on a link to learn
more about a study without being “known” to the researcher. This allows for potential
participants to learn more about a study without feeling coerced to participate (Kozinets, 2020).
However, the anonymity and additional information about the study might provide imposter
participants with enough information that they know how to answer the eligibility screening
questions. For example, in JMR’s study, potential participants were informed about the four
research questions so that they could be thinking of general ideas about social media for
interview question responses. Researchers should carefully weigh how much information,
beyond the inclusion criteria, should be divulged prior to interviews in an attempt to dissuade
imposter participants.
Another benefit of online recruitment is that online forms can be created with logic
branching that includes conditional questions that lead potential participants to different pages
based on their responses; JMR found Google forms easy to use for this purpose. Figure 1 shows
a flowchart of an online form that researchers could create and use for the recruitment and
informed consent phases. Potential participants view the invitation in various online spaces,
such as Twitter or Facebook, and follow a link to the online form. The first section of the form
is the self-selecting inclusion criteria questionnaire, where potential participants answer a series
of questions to confirm that they do indeed meet the inclusion criteria. When creating the form,
researchers can have participants exit the form if they do not meet the study’s inclusion criteria.
Those who self-select and meet the criteria move to the next section of the form where they
read the full letter of consent. At the bottom of that page, potential participants indicate whether
they consent to participate or not and land either on the verification page or an exit page.
The verification questionnaire should include required fields where participants provide
their names and contact information (i.e., email, phone number, and/or social media
information) as well as questions that can be used to verify the aspects of the participants’
identities and experiences that are critical to the inclusion criteria. Participants might also be
required to upload evidence that could provide verification of meeting the inclusion criteria.
For example, if the study is recruiting teachers who have used problem-based learning,
potential participants could be asked to upload a lesson plan to provide evidence that they have
experience in the phenomena being explored. Researchers might also consider including an
open ended “grand tour” question about the study’s topic in this verification stage. This would
allow researchers to gauge how knowledgeable and expressive participants are on the topic
(Humphrey, 2021; Stofer, 2019). The grand tour question might also provide clues to imposter
participants. Once participants submit their responses at the end of the verification stage, they
land on an exit page that should provide information on what they can expect from the
researcher.
Jacqueline M. Roehl and Darci J. Harland 2477

Figure 1

Flowchart of Online Recruitment for Qualitative Participants

Although online recruitment procedures are efficient and can reach a broad audience,
there may be a place for personal contact during the recruitment stage of a qualitative study.
As a novice researcher, JMR was focusing on efficiency with the Google form for recruitment
as well as developing a procedure to reduce barriers to participation. JMR’s demographic
questionnaire could have ended with a note saying that “the researcher will call you to set up
an interview and get your first thoughts on the topic.” This phone call would have allowed JMR
to verify that participants were not using one device for multiple identities, and the call could
have served as the place to ask the grand tour question. Of course, a script of that phone call
and any data analysis procedures for the grand tour question would need IRB approval, similar
to an interview protocol. Overall, if researchers plan for the drawbacks of online recruitment
and include methods to protect against imposter participants, the benefits of online recruitment
in terms of efficiency and global reach can be realized.

Data Collection

Researchers can also address methodological challenges related to imposter


participants during the data collection process. Even when preventative measures have been
taken, it is possible that imposter participants may be included in a study. Therefore,
researchers need to use the data collection process to further confirm the trustworthiness of
their participants and the validity of the data.
One major consideration for researchers who use teleconferencing interviews (for
instance, through Zoom) is weighing the conflict between respecting the participant’s right to
privacy with the option of having their camera off during the interview. Previous research has
shown that when participants’ cameras were on during virtual interviews, researchers had
stronger rapport and were able to consider nonverbal communication compared to when
interviews were conducted with the camera off (Jenner & Myers, 2019; Krouwel et al., 2019).
Heath et al. (2018) found that telephone interviews were found to be less engaging, and
researchers could not read verbal cues. A benefit of Zoom interviews, over phone interviews,
is that the camera allows researchers “to establish a connection with interviewees through
2478 The Qualitative Report 2022

seeing one another’s faces and expressions” (Kobakhidze et al., 2021, p. 5). Pre-pandemic,
researchers would not likely have video-recorded interviews, only audio-recorded (Newman et
al., 2021). Although JMR used Zoom for most of the interviews, the IRB required specific
methodological justification for why researchers would require cameras be on during
teleconferencing interviews to protect participant privacy, so JMR did not require participants
to show their faces. In communications with consenting participants, JMR indicated that she,
as the researcher, would have her camera on for the virtual interview, but she gave each
participant the choice as to whether they had their camera on or off. In addition to IRB
reluctance to requiring cameras be on, JMR provided this courtesy since educators may have
been suffering from Zoom fatigue because many were participating in distance learning at the
time of data collection. But requiring cameras to be on might reduce the likelihood of imposter
or duplicate participants from entering the study.
To mitigate privacy issues but still provide opportunities for visual engagement, and to
deter duplicate participants, researchers might require the participants’ cameras be on
momentarily, but then give them the option of keeping their cameras on for the interview.
Wording in the informed consent might read:

Interviews will take place in Zoom. I will have my camera on during the
interview, and I will want to see your face for at least a few moments. But you
may choose to have your camera on or off for the remainder of the interview. I
will audio record the interview for transcription purposes, but the video
recording will not be saved.

In JMR’s study, the five participants who were later determined to be imposters, chose
to keep their cameras off during the interviews. After reviewing her reflexive journal, JMR
came to suspect that the same person was completing multiple interviews, a situation that could
have been avoided had the participants been required to show their face even for just a moment
during the virtual interview. Ultimately, researchers need to consider how the requirement of
camera use during virtual interviews may help ensure the quality of participants who may
consent to be part of the study. If researchers feel strongly about having the camera on to build
rapport and help ensure that they do not have duplicate participants, they could engage with
IRB members to explain their justification for using the camera during virtual interviews. If
IRB does require that having a camera on must be optional, the researcher could attempt IRB
approval for IP address analysis. Some levels of Zoom teleconferencing software capture the
IP addresses of meeting attendees, so researchers could investigate and purchase
teleconferencing software with an IP address capture option.
During JMR’s dissertation study, she attempted to contact the suspected imposters after
the interviews. Communication could only occur through email since the phone numbers they
had provided were Google voice numbers that did not receive calls or texts. As part of the
informed consent of the study, participants had agreed to participate in a review of their
verbatim transcript as part of data confirmation procedures. JMR emailed the verbatim
transcripts to all participants. None of the imposters responded to the email whereas all seven
of the other participants replied to the email with comments. JMR also emailed all participants,
including the suspected imposters, a summary of her dissertation.
JMR received no responses to this summary from the suspected imposters. In fact, the
only follow-up email JMR ever received from a suspected imposter participant was an email
shortly after the interview concluded asking when she would receive the Amazon gift card. In
retrospect, only including a participant transcript review instead of full member checking was
a novice researcher mistake (Carlson, 2010). Perhaps requiring more participative member
Jacqueline M. Roehl and Darci J. Harland 2479

checking and having participants co-create knowledge with the researcher (see Slettebø, 2021)
would have discouraged imposter participants to volunteer for the study.
Another way researchers can protect themselves from imposter participants is through
a carefully constructed interview protocol. Researchers should carefully plan for the interview:
develop a protocol, test the protocol, and practice interviewing (Castillo-Montoya, 2016;
Yeong et al., 2018). Interviews can be started by repeating questions related to inclusion criteria
because imposters may have forgotten how they answered questions from the self-selecting
inclusion criteria questionnaire. In addition to asking interview questions aligned to the
research questions, researchers can include an open-ended question to start the interview and
get the participant talking more generally about the topic (Stofer, 2019). In the interview
protocol, researchers should consider adding one question that an imposter would have
difficulty answering. For example, DJH advised JMR to ask the follow-up question: “What
was your lesson objective?” Imposter participants struggled to come up with a lesson objective,
often pausing for a while and then asking JMR to repeat that question. Even with the question
repeated, imposter participants struggled to explain lesson objectives compared to the
participants in JMR’s study who were deemed trustworthy participants. Below is one
illustrative example from JMR’s study of how a suspected imposter participant responded to
the lesson objective question without naming a specific objective:

Interviewer: When you encourage students to post to social media, what was
your main objective for them doing that?
Participant 2 at 20:52: For me it was mainly to get the, to get the general ideas
of people. And to like focus … the students should also seem focused by nature.
And this focus can be seen from what they are posting, you know. That
encourages them to think about important aspects about the economy, about life,
and not just post things that are not necessary.

For illustrative and comparative purposes, one trustworthy participant explained a lesson
objective with the following response:

We use the Common Core . . . An eighth-grade standard. I bet it’s even an ELA
standard. I know it’s: Identify multiple sources and synthesize a response. . .
Isn’t that the goal of teaching English for them to reason and think and use their
communication skills. I think that’s like the four C’s of Common Core –
creativity, communication.

Participants have a range of experiences, and some will answer interview questions in
more detail. When participants do not provide clear answers, they are not necessarily lying or
exaggerating experiences (Humphrey, 2021). However, researchers could use unclear
responses as an opportunity to probe for consistencies across the entire interview. Researchers
should apply critical listening skills during the interview and probe with follow up questions
to clarify inconsistencies. Morse (2000) suggested that qualitative research is a bit like being a
detective, and that during semi-structured interviews, researchers should “follow clues, check
out leads, confirm facts, and keep your wits about you until everything falls into place and
makes sense” (p. 579).
Throughout the interview, as all good qualitative researchers do, it is important to use
probing and clarifying questions when the participant’s responses are vague, bizarre, or
nonsensical (Jones et al., 2021). Each of the participants JMR identified as imposters answered
questions with generalities, when asked for details, struggled to provide them, and were
2480 The Qualitative Report 2022

inconsistent, even contradicting themselves, within the interview. Gupta (2017) suggested that
it is a good idea to ask similar questions in varying ways to check for discrepancies.
Similar to Flicker (2004), in JMR’s effort to provide a safe and welcoming space during
the interview, she may not have asked as many confrontational or clarification questions as she
should have to determine whether the participant was lying or if there had been
miscommunication. Researchers should be listening for contradictions in what participants
share, and then use prompts such as, “Earlier you mentioned…now you shared…could you
clarify?” Although struggling to answer questions might indicate participant jitters because
they are talking to a stranger, it is still important that the researcher have assurance that the
participants have the experiences and expertise being explored in the study to help ensure
confidence in the data.
In the pursuit of a reflexive approach to qualitative research, the reflexive journal and
audit trails are important tools researchers can use for ensuring trustworthiness (Kennedy,
2020; Nowell et al., 2017; Ortlipp, 2008). Thoughts and observations related to the potential
for imposter participants should be a topic that researchers include in their journal and audit
trails. Immediately following the interview, researchers should reflect not only on major
themes they heard from each participant but also on the participant’s honesty. Recording “gut
feelings” or “reflective commentary” (Shenton, 2004) about a participant’s ability to answer
questions and notes on any nonverbal observations made during the interview will aid
researchers in identifying any issues of trustworthiness in participants. Lemon (2017)
suggested that qualitative research could benefit from “incorporating mindfulness interventions
as a tool for data collection and analysis by actively paying attention on purpose, in the present
moment” (p. 3308). Reflexive journals can be typed in Word documents or handwritten.
Researchers might also consider reentering an “empty” Zoom room immediately following the
virtual interview to record their reflections on participant’s responses and behaviors. The audio
file can then be transcribed and added as an entry to the reflexive journal. It was the journal
that JMR brought to her dissertation committee to discuss her suspicions of imposter and
duplicate participants that ultimately led to the committee deciding that a change in procedures
was appropriate.

Data Analysis

Qualitative researchers should continue to check for imposter participants during the
data analysis stage. Flicker (2004) used interview transcripts to identify specific instances of
inconsistent or contradictory responses when intuitively, the author knew something did not
make sense during the interview. For example, researchers should determine if references to
times and events are reasonable and logical. At one point in an interview, one of JMR’s
participants shared information about a classroom activity she had done “just a few weeks ago”
but then later in the interview stated that she was currently on maternity leave and had not been
in the classroom for months. Another method is to re-listen to the audio of the interviews and
pay close attention to participants’ pauses or stumbling over words, particularly after being
asked clarifying or probing questions. During data analysis, researchers should also explore
whether the participants were consistent in their answers when asked similar questions in
different ways. These behaviors and actions, alongside inconsistencies in transcripts, could
indicate a nervous participant or poor interviewing skills (Turner, 2010). Alternatively,
inconsistencies may provide evidence of an imposter. To increase confidence in the quality of
the data, researchers should design their data analysis with some check for inconsistencies
while also realizing that other reasons for inconsistent responses exist. During coding, JMR
developed a code named “inconsistent response” and could share those text segments with her
Jacqueline M. Roehl and Darci J. Harland 2481

committee in order to decide whether or not the participants were imposters and then what to
ultimately do with the data.
Data analysis plans often employ constant comparison of verbatim transcripts
(Merriam, 2001). Constant comparison is inductive analysis that the researcher begins with the
first interview and then continues by comparing subsequently gathered data to previously
analyzed data (Patton, 2015; Percy et al., 2015; Ravitch & Carl, 2021). During constant
comparison when coding the fifth interview, JMR noticed even more inconsistencies in the
verbatim transcripts from suspected imposter participants. Constant comparison revealed that
interviews with the two experienced participants who were personally known to JMR lasted
the entire scheduled 60 minutes. In comparison, the three interviews imposter participants
lasted 24 to 41 minutes. Although interview length alone is not proof that a participant is an
imposter, interview length should be part of a researcher’s reflexive journal and can be used
alongside other facts to determine honesty of participants.
Additionally, the trustworthy participants in JMR’s study provided many examples of
specific social media teaching strategies with plenty of thick descriptions, which are an
important part of qualitative research (Patton, 2015; Ravitch & Carl, 2021). The lack of thick
description in interviews might suggest that the participant is an imposter. In JMR’s reflexive
journals, she noted the lack of thick description from imposter participants; in a journal entry
shared with her committee, JMR wrote, “These participants did not provide much thick
description despite being prompted with many follow-up questions.” The trusted interview
transcripts also echoed details about social media that JMR had included in her literature
review, whereas imposter participant transcripts were general and hypothetical. This
connection of interview responses to the literature is one way to determine whether the data
are what were expected (see Lawlor et al., 2021). Overall, thick description and connections to
the literature should be part of a data analysis plan to build trustworthiness in the data.

Conclusion

Lawlor et al. (2021) posited that it is important to predetermine the level of suspicion
required to remove potentially unreliable data. However, predetermining a level of suspicion
is more difficult in qualitative research. Researchers conducting qualitative studies should be
mindful of the potential for imposter participants, carefully plan for this possibility, and
determine how to ensure trustworthy data. During the research process, suspicions of imposters
should be discussed with the dissertation committee, research team, or trusted research
colleagues to reach a consensus regarding the exclusion of responses (see Jones et al., 2021).
Discussions should be part of the planning stage for a study and continued as circumstances
arise during recruitment or data collection. Some questions researchers may use to guide this
process include:

1. During recruitment, can I verify that the participant met the inclusion criteria?
How confident am I in this information?
2. During data collection, was the participant hesitant or flustered when asked
probing questions for additional detail? What were my first impressions of
honesty in my reflexive journal? Did I note any nonverbal cues that might be a
clue to participant dishonesty?
3. During data analysis, did I find places in verbatim transcripts where the
participant contradicted themselves during the interview? Were a participant’s
answers detailed enough that the participant seemed knowledgeable about the
topic?
2482 The Qualitative Report 2022

When determining whether to exclude data, researchers need to be careful not to


eliminate discrepant data from eligible participants, especially if researchers have certain
expectations of the data (Jones et al., 2021). Even well-intentioned participants might be
glossing over details for reasons, such as not wanting to compromise other people’s identities
or not wanting to share about their own behavior or thoughts around a phenomenon (Weiss,
1994). Just because a participant does not provide thick descriptions does not necessarily mean
they are an imposter. Not all interviews go well and result in illuminating data. Having a
predetermined level of suspicion will help researchers to determine whether the participant is
indeed an imposter or simply that the interview did not yield substantive data. Once evidence
has met the level of suspicion and consensus has been reached, the decision needs to be made
whether only certain responses or all data from that participant should be excluded from
analysis. If the inconsistency is more than a single, minor occurrence, the trustworthiness of
the participant themselves comes into question and Flicker’s (2004) three approaches to dealing
with uncertain data (cynic, skeptic, or seeker) may need to be used to decide how to proceed.
Recruiting participants online and conducting virtual interviews are likely going to
become increasingly common practices for qualitative researchers because of issues related to
social distancing and people’s increasing comfort level with teleconferencing tools. All
qualitative researchers, including doctoral candidates and the faculty members who supervise
them, should work with IRBs to consider and prepare for the possibility of imposter participants
in qualitative studies, even those that include face-to-face interviews. In our paper, we have
outlined precautions that can be taken in the recruitment, data collection, and data analysis
phases to avoid the inclusion of imposter participants in qualitative studies. We have also
provided suggestions on how to identify imposters and determine what to do with imposter
participant data. However, we do not view our shared thoughts as the final word on best
practices. We share our experience to begin the conversation of how qualitative methods might
be changing due to a more virtual world.

References

Ahmad, F. H. (2020). Using video-conferencing for observations and interviews: Gathering


data from “home” while studying in New Zealand. Waikato Journal of Education,
25(1), 109-116. https://doi.org/10.15663/wje.v25i0.758
Brown, A., & Danaher, P. A. (2017). CHE principles: Facilitating authentic and dialogical
semi-structured interviews in educational research. International Journal of Research
& Method in Education, 42(1), 76-90.
https://doi.org/10.1080/1743727X.2017.1379987
Carlson, J. A. (2010). Avoiding traps in member checking. The Qualitative Report, 15(5),
1102-1113. https://doi.org/10.46743/2160-3715/2010.1332
Castillo-Montoya, M. (2016). Preparing for interview research: The interview protocol
refinement framework. The Qualitative Report, 21(5), 811-831.
https://doi.org/10.46743/2160-3715/2016.2337
Chandler, J. J., & Paolacci, G. (2017). Lie for a dime: When most prescreening responses are
honest but most study participants are impostors. Social Psychological and Personality
Science, 8(5), 500-508. http://doi.org/10.1177/1948550617698203
Chandler, J., Sisso, I., & Shapiro, D. (2020). Participant carelessness and fraud: Consequences
for clinical research and potential solutions. Journal of Abnormal Psychology, 129(1),
49-55. https://doi.org/10.1037/abn0000479
Flicker, S. (2004). "Ask me no secrets, I'll tell you no lies:" What happens when a respondent's
story makes no sense. The Qualitative Report, 9(3), 528-537.
https://doi.org/10.46743/2160-3715/2004.1922
Jacqueline M. Roehl and Darci J. Harland 2483

Gupta, S. (2017). Ethical issues in designing internet-based research: Recommendations for


good practice. Journal of Research Practice, 13(2), 1-14.
Heath, J., Williamson, H., Williams, L., & Harcourt, D. (2018). “It’s just more personal”: Using
multiple methods of qualitative data collection to facilitate participation in research
focusing on sensitive subjects. Applied Nursing Research, 43, 30-35.
https://doi.org/10/1016/j.apnr.2018.06.015
Humphrey, B. (2021). Analysis & methods: Tips for recruiting participants for user research.
Method in Madness. https://dovetailapp.com/blog/recruiting-participants-user-
research/
Hydock, C. (2017). Assessing and overcoming participant dishonesty in online data collection.
Behavior Research Methods, 50(4), 1563-1567. https://doi.org/10.3758/s13428-017-
0984-5
Irani, E. (2019). The use of videoconferencing for qualitative interviewing: Opportunities,
challenges, and considerations. Clinical Nursing Research, 28(1), 3-8.
https://doi.org/10.1177/1054773818803170
Jenner, B. M., & Myers, K. C. (2019). Intimacy, rapport, and exceptional disclosure: A
companion of in-person and mediated interview contexts. International Journal of
Social Research Methodology, 22(2), 165-177.
https://doi.org/10.1080/13645579.2018.1512694
Jones, A., Caes, L., Rugg, T., Noel, M., Bteman, S., & Jordan, A. (2021). Challenging issues
of integrity and identify of participants in on-synchronous online qualitative methods.
Methods in Psychology, 5, 1-5. https://doi.org/10.1016/j.metip.2021.100072
Kaiser, K. (2009). Protecting respondent confidentiality in qualitative research. Qualitative
Health Research, 9(11), 1632-1641. https://doi.org/10.1177/1049732309350879
Kennedy, L. M. (2020). Confessions of a novice researcher: An autoethnography of inherent
vulnerabilities. The Qualitative Report, 25(6), 1526-1539.
https://doi.org/10.46743/2160-3715/2020.4263
Kobakhidze, M. N., Hui, J., Chui, J., & Gonzalez, A. (2021). Research disruptions, new
opportunities: Re-imagining qualitative interview study during the Covid-19 pandemic.
International Journal of Qualitative Methods, 20, 1-10.
https://doi.org/10.1177/16094069211051576
Lemon, L. (2017). Applying a mindfulness practice to qualitative data collection. The
Qualitative Report, 22(12), 3305-3313. https://doi.org/10.46743/2160-3715/2017.3161
Levi, R., Ridberg, R., Akers, M., & Seligman, H. (2022). Survey fraud and the integrity of
web-based survey research. American Journal of Health Promotion: Critical Issues and
Trends, 36(1), 18-20. https://doi.org/10.1177/08901171211037531
Krouwel, M., Jolly, K., & Greenfield, S. (2019). Comparing Skype (video calling) and in-
person qualitative interview modes in a study of people with irritable bowel syndrome
- An exploratory comparative analysis. BMC Medical Research Methodology, 19(1), 1-
9. https://doi.org/10.1186/s12874-019-0867-9
Kozinets, R. (2020). Netnography: The essential guide to qualitative social media research
(3rd ed.). SAGE.
Lauckner, C., Truszczynski, N., Lambert, D., Kottamasu, V., Meherally, S., Schipanai-
McLaughlin, A. M., Taylor, E., & Hansen, N. (2019). “Catfishing,” cyberbullying, and
coercion: An exploration of the risks associated with dating app use among rural sexual
minority males. Journal of Gay & Lesbian Mental Health, 23(3), 289-306.
https://doi.org/10.1080/19359705.2019.1587729
Lawlor, J., Thomas, C., Guhin, A. T., Kenyon, K., Lerner, M. D., UCAS Consortium, &
Drahota, A. (2021). Suspicious and fraudulent online survey participation: Introducing
the REAL framework. Methodological Innovations, 14(3), 1-10.
2484 The Qualitative Report 2022

https://doi.org/19.1177/20597991211050467
Merriam, S. B. (2001). Qualitative research and case study applications in education. Jossey-
Bass.
Morse, J. M. (2010). “Cherry picking”: Writing from thin data. Qualitative Health Research
20(1), 3. https://doi.org/10.1177/1049732309354285
Newman, P. A., Guta, A., & Black, T. (2021). Ethical considerations for qualitative research
methods during the COVID-19 pandemic and other emergency situations: Navigating
the virtual field. International Journal of Qualitative Methods, 20, 1-12.
https://doi.org/10.1177/1609406921104783
Nolan, M. P. (2015). Learning to circumvent the limitations of the written-self: The rhetorical
benefits of poetic fragmentation and internet “catfishing.” Persona Studies, 1(1), 53-
64. https://doi.org/10.21153/ps2015vol1no1art431
Nowell, L. S., Norris, J. M., White, D. E., & Moules, N. J. (2017). Thematic analysis: Striving
to meet the trustworthiness criteria. International Journal of Qualitative Methods,
16(1), 1-13. https://doi.org/10.1177/1609406917733847
Ortlipp, M. (2008). Keeping and using reflective journals in the qualitative research process.
The Qualitative Report, 13(4), 695-705. https://doi.org/10.46743/2160-
3715/2008.1579
Patton, M. Q. (2015). Qualitative research and evaluation methods (4th ed.). SAGE.
Percy, W. H., Kostere, K., & Kostere, S. (2015). Generic qualitative research in psychology.
The Qualitative Report, 20(2), 76-85. https://doi.org/10.46743/2160-3715/2015.2097
Ravitch, S. M., & Carl, N. M. (2021). Qualitative research: Bridging the conceptual,
theoretical, and methodological (2nd ed.). SAGE.
Roberts, L. D. (2015). Ethical issues in conducting research in online communities. Qualitative
Research in Psychology, 12(3), 314-325.
https://doi.org/10.1080/14780887.2015.1008909.
Roehl, J. M. (2021). Secondary educator experiences using social media to influence students’
empowerment skills [Doctoral dissertation, Walden University]. ProQuest
Dissertations.
Rubin, H. J., & Rubin, I. S. (2012). Qualitative interviewing: The art of hearing data (3rd ed.).
SAGE.
Saberi, P. (2020). Research in the time of coronavirus: Continuing ongoing studies in the midst
of the COVID-19 pandemic. AIDS and Behavior, 24(8), 2232-2235.
https://doi.org/10.1007/s10461-020-02868-4
Salinas, M. R. (2022). Are your participants real? Dealing with fraud in recruiting older adults
online. Western Journal of Nursing Research, 0(0), 1-7.
https://doi.org/10.1177/01939459221098468
Schlegel, E. C., Tate, J. A., Pickler, R. H., & Smith, L. H. (2021). Practical strategies for
qualitative inquiry in a virtual world. Journal of Advanced Nursing, 77(10), 4035-4044.
https://doi.org/10.1111/jan.15000
Sha, L. K., Singh, D. R., & Sah, R. K. (2020). Conducting qualitative interviews using virtual
communication tools amid COVID-19 pandemic: A learning opportunity for future
research. Journal of Nepal Medical Association, 58(232), 1103-1106.
https://doi.org/10.31729/jnma.5738
Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects.
Education for information, 22(2), 63-75. https://doi.org/10.3233/efi-2004-22201
Slettebø, T. (2021). Participant validation: Exploring a contested tool in qualitative research.
Qualitative Social Work, 20(5), 1223-1238.
https://doi.org/10.1177/1473325020968189
Stofer, K. A. (2019). Preparing for one-on-one qualitative interviews: Designing and
Jacqueline M. Roehl and Darci J. Harland 2485

conducting the interview. UF/IFAS Extension. https://edia.ifas.ufl.edu


Turner, D. W. (2010). Qualitative interview design: A practical guide for novice investigators.
The Qualitative Report, 15(3), 2754-2760. https://doi.org/10.46743/2160-
3715/2010.1178
Secretary’s Advisory Committee on Human Research Protections. (2013, March).
Considerations and recommendations concerning internet research and human
subjects research regulations, with revisions. Office for Human Research Protections.
https://www.hhs.gov/ohrp/sachrp-committee/recommendations/2013-may-20-letter-
attachment-b/index.html
Webber-Ritchey, K. J., Aquino, E., Ponder, T. N., Lattner, C., Soco, C., Spurlark, R., &
Simonovich, S. D. (2021). Recruitment strategies to optimize participation by diverse
populations. Nursing Science Quarterly, 34(3), 235-243.
https://doi.org/10.1177/08943184211010471
Weiss, R. S. (1994). Learning from strangers. The Free Press.
Williams, E. P., & Walter, J. K. (2015). When does the amount we pay research participants
become “undue influence”? American Medical Association Journal of Ethics, 17(12),
1116-1121. https://doi.org/10.1001/journalofethics.2015.17.12.ecas2-1512
Yeong, M. L., Ismail, R., Ismail, N. H., & Hamzah, M. I. (2018). Interview protocol
refinement: Fine-tuning qualitative research interview questions for multi-racial
populations in Malaysia. The Qualitative Report, 23(11), 2700-2713.
https://doi.org/10.46743/2160-3715/2018.3412

Author Note

Jacqueline M. Roehl (ORCID ID: 0000-0003-2565-6666) taught high school English


for 21 years and was named the 2012 Minnesota Teacher of the Year. Jacqueline Roehl earned
a PhD in Education with a specialization in learning, instruction, and innovation from Walden
University in 2021. In her dissertation research she explored how secondary educators use
social media to influence students’ empowerment skills. Please direct correspondence to
jacroehl@comcast.net.
Darci J. Harland (ORCID ID: 0000-0001-9300-434X) is the Academic Program
Coordinator for the doctoral specializations of educational technology and design and learning,
instruction, and innovation in the College of Education at Walden University. She teaches and
is a subject matter expert for doctoral courses in her programs. She oversees curriculum
development, assigns course instructors, pairs students with their doctoral committees, and
supervises both PhD and EdD doctoral students.

Copyright 2022: Jacqueline M. Roehl, Darci J. Harland, and Nova Southeastern


University.

Article Citation

Roehl, J. M., & Harland, D. J. (2022). Imposter participants: Overcoming methodological


challenges related to balancing participant privacy with data quality when using online
recruitment and data collection. The Qualitative Report, 27(11), 2469-2485.
https://doi.org/10.46743/2160-3715/2022.5475
© 2022. This work is published under
https://creativecommons.org/licenses/by-nc-sa/4.0/(the “License”).
Notwithstanding the ProQuest Terms and Conditions, you may use this
content in accordance with the terms of the License.

You might also like