0% found this document useful (0 votes)
37 views13 pages

Steed Et Al. - (2023)

Research Paper 5

Uploaded by

gratelavastes
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views13 pages

Steed Et Al. - (2023)

Research Paper 5

Uploaded by

gratelavastes
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

1163720

research-article2023
TECXXX10.1177/02711214231163720Topics in Early Childhood Special EducationSteed et al.

Special Series: Contributions of Mixed Methodologies


Topics in Early Childhood Special Education

Early Childhood Professionals’ Reported


2023, Vol. 43(3) 214­–226
© Hammill Institute on Disabilities 2023
Article reuse guidelines:
Use of Culturally and Linguistically sagepub.com/journals-permissions
DOI: 10.1177/02711214231163720
https://doi.org/10.1177/02711214231163720

Responsive Practices During Initial tecse.sagepub.com

Evaluations: A Mixed Methods Study

Elizabeth A. Steed, PhD1 , Rachel Stein, PhD1,


Heidi Burke, MSM, MA, CCC-SLP1, and
Renee Charlifue-Smith, MA, CCC-SLP1

Abstract
We utilized a mixed methods design to analyze responses to a U.S.-distributed survey regarding early childhood professionals’
(N = 1,047) use of culturally and linguistically responsive practices during the initial evaluation for early intervention or
early childhood special education. Findings from the fully mixed concurrent equal status mixed methods design showed
that personnel used some culturally and linguistically responsive evaluation practices, such as using interpreters and
asking families about language use and routines at home. Other culturally and linguistically responsive evaluation practices
were used by fewer than half of survey respondents. Participants noted a lack of bicultural and bilingual staff, training,
materials, and other supports for implementing culturally and linguistically responsive evaluations. We discuss the need for
ongoing efforts to ensure equitable access to early intervention and special education services for racially and linguistically
marginalized young children.

Keywords
assessment, evaluation, early intervention, preschool, diversity, families

The U.S. population is increasingly racially and linguisti- Labor Statistics and U.S. Department of Labor, 2018).
cally diverse. In 2020, for the first time in U.S. history, the Cultural, racial, and linguistic differences between children,
White population decreased while the Hispanic, Asian, and families, and EI and ECSE professionals may present barri-
Multiracial populations increased, and approximately 22% ers on both sides to communicating about the child’s devel-
of the population spoke a language other than English (U.S. opment, sharing concerns, and successfully linking the
Census Bureau, 2022). Concomitantly, the populations of child to EI or ECSE services (Quebles et al., 2022).
young children with disabilities being served in early inter- Throughout the article, we will use the phrase “racially
vention (IDEA Part C) and early childhood special educa- and linguistically marginalized children and families” as
tion (IDEA Part B) are increasingly linguistically and opposed to “culturally and linguistically diverse children
racially diverse; in 2021, approximately 50% of young chil- and families” to reflect the systemic inequalities and oppres-
dren with disabilities were from a historically marginalized sion that puts individuals in marginalized status rather than
racial background (U.S. Department of Education, 2021); simple demographic differences. We will use the phrase
an increase of approximately 3% from previous years. “culturally and linguistically responsive practices” to refer
Furthermore, 21.5% of children under 6 years of age are to educational professionals’ intentionally adapted use of
dual language learners (DLL) living in homes where adults evidence-based practices to address all children’s instruc-
speak a language other than English (U.S. Census Bureau, tional, social, and language needs (Linan-Thompson et al.,
2022). 2018). In this case, culturally and linguistically responsive
While the population of children and families seeking
and receiving early intervention and preschool special edu- 1
University of Colorado Denver, Denver, CO, USA
cation services is increasingly culturally and linguistically
Corresponding Author:
diverse, the early intervention (EI) and early childhood spe- Elizabeth A. Steed, University of Colorado Denver, 1380 Lawrence
cial education (ECSE) workforce continues to remain Street Center, Denver, CO 80217, USA.
largely White, female, and English speaking (Bureau of Email: elizabeth.steed@ucdenver.edu
Steed et al. 215

evaluation practices are explored in the context of the initial providing materials and information in the families’ home
evaluation for special education. language (Steed & Stein, 2021). These researchers noted
that early childhood personnel did not regularly use higher
level culturally responsive evaluation practices, such as
Culturally and Linguistically Responsive
asking families about their expectations for the testing pro-
Evaluation Practices cess or about their values and hopes for their children (Steed
The initial evaluation for EI or ECSE is an important point & Stein, 2021). Another recent study found that while most
in time for professionals to utilize culturally and linguisti- White and Latina mothers felt that their child’s initial evalu-
cally responsive practices, given that it is often a family’s ation was accurate, Latina mothers reported more chal-
first interaction with professionals who will support their lenges during the initial evaluation and barriers to accessing
child’s learning and development and the importance of the early intervention services (Quebles et al., 2022). More
decision made from information gathered. Initial evalua- information is needed to understand how EI and ECSE pro-
tions are conducted as part of the Individuals with fessionals nationally are utilizing culturally and linguisti-
Disabilities Education Improvement Act (IDEA, 2004, P.L cally responsive practices during the initial evaluation, in
108-446) that requires states to have a comprehensive sys- recognition of the importance of the evaluation in accu-
tem to locate, identify, and evaluate young children who rately identifying all young children for EI/ECSE and con-
would benefit from EI or ECSE. The initial evaluation must necting them quickly to services. Knowing about current
gather information using a variety of assessment tools and practices can inform the development of specific recom-
strategies, with a multidisciplinary team, in all areas of sus- mendations for EI and ECSE teams and preservice and pro-
pected disability, and about families’ daily routines, priori- fessional development for early childhood professionals
ties, and concerns (de Sam Lazaro, 2017). Further, in order who currently or will conduct initial evaluations for young
to obtain accurate evaluation results when assessing racially children.
and linguistically marginalized children, the initial evalua-
tion should be conducted in a culturally and linguistically
responsive manner. Current Study
There are various ways for practitioners to conduct a cul- This mixed methods study sought to extend the extant lit-
turally and linguistically responsive initial evaluation, erature by utilizing a national sample to explore 1,047 early
including individualizing the process for the child and fam- childhood professionals’ reported use of culturally and lin-
ily using culturally appropriate strategies, materials, and guistically responsive practices during initial evaluations of
tools and knowing when and how to collaborate with trained young children for EI or ECSE.
interpreters (Acar & Blasco, 2018; Banerjee & Guiberson, Equitable assessment practices for racially and linguisti-
2012; Cheatham, 2011; McLean, 2001; Nagle et al., 2020). cally marginalized young children (Ortiz & Wong, 2020)
When assessing young children during the initial evaluation served as the framework for conceptualizing the explor-
for EI or ECSE, personnel should gather information about atory study and interpreting the findings. Given the limited
the child’s culture and home expectations to determine information available about equitable assessment practices
whether the child’s skills are due to cultural and linguistic utilized with young children, there was one primary research
factors or if the child is exhibiting a delay or disorder (Nagle question: Which culturally and linguistically responsive
et al., 2020; Ortiz & Wong, 2020). The Division for Early practices did professionals on initial evaluation teams for EI
Childhood (2014) Recommended Practices highlight the and ECSE report to use?
need to use culturally relevant assessment materials and
tools and to conduct the assessment in the child’s dominant
language. Using culturally and linguistically unbiased Method
assessment tools and strategies during the initial evaluation
is a key component of equitable access to special education
Positionality
services as the decision made following the initial evalua- The researchers were cognizant of our own personal views,
tion decides who does and does not receive EI and ECSE experiences, and identities throughout the study conceptu-
services (Neisworth & Bagnato, 2004). alization, survey design, data collection, data coding pro-
Little is currently known about EI/ECSE professionals’ cess, and interpretation of findings (Bourke, 2014). One
use of culturally and linguistically responsive practices dur- research team member identified as Latina, and three team
ing initial evaluations. Initial evaluation team members in members identified as White. All identified as female and
one state used some culturally and linguistically responsive had experience working in the field of EI/ECSE, including
evaluation practices, such as using translators or interpret- conducting initial evaluations; two team members had par-
ers, involving families from racially and linguistically mar- ticipated in an initial evaluation as a parent of a young child
ginalized backgrounds families in the evaluation, and with a possible delay or disability. All researchers held
216 Topics in Early Childhood Special Education 43(3)

positive views of the importance of early identification and Survey


use of culturally and linguistically responsive practices.
While we held personal beliefs about the value of providing The survey utilized for this project was designed by the
flexible, relational, and responsive evaluations with racially authors as part of a larger project focused on EI and ECSE
and linguistically marginalized children and their families, professionals’ use of recommended initial evaluation prac-
we sought to limit bias and ensure the voices of participants tices. The survey, in total, included 40 closed- and open-
were centered in the results through an iterative and non- ended questions and took approximately 12 to 18 min to
linear data coding process, careful audit trails, and peer complete on Qualtrics. The survey began with eight demo-
audits with another researcher with expertise in culturally graphic questions regarding the participants’ gender, race/
and linguistic responsive assessment (Brantlinger et al., ethnicity, professional training, education, and years of
2005). experience. Of the remaining 32 survey questions, 27 ques-
tions asked participants about their use of recommended
initial evaluation practices and five questions asked about
Participants how initial evaluations were conducted during the COVID-
A total of 1,445 individuals completed the survey, with 19 pandemic.
1,047 meeting criteria for being included in analyses for Prior to survey distribution, the survey was piloted with
this project. Survey respondents came from 39 U.S. states five professionals with varying professional roles and expe-
and one U.S. commonwealth. Participant recruitment was rience conducting initial evaluations in EI or ECSE in three
conducted in the early spring of 2021 and involved snow- states; these individuals were asked to test the survey lan-
ball sampling of early childhood professionals who con- guage, formatting, and time to complete the survey. Minor
ducted initial evaluations for young children for EI and/or revisions were made based on their feedback, including
ECSE. The researchers first sent an email to the publicly such things as adding tools to the response options and
available emails of each U.S. state’s IDEA Part C, Part B allowing respondents to select multiple answers to an item
619 coordinator, or Child Find coordinator and asked them about strategies used when a child’s home language is not
to forward the survey link to the relevant professionals in English.
their state. Email contacts without a response were sent two The survey questions utilized in this study included the
reminder emails approximately 3 weeks apart and 1 month eight demographic questions and four survey items, with the
after the original request. For those states in which there latter consisting of two closed-ended and two open-ended
was no response to the initial email, other recruitment questions that pertained to professionals’ use of culturally
efforts were utilized including requesting that EI or ECSE and linguistically responsive practices during the initial eval-
focused university personnel send the link to relevant pro- uation for EI or ECSE. The 1,047 participants included in
fessionals in their state and posting information about the analyses for this project answered at least one of the four sur-
study and survey link on social media groups related to EI vey questions; 1,043 (99.6%) and 1,011 (97%) participants
or ECSE. The study was approved by the University’s answered the two closed-ended survey items and 639 (61%)
Institutional Review Board (IRB). Participant informed and 634 (61%) participants provided codable responses to the
consent was collected electronically at the outset of initiat- two open-ended survey questions. It is common for response
ing the survey; survey respondents did not receive compen- rates to be lower for open-ended survey items when com-
sation for their participation in the survey. pared to closed-ended survey items (Denscombe, 2008);
Of the 1,047 total participants, the majority self-identi- when answering open-ended survey items is not required, as
fied as female (n = 1,025, 98%) and White (n = 931, 89%). many as 75% of respondents may skip those items (Zhou
Participants’ characteristics were overall representative of et al., 2017). The four survey questions and response options
the majority White and female EI or ECSE workforce are included in Supplemental Table S2.
(Bureau of Labor Statistics and U.S. Department of Labor,
2018). Out of the 1,047 participants, 346 (33%) worked on
Data Analysis
initial evaluation teams for Part C, 361 (35%) worked on
initial evaluations teams for Part B, and 257 (25%) of par- A fully mixed concurrent equal status mixed methods
ticipants worked on initial evaluation teams for both Part C design (QUAN+QUAL) was used wherein quantitative and
and Part B. Participant sociodemographic characteristics, qualitative data were collected at the same time followed by
their professional roles, and years of experience are pre- separate data analyses and then merging and integration of
sented in Supplemental Table S1. The most common pro- quantitative and qualitative findings (Creswell & Creswell,
fessional roles represented in the survey were ECSEs 2003). First, quantitative and qualitative data were simulta-
(n = 300, 29%) and speech-language-pathologists (n = 314, neously collected via survey. Then, researchers conducted
30%). Participants’ average years of experience conducting quantitative analyses on select closed-ended survey
initial evaluations was 13.43 (range 0–47). responses to yield descriptive statistics (i.e., number and
Steed et al. 217

percentage of responses for each response item). Researchers at this stage except for additional language in some cases.
simultaneously conducted qualitative analyses on the two For example, a note was added to the code “hard to get the
open-ended survey responses using a phenomenological family to open up about their culture” to explain that some
approach (van Manen, 1997) to understand how early child- participants felt this was related to a lack of time where oth-
hood personnel approached initial evaluations when chil- ers expressed that either they or the family were not com-
dren and families were racially and linguistically fortable discussing the family’s culture. The 40 total codes,
marginalized. Specifically, we used an open coding their descriptions, and example participant quotations
approach (Corbin & Strauss, 1990) followed by constant became the code book used for the next phase of analysis.
comparison analysis (Savin-Baden & Howell-Major, 2013) During the third phase of coding, two members of the
to inductively analyze participants’ responses to open- original coding team (one faculty member and one doctoral
ended survey items and add context to the quantitative student) independently tallied the number of participant
descriptive data. We used constant comparison analysis statements that aligned to each of the 21 codes related to
given that it is a rigorous process for the coding of data and initial evaluations for linguistically diverse children and
identifying underlying themes from the participants’ own families and 19 codes related to culturally diverse children
voices and allows researchers to calculate intercoder reli- and families. Participant statements, short phrases or longer
ability (ICR) by comparing coding across independent sentences about one concept or topic, were identified as the
researchers (Leech & Onwuegbuzie, 2007). The study units of analysis; thus, there could be multiple participant
meets criteria as a fully mixed methods design given the statements per participant response. The average number of
mixing of quantitative and qualitative techniques within statements per participant response was 1.21 (range 1–3).
one or more stages of the research process (Leech & Participant responses that were blank, included only punc-
Onwuegbuzie, 2009). Quantitative and qualitative data tuation, or were about a topic not related to the question
were mixed to glean insights into the initial evaluation pro- (e.g., about early intervention service delivery rather than
cess that would not otherwise have been found through the initial evaluation) were not tallied and were left out of
analyses of the datasets on their own (Leko et al., 2022). analyses. The coding team tallied the participant statements,
The qualitative coding process for analyzing partici- starting at the top of the dataset and stopping once satura-
pants’ responses is depicted in Supplemental Figure S1. tion was reached, defined as no new patterns emerging in
During the initial open coding process for the qualitative the frequency of statements aligned with particular codes
component of the study, each member of the three-person and all codes including at least one participant statement.
coding team that included one faculty member and two doc- Qualitative responses are presented in the findings as a per-
toral students reviewed the first 300 participant responses in centage of the total coded statements; a total of 370 partici-
the data set for each of the two open-ended survey items. pant statements were tallied for codes related to initial
Each researcher went line by line and independently made evaluations for linguistically diverse children and families
notes about keywords and impressions on a document. and 314 participant statements were tallied for codes related
Examples of keyword notes were “finding interpreters, to culturally diverse children and families. At the conclu-
especially if the language is something other than Spanish,” sion of this phase, 14 codes from the 40 total were set aside
“inaccurate interpretation,” and “lack of culturally relevant due to infrequent occurrences across participants (1%–2%
materials and tools.” The coding team then met in person to of participant statements); these codes were not utilized fur-
share their keywords and discuss impressions about com- ther in analyses. Examples of deleted codes were “financial
mon phrases and concepts participants shared for each of resources” and “scheduling issues.”
the two open-ended survey item responses. The initial key- For the fourth phase of qualitative coding, the first 10%
words that were common across coders were the foundation of the total independently coded participant responses
of the Level 1 codes identified at this stage of the process; (n = 145) were checked for ICR. Incorporating ICR into this
there were 21 Level 1 codes related to initial evaluations for phase of the coding process increases the transparency of
linguistically marginalized children and families and 19 the coding process and confidence that the final analytic
Level 1 codes related to racially marginalized children and framework represents a credible account of the data
families. (O’Connor & Joffe, 2020). Total percent agreement was
During the next phase of coding, a constant comparison 97% (range 91%–100%, SD = 0.22). The research team dis-
analysis process was used to look for participant statements cussed areas of agreement and disagreement and came to
across all open-ended responses that represented each code. consensus on the codes that had less agreement. One area of
Multiple participant statements were found for each of the clarification was in how to code statements that referenced
21 Level 1 codes related to initial evaluations for linguisti- both finding and working with interpreters versus just one
cally diverse children and families and 19 Level 1 codes of these issues.
related to culturally diverse children and families. No During the fifth and final phase of analysis, researchers
changes, combinations, or deletions were made to the codes on the qualitative coding team met to integrate quantitative
218 Topics in Early Childhood Special Education 43(3)

and qualitative data. Side-by-side comparisons of relevant qualitative codes provided additional context and informa-
quantitative response options and qualitative codes were tion beyond the quantitative results.
conducted to mix the data and allow researchers to look for
data convergence across quantitative response options and
Learning About Families’ Language and Culture
qualitative codes (Creswell & Creswell, 2003). Specifically,
descriptive quantitative response options were noted on one The quantitative findings revealed that the most common
side of a spreadsheet with qualitative codes on another. way in which EI/ECSE professionals provided culturally and
Response options and codes were moved up or down to linguistically responsive strategies during the initial evalua-
align with each other based on convergence on the same or tion was to learn about families’ language and culture before
related topic. For example, quantitative response options or during the evaluation (Figure 1). Specifically, the majority
related to interpretation were aligned with qualitative codes of professionals asked the about the family’s use of language/s
about finding interpreters and challenges using interpreters. in the home and asked about and validated the family’s val-
Through discussion, the coding team came to consensus on ues, expectations, and home routines when families were
five final themes that were inclusive of quantitative response from racially and linguistically marginalized backgrounds.
options and qualitative codes. We created a joint display of With integration of the quantitative data and qualitative
the quantitative and qualitative data that depicts how the participant responses, an expanded understanding of per-
quantitative responses and qualitative codes aligned with sonnel’s approach to gathering information from families
each theme (Figure 1). about their language and culture was possible, especially
regarding participants’ challenges to gathering information
when families were from a race, culture, or linguistic back-
Methodological Integrity ground that was different from their own. First, participants
We ensured the methodological integrity of our quantitative explained that a difference in language interfered with rap-
and qualitative analyses through several steps, including port building. For instance, Participant 311 stated that a lan-
sampling early childhood professionals from 78% of U.S. guage difference made it “more difficult to engage in some
states, soliciting and incorporating input from professionals forms of play and dialogue with the children and families. It
during survey design, grounding analyses in participant makes building a good rapport somewhat difficult.” Further,
quotations, calculating ICR, and using multiple researchers Participant 298 described:
to code, analyze, and interpret quantitative and qualitative
results (Brantlinger et al., 2005; Lincoln & Guba, 1985). I can’t connect as well with families who don’t speak English,
Further, we addressed several legitimation types for mixed and I never feel like I’m doing a good enough job helping those
methods research, including: (a) integration of quantitative families. How do you evaluate a child who is learning another
and qualitative approaches during data collection and anal- language that you don’t speak? And I always feel like such a
blundering American . . . it’s really awkward.
yses, (b) conversion of qualitative data into quantities (%)
that yielded useful inferences about the data, and (c) multi-
ple validities through appropriate statistical analyses, peer Several participants noted that the use of an interpreter was
audits, and audit trails (Leko et al., 2022). important but also could interfere with rapport building
between the professionals and the family members, such as
Participant 310, who stated, “It makes the eval lengthy at
Results times for the children. It’s hard to build rapport and connec-
The mixed and integrated quantitative and qualitative tion through a third party.” Professionals also noted the
results are presented in a joint display (Figure 1) across five challenge of not knowing what the family understood when
themes: (1) learning about families’ language and culture, an interpreter is used. For instance, Participant 232 said:
(2) evaluation adaptation and translation, (3) finding and
Often, we have trouble explaining to parents what it is we are
working with interpreters, (4) bilingual and bicultural per-
asking about even with an interpreter. Many things seem to be
sonnel, and (5) knowledge about culturally and linguisti- lost in translation, where we are not sure parents fully
cally responsive evaluation practices. Each theme includes understand what is being discussed. This is all especially true
both quantitative and qualitative findings. Quantitative for those students who speak languages other than Spanish,
response options are presented in the joint display (Figure i.e., Farsee, Tagolog, Karin, Arabic, Swahili, and other rare
1) as the percentage of participants who responded to the dialects of which there is no interpreter that can fully help
response option; the total from which these percentages convey our message or ask necessary questions related to the
were obtained often totaled more than 1,047 because par- evaluation/interview.
ticipants could choose multiple response options. Qualitative
findings are presented in the joint display (Figure 1) as the Even more statements referred to the challenges of
percentage of coded responses that aligned with that code; establishing a relationship with the family during the
Steed et al. 219

Theme QUAN Results QUAL Results Integrated Findings


Learning about families’ 85% (n = 888) ask the family about their 11% It is difficult to ▲ EI/ECSE professionals learn
language and culture use of language/s in the home relate to and connect about families’ language and
72% (n = 753) ask about the family’s with families when they culture before or during the
values, expectations, and home routines are from different racial evaluation
71% (n = 743) validate the family’s backgrounds ▼ There were some challenges
values, expectations, and home routines 4% It is hard to gather to learning about families’
information from families language and culture
who speak a language
other than English
Evaluation adaptation 71% (n = 744) allow children from 18% We lack culturally ▲ Some personnel adapted the
and translation racially marginalized backgrounds appropriate tools and evaluation or use translated
to show their interests and skills in materials tools
different ways 8% We do not have access ▼ There were challenges getting
57% (n = 599) provide multiple ways for to evaluation tools in culturally appropriate and
DLL children to show their interests various languages translated tools and a variety of
and skills materials
46% (n = 599) use materials and toys
from different cultures
31% (n = 324) ask the family for input
about cultural values to omit assessment
items or change wording
33% (n = 349) use translated evaluation
tools
Finding and working 57% (n = 596) provide a professional 36% It is hard to find and ▲ A little more than half of EI/
with interpreters interpreter work with interpreters ECSE professionals found and
53% (n = 557) have a bilingual team for children and families worked with interpreters
member who were linguistically ▼ Many had challenges finding
39% (n = 403) have a family member minoritized and working with interpreters
interpret
36% (n = 372) use an interpreter to talk
to the family prior to the evaluation
51% (n = 529) use an interpreter to
share evaluation results with the family
Bilingual and bicultural 28% (n = 291) have team members from 12% We do not have ▼T
 eams lacked bicultural and
personnel racially and linguistically marginalized bicultural team members bilingual team members
backgrounds 3% We do not have
bilingual team members
Knowledge about 49% (n = 514) have attended 5% I am trying to improve ▲ Some professionals have
culturally and professional development focused my use of culturally and engaged in professional
linguistically responsive on culturally responsive evaluation linguistically responsive development to improve their
evaluation practices practices evaluation practices culturally and linguistically
29% (n = 308) have attended 35% I need more responsive evaluation practices
professional development sessions information about culture ▼ Many need more training
focused on evaluation practices for 6% I want to know more
DLLs about language and its role
in development
7% indicated deficit or
colorblind views

Figure 1. Joint display of EI/ECSE professionals’ approach to initial evaluations with racially or linguistically marginalized children and
their families.
220 Topics in Early Childhood Special Education 43(3)

information gathering phase when the family was from a climate with a highly Asian and Pacific Islander background.
different race or culture than the professional. Participant Most adults and children do not wear shoes, instead they wear
367 shared: flip flops/slippers. A question on the standardized assessment
we use, and can’t be omitted, asks if a child can remove their
I naturally find the cultural differences are challenging to shoes with laces or Velcro with their hands. Additionally, many
navigate at first. Communication takes longer and some team of our families feel it is important to take care of and do things
members lack the patience needed to be thorough. for their children that other cultures might not expect. For
Understanding each other takes longer. Identifying the parents’ example, most of my children are fed with utensils by their
concerns (vs. expecting them to just agree with our results/ parents until they are well past two years old, and one of the
plan) and expectations can be challenging, and then also test questions that can’t be omitted is about a child’s use of
addressing those (not just asking about them) can be difficult as utensils independently. This then does not give a culturally
some people on teams have trouble delivering services appropriate representation of a child’s self-care skills.
“outside” of the checklist/rule guide we are taught to proceed
with. Further, Participant 438, a speech-language pathologist,
noted issues with standardized tools’ normative samples,
Participant 1604 provided one of many examples of person- “The current test that we use to determine eligibility is not
nel who shared discomfort and hesitancy to talk openly normed on Spanish-speaking children and the languages
about culture and race during the initial evaluation: “I rules are different in both languages. I am not skilled in
sometimes feel nervous as a White person . . . I don’t want another language to even know how valid the evaluation
to come across like I am judging them or expect their home is.”
life to be different than mine.” Participant 298 further Professionals noted that the testing kits lacked a range of
explained, “I feel disconnected from the kids in a way that I dolls of different skin colors or that personnel were required
don’t if the kids share my own culture. Connecting is the to purchase their own materials. As an example quotation,
essence of a good evaluation, so I don’t feel like I’m as able Participant 1648 stated, “I find it hard to provide materials
to do a good job.” that represent various cultures from the materials that we
Other quotations highlighted the short time frame of the currently have in our office.” Participant 338 summarized:
initial evaluation as a barrier to deeply learning about fami-
lies from different cultural backgrounds. One example quo- We need to do better here! Our assessment team is 100%
tation was from Participant 11 who described, “Getting White, our school is 80% White, and our materials and practices
reflect that. We tend to forget that in a very homogeneous
families to open up about their cultural routines for child
community, how hard it is to walk into our setting and ask for
rearing is hard, without much time to build rapport due to
help/be vulnerable when the entire process is centered around
45-day timelines for Part C evaluations.” White children/families.

Evaluation Adaptation and Translation Finding and Working With Interpreters


Quantitative findings showed that the majority of profes- According to the study’s quantitative findings, a little more
sionals used the culturally and linguistically responsive than half of EI/ECSE professionals found and worked with
evaluation practice of allowing children from racially and interpreters as a culturally and linguistically responsive
linguistically marginalized backgrounds to show their inter- practice during and after the initial evaluation. Only 36% of
ests and skills in different ways during the initial evaluation. participants used an interpreter during pre-evaluation meet-
It was less common for EI/ECSE professionals to use mate- ings with families. Approximately half of the participants
rials and toys from different cultures during the initial eval- had a team member who spoke a language other than
uation, ask the family for input about cultural values to omit English (Figure 1).
assessment items or change wording, or to use translated Integration of qualitative findings allowed for further
evaluation tools (Figure 1). understanding of the issues EI/ECSE professionals faced
Integrated, qualitative findings explained the quantita- when finding and working with interpreters for linguisti-
tive findings further, with professionals describing a lack of cally diverse children and families. First, finding interpret-
access to evaluation tools in various languages and cultur- ers, especially for some languages, was difficult for many
ally appropriate tools and materials. Participant 139 shared EI/ECSE professionals working on initial evaluation teams.
that the evaluation tools they used were not culturally For example, Participant 306 shared:
appropriate and could not be adapted, creating a problem
for cultural relevance and accuracy of the evaluation results: Guess who is the one SLP in the nearest 150-mile radius?
That’s right, me. Do you think an SLP who is Colombian is
I don’t feel the evaluation tool we use scores based on culturally going to be visiting a family who emigrated from Colombia
relevant criteria. For example, I work and live in a warm and now lives 150 miles away from the nearest major city?
Steed et al. 221

Nope. Ideally all assessments should be conducted in that It can be very easy to invalidate even dynamic or informal
child’s most comfortable language, but it’s not always possible. measures as an SLP. For example, asking the child to complete
I don’t have the resources. multi-step directions and the interpreter provides obvious
visual cues (e.g., pointing, gesturing, eye gaze, etc.), which
Participant 306 further highlighted the challenge of find- impacts identifying appropriate scaffolds and a child’s true
ing interpreters in some communities: level of independence.

I live in a rural area. We don’t have access to interpreters, let Finally, EI/ECSE professionals who worked both as ini-
alone ones who come from the same cultural background as the tial evaluation team members and interpreters noted that
child/family. Sometimes we have access to phone interpreters, their challenge was in educating monolingual team mem-
but the connection is weak and fuzzy because of a lack of bers about how to adjust to the flow of an interpreted initial
service and the interpreters are general and don’t know what an evaluation:
SLP evaluation really is. Altogether it’s difficult to use and
frustrating for both the therapist and the family. I am a bilingual interpreter for those families I receive, and one
struggle is when others speak over others. I always try to
Some participants explained that logistical issues, such explain to English-speaking providers that they need to pause,
as rules about not using district employees as interpreters allow time for what they want to say to be said in the parents’
for initial evaluations or the lead agency not having an language and pause still to allow parent to process and then
established process for securing interpreters, were cumber- respond. Then allow parents’ responses to be translated,
some and extended their federally determined initial evalu- allowing ALL voices to be heard respectfully.
ation timelines. For instance, Participant 102 explained,
“The cost to the district was a huge deterrent. I used my Bilingual and Bicultural Personnel
whole Part C timeline convincing my admin that we needed
an interpreter to assist during the evaluation.” Quantitative and qualitive data converged to indicate that
Another challenge was identifying interpreters who most EI and ECSE professionals did not have initial evalu-
knew the language of the child and family and were also ation team members from racially or linguistically margin-
familiar with working with young children in an evaluation alized backgrounds (Figure 1). Open-ended statements
context. Participant 321 described their challenges as: reiterated the lack of diverse representation on initial evalu-
ation teams and the challenges this posed for conducting
When the interpreter is not used to working with children and/ culturally and linguistically responsive evaluations. For
or is not used to the assessment process. This is typically the example, Participant 220 said:
case when we have to bring in an interpreter from an external
agency (not one of our own interpreters, who we’ve trained). Unfortunately, our staff is not diverse at all. With the
The child may not be as responsive to them because they aren’t exception of three people (a Hispanic/Black male-social
as good at working with children. worker, a White male autism specialist, and a Filipino woman
physical therapist), our entire staff is White women. We do
Participant 418 elaborated on the role of the interpreter’s regular trainings to help us become more culturally aware,
experience in interpreting for evaluations for young chil- but it is sometimes hard for people to let go of their White,
dren by saying: middle-class expectations and embrace the families’
expectations.
The quality of our interpreters can vary. Sometimes we will be
working with a new interpreter who hasn’t worked in an early Participant 1533 shared, “Our staff is not racially diverse,
childhood role and while we try to educate them about our but we do have access to cultural liaisons who are available
process, it seems like it can be confusing for them to understand during the evaluation process. While helpful, this is not the
their role. same as having a racially diverse staff working with the
family.” Participant 609 elaborated on the lack of racial and
Many participants worried about how information could linguistic diversity on the initial evaluation team, stating,
be misunderstood or lost during the interpretation process, “We are still a team of women and are mostly White, in
such as Participant 1032 who said, “Sometimes what we are spite of efforts to recruit people of color to our office. Our
asking doesn’t translate well. The concept can get lost by entire system is based on a White, Western construct of
changing one word. As we do not speak the language, we appointments and scores and assessments.”
don’t know exactly what the interpreter is saying.” In addi- Some participants described the potential extra burden
tion, professionals noted concerns that interpreters may on bicultural and bilingual team members to provide a cul-
interfere with the accuracy of the evaluation, such as turally and linguistically responsive evaluation, such as
Participant 579 who stated: Participant 708 who described:
222 Topics in Early Childhood Special Education 43(3)

There are times, though infrequent, that it is a challenge to find difference. It is hard to communicate with families that their
an available staff member to communicate with a family prior child looks delayed because of cultural choices they are
to evaluation, and there are also times the staff members that making.
can serve in this role feel over-worked in this capacity. That
this task can be a burden to their usual role and duties at times. Participant 588 attributed delays to “cultural poverty and
children are not exposed to peers or literacy. The parents
Knowledge About Culturally and Linguistically often don’t read well in their language.” Other responses
expressed that they sought to not see racial diversity. For
Responsive Evaluation Practices example, Participant 72 said, “Children are children.
Quantitative findings revealed that between a third and a Families are families. Humans are humans. Every child in
little less than half of participants had attended professional every family is a unique person with a unique life experi-
development focused on evaluation practices for DLLs or ence. There are no challenges in these areas unless you
culturally responsive evaluation practices (Figure 1). forget that.” Another professional, Participant 1458, down-
Integration of qualitative analyses showed that many pro- played cultural or racial bias in standardized instruments
fessionals desire more training in culturally and linguisti- saying, “I don’t feel there is a challenge since we are observ-
cally responsive evaluation approaches; most comments ing the child and their abilities based on testing parameters
indicated that the professional wanted to know more about and standard manipulatives are used for each test which do
the role of culture in children’s development, families’ not signify a particular culture or race.”
developmental expectations, and the initial evaluation pro-
cess. For example, Participant 918 explained:
Discussion
There are some other cultures with very different child rearing This study contributes to the sparse literature base on ECE
practices than ours. I’d like to know more about that in a way professionals’ reported use of culturally and linguistically
that is culturally sensitive. When am I giving helpful responsive assessment practices during initial evaluations
recommendations and when am I just trying to make them of young children for EI and ECSE. Our findings indicate
more like my culture? I’m not always sure.
that high numbers of EI and ECSE professionals used cul-
turally and linguistically responsive evaluation practices,
Many professionals wanted to know more about the such as providing interpreters, learning about children’s
influence of culture on development and how to use that language use, and asking about family values and home
information in their eligibility determination process, such routines. However, fewer than half of professionals reported
as Participant 85, who noted that their challenge was in to use more advanced culturally and linguistically respon-
“distinguishing between cultural differences and behavioral sive evaluation practices, such as adapting evaluation mate-
differences that may impact development and true develop- rials or instructions.
mental delays.”
Some participants shared that they wanted training to
increase their cultural competence and to allay their anxiet- Interpretation During Initial Evaluations
ies around difficult conversations, such as Participant 122,
who said, “I think there is much to learn about diversity. It First, it appears that EI and ECSE professionals may be
is important to ask family members about their routines and adept at providing some basic culturally responsive evalua-
beliefs in child development and that can sometimes be tion practices during the initial evaluation, such as provid-
uncomfortable because I want to use the right words.” ing interpretation for linguistically marginalized children
Participant 18 noted varying levels of cultural competence and families. EI and ECSE personnel reported the infre-
and ease among team members, “Different members of the quent use of additional, recommended linguistic adjust-
evaluation team have different levels of comfort and skill ments to the evaluation, such as using assessment tools that
talking about race and culture.” were translated and validated for bilingual children. This
Several EI and ECSE professionals indicated deficit or finding matches other literature that bilingual children are
colorblind views of racially and linguistically marginalized often assessed using instruments developed for monolin-
children and families that should be considered when plan- guals with monolingual norms, a practice that can lead to
ning trainings for evaluation teams. For example, some misidentification (Thordardottir, 2015). EI and ECSE pro-
respondents indicated that the parents’ choices and cultural fessionals reported a lack of access to tools for bilingual
practices were the issue, such as Participant 555, who stated: children, district or agency rules that dictated who and who
could not provide interpretation, and insufficient time to use
Oftentimes kids from different cultures . . . are babied for far the recommended practice of administering assessments in
longer than a child from an American background, so what both languages. These systems-level barriers led many ini-
looks like developmental delay is really just a cultural tial evaluation team members to use whoever they could get
Steed et al. 223

for on-the-spot interpretation, including phone interpreters familiar individuals; however, many initial evaluation team
or family members, especially for languages other than members emphasized the need to use standardized assess-
Spanish. ments to determine eligibility and noted the difficulty of
EI and ECSE personnel’s reliance on translation alone their use with culturally and linguistically marginalized
during initial evaluations for young children is likely to children and families.
reduce the reliability and validity of assessments results
conducted with bilingual children (Peña, 2007). Indeed, EI
Training and Supports
and ECSE participants in this study noted several of the
ways that in-the-moment translation can reduce the reliabil- Our findings indicate that professionals reported few sup-
ity and validity of evaluation scores, such as problematic ports for the complicated task of using culturally and lin-
translation of English assessment directions (e.g., “point,” a guistically responsive evaluation practices. Fewer than half
word that has at least two translations in Spanish) and trans- of the participants reported having access to professional
lations that adjust the meaning of the item (Peña & Halle, development on culturally and linguistically responsive
2011). Also, as noted by EI/ECSE personnel in this study, evaluation practices. Other research confirms early child-
using professional interpreters who are not skilled in assess- hood professionals’ training needs around culturally and
ments with young children or the child’s family members linguistically responsive practices (Banerjee & Luckner,
can interfere with the flow of the session, reduce profes- 2014; Moran & Sheppard, 2022). Our findings indicated
sional objectivity, and hinder confidence in the accuracy of that initial evaluation team members in some cases, perhaps
the results (Freeman & Schroeder, 2022). unknowingly, shared implicit biases when explaining that
some cultures had different and less valued expectations for
their children than White middle class developmental
Culturally Adapted Assessments norms. Some professionals explained that they treated all
Approximately 70% of participants in the study asked about children and families the same, suggesting that they did not
and validated families’ culturally situated values, expecta- attend to children and families’ racial identities. A color-
tions, and home routines. However, far fewer participants blind perspective such as this can lead to ignoring racial
reported that they make adjustments such as incorporating inequalities and perpetuating imbalances of power and priv-
culturally relevant play materials that were familiar to the ilege (Gilborn, 2019).
child and omitting or changing the wording of items on the A few members of initial evaluation teams showed
assessment based on family’s cultural values. Neglecting to awareness of the complexity of using culturally and linguis-
adjust or adapt the initial evaluation process, materials, or tically responsive evaluation practices when explaining
assessment items may result in an assessment that is based their challenge in noticing cultural or linguistic differences
on White, middle class assessment norms that are unfamil- while at the same time trying not to judge those differences
iar to racially or culturally marginalized children and fami- or compare children and families’ ways of doing with their
lies. In addition to a standard initial evaluation process own. These study findings elucidate the intricacies and bar-
being a potentially uncomfortable experience for culturally riers to professionals using culturally and linguistically
and linguistically marginalized children and families, the responsive practices during initial evaluations.
evaluation results are less likely to represent the child’s true
developmental capacity, strengths, and areas of need
(Banerjee & Guiberson, 2012).
Limitations
EI and ECSE professionals noted several barriers to The present study contributes to the existing literature in a
making cultural adaptations to assessment tools and materi- number of ways but is not without its limitations. First,
als used during the initial evaluation, including a lack of although survey respondents were from 39 states and one
diverse play materials and books and restrictions regarding U.S. commonwealth, we did not have representation from
how assessment items could be adapted for cultural situa- every U.S. state. Further, the number of responses from
tions. It was not clear whether EI and ECSE professionals each state was highly variable. Given that state-level policy
were given information that adapting assessment items is highly influential in assessment and eligibility practices
would invalidate the results from the assessment publisher (Dempsey et al., 2020), this likely impacted our findings
or from their district or agency; however, it was clear that and may not represent this work across all U.S. states.
some providers had interpreted a lack of flexibility in Second, the respondents primarily identified as White and
adjusting the assessment protocol for cultural circum- female. Although this mirrors the early childhood field as a
stances. Some personnel mentioned the importance of using whole (Whitebook et al., 2018), having a less diverse sam-
authentic assessment approaches when assessing culturally ple limits the range of perspectives that were shared.
and linguistically marginalized children and families, such Therefore, it is unclear whether results from this study are
as observations during the child’s regular routines and with generalizable to other populations.
224 Topics in Early Childhood Special Education 43(3)

Finally, participants were not asked about the demo- when evaluating DLLs, and suggested tools that have fewer
graphic characteristics of the communities where they language demands in English (Sullivan, 2011).
work, meaning we do not have information about the cul- Some participants used language in their survey
tural and linguistic diversity of those communities. Our responses that suggested implicit biases and problematic
sample likely represents the various levels of racial and lin- interpretations of children’s behavior or development that
guistic heterogeneity and segregation present in U.S. states may be couched in stereotypes or assumptions about a fam-
and cities; at the same time, these differences in exposure to ily’s race, culture, or language. Specific training in counter-
diverse racial and linguistic populations of children and ing implicit biases, stereotypes, and assumptions is needed.
families possibly influenced how many opportunities pro- Participants expressed less knowledge and more deficit
fessionals had to practice utilizing culturally and linguisti- views about race than language; therefore, training and pro-
cally responsive practices and how individuals responded to fessional development efforts may need to focus more on
the survey. In sum, although the present study adds to the biases around race than language. Research shows that
existing literature, additional research is needed. early childhood professionals are more likely to grow in this
area when they have intentional opportunities to explore
and learn about implicit bias, can practice strategies to
Future Research
unlearn their attitudes, and have an administrator who is
Prospective research should build upon this study to gather more advanced in their own awareness about implicit biases
additional information about EI and ECSE professionals’ (Neitzel, 2018).
use of culturally and linguistically responsive evaluation Additional implications from study findings include the
practices. Future studies might empirically assess the rela- need for early childhood professionals to have more access
tionships between professionals’ reported and actual use of to culturally relevant evaluation tools and materials and
culturally and linguistically responsive evaluation prac- assessment tools that are published in multiple languages.
tices. Research could focus on the use of authentic assess- Having assessments that incorporate options for how to
ment approaches and routines-based interviewing as a adapt materials and protocols for diverse populations would
component of culturally and linguistically responsive eval- support professionals in implementing more inclusive, child
uation practices. Other studies could analyze the influence and family friendly, and culturally and linguistically respon-
of factors such as professionals’ attitudes and beliefs, sive evaluation practices (Espinosa & García, 2012). Further,
implicit biases, and cultural or linguistic identity on their the use of assessment tools in additional languages would
use of recommended culturally and linguistically respon- decrease the need for on-the-spot translation (Barrueco et
sive evaluation practices. al., 2012). Lastly, EI and ECSE administrators should pro-
It would also be useful to systematically evaluate how vide more resources to professionals engaged in initial eval-
systems-level supports, such as diverse team membership, uations so that they have the time, materials, personnel, and
adequate materials for adapting assessment tools, the time competence to administer culturally and linguistically
allocated for initial evaluations, and professional develop- responsive initial evaluations to young children and their
ment impact personnel’s use of recommended culturally families. If the system itself does not provide necessary
and linguistically responsive initial evaluation practices. resources and supports to initial evaluation personnel, they
Additionally, it is important for future research to under- will not be able to carry out recommended culturally and
stand racially and linguistically marginalized families’ per- linguistically responsive evaluation practices.
spectives of the initial evaluation process.
Conclusion
Implications for Practice and Policy We found that early childhood professionals are using some
Our study findings indicate that EI and ECSE professionals culturally and linguistically responsive evaluation practices
would benefit from preservice training, professional devel- while infrequently utilizing or struggling to implement oth-
opment, and specific recommendations for how to provide ers. Personnel who conduct initial evaluations for EI or
culturally and linguistically responsive initial evaluation ECSE may need professional development and other sys-
practices. Training and recommendations for professionals tems-level supports in order to realize more widespread use
who are involved in initial evaluations should focus on how of recommended culturally and linguistically responsive
to engage with families during the evaluation when there practices during the initial evaluation. A greater awareness,
are cultural or language barriers, utilize interpreters, and acceptance, and use of culturally and linguistically respon-
distinguish between cultural influences on development and sive evaluation practices may lead to improved access to
a developmental delay or disability. Early childhood profes- key early intervention and special education services for
sionals should receive additional training in the nuances of racially and linguistically marginalized children and
language development for DLLs, recommended practices families.
Steed et al. 225

Declaration of Conflicting Interests services with a focus on eligibility for children with neona-
tal complications. Journal of Developmental & Behavioral
The author(s) declared no potential conflicts of interest with
Pediatrics, 41(8), 646–655. https://doi.org/10.1097/
respect to the research, authorship, and/or publication of this
DBP.0000000000000852
article.
Denscombe, M. (2008). The length of responses to open-
ended questions: A comparison of online and paper
Funding questionnaires in terms of a mode effect. Social
The author(s) received no financial support for the research, Science Computer Review, 26, 359–368. https://doi.
authorship, and/or publication of this article. org/10.1177/0894439307309671
de Sam Lazaro, S. L. (2017). The importance of authentic assess-
ORCID iD ments in eligibility determination for infants and toddlers.
Journal of Early Intervention, 39(2), 88–105. https://doi.
Elizabeth A. Steed https://orcid.org/0000-0002-6886-9066
org/10.1177/1053815116689061
Division for Early Childhood. (2014). DEC recommended
Supplemental Material practices in early intervention/early childhood special
Supplemental material is available on the Topics in Early education 2014. http://www.dec-sped.org/recommended-
Childhood Special Education website with the online version of practices
this article. Espinosa, L. M., & García, E. (2012). Developmental assessment
of young dual language learners with a focus on kindergar-
ten entry assessments: Implications for state policies. Center
References for Early Care and Education Research–Dual Language
Acar, S., & Blasco, P. M. (2018). Guidelines for collaborating Learners (CECER- DLL). The University of North Carolina,
with interpreters in early intervention/early childhood spe- Frank Porter Graham Child Development Institute.
cial education. Young Exceptional Children, 21(3), 170–184. Freeman, M. R., & Schroeder, S. R. (2022). Assessing language
https://doi.org/10.1177/1096250616674516 skills in bilingual children: Current trends in research and
Banerjee, R., & Guiberson, M. (2012). Evaluating young chil- practice. Journal of Child Science, 12, e33–e46. https://doi.
dren from culturally and linguistically diverse backgrounds org/10.1055/s-0042-1743575
for special education services. Young Exceptional Children, Gillborn, D. (2019). Hiding in plain sight: Understanding and
15(1), 33–45. https://doi.org/10.1177/1096250611435368 addressing whiteness and color-blind ideology in education.
Banerjee, R., & Luckner, J. (2014). Training needs of early Kappa Delta Pi Record, 55(3), 112–117.
childhood professionals who work with children and fami- Individuals with Disabilities Education Improvement Act of 2004,
lies who are culturally and linguistically diverse. Infants 20 U.S.C. § 1400 et seq. (2004).
and Young Children, 27(1), 43–59. https://doi.org/10.1097/ Leech, N. L., & Onwuegbuzie, A. J. (2007). An array of qualita-
iyc.0000000000000000 tive data analysis tools: A call for data analysis triangulation.
Barrueco, S., Lopez, M., Ong, C., & Lozano, P. (2012). Assessing School Psychology Quarterly, 22(4), 557–584. https://doi.
Spanish-English bilingual preschoolers: A guide to best org/10.1037/1045-3830.22.4.557
approaches and measures. Paul H Brookes Publishing. Leech, N. L., & Onwuegbuzie, A. J. (2009). A typology of mixed
Bourke, B. (2014). Positionality: Reflecting on the research pro- methods research designs. Quality & Quantity, 43(2), 265–
cess. The Qualitative Report, 19(33), 1–9. http://www.nova. 275. https://doi.org/10.1007/s11135-007-9105-3
edu/ssss/QR/QR19/bourke18.pdf Leko, M. M., Hitchcock, J. H., Love, H. R., Houchins, D. E., & Conroy,
Brantlinger, E., Jimenez, R., Klingner, J., Pugach, M., & M. A. (2022). Quality indicators for mixed-methods research in
Richardson, V. (2005). Qualitative studies in special edu- special education. Exceptional Children. Advance online publi-
cation. Exceptional Children, 71(2), 195–207. https://doi. cation. https://doi.org/10.1177/00144029221141031
org/10.1177/001440290507100205 Linan-Thompson, S., Lara-Martinez, J. A., & Cavazos, L. O.
Bureau of Labor Statistics and U.S. Department of Labor. (2018). (2018). Exploring the intersection of evidence-based prac-
Occupational outlook handbook, special education teach- tices and culturally and linguistically responsive practices.
ers. https://www.bls.gov/ooh/education-training-and-library/ Intervention in School and Clinic, 54(1), 6–13. https://doi.
special-education-teachers.htm org/10.1177/1053451218762574
Cheatham, G. A. (2011). Language interpretation, parent partici- Lincoln, Y. S., & Guba, E. G. (1985). Establishing trustworthi-
pation, and young children with disabilities. Topics in Early ness. Naturalistic Inquiry, 289(331), 289–327.
Childhood Special Education, 31(2), 78–88. https://doi. McLean, M. (2001). Conducting culturally sensitive child assess-
org/10.1177/0271121410377120 ments. In Serving the underserved: A review of the research
Corbin, J. M., & Strauss, A. (1990). Grounded theory research: and practice in Child Find, assessment, and the FSP/IEP pro-
Procedures, canons, and evaluative criteria. Qualitative cess for culturally and linguistically diverse young children.
Sociology, 13(1), 3–21. ERIC Clearinghouse on Disabilities and Gifted Education,
Creswell, J. W., & Creswell, J. (2003). Research design. SAGE. Council for Exceptional Children. https://files.eric.ed.gov/
Dempsey, A. G., Goode, R. H., Colon, M. T., Holubeck, P., fulltext/ED454640.pdf
Nsier, H., Zopatti, K., & Needelman, H. (2020). Variations Moran, K. K., & Sheppard, M. E. (2022). Finding the on
in criteria for eligibility determination for early intervention ramp: Accessing early intervention and early childhood
226 Topics in Early Childhood Special Education 43(3)

special education in an urban setting. Journal of Early Savin-Baden, M., & Howell-Major, C. (2013). Qualitative research:
Intervention. Advance online publication. https://doi. The essential guide to theory and practice. Routledge.
org/10.1177/10538151221137801 Steed, E. A., & Stein, R. (2021). Initial evaluation practices and
Nagle, R., Glover Gagnon, S., & Kidder-Ashley, P. (2020). Issues processes: A survey of early childhood personnel. Topics in
in preschool assessment. In V. C. Alfonso, B. A. Bracken, Early Childhood Special Education. Advance online publica-
& R. J. Nagle (Eds.), Psychoeducational assessment of tion. https://doi.org/10.1177/02711214211005856
Preschool Children (5th ed., pp. 3–31). Taylor & Francis. Sullivan, A. L. (2011). Disproportionality in special education
Neisworth, J. T., & Bagnato, S. J. (2004). The mismeasure of identification and placement of English language learn-
young children: The authentic assessment alternative. Infants ers. Exceptional Children, 77(3), 317–334. https://doi.
and Young Children, 17(3), 198–212. org/10.1177/001440291107700304
Neitzel, J. (2018). Research to practice: Understanding the role of Thordardottir, E. (2015). Proposed diagnostic procedures for use
implicit bias in early childhood disciplinary practices. Journal in bilingual and cross-linguistic context. In S. Armon-Lotem,
of Early Childhood Teacher Education, 39(3), 232–242. J. de Jong, & N. Meir (Eds.), Assessing multilingual children.
https://doi.org/10.1080/10901027.2018.1463322 Disentangling bilingualism from language impairment (pp.
Ortiz, S. O., & Wong, J. Y. T. (2020). Psychological assess- 331–358). Multilingual Matters.
ment of culturally and linguistically diverse preschool chil- U.S. Census Bureau. (2022). U.S. Census Bureau QuickFacts:
dren. In V. C. Alfonso, B. A. Bracken, & R. J. Nagle (Eds.), United States and nation continues to age as it becomes more
Psychoeducational assessment of preschool children (5th ed., diverse. https://www.census.gov/quickfacts/fact/dashboard/
pp. 346–374). Taylor & Francis. US/AGE135221
O’Connor, C., & Joffe, H. (2020). Intercoder reliability in qualita- U.S. Department of Education. (2021). IDEA Section 618 Data
tive research: Debates and practical guidelines. International Products: Static Tables. https://www2.ed.gov/programs/
Journal of Qualitative Methods, 19, 1–13. https://doi.org/ osepidea/618-data/static-tables/index.html
10.1177/1609406919899220 van Manen, M. (1997). From meaning to method.
Peña, E. D. (2007). Lost in translation: Methodological considerations Qualitative Health Research, 7(3), 345–369. https://doi.
in cross-cultural research. Child Development, 78, 1255–1264. org/10.1177/104973239700700303
Peña, E. D., & Halle, T. G. (2011). Assessing preschool dual language Whitebook, M., McLean, C., Austin, L. J. E., & Edwards, B.
learners: Traveling a multiforked road. Child Development (2018). The early childhood workforce index 2018. https://
Perspectives, 5(1), 28–32. https://doi.org/10.1111/j.1750- cscce.berkeley.edu/wp-content/uploads/2022/04/Early-
8606.2010.00143.x Childhood-Workforce-Index-2018.pdf
Quebles, I., Perrigo, J. L., Bravo, R., Patel Gera, M., Poulsen, M. Zhou, R., Wang, X., Zhang, L., & Guo, H. (2017). Who tends to
K., Wheeler, B. Y., & Williams, M. E. (2022). Latinx moth- answer open-ended questions in an e-service survey? The con-
ers’ experiences with linkage to early intervention. Infants tribution of closed-ended answers. Behaviour & Information
and Young Children, 35(3), 189–204. https://doi.org/10.1097/ Technology, 36(12), 1274–1284. https://doi.org/10.1080/014
iyc.0000000000000220 4929x.2017.1381165

You might also like