What are evidence-based practices and how often are they used in early childhood education? Evidence-based practices have empirical research that supports their use in the classroom and must meet certain criteria. Research shows that although a number of evidence-based practices in early childhood have been identified, not many are implemented or implemented in the way they were intended. For example, can you think of one evidence-based practice that is used across all preschool classrooms or early intervention programs in Indiana? The answer is most likely no and this article will introduce one possible solution: Implementation Science.
What is Implementation Science? According to the National Implementation Research Network (NIRN), Implementation Science is the study of factors that influence the full and effective use of evidence-based practices. Essentially, implementation science looks at how new practices and procedures are rolled out and sustained over time. In order for evidence-based practices to be effective, they must be skillfully implemented in the way they were intended by a qualified individual. In other words, they must be implemented with fidelity. Fidelity is a cornerstone of implementation science because although a practice may be evidence-based, it will only produce changes in child outcomes if the core components (the components shown to improve child outcomes) of the practice are implemented with the consistency and intensity that the practice's developers intended.
Implementation Science involves three steps:
1. Choosing evidence-based practices
2. Using Improvement cycles
3. Scaling up
To explain these three steps, let’s look at a problem that needs to be solved in early education: the engagement of families in their child’s school. In Indiana 35% of low-income families reported that they were unlikely to partner with their child’s preschool and only 55% reported that they shared family information with their child’s preschool. Family engagement includes more than just the traditional school-directed activities such as family-teacher conferencing, it expands to encourage families as equal partners in supporting their child’s learning and development. So how do we address this problem? First, we decide what practice we want to implement. Choosing evidence-based practices is not easy. According to NIRN, evidence-based practices are those that have research proving their effectiveness and meet the following criteria: clear definition of the practice, clearly identified core components of the practice/program, operational definitions of the core components, and a practical assessment of the performance of practitioners who are using the practice. One evidence-based practice to address family engagement in preschool education is the Parent Teacher Home Visit Project Model (PTHVP). This practice has research showing its effectiveness on increasing child outcomes. Specifically, the model has increased communication, trust, and support between families and teachers which has in turn increased school attendance and student test scores (Sheldon & Jung, 2015). The PTHVP has 5 core components that are operationally defined (tell the teacher what to do and how). Lastly, a simple means for assessing implementation fidelity is present in the PTHVP Evaluation Toolkit to address the degree to which each provider implements the five core components of the model.
After selecting the practice, improvement cycles are used to “work out the kinks”. An improvement cycle is almost like a trial run that serves as a time for practitioners to try the new practice with a small group of individuals in order to fine tune the practice. During this cyclic process, three to five groups (e.g., could be classrooms or child care centers) are chosen based on their willingness to participate. As the practice is implemented, data is gathered on how the families are responding. For the purposes of this example, let’s assume the model is being implemented at a local level (community wide). The community implementing then asks themselves what needs to be adjusted based on the data that has been collected. Once the practice has been implemented with three agencies within the community (the first improvement cycle), another improvement cycle starts with three more agencies, the data is reflected on, changes are made, and then another improvement cycle begins with three more agencies. As each group of “guinea pigs” is cycled through, more is learned about how to modify the practice to best serve all families and children. NIRN reports that following 3 to 5 improvement cycles, 85% of problems that may be present in the practice will be eliminated. So, as you move through improvement cycles and implement the PTHVP you determine what works best and what you might need to tweak. After each improvement cycle, a new group is identified and more tweaks are made to best prepare the practice for full implementation.
Once the protocol for the PTHVP is finalized through the use of improvement cycles, the last step is scaling up. Scaling up is the process of implementing an evidence-based practice across an entire group (not just samples of willing volunteers, which comprised your improvement cycles). For example, scaling up can occur across providers in a provider agency, at the district level, at the regional level or even at the state level. In the case of the Parent Teacher Home Visit Project, scaling up might include implementing the practice across all provider agencies within a given city or even expanding to include other neighboring communities.
Check out the next two articles to delve into specific examples that highlight the importance of each of the three steps described above. If you would like to learn more about implementation science, please visit the National Implementation Research Network website.
Blase, K. A., Fixsen, D. L., Sims, B. J., & Ward, C. S. (2015). Implementation science: Changing hearts, minds, behavior, and systems to improve educational outcomes. Oakland, CA: The Wing Institute.
Sheldon, S.B. and Jung, S.B. (2015). The Family Engagement Partnership Student Outcome Evaluation. Received from Flamboyan Foundation website: http://flamboyanfoundation.org/wp/wp-content/uploads/2015/09/JHU-STUDY_FINAL-REPORT.pdf