BY MAI MIKSIC | Evidence from around the world suggests that academic rigor in K-12 can help to close the achievement gap (Ripley, 2013; Quint, Thompson, & Bald, 2008; Warbuton, E. C., Bugarin, R., & Nunez, A., 2001). Academic rigor indicates challenging content at an appropriate developmental level. In contrast to their peers abroad, education policymakers in the United States seem more focused on standards than on the content of learning itself (Steiner, 2014). Although some education research suggests that a strong curriculum closes the achievement gap (Coca & Kelley-Kemple, 2011), there has been little research on the outcomes associated with academic rigor, especially in early childhood education. This void in research is concerning, since early learning builds the foundation for later skill formation (Heckman, 2006).
Thus, it is refreshing to find a new research study (Claessens, Engel, & Curran, 2014) that explores the effects of academic content in kindergarten. Their study examines two related concerns: which approach results in greater academic gains on standardized tests at the end of the kindergarten year, basic or advanced math and reading? And does academic rigor in kindergarten mitigate the often-observed “preschool fade-out effect?” Claessens et al. (2014) work with data from the Early Childhood Longitudinal Study-Kindergarten Cohort (ECLS-K), a nationally representative dataset that included over 15,000 children. Researchers examined children’s test scores at the beginning of kindergarten and then at the end of the year to determine academic gains. The study focused on test scores at the end of the Spring semester as the key variable of interest.
The authors constructed four categories that learning items could fall into: “basic math,” “advanced math,” “basic reading,” and “advanced reading.” The authors relied on teachers’ reports on what was being taught in the classroom. The authors considered an item to be “advanced” if fewer than 50% of the children in the full sample had mastered the material prior to its introduction in class. Conversely, content was deemed “basic” if 50% or more of the children had already been deemed proficient in the topic. For example, if less than 50% of all students in the ECLS-K sample had mastered counting out loud, then counting out loud was put into the “basic math” category. Therefore each student was compared to the abilities of the students in the full sample.
“Basic math” included concepts such as counting out loud, recognizing geometric shapes, and ordering objects. “Advanced math” contained items such as reading two-digit numbers and adding single-digit numbers. “Basic reading” included items such as letter recognition and writing one’s own name, and “advanced reading” consisted of items like matching letters to sounds.
Claessens et al. (2014) used a set of regressions to analyze their data. This method was probably the most appropriate method to use, given the limitations of using observational data, but it is not without problems. Regressions are an extension of doing correlations, but involve the use of multiple independent variables (in this case, basic or advanced reading or math content) to predict a dependent variable (here, academic achievement). The danger in doing basic regressions is that researchers could possibly omit important variables that are correlated with the dependent variable. This is called “omitted variable bias,” which can produce inaccurate estimates. Researchers aim to create a regression equation that contains all the possible variables that could explain the dependent variable- in this case, academic achievement.
At the same time, it is not advisable or possible to include every conceivable variable in a regression equation, as it makes the model less parsimonious and can result in what is known as a type-II error, in which we fail to detect an effect when it is actually there. Thus, running regressions as an analytic strategy to answer a research question constitutes a delicate balancing act, in which one wants to include all the most important variables but not include just any variable. Oftentimes, theory guides the choice of which variables to include, such as the ecological model. Here the authors are not specific about a particular theory that drives their selection of variables; instead they seem to rely heavily on previous research and normal convention to choose their control variables.
However, the authors have chosen a set of control variables that I believe are adequate, given what they had to work with, to minimize omitted variable bias- although it is always impossible to completely avoid omitted variable bias when doing regressions. The authors control for a number of teacher, classroom, child, parent, and home characteristics, each of which could affect academic achievement. It seems that the authors did their best to isolate the effect of academic content.
For the first research question, results indicate that advanced academic content is positively associated with higher gains in learning than basic content. In other words, the more days spent on advanced content, the higher the test scores at the end of kindergarten. In contrast, basic academic content did not have as great of an effect, and in some cases was negatively associated with test scores; children actually did worse on the tests if they were taught basic math than if they were taught advanced math. What makes these results most profound is that academic rigor benefited children regardless of their socioeconomic background, their academic achievement at the start of kindergarten, or whether they had attended preschool or not. While some children did seem to benefit more than others, generally all students had greater gains over time.
As exciting as these results are about the promise of advanced content, the authors did find some troubling results about the average number of days of basic to advanced academic content being taught in the classroom. Specifically, in one month, the teachers reported spending an average of 18 days covering basic reading content and only 11 days covering advanced reading content. For math, the teachers spent 10 days per month on basic content and only 6 days on advanced content. The authors seem to suggest that if advanced content indeed leads to greater academic gains, then teachers should spend more time on advanced content instead of basic content – regardless of who is in their classroom.
The questions that immediately follow from the authors’ data are, why are teachers spending less time, on average, on advanced content in kindergarten classrooms? Do they exercise choices about the curriculum and the pace of learning, or are content and pace prescribed? Would the teachers argue that the children needed the basic in order to master the advanced? And how can teachers insure that children in fact possess the basic knowledge upon which advanced knowledge builds, while at the same time challenging children sufficiently? There is little research out there that investigates these questions. These research questions are beyond the scope of this study. Still, they are avenues for further exploration since the policy implications follow from the pedagogical ones.
The second research question Claessens et al. (2014) investigated was the relationship between the preschool fade-out effect and the subsequent academic content taught in kindergarten. The fade-out effect refers to an observed phenomenon in which the academic benefits of preschool seem to quickly disappear after children leave preschool (U.S. Department of Health and Human Services, 2010). This effect is often cited and used against the expansion of preschool funding. The question is, what causes the fade-out effect? Is it because subsequent schooling is of such poor quality that it cancels out the original benefits of preschool? Claessens et al. (2014) hypothesize that advanced academic content in kindergarten can sustain preschool effects. Since they only measured academic gains by using test scores at the end of kindergarten, the authors imply that higher test scores at the end of kindergarten mean that advanced academic content can sustain preschool effects.
Disappointingly, the authors emphasized the fade-out effect in their introduction, but seemed to ignore it in their conclusions and discussion. All the authors needed to do was explicitly discuss the rate of learning for children who attended preschool and received advanced academic content in kindergarten and compare it to children who attended preschool and received basic content. While they had the right data to make the comparisons the authors didn’t really take the time to discuss it in the body of their article or do additional analyses to really explore fade-out.
The closest thing the authors did to this was make graphs comparing the groups. The graphs show that children in center-based care (any type of child care/preschool that is outside of the home other than Head Start) start out with higher test scores than children in other types of care and in Head Start. The children in center-based care maintain this advantage by the end of the year. For every group, children who received advanced content did better than children who did not, as is consistent with their regression findings. However, interpretation of these graphs is difficult since the regression findings showed that differences between groups were not statistically significant.
While descriptive graphs and statistics can be informative, they’re limited in their ability to truly explore fade-out effects. Just comparing baseline test scores at the beginning of kindergarten and comparing it to Spring test scores still can’t tell us anything about fade-out effects. We need to know the rate of learning in preschool and then compare it to the rate of learning in kindergarten, and then we could determine if any sustaining of preschool effects occurred.
Using regression methods, the authors can only say that students did better in kindergarten if exposed to more rigorous academic content. There is no counterfactual to confirm any fade-out effect claims. Instead, an experimental study would be the best method of studying fade-out effects and academic content. In such a study, we would want to compare children who attended preschool and received an advanced curriculum in kindergarten, to children who attended preschool and got a basic curriculum in kindergarten. The key here, and to any experiment, is that children would be randomly assigned to either group. Then we could make strong assertions about kindergarten academic content and fade-out effects.
Also, a study about fade-out effects should go beyond measuring achievement after a year of exposure to advanced content. The ECLS-K followed children from kindergarten through eighth grade. It would have made for a more compelling research story if the authors had shown results at least through third grade.
Nevertheless, policymakers can draw one very important conclusion from this study: young children respond to strong content in kindergarten, and yet most classrooms under-challenge them. American pedagogy has a longstanding bias against academically advanced content; it certainly resists introducing advanced content into a classroom of children with mixed abilities. However, the Claessens et al. (2014) study suggests that all children, regardless of their initial achievement scores, can benefit from more rigorous academic content. If we hold our children up to high enough standards, these findings suggest that children will rise to meet them.
Claessens, A., Engel, M., & Curran, F. C. (2014). Academic content, student learning, and the persistence of preschool effects. American Educational Research Journal, 51(2), 403-434. doi: 10.3102/0002831213513634
Coca, V., D. Johnson, and T. Kelley-Kemple (2011). Working to my potential: The postsecondary experiences of CPS students in the International Baccalaureate Diploma Programme. University of Chicago Consortium on Chicago School Research, Chicago.
Heckman, J. J. (2006). Skill formation and the economics of investing in disadvantaged children. Science, 312, 1900-1902.
Quint, J., Levy Thompson, S. & Bald, M.(2008). Relationships, rigor, and readiness: Strategies for improving high schools. MDRC. Retrieved from http://www.mdrc.org/sites/default/files/Relationships%20Rigor%20and%20Readiness.pdf
Ripley, A. (2013). The smartest kids in the world: And how they got that way. New York: Simon & Schuster.
Steiner, D. (2014). The coming common core assessments: How they could stop patronizing our students. CUNY Institute for Education Policy. Retrieved from http://ciep.hunter.cuny.edu/coming-common-core-assessments-stop-patronizing-students/
Warbuton, E. C., Bugarin, R., & Nunez, A. (2001). Bridging the gap: Academic preparation and postsecondary success of first-generation students. Washington DC: National Center for Education Statistics.
U.S. Department of Health and Human Services, Administration for Children and Families. (2010). Head Start Impact Study: Final report. Washington DC.