Methodological factors related to an intervention’s effectiveness

Comparing the effectiveness of different interventions is not straightforward as there are a number of issues that have been found to relate to effect size (i.e. a standardised mean score difference between groups, often calculated as Cohen’s d). The review by Fischer et al. (2013) focused mainly on the evaluation of the impact of methodological factors in intervention studies in relation to their effectiveness. Several other review authors have also considered the topic as part of their research (e.g. Codding et al., 2011; Gersten et al., 2009; Kroesbergen & Van Luit, 2003; Kunsch et al., 2007).

First, the measurement may mediate the effectiveness of the intervention. The effects of interventions using standardised tests have been shown to be significantly lower than interventions using researcher-made assessments (Zhang & Xin, 2012; Wang et al., 2016). Self-developed tests often specifically measure the skills practised in the interventions, thus producing larger effects.

Second, the type of control group may affect the size of the effect. Studies administering another type of intervention for the control group (i.e. active control group) tended to have a significantly smaller effect when compared to studies that did not (i.e. using passive control groups) (Fischer et al., 2013; Gersten et al., 2009). Furthermore, if the study employed a performance-matched control group, it produced smaller effects than those that did not (Fischer et al., 2013).

It has been found that sample size affect the effect size. Kroesbergen and Van Luit (2003) reported that single-subject design studies have significantly higher effect sizes than group-design studies. Also, small studies (i.e. number of participants) presented higher effects than large studies (Kroesbergen & Van Luit, 2003). Thus, because of the differences in the nature of group- and single-subject design interventions (e.g. single-subject designs often include baseline measurements and no control participants), to obtain reliable effect size values, effect size calculations should be performed differently, and the results should be treated separately for each approach (e.g. in meta-analyses).

 Fourth, Fischer et al. (2013) state that the number of mathematics components in an intervention is related to the size of the effect. They, as well as Wang et al. (2016), found that interventions focusing on training one specific mathematics skill (i.e. one component) produced significantly greater effects than those covering more than one mathematical skill (i.e. multicomponential).

Fifth, interventions focusing on a specific skill are often of short duration. As a specific mathematics skill can often be fully acquired in a short period of time (Kroesbergen & Van Luit, 2003), shorter interventions have generally been found to be more effective than longer ones (Gersten et al., 2009; Kroesbergen & Van Luit, 2003). In contrast to this finding, the length of an intervention was not found to affect intervention outcomes in reviews by Kunsch et al. (2007) and Fischer et al. (2013). These contradictory results may be explained by the fact that the review studies defined long and short interventions differently. Moreover, the length of an intervention does not tell us how intensive the intervention is (i.e. duration and frequency of sessions).

Sixth, the type of interventionist (i.e. a researcher or a teacher) has, on one hand, been shown to have had a minimal impact on intervention outcomes (Gersten et al., 2009). On the other hand, Codding et al. (2011) found that using a combination of ‘intervention agents’ (i.e. a teacher and a student or a teacher and a researcher) was the most effective approach. When an intervention study is implemented in an authentic learning environment (e.g. at school), it is said to increase the ecological validity of the study (Reed et al., 2013)

Finally, there are contradictory findings as to whether interventions are more effective for younger or older students. For example, Gersten et al. (2009) and Kunsch et al. (2007) found larger effects with younger students. In contrast, Kroesbergen and Van Luit’s review (2003) found a greater effect with older students than with younger students. In interpreting these results, one should note that the range of ages and school level varied in different reviews, and some age levels may have been overrepresented. For instance, the review by Kunsch et al. (2007) included students from primary to secondary school, and Kroesbergen and Van Luit’s review (2003) included students from kindergarten to primary school.

  • Codding, R. S., Burns, M. K., & Lukito, G. (2011). Meta-analysis of mathematics basic-fact fluency interventions: A component analysis. Learning Disabilities Research & Practice, 26(1), 36–47. doi:10.1111/j.1540-5826.2010.00323.x
  • Fischer, U., Moeller, K., Cress, U., & Nuerk, H.-C. (2013). Interventions supporting children’s mathematics school success. European Psychologist, 18(2), 89–113. doi:10.1027/1016-9040/a000141
  • Gersten, R., Chard, D. J., Jayanthi, M., Baker, S. K., Morphy, P., & Flojo, J. (2009). Mathematics instruction for students with learning disabilities: A meta-analysis of instructional components. Review of Educational Research, 79(3), 1202–1242. doi:10.3102/0034654309334431
  • Kroesbergen, E. H., & Van Luit, J. E. H. (2003). Mathematics interventions for children with special educational needs. A meta-analysis. Remedial and Special Education, 24(2), 97–114. doi:10.1177/07419325030240020501
  • Kunsch, C. A., Jitendra, A. K., & Sood, S. (2007). The effects of peer-mediated instruction in mathematics for students with learning problems: A research synthesis. Learning Disabilities Research & Practice, 22(1), 1–12. doi:10.1111/j.1540-5826.2007.00226.x
  • Reed, D. K., Sorrells, A. M., Cole, H. A., & Takakawa, N. (2013). The ecological and population validity of reading interventions for adolescents: Can effectiveness be generalized? Learning Disability Quarterly, 36(3) 131–144.
  • Wang, A. H., Firmender, J. M., Power, J. R., & Byrnes, J. P. (2016). Understanding the program effectiveness of early mathematics interventions for prekindergarten and kindergarten environments: A meta-analytic review, Early Education and Development. doi:10.1080/10409289.2016.1116343
  • Zhang, D. & Xin, Y. P. (2012). A follow-up meta-analysis for word-problem solving interventions for students with mathematics difficulties. The Journal of Educational Research, 105(5), 303–318. doi:10.1080/00220671.2011.627397