Hattie’s Index Of Teaching & Learning Strategies: 39 Effect Sizes In Ascending Order

, , 13 Comments

hatties-effect-sizes-teaching-learning-strategiesAn Index Of Teaching & Learning Strategies: 39 Effect Sizes In Ascending Order

by Dana Schonsai-iowa.org

Effect Size Defined

Statistically speaking, the strength of the relationship between two variables. John Hattie, Professor of Education and Director of the Melbourne Education Research Institute at the University of Melbourne, Australia, says ‘effect sizes’ are the best way of answering the question ‘what has the greatest influence on student learning?’

Effect Size Applied

  • Reverse effects are self-explanatory, and below 0.0
  • Developmental effects are 0.0 to 0.15, and the improvement a child may be expected to show in a year simply through growing up, without any schooling. (These levels are determined with reference to countries with little or no schooling.)
  • Teacher effects “Teachers typically can attain d=0.20 to d=0.40 growth per year—and this can be considered average”…but subject to a lot of variation.
  • Desired effects are those above d=0.30 (Wiliam, Lee, Harrison, and Black 2004) and d=0.40 (Hattie, 1999) which are attributable to the specific interventions or methods being researched– changes beyond natural maturation or chance.
  • Blatantly obvious effects: An effect-size of d=1.0 indicates an increase of one standard deviation… A one standard deviation increase is typically associated with advancing children’s achievement by two to three years*, improving the rate of learning by 50%, or a correlation between some variable (e.g., amount of homework) and achievement of approximately r=0.50. When implementing a new program, an effect-size of 1.0 would mean that, on average, students receiving that treatment would exceed 84% of students not receiving that treatment.
 Cohen (1988) argued that an effect size of d=1.0 should be regarded as a large, blatantly obvious, and grossly perceptible difference [such as] the difference between a person at 5’3″ (160 cm) and 6’0″ (183 cm)—which would be a difference visible to the naked eye.

Effect Size CAUTION

Reduce temptation to oversimplify. This is one more resource in our efforts to problem-solve on behalf of our students. We need to be careful about drawing too definite a conclusion from an effect size without examining the study. For example, homework is shown to have an overall effect size of 0.29, which is low and well below the average of 0.40. But when you look more closely, you find that primary students gain least from homework (d = 0.15) while secondary students have greater gains (d = 0.64).

Editor’s Note

Data is only as useful as its application. As hinted at above, don’t fall into the trap of assuming the teaching and learning strategies and other impacts on student achievement at the top of the list are “bad,” and those at the bottom are “good.” These are not recommendations, but rather a comprehensive synthesis of a huge amount of data. Every study has a story, and every strategy and impacting agent below has a background.

The most helpful part of this chart–and the reason we asked Dana to share her work here–was the column on the right where she adds a short statement or tidbit that helped contextualize the data point. Otherwise, judging purely by the chart, inquiry-based learning. self-directed learning,  class size, and teacher content knowledge perform terribly, while skipping a year, reciprocal teaching, and teaching of study skills are through the roof.

Ultimately, to best use this data to inform teaching and planning, every study should be analyzed on its own. We would need to clarify what the terms were for success. We’d also need to plainly clarify the definition for every word and phrase for every impacting agent and strategy so that we were all speaking the same language. We would then need to identify and analyze other variables in each study–inquiry with or without technology, with or without access to local communities, with students reading at, below, and above grade level, using culturally relevant or irrelevant text, and so on.

Which makes two of his books–Visible Learning and the Science of How We Learn, and Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement–must-buys so that you can do that kind of analysis on your own rather than skimming a blog post and extracting misguided takeaways (which is why we hesitated to publish it to begin with). That said, the results of the synthesizing of the data appear below.

Also, we used a new embedding host for the document. If you have trouble viewing or scrolling, let us know in comments, via email, or on twitter!

What Has The Greatest Influence On Learning? A Synthesis Of Hattie’s Synthesis

(See above for effect sizes and context/explanation.)

  1. Retention (holding back a year)
  2. Open vs traditional learning spaces
  3. Student control over learning
  4. Teacher subject matter knowledge
  5. Ability grouping/tracking/streaming
  6. Gender (male compared with female achievement)
  7. Matching teaching with student learning styles
  8. Within-class grouping
  9. Extra-Curricular
  10. Reducing class size
  11. Individualized instruction
  12. School finance
  13. Teaching test-taking and coaching
  14. Homework
  15. Inquiry-based teaching
  16. Using simulations and gaming
  17. Decreasing disruptive behavior
  18. Computer-assisted instruction
  19. Integrated curricular programs 
  20. How to develop high expectations for each teacher
  21. Professional development on student achievement
  22. Home environment
  23. Peer influences on achievement
  24. Phonics instruction
  25. Providing worked examples
  26. Cooperative vs individualistic learning
  27. Direct instruction
  28. Concept mapping
  29. Comprehension programs
  30. Teaching learning strategies
  31. Teaching study skills
  32. Vocabulary programs
  33. How to accelerate learning (e.g. skipping a year)
  34. How to better teach meta-cognitive strategies
  35. Teacher-student relationships
  36. Reciprocal teaching
  37. How to provide better feedback
  38. Providing formative evaluation to teachers
  39. Teacher credibility in the eyes of the students