Jared Cooney Horvath is a globally recognized Science of Learning expert committed to helping teachers, students and parents achieve better outcomes through applied brain and behavioral science.
Evidence. Based. Teaching.
If you’ve been in education for any time at all, you know these three words have dominated the field for the better part of two decades.
Evidence-based teaching involves the promotion and use of strategies that have been empirically validated through controlled laboratory studies.
To validate such strategies, an intervention is applied to a group of (hopefully) randomly assigned students, and the results are evaluated against a non-intervention control group.
If the intervention group demonstrates superior performance (however that might be defined), that intervention is henceforth deemed to be an ‘evidence-based’ teaching strategy.
Moreover, to stack the evidence for any given intervention, groups of studies are often pooled together to create a 'meta-analysis'. That, for instance, is what John Hattie did with his popular Visible Learning book.
Now, this all may sound like a powerful way to determine which teaching strategies hold the most promise for increasing student achievement … and in many respects it is.
But unfortunately, it is not the infallible panacea that many would have us believe.
Evidence-based teaching suffers from some significant problems, including an over-reliance on standardized tests, a large research-funding bias, and a narrow definition of what student success even means.
Perhaps the biggest problem with leaning too heavily on such models, however, is that teachers are insidiously disempowered as the control over classroom instruction gradually shifts to educational researchers -- many of whom have never spent even a single day in a live classroom.
In other words, when teachers are not only told what to teach but how to teach it, they slowly lose agency and autonomy over their craft as they become cogs in a blunt system that lacks any trace of nuance or appreciation for the individual student.
Okay, so maybe that’s a bit grim … but it’s very cold and overcast outside as I write these words, so please excuse me if I’m being hyper-dramatic ;)
Anyway, in this From Theory to Practice video, I explore a piece of research that can help us develop a deeper appreciation for this issue in a sort of roundabout way:
Is It Live or Is It Internet? Experimental Estimates of the Effects of Online Instruction (David Figlio, Mark Rush, Lu Yin)
Here are some of the questions I tackle in this installment:
How can relying on global averages sometimes cause us to overlook key information when assessing research findings?
What is stratification, and how can it help us achieve a more nuanced understanding of otherwise unremarkable data?
With regard to live versus digital learning, which groups of students tend to suffer the most when thrust into the latter environment?
What are two key takeaways from this research that can help teachers approach their craft with more agency and personalization?
Give it a watch, and let me know what you think in the YT comments section.
And, as always, if you find this video valuable, interesting and/or entertaining, you can support us by liking, sharing and subscribing to our YouTube channel ;)
Hello everybody, and welcome to this week's From Theory to Practice, where I take a look at the research so you don't have to.
The article I've selected this week is called Experimental Estimates of the Effects of Online Instruction by Figlio and colleagues.
Now, I think this research highlights one really important thing that's worth thinking about, and that is the concept of stratification.
So, when it comes to scientific research, a lot of the time we deal in global averages. In essence, we conduct a study whereby we compare a group of people who do one thing against another group of people who do another thing, and in the end we come up with a global average which gives us the power to determine that group A did better or worse than group B.
And this is fine; it can be a great tool … but every once in a while it can also be a blunt tool. On occasion, relying on global averages can cause us to overlook useful or important data that's unique to individuals within a group ...
Coming soon ...
Did You Enjoy This Post?
Help spread the idea by sharing it with your peers and colleagues ...
NOT ON THE LIST? Click below to join the LME Community ... and receive new Science of Learning articles from Dr. Jared Cooney Horvath every week!
You Might Also Like ...
Connect With Us
Copyright © 2022 LME Global
6119 N Scottsdale Rd, Scottsdale, AZ, 85250
Connect With Us
Copyright © 2022 LME Global – 6119 North Scottsdale Road, Scottsdale, AZ, 85250 – (702) 970-6557