Over at the TCEA TechNotes blog, someone posted the following in response to this blog entry, Going Beyond SAMR: Models for Technology Integration:
Thank you so much for this article. This is such a fascinating topic and I love learning more about it.
I took the TIM-T course with the University of Florida and I absolutely loved it. I want to study more about these Technology Integration Models and when to use them. Could you provide me with more resources?
I have also read a lot about two other Technology Integration Model: PICRAT and the 5E model. Have you heard about them too?Thank you once again.
My response got to be quite long and I hated the way the comment was formatted in WordPress. Also, I felt terrible I couldn’t insert any images or easily add links. So I’ve posted my response here, too.
Response begins here:
Hi, Leandro. One of the benefits of edtech frameworks is that they provide different lenses. Each assists our observation and understanding of what is happening in the classroom.
Shifting the Role of the Student
At the heart of educational technology frameworks is how technology impacts classroom instruction. And, in some cases, shifts student(s) to the role of the teacher. In every model you analyze, you will see how the student’s role shifts. It shifts from being a passive learning to a creator and problem-solver. Or from using technology for productivity to collaboration.
Two Continuums
Imagine these as two continuums for student/teacher growth and tech usage. I could build a theoretical framework out of those two continuums. You would see a sliding scale for each, and you could plot classrooms on each scale. But it would be as valid as any other false one (e.g. SAMR, RAT).
One must be careful with ETFs since their goal is to assess movement along the two continuums. If these tools are not evidence-based, they could have disastrous results. A principal could end a teacher’s career because they did not proceed along a false continuum (e.g. SAMR, RAT).
Some TIMs
Tech Integration Models (TIMs) include the 5E/6E Model, Engineering Design Process, and others. Others include PBL, Inquiry-Based Learning, Constructivist Learning, and EduProtocols. There are many more models, all designed to mo.
Amplifying Effect
TIMs succeed only insomuch as they blend proven, evidence-based instructional strategies into each phase of their model. If the instructional strategies are sound, then the technology amplifies their effect. If the strategies are not, then technology fails.
For example, corporal punishment is a failed strategy. Using technology to amplify the effect of that strategy would do MORE harm, make the effect MORE negative than it is without tech.
Distinguishing Between TIMs and ETFs
We use Technology Integration Models (TIMs) to learn how to embed technology in the lesson cycle. They allow us to bridge evidence-based instruction with emerging technology. Educational Technology Frameworks (ETFs) assess our success in blending technology into instruction.
ETFs include evidence-based frameworks such as T3, Triple E, HEAT/LOTI, PAGER. And, they also include other non-evidenced-based frameworks. Those members include SAMR, R.A.T., and PICRAT. Those are theoretical explanations. They are a simplification of complex concepts you can present to educators.
I loved this quote from the PICRAT source:
Though these models are commonly referenced throughout the literature to justify methodological approaches for studying educational technology, little theoretical criticism and minimal evaluative work can be found to gauge their efficacy, accuracy, or value, either for improving educational technology research or for teaching technology integration (Kimmons, 2015; Kimmons & Hall, 2017).
Relatively few researchers have devoted effort to critically evaluating these models, categorizing and comparing them, supporting their ongoing development, understanding assumptions and processes for adopting them, or exploring what constitutes good theory in this realm (Archambault & Barnett, 2010; Archambault & Crippen, 2009; Brantley-Dias & Ertmer, 2013; Graham, 2011; Graham, Henrie, & Gibbons, 2014; Kimmons, 2015; Kimmons & Hall, 2016a, 2016b, 2017).
In other words, educational technologists seem to be heavily involved in what Kuhn (1996) considered “normal science” without critically evaluating competing models, understanding their use, and exploring their development over time. Reticence to engage in critical discourse about theory and realities that shape practical technology integration has serious implications for practice, leading to what Selwyn (2010) described as “an obvious disparity between rhetoric and reality [that] runs throughout much of the past 25 years of educational technology scholarship” (p. 66), leaving promises of educational technologies relatively unrealized.
It should come as no surprise that educational technology has FAILED. One reason was its reliance on popular instructional approaches with no basis in research. That is, the technology integration models are bankrupt and were adopted because they sounded good. The second reason? The edtech frameworks we relied on to assess technology use in the classroom (the effectiveness of a Tech Integration Model) was also not evidence-based. That is, untested.
![]() |
| image source |
The Problem with Frameworks
Conceptual frameworks (e.g. SAMR, RAT, PIC) are fun to discuss around the coffee table, the teacher’s lounge. The reason why you should avoid them is that they are not research-based. Approaches focused on educating children and assessing teachers’ efforts must be RESEARCH-BASED, SCIENTIFICALLY-PROVEN. They must be testable, not subjective measures.
If it lacks scientific evidence, then a TIM or ETF should NOT be used. I know administrators use them in the classroom, even unproven. They are “popular research.” The same is true of many other “educational innovations” that find their way into schools. These digital tools and instructional strategies waste precious instructional time in the classroom.
And, worse, in some cases, administrators have censured teachers. This happens when school leaders rely on non-evidence-based frameworks or models. What needs to happen is a purge of non-evidence-based approaches and teacher assessment tools. That includes ineffective uses of technology that fail to amplify the effect of proven instructional strategies.
There is a plethora of research-based TIMs available for you to use. If you search on 5E Model, Engineering Design Process, PBL here at TCEA, you will find online courses and blog entries that you can use to learn more about each. The PIC and RAT, or their combination known as PICRAT Model, are seldom mentioned at TCEA. That’s because, like SAMR, they are unproven conjecture. If mentioned, it is only to highlight their insufficiency.
Research Based Models and Assessment Frameworks
To find research-based models, I encourage you to review the following:
- Visible Learning MetaX,
- the SOLO Taxonomy (in lieu of Bloom’s),
- Coaching Research
- A research-based framework, such as:
- T3 Framework
- HEAT/LOTI frameworks for assessing students and teachers, respectively
- Triple E framework
Of these, T3 Framework blends in John Hattie’s and Robert Marzano’s work on evidence-based instructional strategies. You can use other frameworks, but you will have to invest more time and effort to blend in strategies that work.
Discover more from Another Think Coming
Subscribe to get the latest posts sent to your email.
