Should students be using AI? If yes, when and where should it be introduced in school? These are tough questions. It’s easy to say, “Introduce it when appropriate,” but WHEN is it appropriate? And, are we sure that the AI (how to interact with it) we introduce today won’t be obsolete, tomorrow? Or can we assume it will get even easier?
I share my response (lame), and Claude.ai’s suggestion. You be the judge of who did the better job.
AI Tidal Wave
The AI Tidal Wave of so-called innovation is going to impact schools no matter what. The reason, as I’ve said before, is big money. Colleague Laura Bresko points out one of the big issues:
Though content creation is democratized for the foreseeable future, I am concerned about AI overlords rationing the keys to their kingdoms. It’s a troubling trend when middleware/wrapper providers are constantly mowed over by OpenAI…It is killing innovation and razing small business!
The EdTech revolution was falsely advertised (I did some of that) as transforming education. But we now know edtech failed. All those people who told me, “We don’t need technology to ensure critical thinking and student learning,” well, they were right. The only thing that actually works is the use of time-proven, evidence-based, well-researched instructional strategies in schools.
Note: That’s not to say that technology isn’t an important tool that we should avoid learning how to use. Rather, let’s stop trying to use it as magic fairy dust to improve student achievement.
The problem? The REAL problem? Most educators don’t know about these strategies, argue about what strategies they DO know about, and want to drop technology into the mix to heighten engagement.
Yoked to some worthless strategy, edtech doesn’t get the job done.
So, why so much emphasis on AI when there’s NO research it works?
“…to seek for such evidence and appearances as are in favor of our desires, and to disregard which oppose them…we receive as friendly that which agrees with [us], we resist with dislike that which opposes us; whereas the reverse is required by every dictate of common sense.” – Michael Faraday
What evidence is there that AI improves learning?
When the Sale Comes First
The reason why AI is going to smash through schools policies, filters, and administrator memos (all exhorting the life-saving, ramp to the future, benefits of AI even if unproven in schools) is that there is so much MONEY behind AI. We are in an AI “arms race” to deploy AI everywhere through our society on the possibility it’s going to change everything. And, like any other technology, it will…in the long run.
For me, this means that AI technologies will change everything, but the race to blend it into everything, seizing upon every single bit of “middleware,” learning to craft prompts and problem formulation, will fade in the face of a much easier, and complicated, reality.
AI apps popping up, lists of AI tools, Tik Toks…they are all a distraction at this time. It’s a high-priced, per user cost that schools are going to be tempted to waste their money on. Soon, richer schools and school leaders will use AI to differentiate themselves from poor public schools who can’t afford AI for every student and staff member.
A Fool’s Errand
Buying it for every student so they can learn how to use AI is a fool’s errand. After all, how has that worked out with Windows vs Mac, MS Office vs. Google Workspace, iOS vs Android, and all the either/or wars? All these did was generate a lot of publicity, and funding, siphoning funding away from schools.
Money-chasing businesses and organizations have one imperative at this time. That is…
“How can we recast our business so that AI appears to enhance our services in ways that appear dramatic and essential?
How can we make a sale?”
But that’s not what education is about, right?
“…are you not ashamed of heaping up the greatest amount of money and honor, and reputation, and caring so little about wisdom and truth and the greatest improvement of the soul, which you never regard at all?” -Socrates
Great Amounts of…
Sure, providing every teacher with access to AI tools can have a serious impact on time spent prepping lessons, developing materials, summarizing information, and resources. These time-saving efforts involving AI presume that the teacher did the hard work in the first place.
“It’s not cheating for teachers to use AI,” goes one saying. That’s because the assumption, the expectation is teachers did the hard work in college and in schools.
When I worked as a school district webmaster, a colleague told me, “No point in using a GUI editor if you can’t read and write HTML code. What happens if you run into a problem?” The reality was that I spent some time learning HTML, and the knowledge has served me long after the GUI editors I relied on later became obsolete. The hard work of learning HTML has paid off many times since I learned it, even though I had access to HTML GUI editors from the beginning. That’s a lesson that is applicable to AI, no?
That’s not necessarily true, though. I’m sure we can all think of educators who short-circuited the process of learning, failed to do their due diligence in processing information. We are all guilty of shirking our responsibilities, probably because we’re weak humans and imperfect teachers sought to assist us in spite of their imperfections. Hey, that’s life.
Ain’t nobody perfect, as the saying goes, neither teacher or learner. This means that learning really IS a lifelong endeavor, a lifetime chore, a life sentence of slogging through uncomfortable spots until things start to make sense. Who are we to deny our students that by giving them AI? Let them struggle for it.
Productive struggle is a state of engagement that enables students to work through increasingly challenging problems and new problems they have never seen before. In this way, making things harder on your students so they will stretch their brains can be a good thing. (source)
Or, as Hattie terms it, a move from surface learning to deep learning to transfer learning. Of making a path in our brains that tells the lump in our skulls, “Yes, this is something worth keeping,” even when sometimes, it’s dead wrong and we have to do our best to set it aside in the face of new evidence.
“All truths–even the laws of science–are subject to revision. We operate by them because they are necessary and they work.” (source)
Making Our Brains Work
As we all rush to use AI, we must do so with the understanding that learning to process information on our own yields tremendous benefits. After all, it’s long been known that if your brain doesn’t do the work, you won’t get the benefit of long-term information retention or develop the facility to process ideas and information:
“We all must add the action of your own mind in order to learn something.” Socrates expressed it two thousand years ago. He said that an idea should be born in the student’s mind, and the teacher should act as a midwife. The idea should be born in the student’s mind and the midwife shouldn’t interfere, too much, too early. But if the labor of birth, the midwife must then intervene. The student learns by his own actions.”
The short response is that we shouldn’t allow AI in elementary schools, gradually blending it into the curriculum at middle and high school. It will be tempting to give children access early, but without students learning to process information, what’s the point of AI?
What role/how much should students be using AI, and does this vary based on grade level?
Brady asks important question. AI’s role here will be the same that some have attempted to rely on Amazon and Google’s assistants (e.g. Alexa) for in the classroom. Since you can tell AI what role to assume, it’s safe to predict that some will get it to produce instructions for elementary students.
“It’s a new age in content democratization. Prompting is going to go away in a few years. Middleware will disappear in time. Prompting is an artifact, a necessary evil. It will get so much smarter, sophisticated prompts will go away,” says Laura Bresko. Consider this a prediction of the near future.
It’s increasingly important that teachers (and schools) use whatever they can to create situations where students struggle to process content and learn. We don’t want to use AI to take away that valuable experience from learners (nor do we want to give it up as educators).
Listening to Dr. Douglas Fisher at a recent conference, I was struck by this quote:
The ability to think metacognitively helps students plan their learning goals.
We all need to be able to think metacognitively about our own work, how to use AI to support that work, and when NOT to use it. With the gold rush of AI tools, my concern is that we’ve decided “AI is ubiquitous (it’s not since, as Jon Fila says, people self-select to use it based on their resources).
While I’m as excited as anyone about what you can do with the shiny tools, I suspect we need to follow something like the following:
- Grades PreK-5: No AI use except by teachers. Students need ample opportunities for productive struggle.
- Grades 6-8: Scaffolded use of AI with support to get analysis and feedback about their work. The idea is to get feedback on work they did with the teacher’s feedback first. As teacher, I get to see the work before the AI.
- Grades 9-12: Use proven strategies to process ideas and information. Again, use AI to provide feedback to students.
- Adult Learners: Use AI tools to help guide areas of study and development of resources and analyze information.
Just looking at that bulleted list, I realize how ignorant I am on this topic. I need to flesh this out some more, and the assertions could be wrong. Oh, well. I asked Claude.ai for what it would do. Sigh.
Claude.ai’s Suggestions
Here are some suggestions for improving the use of AI across different grade levels and adult learning:
PreK-5:
– Allow limited use of AI for personalized and adaptive learning to reinforce concepts. This provides engagement and scaffolds instruction.
– Use AI tutoring for extra practice on foundational math and literacy skills. This frees up teachers for small group instruction.
– Do not replace human teachers with AI. Focus should be on social-emotional development.
6-8:
– Use AI writing assistants to provide feedback on drafts before submission. This teaches the iterative writing process.
– Have students analyze the AI’s feedback compared to a teacher’s feedback. Discuss similarities/differences.
– Allow open-ended conversational AI tutors for personalized Q&A on academic topics. Monitor for quality.
9-12:
– Apply AI assistants to provide research and study support. Teach evaluation of sources/claims.
– Use AI to flag potential plagiarism and suggest citations. Teach appropriate source integration.
– Allow AI collaboration for project-based learning. Emphasize teamwork and communication skills.
Adult Learners:
– Provide AI-powered individualized guidance for developing learning plans based on skills assessments.
– Curate AI learning resources and tools aligned to adult learning goals and interests. Guide selection.
– Use AI writing support to provide feedback on developing communication and workplace skills.
The key is maintaining human guidance, oversight and supplementing AI with social-emotional development. AI should enhance, not replace, human teachers across all age groups.
Hmm. This saved me a lot of time and effort. But shouldn’t I have done the work and research that went into those suggestions?
Image Credits
I had so much fun generating the images illustrating this blog entry. While I started with Nightcafe, Microsoft Bing’s Image Creator is my go-to tool. The prompt used:
a tidal wave of artificial intelligence sweeping away schools and people with money tycoons looking on
😉
Everything posted on Miguel Guhlin’s blogs/wikis are his personal opinion and do not necessarily represent the views of his employer(s) or its clients.
Read Full Disclosure
Discover more from Another Think Coming
Subscribe to get the latest posts sent to your email.