Well-crafted points on this topic by Miriam Reynoldson:
Using AI is not about communicating. It’s about avoiding communicating. It’s about not reading, not writing, not drawing, not seeing. It’s about ceding our powers of expression and comprehension to digital apps that will cushion us from fully participating in our own lives.
Generative AI use is degenerative to literacy.
Read her engaging piece, even if you are a proponent of “AI Literacy.” That term makes me nervous, and I suggest a different term below.

Having had this argument, quite similar, with someone about computer literacy, digital literacy, etc. I am not unfamiliar with this type of perspective. The terms are a convenient, easy handle for people to grasp a discrete set of skills. The vocabulary choice is problematic for many reasons, as Miriam R. asserts so eloquently.
Degenerative AI
Miriam’s point about genAI being degenerative to communication…I must agree while suggesting that isn’t always true.
While this BYU study cited below asserts something similar, one of the authors makes a key distinction (keep reading beyond this quote):
“If you use GenAI for all your assignments, you may get your work done quickly, but you didn’t learn at all,” Wells said in a statement. “What’s your value as a graduate if you just off-loaded all your intellectual work to a machine?” https://www.byteseu.com/1082444/
That’s what happens with AI literacy as Miriam describes it. You have given your work to a machine to do when you needed to do it yourself…or rather than use the machine to build yourself up, you let it do it all.
In the BYU study, this point (the one I wanted you to read from earlier) is made as well:
“It is important to understand your goal behind creating something,” Steffen said. “Is it to learn? Is it to get something done fast? Are you presenting something as your own work, or as a gift to someone else? These questions can help us decide when to use AI.”
And that point is so often tossed aside by the AI resistance.
GenAI as A Support
Miriam acknowledges the uses of AI as a scaffold for learning:
Look, I’ve had students who have effectively used it as a scaffold to engage with a meaningful secondary discourse (academic literacy). And that’s wonderful. But it’s a scaffold, not the thing itself. My aim is for the student to achieve the actual skill.
I don’t disagree with AI as a scaffold. I use it exactly that way when exploring models. But at some point, I have to use the model unaided by AI, even if it assisted in the model’s creation. Otherwise, my brain doesn’t get the benefit of longterm information retention and develop skill at performing a task.
Generative AI Discourse Competence
But there are certain ways to use AI and knowing how to use AI that constitute a discrete set of skills. This is what people mean when they say “AI literacy.”
Rather than being about literacy as has been defined in the past, it’s about learning how to use generative AI.
It’s also about being able to have a conversation about genAI, its uses in your field, whether it’s okay to use, when and when not to use it, how to recognize it, how to cite it, etc.
A better way to describe this is “generative AI discourse competence.” One way to define it:
GenAI discourse competence is the ability to confidently discuss generative AI’s applications, limitations, and ethical considerations within your field.
It means knowing when and how to use GenAI, recognizing its impact, and engaging in thoughtful conversations about its responsible use.
That aside, I agree with Miriam’s point about yet another literacy related term muddying up the pool of water other literacies exist in. That’s why a different phrase (Generative AI discourse competence) may work.
As to Miriam’s other point, it is compelling to think AI is a scaffold that, if overused, impedes actual learning. For students, I agree wholeheartedly. That’s why the SOLO Taxonomy and Gen AI work well together for educators.
With such easy access, students may skip the hard work of learning. They have and will embrace easy GenAI. But hasn’t that always been true?
I keep coming back to what’s the original purpose of doing something. Is it to learn or to get something done quickly so you can get back whatever else you really wanted to do in the first place?
When it’s to learn, the long road, the more difficult path is the best way. When you are in it to produce because your employer demands output, well, some combination of learning enough to produce with genAI quickly works, too.
What’s the Purpose?
• If it’s to learn, embrace the challenge.
• If it’s to produce, leverage the tools…but don’t lose sight of the value of genuine understanding.
Consider:
Use GenAI to enhance productivity, but set aside some time for deep learning.
Can students identify when this should be? That’s the discourse we need to be engaged in.
Note: I wrote this on my smartphone before getting up this morning. It’s possible my brain wasn’t working perfectly and I am ok with reading this later and realizing, “Oops, that was interesting.” So if my thinking or reasoning is faulty, better it get some sunshine splashed on it. Have a great day, sunshine. 😉
Discover more from Another Think Coming
Subscribe to get the latest posts sent to your email.
I apprecia
[…] we need isn’t AI literacy per se, but rather, Gen AI discourse competence. And students need it as well. That’s defined […]