Earlier today, someone from a private, Catholic school wrote and asked me about COPPA and Gen AI use. I thought I’d share my response and, if someone has some additional points to make, they could let me know. I don’t pretend to be an expert and I’m only thinking of how I would approach this in a school or district. Not surprising, I’ve implemented similar solutions when I was in the exact same spot as the educator who wrote me.
Question
When I read your extremely well-written posts, I glean information that I regularly share with my teachers. We have a Microsoft environment. We do not use Google. Thank you for all you do.
In today’s post, you wrote about Claude AI. I love the ideas, but you refer to using classes with elementary students. Can you talk me through how you’re handling the age issue, please? I’m working under my understanding that since under 13- year olds are not of age to give consent that they should not be using Claude AI. I look forward to your guidance on this slippery slope.
Response
Thank you for your kind words about my blog posts. I’m glad you find them helpful for sharing with your teachers.
You raise an excellent point about age restrictions and consent for students under 13. Of course, tools like Vibes DIY (the tool I share in today’s blog entry) does NOT collect any user data, so no worries on violating COPPA’s under 13 rule. Claude Artifacts, however, is a different story. But it’s still possible to use it in the classroom. Let’s explore a few ways how.
The Four Tests
You’re absolutely correct to be cautious. This can be a slippery slope, which is why I recommend applying the Four Tests (which I cover in a future, unpublished blog entry, so you’re getting a sneak preview!):
- The Problem Test: Does it solve a real problem?
- The Student Test: Does it truly improve learning?
- The Sustainability Test: Is the effort worth the reward?
- The Alignment Test: Does it fit our mission?
Aside: You know, as someone who attended private Catholic school (oh, the privilege) K-12 (St. Mary’s Catholic School in Panama, Holy Spirit in San Antonio, and Central Catholic Marianist High School in San Antonio), I really see no issues with using tools like Vibes DIY, which Doug Holton introduced me to in a Mastodon post, with students. What’s more, the practical solutions below can work quite well in private, Catholic school environments, as well as other religious settings.
Here’s how I approach this issue:
The Legal Framework
- COPPA (Children’s Online Privacy Protection Act) requires parental consent for collecting personal information from children under 13
- Most AI platforms, including Claude, have Terms of Service requiring users to be 13+ or have parental consent
- In loco parentis allows schools to act in place of parents for educational activities, but this has limits. I would recommend using the Four Tests above as part of the conversation on campus. The 4th question, The Alignment Test, can be helpful on a private school campus.
Practical Solutions I Recommend
- Teacher-Mediated Use: The teacher maintains the account and acts as the intermediary. Students provide input, but the teacher actually operates the AI tool. This keeps students from creating accounts or directly accessing the platform. This is usually how tools like Claude Artifacts are used…a lot of planning in tools students DO have access to like Microsoft OneNote or MS PowerPoint, then executing the work in the Gen AI tool or Ed Tech tool of choice.
- Whole-Class Demonstrations: Project the AI interface on a screen and have collaborative discussions about prompts and outputs. Students learn AI literacy without individual access. (Aside: In fact, this is quite similar to the approach AI in the Writing Workshop takes).
- Parent Permission Forms: Some schools create specific permission slips for AI tool usage, similar to field trip forms. This provides documented consent.
- Alternative Tools: Consider age-appropriate alternatives like:
- Scratch for coding concepts
- Book Creator for digital storytelling
- Microsoft has MakeCode (free), which introduces block coding and there are ample lessons available (let me know if you need links)
- Microsoft’s Minecraft is also an option, albeit with a per user license
- MineTest is like Minecraft but totally FREE. Read this blog entry here.
For Your Microsoft Environment
Since you’re in a Microsoft ecosystem, please be aware that Microsoft Copilot is officially available only to users aged 13 and above; users under 13 are not permitted to use it, regardless of parental consent or school policy. That’s why I would prefer to use Vibes DIY, or take advantage of Practical Solutions 1, 2, and/or 4 above.
My Recommendation
For elementary students, I suggest starting with teacher-mediated demonstrations where you control the AI interaction while students contribute ideas and analyze outputs. This provides the educational benefits while maintaining compliance with age restrictions. But again, remember that Vibes DIY collects no student data. If you let parents know, offer an opt-out Responsible Use Agreement (RUA) where parents have to specifically deny permission, then you can easily use Gen AI tools in loco parentis in the classroom. You can still follow the Game Design process outlined shared in this blog entry, Interactive Fiction Game Design with Claude Artifacts.
Hoping this helps clarify what you can and cannot do. My suggestion is to get a stakeholder committee discussing the Four Tests. In this way, you can educate everyone on what the possibilities are. Chances are, parents are already making these tools accessible for students without the critical guidance they need. That’s an aspect a school with name you have (religious, private school) is uniquely prepared to address.
Further Ruminations
You know, the Alignment Test really comes into play at a private, religious school. It comes to mind when I think about this article’s 8 Ethical Ways Teachers Can Use AI in the Classroom. Their roadmap offers some interesting points. Some might argue that NO Gen AI use can be ethical given the impact on people, environment, and the way it’s being pushed on people by tyrannical governments (no, I’m not spelling it out, you can make the connection yourself).
You may recall I shared the CARE Framework for Gen AI Ethics. I thought I’d get my CARE bot to create a response to the email…what do you think?
C – Critical Awareness
You are 100% right: students under 13 cannot and should not create their own accounts or consent to the terms of service for platforms like Claude, ChatGPT, or others. This is a firm line dictated by privacy laws like COPPA. This aligns perfectly with the core idea you mentioned from my posts: we can’t pretend the tool is ethical, so our method of use must be.
The foundational awareness here is that individual student use is not the goal or the method for elementary students. The only responsible approach is one where the teacher acts as a secure and thoughtful mediator.
A – Applied Purpose
With younger students, the purpose of using a tool like Claude is not to teach them “prompting” in a 1:1 setting. The purpose is to use the AI as a novel object of inquiry for the entire class. The goal is to:
- Model how these tools work in a safe environment.
- Spark creativity and collaborative storytelling.
- Create a “first draft” or a “silly example” that the students can then critique, improve, and build upon with their own human creativity.
The AI becomes a catalyst for conversation and critical thinking, led by the teacher.
R – Responsible Practice
This is the “how-to” that directly answers your question. The only responsible way to use generative AI with under-13 students is the “Teacher as Mediator” model.
- One Account Only: The teacher is the only one with an account.
- Projected for the Class: The teacher operates their account on a computer that is projected onto a smartboard or screen for the whole class to see. The students never touch the tool directly.
- No PII: The teacher ensures that no personally identifiable information (PII) from any student is ever entered into the prompt.
- Collaborative Prompting: The teacher leads a discussion to create a prompt together. For example: “Okay class, we’re going to write a story about a robot who is afraid of butterflies. What should we ask the AI to write for the first paragraph?” The teacher then types the co-created prompt.
- Shared Analysis: The class reads and analyzes the AI’s output together, led by the teacher.
In this model, the students are not “using Claude AI.” They are participating in a teacher-led lesson where an AI tool is the subject of discussion and interaction, all happening through the teacher’s secure account.
E – Evaluative Outcome
The success of the lesson is not measured by the quality of the AI’s output. The success is measured by the quality of the students’ discussion about the output. The teacher evaluates the outcome by asking questions like:
- “What parts of the AI’s story did we like? Why?”
- “What parts were confusing or didn’t make sense?”
- “How can we, as a class, write a better second paragraph than the AI did?”
Discover more from Another Think Coming
Subscribe to get the latest posts sent to your email.