May 21, 2023
‘The genie is out of the bottle’: What parents should know about their kids using AI like ChatGPT
Only about 30% of parents say they’ve used ChatGPT, compared to 58% of kids ages
Only about 30% of parents say they’ve used ChatGPT, compared to 58% of kids ages 12-18, according to a survey out this month from Common Sense Media. And while half of adolescents report using ChatGPT for school, only about a quarter of parents reported knowing their child had used it for school. (Michelle Zenarosa / Getty Images) (Getty Images)
Are your kids using ChatGPT? Should they?
The chatbot revolution is here, presenting parents with new challenges raising kids in a world increasingly reliant on powerful artificial intelligence (AI) programs.
It's OK for parents to be nervous about how technologies like ChatGPT will impact their kids, said Dr. Aimee Roundtree, associate dean of research in the College of Liberal Arts at Texas State University, whose research includes AI technologies like facial and speech recognition. She recently contributed to the World Economic Forum's Artificial Intelligence for Children Toolkit.
"Any parent who cares about their child is going to be concerned about this issue. I have a child myself, so I understand," Roundtree said.
"But we’re not going to retract this technology; there's not going to come a time where we won't use it. The genie is out of the bottle."
Only about 30% of parents say they’ve used ChatGPT, compared to 58% of kids ages 12-18, according to a survey out this month from Common Sense Media. And while half of adolescents report using ChatGPT for school, only about a quarter of parents reported knowing their child had used it for school.
ChatGPT is one of the most widely-used AI chatbot programs, mining publicly-available online data to create responses to user questions ranging from basic ("What's a good pot roast recipe?") to elaborate ("Write a song about a penguin dance competition").
Roundtree recommends parents take time to think about how they want their kids to understand AI programs more broadly, what to expect from them and how they can hold those technologies accountable. Reckon spoke with Roundtree this week to learn more.
This interview has been edited for length and clarity.
The jury is still out. There's a big concern right now about ChatGPT and plagiarism. It's compelling us to expand the definition of plagiarism: How much can a computer help you write? If Grammarly or if your Microsoft grammar checker can help, can ChatGPT help, and in what ways? The concern for an educator and the concern for a parent – and I’m both – should be, what is this teaching my child? What are we learning?
So will it ever be appropriate for a student to allow ChatGPT to write an essay for them? No. Could there be a learning task where ChatGPT is involved? I’m thinking, for example, of having ChatGPT write something and then having a student critique it? Maybe so.
The underlying principle should be that these tools – as all technology is a tool – are deployed to help our children learn; it's not a substitute for their learning.
I think having familiarity with how these technologies work is really important. They are already invading many aspects of our lives, including things as mundane as your Amazon or Spotify list, recommending things that might be of interest to you: If you like this, you might also like that. That kind of computer machine learning and thinking is already underpinning a lot of the technologies kids use and the ways they already live their lives. It informs Instagram or TikTok. It informs what you’re shown or not shown on YouTube.
In my mind, we’d want to begin an age-appropriate path to learning about these technologies from a pretty early age. That can vary (by child), but as soon as they’re using a computer, they can learn.
Having some familiarity with the basic premise of how algorithms work is important because (AI technologies) have a way of being transparent – you don't even realize how much they’re impacting your life. They just become a part of how you expect to interact with the world and with others.
It's important for children to understand the limitations of technology. YouTube isn't feeding you those videos because that's who you are or who it's predicted you to be. It's just an interpretation of the numbers of how many times you’ve watched a video, how many other people have watched it and how many times you might share it. It's predictive logic.
Part of what makes humanity have the potential for change and growth is in self-awareness and the freedom to make decisions besides the one that's been presented in front of you. The earlier that kids can learn that, the better.
My question to my child would be, ‘Why do you want to use it?’ and ‘Where did you hear about it?’
Try to engage with your child so you can gauge the context of the ask. There might be circumstances where it's appropriate; for example, if the child was a little older and it could be useful for an assignment.
I’d be really conscious about making sure we also begin a conversation (with our kids) about the limitations of this kind of technology, about what it can and cannot tell you about decision making and creativity. These are the kinds of characteristics I’m hoping to develop in my child, not outsource to a technology.
Now that we have this technology here with us, I think the issue will be holding ourselves and our kids accountable, and holding the companies that create these technologies and the developers who design these technologies accountable.
We have to teach our children to hold these technologies accountable for being fair, inclusive, responsible and transparent.
Guide to ChatGPT for Parents and Caregivers, by Common Sense Media
Artificial Intelligence for Children Toolkit, from the World Economic Forum
RAISE (Responsible AI for Social Empowerment and Education), from MIT