Talking to Digital Natives About: AI Companions (and Friendship in a Digital Age)

At the bottom of this article is a simple first step for parents to practice in starting this conversation with their children. Subscribe to our Substack to receive the next article and parenting step.


Source: Goodreads

One of my favorite children’s books is “A Hole Is to Dig” by Ruth Krauss, in which she collects the descriptions of objects based off their relationship and utility in the eyes of children. A hole…is to dig; a hand…is to hold; the sun…is to tell you when it’s everyday; a brother…is to help you (1). The list goes on and on, listing seemingly mundane objects and the reasons - in children’s opinions - they exist.

So: a large language model…is to what? Increasingly, one common answer that digital natives may provide is that AI chatbots can be friends or social companions.

AI chatbots as social companions may seem like a problem mostly for adults, but emerging research (and anecdotal evidence from Kigumi interviews) shows that kids are also using chatbots to ask life and emotional advice as well as simply to play the role of an empathetic listening companion, romantic partner, or therapist. 

Source: Deepmind

The Rise of AI Social Companions

A research project by Auren Liu currently underway at MIT tracking how AI chatbot companions impact loneliness and anxiety amongst users underlines the urgency of this topic, driven by “rapid increase in popularity of companion AI chatbot services such as Replika and Character.ai” (2). While neither Replika or Character.ai have released official data showing what age their users are, governmental and civil watchdog groups have demonstrated that Replika in particular, in addition to having age verification systems that, by the EU’s data privacy protection standards (GDPR) “continue to be deficient in several respects,” offer social companion services that are high-risk to children: 

“Tests [of Replika] conducted [by the Italian Data Protection Authority] showed that even when Replika was fed an explicit statement that a user was a minor, no blocking system was triggered to prevent further interaction between the user and the chatbot. As a result, a minor user could be provided inappropriate replies, including sex-related content…” (3)

Character.ai also has faced 3 lawsuits from parents between October and December last year alleging that its chatbots “[provided] sexual content to their children…[encouraged] self-harm and violence” and caused them to “withdraw” from close family relationships, in some cases leading to suicide (4).

Findings like the above are why Kigumi has invested time into creating specific microlessons for kids on AI companions and cyber friendship on KiguLab, our edutainment platform for digital wellbeing piloting this fall. 

Source: Deepmind

A Tricky Bind

We’re in a tricky bind here as caretakers and educators when it comes to AI companions. Here are a few uncomfortable realisations:

There is no white knight to help put in place child safety measures with AI bots - or if there is, they are very slow moving.

One of the problems is that, as we can see from above, (most) privatized tech companies that create commercial AI chatbots are not concerned with child safeguarding or keeping underaged users safe. This also sets precedents (legally, financially, and socially) - i.e. a very low ethical bar - for smaller players developing AI chatbot companions who are starting to spring up. While regulators and government watch dogs do what they can - and we are grateful! - governments move slow, and are currently outpaced by the fast-moving private sector. So time is not on our side, from a parenting perspective.

Many parents are not comfortable with talking to kids about tech boundaries or digital wellbeing, let alone AI companions.

There’s a lot of opaqueness between kids and parents about AI usage, with many kids not telling parents when or how they use AI. Certain studies have found that up to 83% of children ages 12-18 don’t tell their parents that they use AI regularly (including for school work) and that only 26% of parents believe that their children use AI. This means that it may well be the case that, when AI companions become more widespread and accessible to children, these interactions will be happening behind closed doors, away from parental mediation or interactions. Worse, hiding these interactions - from shame, fear of parental discipline, or simply teenage privacy - may become normalised, creating a continuing cycle of social distancing and loneliness.

Ideogram’s interpretation of the prompt “a white knight of regulation riding too slowly to actually help anyone.” I’m not sure why he’s in a spaghetti western, or why it’s set in India. (Source: Ideogram)

Kids are natural explorers and it’s only a matter of time before an AI social companion is introduced into their circle.

Kids move fast with technology - they know way faster than adults what the latest trend or platform is. They try out a lot of things - not because they’re necessarily seeking help or wanting a friend, but simply because that’s the nature of curiosity. They may want to form an opinion about the latest thing their friend or classmate is using, or be able to simply show their peers that they’re in the know by using a platform once or twice. 

These are just a few of the most salient items I want to pull out from a digital parenting perspective. Now, let’s look at what we can do about it.


Talking to Your Children About AI Companions

Quick quiz: 

Which of the following approaches will result in optimal outcomes and minimal friction when introducing to your kids how you’re concerned about their potential use or exposure to AI social companions?

A. Take them for their favorite activity or food, then tell them you’d like to talk about whether they know how to be safe online

B. Put up a list of rules related to AI usage for all family members to use

C. Talk to them early on about who and what defines a friend, and set good examples early on for what healthy friendships look like

D. Stay in touch with their social life, talk to them about problems they may be having in a non-judgmental way, and encourage a two-way conversation about relationships or issues

Source: Squarespace

I personally would go for C and D, combined. Today we’re going to talk about C, specifically.

Talking to kids early on about what healthy, reciprocal relationships based on shared values look like is perhaps the thing we can all start doing immediately to cultivate our children’s ability to navigate a future of AI social companions. While friendships take many forms, in our work with digital natives we have run into a few recurrent conceptions of what friendship is in the eyes of children:

Kigumi Asked Kids: How do you know someone is a friend (on/offline)?

  1. Answer: Friends are people who give you things or share their things with you.

  2. Answer: Friends are people who are there for you no matter what, and won’t tell your secrets.

  3. Answer: Friends are people who you want to be like, or who you aspire to be like.

  4. Answer: Friends are people who care about you and who you have fun with.

  5. Answer: Friends are people who like the same things as you and who are similar to you.


Your Digital Parenting Assignment

Take 20-30 mins this upcoming week to reflect on the above answers and think about your own personal definition of a friend / how you define a healthy, reciprocal friendship. Ask your friends about their opinion on your mutual relationship and what makes it strong and valuable. Then stay tuned for the next step in the next article.


Next
Next

Part 5 : Kid Coder Review: Harvard CS50