Artificial intelligence and social-emotional learning are on a collision course

Artificial intelligence and social-emotional learning are on a collision course

AI is expected to significantly impact how children develop their sense of self and interact with each other, their teachers, their families, and the wider world.

That means teaching outdated social skills may need to be updated, experts say, though social-emotional skills will matter more in an AI-powered world than ever before.

For example, knowing how to build and maintain positive relationships is one of the pillars of social-emotional learning. Artificial intelligence could fundamentally reshape our relationships, including who or what we form them with, experts say.

“Our humanity, our ability to connect, empathize, and experience positive, loving, nurturing relationships that are productive for ourselves and for society, is at the core of who we are as human beings,” said Melissa Schlinger, vice president for innovations and research. Partnerships in the Collaborative for Academic, Social, and Emotional Learning, or CASEL. “It’s exciting when technology can enhance that, but when you start replacing that, I think it becomes a really serious problem. I don’t know how you mitigate that. We see kids who are really addicted to their phones without AI.”

Generative AI tools — chatbots like ChatGPT and the social media app Snapchat’s bot — may pose problems for developing students’ social and emotional skills: how they learn these skills, how they form relationships, and how they navigate online environments filled with AI-generated misinformation.

Students are already turning to AI-powered generative chatbots to ask questions about how they handle their relationships. They’re asking questions on chat shows about romantic relationships, dealing with issues with family and friends, and even dealing with anxiety and other mental health issues, according to a survey of 1,029 high school students conducted by the Center for Democracy and Technology..

Request a chatbot for relationship advice

Chatbots have quickly become a popular tool that can be used to seek advice on a variety of social and emotional issues and topics, said (Pat) Youngbradet, chief academic officer at Code.org and president of TeachAI, a new initiative to support schools. In the use and teaching of artificial intelligence. But there is a lot we don’t know about how these chatbots are trained and what information they are trained on. Youngbradet said generative AI technology is often trained using vast amounts of data pulled from the Internet; it is not a search engine or “fact machine.” There is no guarantee that generative AI tools provide good or accurate advice.

“Children embody these tools because of how they are represented in the user interface, and they believe they can ask these questions,” he said. “People have to understand the limitations of these tools and understand how AI actually works. It’s not a replacement for humans. It’s a predictive text machine.”

Youngbradet points out that people are more likely to use a tool that responds in a human-like way, so if the tool is designed properly and provides accurate information, that can be a good thing.

But right now, because many AI-powered tools are so new, children and teens don’t understand how to use those tools properly, Youngbradet said, and neither do many adults.

This is one way that AI may impact how students learn to deal with social and emotional situations. But there are others, says Nancy Blair Black, head of the Artificial Intelligence Explorations Project with the International Society for Technology in Education, or ISTE, particularly as these rapidly evolving chatbots may replace human relationships for some children.

“We’re talking about AI agents that we interact with as if they were humans,” Black said. “Whether it’s chatbots, AI bots, or non-player characters in video games, it’s a whole extra layer. A year ago, those interactions were very simple. Now we’re finding they’re getting complex interactions.”

“Why would I try so hard to get a friendship when I have a chatbot that is so supportive”

Some teens and adults are even developing romantic relationships with chatbots Which is designed to provide companionship, such as the service offered by Replika. It allows subscribers to design their own personal companion robots.

Replika describes its chatbots as “AI for anyone who wants a friend without any judgement, drama, or social anxiety.”

“You can make a real emotional connection, share a laugh, or chat about anything you want!” Prepare. Subscribers can choose their relationship status with their chatbot, including “friend,” “romantic partner,” “mentor,” or “know how things are going.”

Replika also claims that chatbots can help users understand themselves better — from how interesting they are to how they handle stress — through personality tests administered by personal chatbots.

This was once the stuff of science fiction, but there is now concern that compliant chatbots may fuel unrealistic expectations of real relationships – which require give and take – or even cloud children’s interest in establishing relationships with other people.

Schlinger said this is new territory for her as well as most teachers.

“Why would I try so hard to have a friendship when I have this very supportive chatbot — wasn’t there a movie about this?” Schlinger said? “I don’t think it’s unrealistic that we can’t see this as a scenario.”

How generative AI can help improve social and emotional learning (SEL) skills.

Generative AI will not be negative on children’s social and emotional development. Black said there are ways technology can help children learn social and life skills. She said: Imagine how a chatbot could help children overcome social anxiety by giving them an opportunity to practice how to interact with people. Or how new AI-powered translation tools will make it easier for teachers who only speak English to interact with their students who are learning English.

This is not to mention the other benefits that AI brings to education, such as personalized virtual tutoring programs for students and time-saving tools for teachers.

When it comes to asking chatbots for advice on handling social situations and relationships, Schlinger said there’s value in kids having a non-judgmental sounding board for their problems — assuming, of course, that kids aren’t getting harmful advice. It’s possible that generative AI tools can provide better advice than peers of 13-year-old teens, for example, Schlinger said.

But while the basic ideas that make up social and emotional learning remain relevant, AI will mean changes in how schools teach social and emotional skills.

Social and emotional learning (SEL) curricula will likely need a meaningful update, Black said.

With that in mind, Youngbradet said schools and families should focus on teaching children at an early age about how generative AI works because it can have a profound impact on how children develop their relationships and sense of self.

Experts suggest that new and improved social and emotional learning curricula will need to include educating children about how AI is biased or tends to perpetuate certain harmful stereotypes. Much of the data used to train generative AI programs is not representative of the population, and these tools often amplify biases in the information they are trained on. For example, a text-to-image generator that posts an image of a white man when asked to generate an image of a doctorand the image of a dark-skinned person when asked to produce an image of a criminal, poses real problems for how young people understand the world.

Adults should also adjust how they interact with technology that mimics human interactions, and think about the social-emotional norms they may be inadvertently signaling to children and teens, Black said.

“Chatbots and those cognitive assistants, like Siri and Alexa, the ones that are supposed to be compliant, the ones that people control, are almost exclusively given a female personality,” she said. “This bias is getting out into the world. Children hear their parents interacting and talking to female personal chatbots in degrading ways, and they control them.

“We will always be looking to interact with others, and I don’t think AI can meet those needs.”

Where possible, Black recommends that teachers and parents change chatbot and other virtual assistant voices to a gender-neutral voice, and even provide a gentle model for Alexa and Siri.

But in the not-too-distant future, will artificial intelligence diminish our ability to interact positively with others? It’s not hard to imagine how a variety of everyday interactions and tasks — with a bank teller, a waiter, or even a teacher — could be replaced by a chatbot.

Black said she believes these possible scenarios are exactly why social-emotional learning becomes more important.

Social and emotional skills will have an important role in helping K-12 students distinguish true information from false information online, as artificial intelligence will likely increase the amount of misinformation circulating online. Some experts predict that up to 90 percent of online content may be created artificially in the next few years. Even if this expectation is short, it will be a lot, and social-emotional skills such as emotional management, impulse control, responsible decision-making, perspective-taking, and empathy are essential for dealing with this new online reality.

Other skills, such as resilience and resilience, will be important to help today’s children adapt to the rapid pace of technological change that many expect AI to herald.

“I think we will always crave interaction with others, and I don’t think AI can meet those needs in the workplace or at home,” Black said. “I think the things that make us more human — our fallibility, our creativity, our empathy — are the things that will be valuable.” Larger in the workplace because they are harder to replace.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *