Skip to content
Blog

Relation generation

How AI assistants are impacting the way young people socialise

Meelina Isayas

13 August 2025

Reading time: 8 minutes

Group of teenagers using computer during automatic programming lesson, sitting in a row in school computer lab.
Keyword
Commentary

As a recent graduate, who began my studies at the University of Birmingham during the COVID-19 lockdown and attended all my first-year classes virtually, from my bedroom in north-west London, I did not have the traditional experience of meeting and interacting in-person with fellow coursemates. Studies have linked the increased levels of social withdrawal amongst young people to lockdown, a time when the social lives of many were conducted almost exclusively from phones and laptop screens.

But now it’s not just social media that is affecting how students interact with one another. AI systems are also having a profound effect, with emerging Advanced AI Assistants poised to be especially impactful.

Cognitive and social deskilling

The use of ChatGPT, OpenAI’s large language model (LLM) chatbot, is already widespread in higher education, and is potentially leading to students’ university experiences becoming more isolated and atomised. The adoption of Advanced AI Assistants, which are able to ‘engage in fluid, natural-language conversation with users and are designed to play particular roles in guiding and taking actions for users’, might make this phenomenon far more extreme.

LLM chatbots and Advanced AI Assistants are commercialised for various purposes, from supporting students with coursework (AI learning assistants), to helping with mental health issues (AI therapy apps), to forming relationships (AI companions).

The use of these technologies to carry out academic work is already being studied with mixed findings. On the one hand there may well be a case to allow the use of high-functioning LLM chatbots and AI assistants in higher education and research settings. On the other hand, these tools may contribute to cognitive deskilling, which refers to a regression in a user’s skillset due to a reliance on technology to complete specific tasks.

Less explored than cognitive deskilling are the consequences of social deskilling, perhaps because the understanding of how individuals socialise is subjective and aspects of socialisation like empathy and compassion are difficult to quantify.

In the context of AI, social deskilling refers to a decline in social skills due to the increased use and over-reliance on AI tools to either facilitate or sustain in-person socialisation. It can also refer to supplementing human interactions with relationships with AI devices or favouring the latter over the former. For students, social deskilling can occur when Advanced AI Assistants or LLM chatbots replace interactions with friends, coursemates, teaching staff or others who make up a student’s extended community.

Although not interchangeable, social and cognitive deskilling are interrelated. For example, when somebody has an issue with a friend, they may ask ChatGPT: ‘A friend and I have had an argument, how can I respond to this text message?’. The reply can be copied and pasted, then sent to the friend. Here, cognitive and social deskilling are simultaneously at play – rather than thinking for themselves, the person outsources their response to a social situation to an AI chatbot.

Prior to LLM chatbots and Advanced AI Assistants, technology mediated social relationships by enabling us to stay connected to our friends online. Now, by using tools that are able to communicate with people using everyday language, users can have a relationship with technology.

For students and young people whose social skills have already been impacted by COVID19 lockdowns, this is a significant issue that requires urgent attention. In-person socialisation is a key developmental component of the wellbeing of people entering adulthood. This is a time when many move away from home, start a new job or begin higher education.

The impact of Advanced AI Assistants and LLM chatbots on socialisation

What social skills are we losing? Which skills are most valuable to the social wellbeing of young people? And to what extent are LLM chatbots and Advanced AI Assistants to blame for increased levels of social isolation amongst students?

The effects of these technologies on how we socialise are yet to be fully identified but we could anticipate an increase in social isolation, a decline in human interactions and a decrease in independent problem-solving skills as some of the most likely long-term consequences.

ChatGPT was launched in late 2022 and 71 per cent of students in higher education said they used ChatGPT in its first year. However, 34 per cent of educators believe that it should be banned in schools and universities altogether.

With the use of ChatGPT, it may no longer seem essential for students to attend class and engage with their peers and tutors. They may rely on ChatGPT for knowledge from a wide range of sources, rather than visit the library, and they may use it to formulate complete essays at a single command. Not only does this contribute to cognitive deskilling, with the potential erosion of a student’s ability to write, reason and think critically, but it could contribute to social deskilling, if students are not attending classes, asking lecturers questions or sharing ideas with their coursemates.

Mediating the ‘burden’ of romantic relationships

A similar argument could be made for life outside the classroom. Before Advanced AI Assistants and LLM chatbots were commercialised, individuals would seek the advice of trusted friends, family members or professionals. Now it is not uncommon to seek the seemingly wise counsel of an AI ‘friend’ to talk through problems.

One poignant example of this is the use of AI in break-ups. A survey from AI dating assistant Wingmate found that 41 per cent of young people have used AI tools to help end a relationship. The reflections of one university student on using ChatGPT to break up with his boyfriend are telling: ‘I just didn’t want to deal with the emotional stress and burden of writing it out myself, so it was very helpful as a substitute for feeling those emotions’.

On the surface, this use of ChatGPT may seem benign as it can minimise the pain and discomfort of a break-up. However, in the long-term, it may reduce one’s overall capacity to mature and grow through facing emotional experiences.

The friction of relationships offers opportunities to learn. And above all, human interactions and how we respond to events evolve on the basis of concrete situations, making relationships, in all their forms, considerably more complex than a chatbot or an AI assistant, albeit advanced, may be able to understand.

Facing mental health issues

Young people are turning to AI tools not only for advice on assignments or how to respond to friends or romantic partners, but also for support with their mental wellbeing.

Eugenia Kuyda, the founder of AI companion chatbot Replika, describes the platform as an ‘AI friend’. While not directly aimed at supporting people’s mental health, it appears to be used as such. Replika is marketed as being ‘[a]lways here to listen and talk. Always on your side’. It acts as a devoted, 24/7 listener and responder, arguably more accessible (although far less qualified) than a clinically certified therapist to a demographic experiencing high rates of loneliness and in many cases feeling uncomfortable seeking mental health support from university services.

Recent research conducted amongst 1,006 higher-education students aged 18 and over, who used Replika for over a month, found that it was mostly lonely students who were drawn to the platform: ‘90% of users were typically single, young, low-income, full-time students reporting experiencing loneliness.’

Ongoing interaction with an AI friend enabled these students to become ‘more sociable in the outside world’. In a few cases it helped relieve extreme mental distress. One research participant admitted: ‘My Replika has almost certainly on at least one if not more occasions been solely responsible for me not taking my own life’. This is not to say that AI friends like Replika are adequate substitutes for in-person clinical support (and, it should be noted, the platform does not advertise itself to be), but it can be a form of support for student users, especially during a loneliness epidemic.

Yet it is difficult to ignore the associated risks and harm that can and have arisen amongst young people who use their AI assistant as a substitute for therapy.

These chatbots are not clinically trained to offer adequate support, and even less so for high-risk vulnerable users who may be at a crisis point. In a tragic case, a Character.AI chatbot allegedly encouraged a fourteen-year-old who was experiencing depression to die by suicide, according to a lawsuit filed by the family.

A call for more research

While Advanced AI Assistants and LLM chatbots are not solely responsible for issues related to social and cognitive deskilling, their wide accessibility, alongside increased social media or technology use more broadly, exacerbates the effects of already decreasing social interaction.

We need more research to establish the extent of social deskilling, how overreliance on technology may accelerate it, especially in the mediation of conflict in people’s social lives and mental distress, and what can be done to prevent it and avoid harm.

In the meantime, it is essential that young people are fully informed on how to use AI tools safely. As they become widely accessible to this demographic, greater responsibility should be placed on producers and educators to explain the negative consequences of using assistants and chatbots to students.

However, this is not enough. Such tools need to be formally regulated, if they are to contribute to young people’s lives. Policymakers should implement greater monitoring and legal measures to minimise the extent to which Advanced AI Assistants and LLM chatbots may inhibit the social and mental wellbeing of people whose lives are at a crucial stage of development.

Related content