Menu

Short-term gain and long-term loss: social AI and loneliness

by | Oct 30, 2024 | AI Ethics

Loneliness is something that many of us could experience within our lifetime; it is a natural reaction to solitary situations given our inherent traits as social creatures. There is no universal experience of loneliness, either; it is highly subjective and can be triggered by a multitude of circumstances (such as a divorce, an object with a painful memory, or even the layout of locations themselves).

While there is no unique definition of what loneliness is, it can be described as a discrepancy between the quality and quantity of relationships you have, compared to those that you want. This is why loneliness should not be conflated with being alone: you can be alone, yet not lonely (such as enjoying a private moment in a busy work day, satisfied with the quality and quantity of your social relationships), and you can be lonely, yet not alone (such as being in a room full of people you love, yet feeling lonely as you want the quantity and quality of these relationships to be higher).

Feelings of loneliness also come in degrees, stretching from short and less intense moments of loneliness (such as having the house to yourself while your roommate is away) to perpetual feelings of chronic loneliness (where you feel lonely most of the time, if not all of the time). The latter is where governments are trying to devise ways to avoid its mortal consequences.

For example, city authorities in Seoul, South Korea this year have committed nearly $327 million over the next five years to combat loneliness in the city. The World Health Organisation has launched a Commission on Social Connection (2024-2026) in recognition of loneliness as a global public health priority. There have even been structural governmental changes to reflect loneliness as a pressing issue: both the British and Japanese government have appointed loneliness ministers in the last five years.

One strategy that has also taken hold is the use of social artificial intelligence (AI), a branch of AI technologies dedicated to conversing and establishing relationships (both romantic and platonic) with humans. Given the epidemic and how short-of-time health practitioners are, social AI can provide 24/7 companionship that can alleviate some of the immediate effects of feeling lonely. Some studies even show how social AI applications have helped some users reconnect with people they had lost touch with. Hence, how effective social AI (particularly chatbots) can be in combatting loneliness is where I have focused my research.

Loneliness and social AI

I paid particular attention to chronically lonely emerging adults (18-24), given how chronic loneliness is the most pernicious form of loneliness, and emerging adults are an often-overlooked group in loneliness research. While these types of results must be taken with a pinch of salt due to loneliness reports being susceptible to self-reporting bias, they are still indicative of a global trend: young adults are often the loneliest group in society (not the elderly as you might have expected).

With this backdrop in mind, I wanted to explore how a loneliness-specific chatbot intervention would need to be designed to be successful amongst chronically lonely emerging adults. Below are my main findings:

  • Stepping stone: such an intervention cannot replace any human help. Instead, it must act as a gateway to further human-centric help. To illustrate, if an individual finding out they are chronically lonely is 0, and them receiving human-centric help (such as local community projects) is 1, a chatbot intervention would need to be 0.5.
  • Current companion apps are unsuitable for combatting chronic loneliness: AI companions (such as Replika, Crushon.AI etc.) aim to retain users who are feeling lonely, and do not prioritise allowing them to reconnect with others. Some users do end up doing so, but this is likely in cases of transient/short-term loneliness, and is not the apps’ priority.
  • Offboarding: chatbots are great at creating a ‘safe haven’ for personal information disclosure; you do not feel like someone is judging you. This can potentially create a strong bond between the user and chatbot, meaning an effective offboarding system that allows this relationship to be closed in favour of moving on to human-centric help is important.
  • There are persistent problems: issues to do with privacy, attachment/social deskilling and hallucinations (when large language models present information as if it was true, when it is not) will not be solved soon. Hence, further consideration/mitigation is needed.

When it comes to the future, there will likely be a proliferation of AI companions and assistants and a general increase in how often we interact with different forms of AI daily. For example, Microsoft has introduced AI agents designed to accompany employees in their daily work into its workflow. As a result, it is important to really consider what we want to use AI for. The value of human-human connection is immeasurable and linked to so many health benefits, meaning we should measure our use and interaction with social AI accordingly to not jeopardise this reality.

In sum, social AI can be a powerful tool when it comes to loneliness, but if it is to be effective, it should be reserved to infrequent periods of short-term loneliness. AI companions are run by private organisations with profit incentives to retain users, meaning they are not long-term solutions for issues such as loneliness. In the same way, not all social problems can be aided by technological solutions.