• Home
  • VSETH
    • Präsi-Kolumne
    • HOPO-KOLUMNE
  • ETH World
  • Stories
  • Extras
    • Quiz
    • Cartoons
  • Issues
    • Intelligence
    • Artificial
    • Motion
    • Magic
    • Foreigner
    • Optimism
    • Realism
  • About
    • Advertising
    • Join Us
    • Contact
  • Subscribe
DossierIntelligence

Can robots replace human interactions for mental health?

Life can be hard sometimes. Luckily, we don’t have to endure it alone. Talking to the people around us can give us hope and make our problems seem easier to tackle. Most people have conversations with their family, friends, and colleagues every day without really noticing its value for their mental wellbeing. However, at certain times when it is needed the most, there may be nobody around to share our problems with. This is where Nightline Zurich – the anonymous and confidential listening service of VSETH – comes into play. The student-run service is available every night of the semester between 8 pm and midnight by phone, chat, and mail via nightline.ch. It aims to give students the opportunity to talk to fellow students anonymously about almost anything.

by Polykum Redaktion May 27, 2024
written by Polykum Redaktion May 27, 2024 573 views
0 0
Read Time:4 Minute, 36 Second

Nightline: Available via Phone, Chat & Mail

During its early years, Nightline Zurich was only reachable by phone and saw a moderate number of calls. However, with the introduction of the chat system in 2017, the service experienced a huge increase in contacts. It seemed like the possibility of chatting instead of talking made conversations with strangers less scary and more accessible. Furthermore, despite being more impersonal than phone calls, the relief these conversations provided was apparently still sufficient. Thus, considering the improvement of conversational AI over the past few years and the increased demand for mental health support, the question poses itself, whether a well-adjusted chatbot could replace a human partner in these chat conversations and still deliver similar support.

Eliza

The idea of using chatbots to replace humans as conversational partners is older than one might expect. Already in the mid-1960s, Joseph Weizenbaum developed Eliza, a program that was able to communicate with humans via text messages using a simple rule-based approach to natural language processing. It would analyze the given written inputs by looking for keywords in the text, which were then used to craft a response according to a rule set predefined by the developer. One of these rule sets was the Doctor script, whose goal was to imitate the responses of a therapist. This relatively simple concept already showed impressive conversational capabilities and many people at that time felt like they were talking to an actual therapist.

While Eliza was originally developed to explore the natural language processing possibilities of computers, a growing number of services today try to provide serious AI-based support for mental-health-related topics. By using large-language models, these new chatbots allow for longer and more coherent conversations and often provide reasonable and helpful responses, that feel more human than those of rule-based Eliza. The aim of these services is to satisfy the growing demand for broadly accessible mental health care and make therapy accessible for those who cannot – or do not want to – go to real therapy.

The Good vs. the Bad

These systems may have some advantages over traditional therapies, like reduced costs or availability at any point in time whenever the patient faces a mental health crisis. Additionally, some patients may feel more comfortable talking openly to chatbots since there is no possibility for judgement or stigma, when there is no real person involved in the conversation. However, there are rising concerns about this kind of therapeutic approach. Research has shown that patients talking to chatbots were aware of the fact, that there was no real understanding of their problems on the computer side and quickly lost interest in the conversation after getting some responses that did not correspond to their expectations or wishes (Buis, An Overview of Chatbot-Based Mobile Health Apps, JMIR Mhealth Uhealth 2023). This lower acceptance of unwanted responses may be especially critical if the life of the patient is at stake. In the case of suicidality of the patient – even if the system would be able to correctly identify and react to the situation – its response may carry less weight for the patient than if it were provided by a human being. Tailoring the responses of the chatbot to cater to the patient’s expectations can be equally problematic. Such a chatbot would not only amplify a possibly unhealthy worldview, but it would also give the user total control over the relationship between the “therapist” and themselves. This may lead to an unhealthy attachment of the patient to the bot and lower the user’s tolerance for real human relationships, which are shaped by disagreements and reconciliations, misunderstandings and clarifications. These increased expectations in their conversation partners could hinder the patient’s ability to make connections with ordinary people, further isolating the users from the people around them.

Some Uncertainties Remain

It has been shown that the most significant indicator for the success of a therapy – or any other interhuman interaction – is the relationship between the people taking part in it (Flückiger et al. in Counseling Psychology 2012). While well-trained algorithms allow chatbots to mimic empathy and understanding on a level that can often lead to a strong humanization of and bonding to the system, it is still unclear how this compares to a therapeutic relationship between real human beings. After all, treating a chatbot as a human involves some kind of (self-)deception about the nature of the conversation partner, and even the most valuable contents of a chatbot’s responses do not change the fact that we are still alone with our lives. Therefore, with chatbots possibly becoming a valuable part of our mental health toolbox, like keeping a diary or going for a walk, they will not replace human interaction as an integral part of our wellbeing.

The goal of Nightline Zurich is thus to provide the possibility for a human connection between students in an anonymous and easily accessible fashion. Writing in the chat there, you will always talk to a fellow student who listens and understands such that you are not alone. Therefore, if you need someone to share something, be it happy or sad, if you need someone to talk to or just be there for you, write to Nightline or maybe even take the phone. We listen to you.

Gabriel Margiani, 29, doctoral candidate in physics. As president of Nightline Zurich, I often think about what constitutes an effective empathetic conversation and what to do about the loneliness of students.

Share

Facebook
Twitter
Pinterest
LinkedIn

About Post Author

Polykum Redaktion

Happy
Happy
0 0 %
Sad
Sad
0 0 %
Excited
Excited
0 0 %
Sleepy
Sleepy
0 0 %
Angry
Angry
0 0 %
Surprise
Surprise
0 0 %
Polykum Redaktion

previous post
Zwischen Unterforderung und Überqualifizierung
next post
Is Optimism the Key to a Good Life?

You may also like

In an Optimistic Mind

October 21, 2024

Is Optimism the Key to a Good Life?

October 21, 2024

Zwischen Unterforderung und Überqualifizierung

May 27, 2024

What even is IA

May 27, 2024

Coding Weekend

May 27, 2024

Tiere vor dem Spiegel

May 27, 2024

Polykum im Wandel der Zeit

May 27, 2024

Change

May 27, 2024

Where did our millions go?

May 27, 2024

Craft your own Pallet Sofa

May 27, 2024

VSETH

  • The Absurd Myths behind VSETH

    December 16, 2024
  • What even is IA

    May 27, 2024
  • Coding Weekend

    May 27, 2024
  • Polykum im Wandel der Zeit

    May 27, 2024

ETH World

  • People of ETH

    December 16, 2024
  • Von der ETH in den Weltraum

    December 16, 2024
  • The fight against the tuition fees: the end or just the beginning?

    December 16, 2024
  • Anti-Realismus an der ETH Zürich?

    December 16, 2024

Footer Logo
  • Imprint
  • Privacy Policy
  • Print Archive
  • How to Subscribe

© Copyrights 2024 Polykum - All Rights Reserved.

  • Home
  • VSETH
    • Präsi-Kolumne
    • HOPO-KOLUMNE
  • ETH World
  • Stories
  • Extras
    • Quiz
    • Cartoons
  • Issues
    • Intelligence
    • Artificial
    • Motion
    • Magic
    • Foreigner
    • Optimism
    • Realism
  • About
    • Advertising
    • Join Us
    • Contact
  • Subscribe