
Artificial Empathy? – Gabriel Stille
Right now we live through a breakthrough of Artificial Intelligence. New applications are popping up all the time, and chatbots are increasingly performing the roles of buddies, coaches, therapists or romantic partners. Applications trained on previous input can react to human emotional expressions, and in turn, express a response – a kind of artificial empathy.
On a historical note, some practices of humanistic psychology have from the very beginning been accused of being “robotic” – in repeatedly offering reflections essentially just repeating what is said. One of the very first chatbots, developed in the 1960s, was modelled to reply like a humanistic therapist. In NVC circles, there is also a discussion on how the “mechanic” application of the 4 steps with the lists of feelings and needs works, in comparison to a grounded and lived-in focus on resonance, embodiment and connection.
The session starts with a presentation offering a view of the topic and then moves into a space for reflection and discussion on what empathy is in our respective understanding, if or when artificial empathy is useful, and how to react to AI applications in our fields. It is possible to just follow the session, but participants are warmly invited to share their own impressions, opinions and experiences.