Can a robot be a better conversation partner?
For Ursula and Raffaele, this is not a rhetorical question. As part of a pilot project from the Citizens Consider(ed) seminar, a project group looked at care assistance robots. Sarah Sbalchiero was part of this team. She reports on the pilot project, the conversation with two citizen scientists and the challenges.
Author: Sarah Sbalchiero
Our aim was not to get lost in theoretical discussions about AI-supported robots and demographic change. Instead, we left the university buildings to listen to those who don't have to deal with neural networks or ethics guidelines every day, but who do have to deal with the question of how technology is changing their everyday lives. We wanted to find out what happens when care assistant robots not only serve coffee or transport medication, but also talk to us. What happens when elderly people interact with a robot and try to feel understood by it?
We invited Ursula (82) and Raffaele (75), two citizen scientists from the Zurich region, to a discussion format at F&P Robotics. They belong to an age group that is likely to come into increasing contact with robots in the future. In the care sector in particular, robots are seen as a possible answer to the shortage of skilled workers that is increasingly challenging the care of elderly people. At our meeting, the focus lied not on the technology, but on the shared experience: How does it feel to encounter a talking robot? What works well, what is irritating? Ursula and Raffaele were able to touch the robot, talk to it, try it out, observe it and talk to us researchers and Niels Schlegel, the AI and robot software developer from F&P Robotics.
A robot as a listener?
The robot is called Lio. It is manufactured by the company F&P Robotics in Zurich. Lio moves slowly so as not to endanger anyone. Its soft surface feels more pleasant than plastic. He shouldn't look too human, but remain clearly recognizable as a robot – but with eyes, he looks friendlier. And according to Niels, he may soon be equipped with a large language model, i.e. an algorithm that draws on large data sets and is trained for conversations.
Ursula, one of the Citizen Scientists, met Lio in an office space at F&P Robotics. She was curious and courageous enough to enter into the conversation. What she said surprised me:
"I find a conversation with him more pleasant than with a person who doesn't understand me [...]. Because he has no feelings. But I can get rid of mine. And then it's right for me."
For her, the robot was not merely a machine, but simply a neutral counterpart that does not judge. One that doesn't interrupt when she wants to let off steam about the "jerk next door". Someone who doesn't roll their eyes in annoyance when she raises a topic again and again. Someone who is simply there and doesn't run away when things get personal.
For Ursula, the robot's lack of emotion was liberating. A counterpart that simply listens without judging.
Technology that remains unfamiliar
Not everyone saw it that way. Raffaele remained skeptical:
"For me, a personal conversation means that I have something I want to discuss with someone. But I can't get through to him. He's not a person - he's just a machine! I like him as a household helper, but I can't connect with him personally. He remains something alien to me."
For him, Lio was not neutral, but empty and alien: simply a machine. It was clear to him: a robot can help in everyday life, but it cannot be a substitute for a conversation partner. His skepticism was understandable. The conversation with Lio didn't feel like a free conversation.
Ursula put it like this:
"He [Lio] is made available to me and I think: Oh, how nice - now I have someone to talk to. And then I'm forced to give answers."
The robot constantly asks questions. It is a conversation dynamic that resembles a quiz rather than listening.
Raffaele added:
"It was the same with me: he forces you to ask a question he wants."
Humans make machines
There was no malicious intent behind this, but rather a configured algorithm. Niels, who had built the language model for Lio and continues to develop it, openly explained that he had programmed precisely this behavior:
"I basically told him: Try to involve the users, i.e. you, in the interaction again and again. Keep asking questions. And now I realize: Ok, that's too much for you. It's imposing something. I can reduce that."
It was an exciting moment that reminded us that technology is a product of human activity. Whether a conversation is perceived as “free” depends on how the AI has been configured. It is designed by humans, with assumptions, intentions, and sometimes well-intentioned mistakes.
This snapshot from the seminar project shows me the potential of participatory research. Involving different interest groups and taking an open approach brings forth surprising and uncomfortable perspectives that can offer real added value for population-based research.

About Sarah Sbalchiero: Sarah Sbalchiero studied Visual Communication. She worked for several years in various media companies, including SRF, before completing a Master's degree in Strategic Design with a focus on complexity in Barcelona. She is currently completing another Master's degree in sociology. She took part in the "Citizens Consider(ed)" teaching project in the spring semester of 2025.