How robots recognize feelings: study shows new ways of communication!
Craft a short meta-description for an article about "UNI Chemniz", in German. The article contains the following content: "
Today is September 15th, 2025
Date: September 15th, 2025 - Source 1 ():
- Study by Chemnitz University of Technology examines conversations between humans and robots.
- Goal: Find out whether people speak differently to robots than to other people.
- Interdisciplinary team from neurorobotics and linguistics carried out experiments.
- Experiment: Humans and an industrial robot arm with voice functions build a simple IKEA shelf together.
- Teams were recorded during the task, conversations were transcribed and analyzed.
- Results:
- In purely human teams there were more statements, explanations and questions.
- Robot received more direct instructions.
- There were almost as many emotional statements in human-robot teams as between humans (e.g. “You're doing well.”).
- Study was published “open access” to reach a wider public.
- Podcast “Linguistics Behind the Scenes” published to communicate research topics in a generally understandable way.
- Latest podcast episode covers artificial intelligence and the robot study.
- Publication: Coelho, Kaden, Beccard, Röhrbein & Sanchez-Stockhammer. 2025. "Another bit. Upwards. Okay, stop." Proceedings of the Human and Computer 2025 (MuC '25), 465-470. DOI:
- Podcast episode available on YouTube, Spotify and Apple Podcasts.
- Contact: Prof. Dr. Christina Sanchez-Stockhammer, telephone +49 (0)371 531-32444, email christina.sanchez@phil.tu-chemnitz.de.
Source 2 ():
- Study by Kim Klüber, Katharina Schwaiger and Prof. Dr. Linda Onnasch investigates the influence of robots' emotional speech on the perception of their social and emotional abilities.
- Title of the study: “Affect-enhancing speech characteristics for robotic communication”.
- Study was published as a highlighted article in Science Robotics.
- Results show that emotional speech and expressive intonation influence human-like perception of robots.
- Particularly pronounced effect on technical-looking agents, where affective communication can increase social acceptance.
- Study contributes to the further development of voice-based robot systems in order to make their use more intuitive and acceptable.
- Full study is available open access.
Source 3 ():
- There have been significant developments in the field of artificial intelligence (AI) in recent months.
- The release of ChatGPT and announcements from other tech giants have increased interest in language models.
- Attention to AI issues increased with the announcement and release of GPT-4.
- Language models could also be important for robotics.
- A research group from Google and TU Berlin presented the language model Palm-E, which is combined with a visual model.
- Palm-E enables a robot with a gripper arm and camera to issue commands such as “Bring me the rice chips from the drawer.” to carry out and react to changes in the environment.
- The interaction between people and machines could change, as people often interact with machines differently than with other people.
- Developers make decisions to make language models appear more human, e.g. by using emojis or delaying the appearance of answers.
- A press briefing was organized to discuss issues related to research and developments in language models, robotics and human-machine interaction.
- Briefing topics include:
- Next steps in the research and application of language models.
- Possibilities for increasing the size and performance of the models.
- Combination of language models with research systems.
- Advances in robotics through language models.
- Emergent capabilities of language models that have not been explicitly trained.
- Future interactions between humans and machines.
- Psychological effects of these interactions on people.
- Researchers answered questions in a 50-minute virtual press briefing.
". Don't add the title at the beginning of the created content. Write it as if you want to inform the readers about who, what, when, where, why and how. Dont exceed 120 characters. Style: Maintain a professional level of formality suitable for a newspaper, but avoid overly complex language to ensure the content is accessible to a wide audience. Include keywords related to the news event and phrases likely to be used by readers searching for information on the topic. Tone: While keeping the tone professional, use engaging language to capture the reader's interest without sensationalizing. Reply in plain text without putting the meta-description into any quotes.

Today is September 15th, 2025
Date: September 15th, 2025 - Source 1 ():
- Study by Chemnitz University of Technology examines conversations between humans and robots.
- Goal: Find out whether people speak differently to robots than to other people.
- Interdisciplinary team from neurorobotics and linguistics carried out experiments.
- Experiment: Humans and an industrial robot arm with voice functions build a simple IKEA shelf together.
- Teams were recorded during the task, conversations were transcribed and analyzed.
- Results:
- In purely human teams there were more statements, explanations and questions.
- Robot received more direct instructions.
- There were almost as many emotional statements in human-robot teams as between humans (e.g. “You're doing well.”).
- Study was published “open access” to reach a wider public.
- Podcast “Linguistics Behind the Scenes” published to communicate research topics in a generally understandable way.
- Latest podcast episode covers artificial intelligence and the robot study.
- Publication: Coelho, Kaden, Beccard, Röhrbein & Sanchez-Stockhammer. 2025. "Another bit. Upwards. Okay, stop." Proceedings of the Human and Computer 2025 (MuC '25), 465-470. DOI:
- Podcast episode available on YouTube, Spotify and Apple Podcasts.
- Contact: Prof. Dr. Christina Sanchez-Stockhammer, telephone +49 (0)371 531-32444, email christina.sanchez@phil.tu-chemnitz.de.
Source 2 ():
- Study by Kim Klüber, Katharina Schwaiger and Prof. Dr. Linda Onnasch investigates the influence of robots' emotional speech on the perception of their social and emotional abilities.
- Title of the study: “Affect-enhancing speech characteristics for robotic communication”.
- Study was published as a highlighted article in Science Robotics.
- Results show that emotional speech and expressive intonation influence human-like perception of robots.
- Particularly pronounced effect on technical-looking agents, where affective communication can increase social acceptance.
- Study contributes to the further development of voice-based robot systems in order to make their use more intuitive and acceptable.
- Full study is available open access.
Source 3 ():
- There have been significant developments in the field of artificial intelligence (AI) in recent months.
- The release of ChatGPT and announcements from other tech giants have increased interest in language models.
- Attention to AI issues increased with the announcement and release of GPT-4.
- Language models could also be important for robotics.
- A research group from Google and TU Berlin presented the language model Palm-E, which is combined with a visual model.
- Palm-E enables a robot with a gripper arm and camera to issue commands such as “Bring me the rice chips from the drawer.” to carry out and react to changes in the environment.
- The interaction between people and machines could change, as people often interact with machines differently than with other people.
- Developers make decisions to make language models appear more human, e.g. by using emojis or delaying the appearance of answers.
- A press briefing was organized to discuss issues related to research and developments in language models, robotics and human-machine interaction.
- Briefing topics include:
- Next steps in the research and application of language models.
- Possibilities for increasing the size and performance of the models.
- Combination of language models with research systems.
- Advances in robotics through language models.
- Emergent capabilities of language models that have not been explicitly trained.
- Future interactions between humans and machines.
- Psychological effects of these interactions on people.
- Researchers answered questions in a 50-minute virtual press briefing.
". Don't add the title at the beginning of the created content. Write it as if you want to inform the readers about who, what, when, where, why and how. Dont exceed 120 characters. Style: Maintain a professional level of formality suitable for a newspaper, but avoid overly complex language to ensure the content is accessible to a wide audience. Include keywords related to the news event and phrases likely to be used by readers searching for information on the topic. Tone: While keeping the tone professional, use engaging language to capture the reader's interest without sensationalizing. Reply in plain text without putting the meta-description into any quotes.
How robots recognize feelings: study shows new ways of communication!
A current study by the Chemnitz University of Technology deals with communication between humans and robots. The aim of the research is to find out whether interactions with robots are different than with other people. An interdisciplinary team consisting of experts in neurorobotics and linguistics carried out various experiments.
As part of an experiment, humans and an industrial robot arm with voice functions worked together to build a simple IKEA shelf. The interactions were recorded, transcribed and analyzed. The results show striking differences: In human teams, more helpful statements, explanations and questions were observed, while the robot mainly received direct instructions. Interestingly, there were almost as many emotional expressions in the human-robot teams as in all-human teams, with examples such as “You’re doing well.”
Emotional communication and robotics
In parallel to these findings, a study by Kim Klüber, Katharina Schwaiger and Prof. Dr. Linda Onnasch at the Technical University of Berlin examined the influence of emotional language design on the perception of robots. The study, titled “Affect-enhancing speech characteristics for robotic communication,” was published as a highlighted article in Science Robotics.
The results demonstrate that emotional speech and expressive intonation can strengthen human-like perception of robots. This effect was particularly pronounced for technical-looking agents, where affective communication could significantly increase the social acceptance of the machines. This contributes to the further development of voice-based robotic systems with the aim of making their use more intuitive and acceptable to users.
Language models in interactive robotics
The discussion about more humane interaction between humans and machines is being driven forward by advances in artificial intelligence. In recent months, language models such as ChatGPT and GPT-4 have gained increasing interest. A press briefing from Science Media Center has highlighted the potential applications of these models in robotics.
An example of this is the Palm-E language model, developed by a research group at Google in collaboration with the TU Berlin. It allows robots to respond to complex voice commands, such as “Bring me the rice chips from the drawer.” Such developments significantly change the nature of interaction between humans and machines. Developers use targeted measures to make language models more humane, for example through the use of emojis or time-delayed answers.
A 50-minute virtual press briefing addressed questions about the next steps in research, the integration of language models into research systems and their effects on interpersonal communication.
For more information visit the Website of the Chemnitz University of Technology or contact Prof. Dr. Christina Sanchez-Stockhammer at +49 (0)371 531-32444 or via email to christina.sanchez@phil.tu-chemnitz.de.