Trust or distrust? The future of autonomous systems in focus!

Transparenz: Redaktionell erstellt und geprüft.
Veröffentlicht am

The UNI of Lübeck is hosting an online lecture on trust in autonomous systems on July 15, 2025. Be there!

Die UNI Lübeck veranstaltet am 15. Juli 2025 eine Online-Vorlesung über Vertrauen in autonome Systeme. Seien Sie dabei!
The UNI of Lübeck is hosting an online lecture on trust in autonomous systems on July 15, 2025. Be there!

Trust or distrust? The future of autonomous systems in focus!

In today's world, when autonomous systems are increasingly penetrating our everyday lives, the topic of trust and distrust is becoming a central discourse. The upcoming event at the University of Lübeck, which will take place on July 15, 2025, will address precisely these questions. The “Virtual Ethical Innovation Lecture” with Alma Kolleck will be dedicated to the critical analysis of the different perceptions towards autonomous technologies. How uni-luebeck.de reported, factors that influence trust in artificially intelligent systems will be discussed.

There is growing skepticism among the general public, particularly towards fully automated driving systems and automated weapon systems. This skepticism is not unfounded, as rejection of such technologies often predominates in media debates. The aim of the upcoming lecture is to carry out comprehensive analyzes of the perception and use of these systems and to shed light on possible causes for the prevailing distrust.

Diversity of autonomous systems

The importance of autonomous systems extends to numerous areas of society. Loud iese.fraunhofer.de They offer potential for solving significant ecological and economic challenges. Autonomous systems are defined as technologies that are able to achieve a desired goal without human intervention. A distinction is made here between virtual and physical systems, from cyber protection measures to robots and self-driving vehicles.

The applications of these technologies are diverse. Autonomous mobile robots can be used in logistics or healthcare, where they take on tasks that are not possible or efficient for humans. However, there are also hurdles that make widespread introduction difficult. Trust in these systems is crucial, and this includes that they work safely and reliably. Fraunhofer IESE emphasizes that data protection and compliance with ethical standards are essential. This is the only way to anchor social values ​​in autonomous behavior.

Ethical challenges and social acceptance

The development of autonomous systems also raises profound ethical questions. A central point is responsibility for the decisions made by these systems. Who is liable if something goes wrong? Manufacturer, programmer or user? These questions must be discussed in interdisciplinary dialogue das-wissen.de emphasized. Transparency in the decision-making processes and the traceability of the actions of autonomous systems are essential prerequisites for user trust.

Another problem is possible biases that are anchored in the systems' algorithms. These can increase existing social inequalities if access to the technologies is not uniformly guaranteed. It is therefore crucial that the training data of the algorithms is diverse and representative to guarantee fairness.

Social acceptance will be crucial in the coming years to advance the integration of autonomous systems into everyday life. Clear communication, regular workshops and public dialogue are necessary tools that can help build trust. The upcoming event at the University of Lübeck will not only address acute questions about trust and distrust in relation to autonomous systems, but will also offer long-term perspectives for their development and social integration.