Using self-awareness to explore the world

by Karen Meehan for the magazine ad astra; photo: goinyk/Adobestock

Bernhard Rinner, Professor for Pervasive Systems at the Institute of Networked and Embedded Systems, describes the fascination and challenges associated with a highly topical interdisciplinary research field.

Self-driving cars, networked devices within the Internet of Things, software programmes such as “bots”, and the rover Perseverance, which is currently exploring the Jezero Crater on Mars – all of these are systems that continuously collect information about their own state and their environment and then extrapolate their (future) behaviour from the analysed data. Just like biological systems, they increasingly use capabilities such as “self-awareness” and “proprioception” to improve their own perception and their behavioural planning. Originally developed  in the field of cognitive science in order to create detailed descriptions of the mechanisms involved in biological processes, these ideas have now been transferred to technical systems.

Working closely with eight European partners in the interdisciplinary project EPiCS (Engineering Proprioception in Computing Systems), funded by the European Union, Bernhard Rinner and his team first investigated how computer systems that need to adapt to constant change might be realised with “self-awareness”.

The principle can be illustrated using the example of a drone: Through internal sensors (e.g. battery level or position sensor) and external sensors (e.g. camera or laser sensor), the drone collects data and compares it with previously stored models that describe the available knowledge about the flying robot in a structured form. The drone compares the behaviour observed through the sensors with the behaviour predicted by the models and has the ability to detect discrepancies. If a discrepancy exists, the drone’s “self-awareness” triggers a learning process that models the current – as yet unknown – behaviour. Over time, the drone refines the existing models or generates new models and adds these to its memory.

Autonomous systems have to accomplish a number of highly complex tasks at the same time. For instance, a drone has to navigate over unknown terrain, detect obstacles and analyse recorded sensor data. Each task involves generating, reviewing and refining specific models. This means that the complexity and number of models, as well as the drone’s knowledge about its tasks and its environment, are constantly growing. To manage this, researchers can use proven methods from the field of machine learning, although these are very intensive in terms of computing power. Intensive research is currently underway into resource-efficient learning methods and the reduction of modelling complexity.

Another compelling research dimension arises when the concept of “self-awareness” is applied to groups, swarms or networks. In the context of “collective self-awareness”, researchers in Klagenfurt who are involved in the focal research area of “self-organising systems” are engaged in exploring how individual agents directly interact with each other and how this autonomously leads to the development of group or swarm behaviour.

Autonomous systems hold great interest for a variety of disciplines. Decision-making processes in technical, biological, economic and social systems are the subject of close scrutiny, and cross-disciplinary collaborations in these areas frequently break new scientific ground. Within the doctoral programme DECIDE, decision-making behaviour in the digital age is being investigated from an economic, technical and psychological perspective. The field of robot ethics deals with morality and responsibility in relation to machines that act autonomously. Disciplines in the humanities such as sociology are increasingly concerned with the interaction between humans and autonomous systems. This poses a significant challenge for both humans and machines, given that the underlying decision-making processes are becoming ever more complex and harder for the respective “other” to predict.