OC-TRUST - Trustworthiness of Organic Computing Systems
|Funded by:||DFG (Deutsche Forschungsgemeinschaft)|
|Local project leader:||Prof. Dr. Elisabeth André|
M.Sc. Stephan Hammer
Dipl.-Inf. Michael Wißner
The DFG-funded project OC-TRUST aims to improve the trustworthiness of Organic Computing systems to permit their use in open, heterogeneous, safety-critical, and user-centered scenarios. Furthermore, it will be investigated to what extent trust can help as a constitutive element of technical systems to improve the systems' robustness and efficiency. Therefore, methods, models, algorithms, and user interfaces are developed that allow to take into account trust during the design of the systems and to examine the systems' trustworthiness. Moreover, they allow to measure trust at runtime and to adapt the systems with regard to different aspects of trust.
A core feature of OC-systems is their ability to automatically adjust to dynamically changing usage and context conditions. Thus, OC-systems confront developers of user interface concepts with a rather challenging task since it cannot always be assumed that sudden and often unexpected adjustments are automatically self-explanatory. Rather, there is the risk that users do not understand the rationale behind the system's behavior and wonder why a system has adapted in a specific manner. This in turn may lead to a loss of trust and in the worst case a rejection of the system. The objective of the project is the development of trustworthy user interfaces that are able to cope with these challenges.
To achieve this objective, we develop a decision-theoretic approach to trust management on the basis of Bayesian Networks. The resulting User Trust Model (UTM) assesses the user's trust in a system, monitors it over time and applies appropriate system actions to maintain trust in critical situations, such as the generation of explanations to increase a system’s transparency.
Example of a UTM that protects the data of a user during the use of public displays
To implement system actions, we rely on a constraint solver which allows us to define hard and soft behavioral corridors at design time which are automatically checked at run time.
The developed methods and techniques are evaluated using highly dynamic demonstrators consisting of heterogeneous flexibly combinable interaction devices and displays. The demonstrators under investigation integrate public displays (called public screen), private and mobile displays (smart phones, tablets) and semi-public displays, such as interactive tables (Microsoft Surface). The resulting system class is called Trusted Display Grid, in short TDG.
Project webpage(in German only)