Explainability for Public Communication of Expert Knowledge: Translating AI Explainability Research to Public Communication in Complex Crises on the Example of COVID-19 (additional Corona related funding)
- Funded by Volkswagen Stiftung
- Total publications:0 publications
Grant number: unknown
Grant search
Key facts
Disease
COVID-19Funder
Volkswagen StiftungPrincipal Investigator
Prof Dr-Ing and Prof Dr and Prof PhD Jasminko Novak, Enny Das, Martha Larson, Kalina Drenska…Research Location
GermanyLead Research Institution
European Institute for Participatory Media e. V.Research Priority Alignment
N/A
Research Category
Policies for public health, disease control & community resilience
Research Subcategory
Communication
Special Interest Tags
N/A
Study Type
Non-Clinical
Clinical Trial Details
N/A
Broad Policy Alignment
Pending
Age Group
Not Applicable
Vulnerable Population
Not applicable
Occupations of Interest
Not applicable
Abstract
This project module investigates how research on explainability of AI systems could be translated and applied to public communication of expert knowledge in complex crisis situations such as the COVID-19 pandemic. Both AI systems and expert recommendations of COVID-19 measures present people with results of what they perceive as intransparent, complex systems whose reasoning processes are neither directly observable, nor readily understandable. The goal is to examine which AI explanation techniques could be applied to increase public understanding of expert knowledge and decisions in such situations (e.g. combining contrastive and counterfactual explanations with persuasive communication). This could provide experts and decision-makers with novel methods for improving their communication and increasing public acceptance of their recommendations (e.g. COVID-19 containment measures, vaccination).