Detection and elimination of biases in triage and localization algorithms for COVID-19
- Funded by BBVA Foundation (Spain)
- Total publications:0 publications
Grant number: unknown
Grant search
Key facts
Disease
COVID-19Known Financial Commitments (USD)
$84,354.66Funder
BBVA Foundation (Spain)Principal Investigator
Ángel Puyol GonzálezResearch Location
SpainLead Research Institution
Universidad Autónoma de BarcelonaResearch Priority Alignment
N/A
Research Category
Clinical characterisation and management
Research Subcategory
Supportive care, processes of care and management
Special Interest Tags
N/A
Study Type
Clinical
Clinical Trial Details
Unspecified
Broad Policy Alignment
Pending
Age Group
Unspecified
Vulnerable Population
Unspecified
Occupations of Interest
Unspecified
Abstract
This project will generate a protocol that allows an ethical audit of the algorithms used in COVID-19 to prioritize admission to intensive care units (ICU) and to establish -through geolocation by mobile phone- the degree of infiltration of the virus in a specific area. The authors reason that algorithms facilitate decision-making, but their results may be unfair due to two types of biases: a) they lack complete information (for example, there is more data on men than women or certain minorities or situations are not sufficiently represented socioeconomic); and b) the decisions on which the machine learning system is built were not fair out (for example, the responsible health team decided not to admit the elderly, considering that they were less likely to survive). In the case of ICUs, it is about improving decisions when instead of prioritizing it is necessary to ration; and, in geolocation, to avoid discrimination, lack of privacy and abuse of power.