Loading…

Loading grant details…

Active STANDARD GRANT National Science Foundation (US)

CRII: CIF: Information Theoretic Measures for Fairness-aware Supervised Learning

$892.2K USD

Funder National Science Foundation (US)
Recipient Organization Missouri University of Science and Technology
Country United States
Start Date Oct 01, 2024
End Date Jun 30, 2026
Duration 637 days
Number of Grantees 1
Roles Principal Investigator
Data Source National Science Foundation (US)
Grant ID 2452330
Grant Description

Despite the growing success of Machine Learning (ML) systems in accomplishing complex tasks, their increasing use in making or aiding consequential decisions that affect people’s lives (e.g., university admission, healthcare, predictive policing) raises concerns about potential discriminatory practices. Unfair outcomes in ML systems result from historical biases in the data used to train them.

A learning algorithm designed merely to minimize prediction error may inherit or even exacerbate such biases; particularly when observed attributes of individuals, critical for generating accurate decisions, are biased by their group identities (e.g., race or gender) due to existing social and cultural inequalities. Understanding and measuring these biases-- at the data level-- is a challenging yet crucial problem, leading to constructive insights and methodologies for debiasing the data and adapting the learning system to minimize discrimination, as well as raising the need for policy changes and infrastructural development.

This project aims to establish a comprehensive framework for precisely quantifying the marginal impact of individuals’ attributes on accuracy and unfairness of decisions, using tools from information and game theories and causal inference, along with legal and social science definitions of fairness. This multi-disciplinary effort will provide guidelines and design insights for practitioners in the field of fair data-driven automated systems and inform the public debate on social consequences of artificial intelligence.

The majority of previous work formulates the algorithmic fairness problem from the viewpoint of the learning algorithm by enforcing a statistical or counterfactual fairness constraint on the learner’s outcome and designing a learner that meets it. As the considered fairness problem originates from biased data, merely adding constraints to the prediction task might not provide a holistic view of its fundamental limitations.

This project looks at the fairness problem through different lens, where instead of asking “for a given learner, how can we achieve fairness”?, it asks “for a given dataset, what are the inherent tradeoffs in the data, and based on these, what is the best learner we can design”?. In supervised learning models, the challenge in the proposed problem lies in the complex structures of correlation/causation among individuals’ attributes (covariates), their group identities (protected features), the target variable (label), and the prediction outcome (decision).

In analyzing the dataset, the marginal impacts of covariates on the accuracy and discrimination of decisions are quantified from the data, via carefully designed measures accounting for the complex correlation/causation structures among variables and the inherent tension between accuracy and fairness objectives. Subsequently, methods to exploit the quantified impacts in guiding downstream ML systems to improve their achievable accuracy-fairness tradeoff will be investigated.

Importantly, the proposed framework provides explainable solutions, where the inclusion of certain attributes in the learning system is explained by their importance for accurate as well as fair decisions.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

All Grantees

Missouri University of Science and Technology

Advertisement
Apply for grants with GrantFunds
Advertisement
Browse Grants on GrantFunds
Interested in applying for this grant?

Complete our application form to express your interest and we'll guide you through the process.

Apply for This Grant