Loading…
Loading grant details…
| Funder | National Science Foundation (US) |
|---|---|
| Recipient Organization | University of Minnesota-Twin Cities |
| Country | United States |
| Start Date | Sep 01, 2021 |
| End Date | Aug 31, 2026 |
| Duration | 1,825 days |
| Number of Grantees | 1 |
| Roles | Principal Investigator |
| Data Source | National Science Foundation (US) |
| Grant ID | 2042366 |
Classical statistical testing and estimation, based on the ideas of Fisher, Neyman, and Pearson, are ubiquitous in science. They continue to be central in psychology, medicine, epidemiology, pharmacology, agriculture, and environmental science, among others. Thus, they play an important role in those sciences' empirical successes.
Yet they often fall short of scientists' ambitions to quantify how data positively support hypotheses and to express the variable confidence of estimates. To address such shortcomings, this project develops new conceptual foundations for statistical testing and estimation with quantitative implications. It modifies, extends, and strengthens those foundations by adapting ideas from 20th- and 21st-century epistemology, the philosophical study of the nature and conditions for evidence and knowledge.
In doing so, it shows how to draw more nuanced scientific conclusions from data. This research also integrates with supported activities towards two educational objectives. First, this project promotes the integration of statistical concepts and methods into college philosophy curricula, especially critical reasoning, epistemology, and philosophy of science courses.
It does so through two faculty summer institutes for college instructors. Second, this project begins to build a network of early-career scholars engaged in graduate-level research in the philosophy of statistics through two summer schools for early-stage graduate students. The main synergy between the educational and research objectives arises from the natural feedback between teaching and new research directions.
The ideas that this research adapts from contemporary epistemology include the modal conditions for evidence and knowledge of adherence, sensitivity, and safety. From the viewpoint of probabilized reliabilist epistemology, Fisherian p-values are a probabilistic measure of adherence, while two novel quantitative post-data measures of evidence correspond to sensitivity and safety.
These include a distinct post-data analogue of statistical power related to Mayo's severity concept. Data are evidence for a hypothesis to the extent they are sufficiently adherent, sensitive, and safe according to these measures. By contrast, traditional Fisherian significance testing only measures how adherent data are for a hypothesis, which is necessary but not sufficient for positive evidential support of that hypothesis.
And Neyman-Pearson testing does not quantify the evidence that data provide for a hypothesis at all, but rather provides a decision procedure for accepting and rejecting hypotheses with specified rates error in the long run. The first part of this project develops the theoretical foundation for these ideas in the context of general statistical testing and estimation.
The second part then implements these ideas computationally by modifying standard testing and estimation packages in the R programming language. This paves the way for the seamless adoption and application of these new nuanced measures of evidence in sciences that continue to use classical statistical testing and estimation.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
University of Minnesota-Twin Cities
Complete our application form to express your interest and we'll guide you through the process.
Apply for This Grant