Loading…

Loading grant details…

Completed NON-SBIR/STTR RPGS NIH (US)

Nonlinear performance analysis and prediction for robust low dose lung CT

$1.96M USD

Funder NATIONAL CANCER INSTITUTE
Recipient Organization Johns Hopkins University
Country United States
Start Date Jan 01, 2021
End Date Aug 01, 2022
Duration 577 days
Number of Grantees 1
Roles Principal Investigator
Data Source NIH (US)
Grant ID 10321949
Grant Description

1 PROJECT SUMMARY / ABSTRACT 2 Nonlinear algorithms such as model-based reconstruction (MBR) and deep learning (DL) reconstruction have 3 sparked tremendous research interest in recent years. Compared to traditional linear approaches, the nonline- 4 arity of these algorithm transcends traditional signal-to-noise requirement and offer flexibility to draw information

5 from a variety of sources (e.g., statistical model, prior image, dictionary, training data). MBR has enabled numer- 6 ous advancements including low-dose CT and advanced scanning protocols. Deep learning algorithms are rap- 7 idly emerging and have demonstrated superior dose vs. image quality tradeoffs in research settings. However,

8 widespread clinical adoption of nonlinear algorithms has been impeded by the lack of a lack of systematic, quan- 9 titative methods for performance analysis. Nonlinear methods come with numerous dependencies on the imag- 10 ing techniques, the imaging target, and the prior information, and the data itself. The relationship between these

11 dependencies and image quality is often opaque. Furthermore, improper selection of algorithmic parameters can 12 lead to erroneous features (e.g., smaller lesions, texture) in the reconstruction. Therefore, methods to quantify 13 and predict performance permit efficient and quantifiable performance evaluation to provide the robust control

14 and understanding of imaging output necessary for reliable clinical application and regulatory oversight. 15 We propose to establish a robust, predictive framework for performance assessment and optimization that can 16 be generalized to any reconstruction method. We quantify performance in turns of the perturbation response and

17 covariance as a function of imaging techniques, system configurations, patient anatomy, and, importantly, the 18 perturbation itself. The perturbation response quantifies the appearance (e.g., biases, blurs, distortions), and, 19 together with the covariance, allows the computation of more complex metrics such as task-based performance

20 and radiomic measures including size, shape, and texture information. We illustrate utility of the approach in lung 21 imaging with the following specific aims: Aim 1: Develop a lesion library and generate perturbations encom- 22 passing clinically relevant features. We will extract lesions from public databases and develop methods lesion

23 emulation in for realistic CT simulation and physical data via 3D printing technology. Aim 2: Develop a gener- 24 alized prediction framework for perturbation response and covariance. Using analytical and neural network 25 modeling, we will establish a framework that predicts perturbation response and covariance across imaging

26 scenarios for classes of algorithms with increasing data-dependence including MBR with a Huber penalty, MBR 27 with dictionary regularization, and a deep learning reconstructor. Aim 3: Develop assessment and optimiza- 28 tion strategies to drive robust, low dose lung screening CT methods. We will optimize and adapt nonlinear

29 algorithms and protocols for lung cancer screening to achieve faithful representations of clinical features. This 30 work has the potential to drive much-needed quantitative assessment standards that directly relate image quality 31 to diagnostic performance and optimal strategies for robust, reliable clinical deployment of nonlinear algorithms.

32

All Grantees

Johns Hopkins University

Advertisement
Discover thousands of grant opportunities
Advertisement
Browse Grants on GrantFunds
Interested in applying for this grant?

Complete our application form to express your interest and we'll guide you through the process.

Apply for This Grant