Skip to Main Content
 

Global Search Box

 
 
 
 

Files

File List

Full text release has been delayed at the author's request until August 04, 2025

ETD Abstract Container

Abstract Header

Likelihood-free Inference via Deep Neural Networks

Abstract Details

2024, Doctor of Philosophy, Ohio State University, Statistics.
Many application areas rely on models that can be readily simulated but lack a closed-form likelihood, or an accurate approximation under arbitrary parameter values. In this "likelihood-free" setting, inference is typically simulation-based and requires some degree of approximation. Recent work on using neural networks to reconstruct the mapping from the data space to the parameters from a set of synthetic parameter-data pairs suffers from the curse of dimensionality, resulting in inaccurate estimation as the data size grows. In this dissertation, we propose new inferential techniques to overcome these limitations, beginning with a simulation-based dimension-reduced reconstruction map (RM-DR) estimation method. This approach integrates reconstruction map estimation with dimension-reduction techniques grounded in subject-specific knowledge. We examine the properties of reconstruction map estimation with and without dimension reduction, and describe the trade-off between information loss from data reduction and approximation error due to increasing input dimension of the reconstruction function. Numerical examples illustrate that the proposed approach compares favorably with traditional reconstruction map estimation, approximate Bayesian computation (ABC), and synthetic likelihood estimation (SLE). Additionally, in settings where likelihood evaluation is possible but expensive, we propose combining the RM-DR approach with local optimization as an alternative to using expensive global optimizers for parameter estimation, achieving comparable accuracy with improved time efficiency. To further incorporate uncertainty quantification, crucial for interpretation and making informed decisions, we introduce kernel-adaptive synthetic posterior estimation (KASPE). This method employs a deep learning framework to learn a closed-form approximation to the exact posterior, combined with a kernel-based adaptive sampling mechanism to generate synthetic training data. We study the convergence properties of the approach and its connections with other existing likelihood-free and likelihood-based methods. KASPE consistently outperforms competing methods, demonstrating significantly higher accuracy and robustness in approximating posterior densities. Building on these advancements, we extend the RM-DR and KASPE methods by integrating a Transformer encoder into their architectures, resulting in RM-TE and KASPE-TE frameworks. Through the self-attention mechanism and tailored model layers designed for parameter inference tasks, these methods are able to capture complex dependencies and handle high-dimensional data by automatically learning and extracting relevant features, enhancing their generalizability and robustness. Numerical experiments confirm that RM-TE and KASPE-TE offer significant improvements in estimation accuracy over existing methods. In addition to bypassing the need to elicit expert summaries, we discuss how this approach is applicable to other likelihood-free methods as a means of providing automatic, data-driven summary statistics.
Oksana Chkrebtii (Advisor)
Dongbin Xiu (Advisor)
Yuan Zhang (Committee Member)
154 p.

Recommended Citations

Citations

  • Zhang, R. (2024). Likelihood-free Inference via Deep Neural Networks [Doctoral dissertation, Ohio State University]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=osu172131388208005

    APA Style (7th edition)

  • Zhang, Rui. Likelihood-free Inference via Deep Neural Networks. 2024. Ohio State University, Doctoral dissertation. OhioLINK Electronic Theses and Dissertations Center, http://rave.ohiolink.edu/etdc/view?acc_num=osu172131388208005.

    MLA Style (8th edition)

  • Zhang, Rui. "Likelihood-free Inference via Deep Neural Networks." Doctoral dissertation, Ohio State University, 2024. http://rave.ohiolink.edu/etdc/view?acc_num=osu172131388208005

    Chicago Manual of Style (17th edition)