Skip to Main Content

Basic Search

Skip to Search Results
 
 
 

Left Column

Filters

Right Column

Search Results

Search Results

(Total results 2)

Mini-Tools

 
 

Search Report

  • 1. An, Hyoin Bayesian Quantile Regression via Dependent Quantile Pyramids

    Doctor of Philosophy, The Ohio State University, 2024, Statistics

    Quantile regression (QR) has drawn increased attention as an attractive alternative to mean regression. QR was motivated by the realization that extreme quantiles often have a different relationship with covariates than do the centers of the response distributions. QR can target quantiles in the tail of the response distribution and provide additional insights into the response distribution. QR also tends to be more robust to outliers than is mean regression. A fundamental property of quantiles is their monotonicity, ensuring that they increase with higher quantile levels. When analyzing multiple quantiles of the response distribution, fitting individual quantile regressions may not guarantee the correct ordering of the conditional quantiles. If the ordering of the quantiles is not maintained, the quantile regression curves cross one another. To mitigate the crossing quantile issue, the idea of simultaneous quantile regression (SQR) has been introduced. SQR aims to fit multiple quantile regression curves under the monotonocity constraint. With SQR, we can understand the relationship between covariates and multiple response quantiles in a relatively comprehensive way. However, despite the practicality of SQR, obtaining multiple QR curves, especially in a flexible, nonlinear form, continues to be challenging. We propose a new class of stochastic processes, a process of dependent quantile pyramids (DQP). This class is applied to build a flexible SQR model that falls within the Bayesian nonparametric framework. The DQP generalizes the quantile pyramid, a model for a single set of quantiles without a covariate. The generalization replaces each scalar variate in the quantile pyramid with a stochastic process whose index set is a covariate space. The resulting model is a distribution-valued stochastic process which provides a nonparametric distribution at each value of the covariate. We rigorously establish the existence of the model and describe several of its propertie (open full item for complete abstract)

    Committee: Steven MacEachern (Advisor); Mario Peruggia (Committee Member); Oksana Chkrebtii (Committee Member) Subjects: Statistics
  • 2. Grabaskas, David Efficient Approaches to the Treatment of Uncertainty in Satisfying Regulatory Limits

    Doctor of Philosophy, The Ohio State University, 2012, Nuclear Engineering

    Utilities operating nuclear power plants in the United States are required to demonstrate that their plants comply with the safety requirements set by the U.S. Nuclear Regulatory Commission (NRC). How to show adherence to these limits through the use of computer code surrogates is not always straightforward, and different techniques have been proposed and approved by the regulator. The issue of compliance with regulatory limits is examined by rephrasing the problem in terms of hypothesis testing. By using this more rigorous framework, guidance is proposed to choose techniques to increase the probability of arriving at the correct conclusion of the analysis. The findings of this study show that the most straightforward way to achieve this goal is to reduce the variance of the output result of the computer code experiments. By analyzing different variance reduction techniques, and different methods of satisfying the NRC's requirements, recommendations can be made about the best-practices, that would result in a more accurate and precise result. This study began with an investigation into the point estimate of the 0.95-quantile using traditional sampling methods, and new orthogonal designs. From there, new work on how to establish confidence intervals for the outputs of experiments designed using variance reduction techniques was compared to current, regulator-approved methods. Lastly, a more direct interpretation of the regulator's probability requirement was used, and confidence intervals were established for the probability of exceeding a safety limit. From there, efforts were made at combining methods, in order to take advantage of positive aspects of different techniques. The results of this analysis show that these variance reduction techniques can provide a more accurate and precise result compared to current methods. This means an increased probability of arriving at the correct conclusion, and a more accurate characterization of the risk associated with even (open full item for complete abstract)

    Committee: Tunc Aldemir PhD (Advisor); Richard Denning PhD (Committee Member); Marvin Nakayama PhD (Committee Member); Alper Yilmaz PhD (Committee Member) Subjects: Nuclear Engineering; Statistics