Skip to Main Content

Basic Search

Skip to Search Results
 
 
 

Left Column

Filters

Right Column

Search Results

Search Results

(Total results 10)

Mini-Tools

 
 

Search Report

  • 1. Zhang, Yuankun (Ultra-)High Dimensional Partially Linear Single Index Models for Quantile Regression

    PhD, University of Cincinnati, 2018, Arts and Sciences: Mathematical Sciences

    Nonparametric modeling tends to capture the underlying structures in the data without imposing strong model assumptions. The nonparametric estimation provides powerful data-driven approaches to fit a flexible model to the data. Single-index models are useful and appealing tools to preserve the flexibility and interpretability but to overcome “curse of dimensionality” problems in nonparametric regression. In this dissertation, we consider partially linear single-index models for quantile regression. This set of semi-parametric models allow some of covariates in linear form and other covariates in nonparametric term to reflect the non-linear feature in modeling the conditional quantiles of the response variable. We first develop efficient estimation and variable selection for partially linear single-index quantile models in the fixed dimension. We use spline smoothing with B-spline basis to estimate the nonparametric component and adopt the non-convex penalties to select variables simultaneously. We study the theoretical properties of the resulting estimators and establish the “oracle property” for penalized estimation. With the rise of new technologies used in data collection and storage, high dimensional data spring up and become available in various scientific fields. Often researchers face the new challenge that the dimension of the explanatory variables, p, may increase with the sample size, n, or potentially become much larger than n. For those problems of high to ultra-high dimensionality, data are likely to be heterogeneous and the underlying model is prone to be nonlinear. Variable selection will also play a critical role in the dimension reduction and modeling process. Thus, we propose a penalized estimation under the sparsity assumption for partially linear single-index quantile models in high dimension. We further investigate ultra-high dimensional penalized partially linear single-index quantile models in which both linear components and single-index vari (open full item for complete abstract)

    Committee: Dan Ralescu Ph.D. (Committee Chair); Yan Yu Ph.D. (Committee Chair); Emily Kang Ph.D. (Committee Member); Ju-Yi Yen (Committee Member) Subjects: Statistics
  • 2. Benatar, Michael An Experimental Investigation of the Load Distribution of Splined Joints under Gear Loading Conditions

    Master of Science, The Ohio State University, 2016, Mechanical Engineering

    Splined joints are commonly used to transmit rotary motion from a shaft to machine elements such as gears. While computationally efficient spline load distribution models have recently been proposed, there is no validated model of a spline due to lack of high-fidelity experimental data. Accordingly, this study aims to (i) establish an extensive experimental database on load distributions of splined joints subject to gear loading conditions and (ii) assess the accuracy of the spline load distribution model of Hong et al. (2014) through direct comparisons of its predictions to experimental measurements. On the experimental side, a quasi-static, spline-specific test setup is designed, fabricated, and instrumented. A test matrix covering various loading conditions is executed in order to compile an extensive spline load distribution database. The modeling effort centers around expanding the model of Hong et al. (2014) by adding a new root stress prediction module. The experimental data illustrates the cyclic nature of loads and resultant stresses on spline teeth caused by rotation of the spline teeth in relation to the gear mesh that loads the splined joint. A nonlinear relationship between torque applied and resultant stress is revealed, as well as the relationship between the location of maximum stress along the face width and the amount of lead crown modification applied. Through correlations to the experimental results, the model is shown to be accurate; it captures several unique effects of spur and helical gear loading conditions.

    Committee: Ahmet Kahraman (Advisor) Subjects: Mechanical Engineering
  • 3. Tang, Lin Efficient Inference for Periodic Autoregressive Coefficients with Polynomial Spline Smoothing Approach

    Doctor of Philosophy, University of Toledo, 2015, Mathematics (statistics)

    First, we propose a two-step estimation method for periodic autoregressive parameters via residuals when the observations contain trend and periodic autoregressive time series. In the first step, the trend is estimated and the residuals are calculated; in the second step, the autoregressive coefficients are estimated from the residuals. To overcome the drawback of a parametric trend estimation, we estimate the trend nonparametrically by polynomial spline smoothing. Polynomial spline smoothing is one of the nonparametric methods commonly used in practice for function estimation. It does not require any assumption about the shape of the unknown function. In addition, it has advantages of computational expediency and mathematical simplicity. The oracle efficiency of the proposed Yule-Walker type estimator is established. The performance is illustrated by simulation studies and real data analysis. Second, we consider time series that contain a trend, a seasonal component and periodically correlated time series. A semiparametric three-step method is proposed to analyze such time series. The seasonal component and trend are estimated by means of B-splines, and the Yule-Walker estimates of the time series model coefficients are calculated via the residuals after removing the estimated seasonality and trend. The oracle efficiency of the proposed Yule-Walker type estimators is established. Simulation studies suggest that the performance of the estimators coincides with the theoretical results. The proposed method is applied to three data sets. Third, we will make the inference for the logistic regression models using the nonparametric estimation method. The primary interest of this topic is the estimation of the conditional mean for the logistic regression models. We propose the local likelihood logit method with linear B-spline to estimate the conditional mean. Simulation studies shows that our method works well.

    Committee: Qin Shao (Committee Chair); Donald White (Committee Member); Rong Liu (Committee Member); Jing Wang (Committee Member) Subjects: Statistics
  • 4. Shi, Bibo Diversification and Generalization for Metric Learning with Applications in Neuroimaging

    Doctor of Philosophy (PhD), Ohio University, 2015, Computer Science (Engineering and Technology)

    Many machine learning algorithms rely on “good” metrics to quantify the distances or similarities between data instances. Context dependent metrics learned from the training data are often e effective in improving the performance of metric-based algorithms under different circumstances, or for different tasks at hand. At present, most of existing metric learning algorithms learn metrics only from a binary similarity perspective, overlooking the fact that similarities tend to have different levels and binary configurations cannot fully account for many situations occurring in practice. In addition, many state-of-the-art metric learning solutions only estimate Mahalanobis metrics, which are linear transformation models with limited expressive power. More complicated nonlinear structures embedded in the data often cannot be well handled, thus failing to improve or even deteriorating the performance of the metric-based algorithm that follows. In this dissertation, we address the aforementioned drawbacks along two directions: diversify and generalize the forms of metric learning. For diversification, a novel Prior Distance Informed Metric Learning (PDIML) model is developed. Both global and local PDIML implementations have been successfully applied to diversify the similarities between data points through integrating prior distance knowledge into pairwise distance matrices. For generalization, we tackle metric learning from the perspective of feature transformation, and propose a set of novel nonlinear solutions through the utilization of deformable geometric models to learn spatially varying metrics. Thin-plate splines (TPS) are chosen as the geometric model due to their remarkable versatility and representation power in accounting for high-order deformations. TPS based metric learning algorithms have been developed for both k Nearest Neighbor (kNN) and Support Vector Machines (SVMs). Furthermore, a theoretically sound diffeomorphic constraint is proposed to ensure t (open full item for complete abstract)

    Committee: Jundong Liu (Committee Chair); Razvan Bunescu (Committee Member); Cindy Marling (Committee Member); Chang Liu (Committee Member); Robert Colvin (Committee Member); Martin Mohlenkamp (Committee Member) Subjects: Computer Science
  • 5. Hong, Jiazheng A Semi-Analytical Load Distribution Model of Spline Joints

    Doctor of Philosophy, The Ohio State University, 2015, Mechanical Engineering

    While spline joints are commonly used in power transmission devices and drivetrains of most automotive, aerospace and industrial systems, the level of design knowledge of them is far lower than other components such as gears, shafts and bearings. This study proposes a family of semi-analytical models to predict load distribution of clearance-fit (side-fit), major diameter-fit and minor diameter-fit spline joints with the intention of enhancing spline design practices. These models include all major components of spline compliance stemming from deformations associated with bending, shear and base rotation of the teeth as well as contact and torsional deformations. For clearance-fit splines, only drive side tooth surfaces are allowed to contact while top and root lands of the external spline are also chosen as potential contact zones in case of major diameter-fit and minor diameter-fit splines, respectively. Any helix mismatch or interference conditions are also handled by allowing contacts on back side tooth surfaces as well. All of these models are formulated for any general loading condition consisting of torsion, radial forces and tilting moments, such that loading conditions of gear-shaft splines can be modeled conveniently. Since contacting spline tooth surfaces are conformal, the potential contact zone covers all of the tooth surfaces, whose direct load distribution solution might require significant computational time. A new multi-step discretization solution scheme is devised and implemented in the semi-analytical models to reduce the computational time significantly such that they can be used as convenient design tools. Meanwhile, accuracy of the predictions of the proposed modelsis demonstrated through comparisons to those from a detailed deformable-body contact model. As afforded by their computational efficiency, proposed models are used to perform extensive parameter studies to quantify influences of loading conditions, misalignments, tooth modi (open full item for complete abstract)

    Committee: Ahmet Kahraman (Advisor); Robert Siston (Committee Member); Soheil Soghrati (Committee Member); Sandeep Vijayakar (Committee Member) Subjects: Mechanical Engineering
  • 6. Wu, Chaojiang Essays on High-dimensional Nonparametric Smoothing and Its Applications to Asset Pricing

    PhD, University of Cincinnati, 2013, Business: Business Administration

    Nonparametric smoothing, a method of estimating smooth functions, has gained increasing popularity in statistics and application literature during the last few decades. This dissertation has focused primarily on the nonparametric estimation in quantile regression (Chapter 1) and an application of nonparametric estimation to financial asset pricing (Chapter 2). In the first essay (Chapter 1), we consider the estimation problem of conditional quantile when multi-dimensional covariates are involved. To overcome the "curse of dimensionality" yet retain model flexibility, we propose two partially linear models for conditional quantiles: partially linear single-index models (QPLSIM) and partially linear additive models (QPLAM). The unknown univariate functions are estimated by penalized splines. An approximate iteratively reweighted penalized least square algorithm is developed. To facilitate model comparisons, we develop effective model degrees of freedom for penalized spline conditional quantiles. Two smoothing parameter selection criteria, Generalized Approximate Cross-validation (GACV) and Schwartz-type Information Criterion (SIC) are studied. Some asymptotic properties are established. Finite sample properties are investigated through simulation studies. Application to the Boston Housing data demonstrates the success of proposed approach. Both simulations and real applications show encouraging results of the proposed estimators. In the second essay (Chapter 2), we investigate whether the conditional CAPM helps explain the value premium using the single-index varying-coefficient model. Our empirical specification has two novel advantages relative to those commonly used in the previous studies. First, it not only allows for a flexible dependence of conditional beta on state variables but also modeling heteroskedasticity. Second, from a large set of candidate state variables, we identify the most influential ones through an exhaustive variable selection method. (open full item for complete abstract)

    Committee: Yan Yu Ph.D. (Committee Chair); Hui Guo Ph.D. (Committee Member); Martin Levy Ph.D. (Committee Member) Subjects: Statistics
  • 7. SARTOR, MAUREEN TESTING FOR DIFFERENTIALLY EXPRESSED GENES AND KEY BIOLOGICAL CATEGORIES IN DNA MICROARRAY ANALYSIS

    PhD, University of Cincinnati, 2007, Medicine : Biostatistics (Environmental Health)

    DNA microarrays are a revolutionary technology able to measure the expression levels of thousands of genes simultaneously, providing a snapshot in time of a tissue or cell culture's transcriptome. Although microarrays have been in existence for several years now, research is yet ongoing for how to best analyze the data, at least partly due to the combination of small sample sizes (few replicates) with large numbers of genes. Several challenges remain in maximizing the amount of biological information attainable from a microarray experiment. The key components of microarray analysis where these challenges lie are experimental design, preprocessing, statistical inference, identifying expression patterns, and understanding biological relevance. In this dissertation we aim to improve the analysis and interpretation of microarray data by concentrating on two key steps in microarray analysis: obtaining accurate estimates of significance when testing for differentially expressed genes, and identifying key biological functions and cellular pathways affected by the experimental conditions. We identify opportunities to enhance analytical techniques, and demonstrate that these enhancements significantly improve the functional interpretation of microarray results. We develop three related Bayesian statistical models to improve the estimates of significance by exploiting the information available from all genes, and functionally relating the gene variances to their expression levels. These novel methodologies are compared to previously proposed methods both in simulations and with real-world experimental data performed on multiple microarray platforms. In addition, we introduce a logistic regression method for identifying key biological categories and molecular pathways and compared this method with the commonly used Fisher's exact test and other relevant previously developed methods. We make our statistical methods available to the biomedical research community through the use (open full item for complete abstract)

    Committee: Dr. Mario Medvedovic (Advisor) Subjects:
  • 8. Lee, Won Hee Bundle block adjustment using 3D natural cubic splines

    Doctor of Philosophy, The Ohio State University, 2008, Geodetic Science and Surveying

    One of the major tasks in digital photogrammetry is to determine the orientation parameters of aerial imageries correctly and quickly, which involves two primary steps of interior orientation and exterior orientation. Interior orientation defines a transformation to a 3D image coordinate system with respect to the camera's perspective center, while a pixel coordinate system is the reference system for a digital image, using the geometric relationship between the photo coordinate system and the instrument coordinate system. While the aerial photography provides the interior orientation parameters, the problem is reduced to determine the exterior orientation with respect to the object coordinate system. Exterior orientation establishes the position of the camera projection center in the ground coordinate system and three rotation angles of the camera axis to represent the transformation between the image and the object coordinate system. Exterior orientation parameters (EOPs) of the stereo model consisting of two aerial imageries can be obtained using relative and absolute orientation. EOPs of multiple overlapping aerial imageries can be computed using bundle block adjustment. Bundle block adjustment reduces the cost of field surveying in difficult areas and verifies the accuracy of field surveying during the process of bundle block adjustment. Bundle block adjustment is a fundamental task in many applications, such as surface reconstruction, orthophoto generation, image registration and object recognition. Point-based methods with experienced human operators are processed well in traditional photogrammetric activities but not the autonomous environment of digital photogrammetry. To develop more robust and accurate techniques, higher level objects of straight linear features accommodating elements other than points are adopted instead of points in aerial triangulation. Even though recent advanced algorithms provide accurate and reliable linear feature extraction, ext (open full item for complete abstract)

    Committee: Anton F. Schenk (Advisor); Alper Yilmaz (Committee Co-Chair); Ralph R.B. von Frese (Committee Member) Subjects: Civil Engineering
  • 9. Twa, Michael Structural classification of glaucomatous optic neuropathy

    Doctor of Philosophy, The Ohio State University, 2006, Physiological Optics

    Glaucoma is a leading cause of blindness. Quantitative methods of imaging the optic nerve head (e.g. confocal scanning laser tomography) are increasingly used to diagnose glaucomatous optic neuropathy and monitor its progression, yet there is considerable controversy about how to interpret and make best use of this structural information. In this research, machine learning methods are proposed and evaluated as alternatives to current methods of disease classification. First, multiple mathematical modeling methods such as radial polynomials, wavelet analysis and B-spline fitting were used to reconstruct topographic descriptions of the optic nerve head and peripapillary region. Next, features derived from these models were extracted and used as classification features for automated decision tree induction. Decision tree classification performance was compared with conventional techniques such as expert grading of stereographic photos, Moorfields Regression Analysis, and visual field-based standards for the cross-sectional identification of glaucomatous optic neuropathy. Pseudozernike polynomial modeling methods provided the most compact and faithful representation of these structural data, albeit at considerably greater computational expense when compared to wavelet and B-spline modeling methods. The pseudozernike-based classifier had the greatest area under the receiver-operating characteristic (ROC) curve, 85% compared to 73% and 71% for the wavelet and B-spline-based classification models respectively. These results show that automated analysis of optic nerve head structural features can identify glaucomatous optic neuropathy in very good agreement with expert assessments of stereographic disc photos. Moreover, these quantitative methods can improve the standardization and agreement of these assessments. Extensions of these methods may provide alternative ways to evaluate structural and functional disease relationships in glaucoma.

    Committee: Mark Bullimore (Advisor) Subjects:
  • 10. Andrew, Steven Tools for the simulation and analysis of aerodynamic models

    Master of Science (MS), Ohio University, 1999, Electrical Engineering & Computer Science (Engineering and Technology)

    Within this thesis a set of tools is developed for creating and evaluating differentiable functions of two and three variables from tables of data in the Matlab ®environment. These functions, respectively known as bivariate and trivariate splines, are based on single variable cubic piecewise polynomial splines of each variable. For this reason they are easily differentiated, and by definition have continuous first and second order derivatives in each of their variables. A Simulink model is also developed that simulates the flight characteristics of a six degree-of-freedom missile. This model utilizes the multivariate splining techniques to interpolate the missile's aerodynamic force and moment data. The simulation is capable of demonstrating the response of the missile to any arbitrary set of commanded input fin deflections. In addition, it provides a test platform that can be used to evaluate the performance of controllers developed for the missile. A separate set of tools is developed to linearize the model about an arbitrary equilibrium configuration. This provides an initial step towards the development of gain scheduled controllers for the system.

    Committee: Douglas Lawrence (Advisor) Subjects: