Skip to Main Content

Basic Search

Skip to Search Results
 
 
 

Left Column

Filters

Right Column

Search Results

Search Results

(Total results 67)

Mini-Tools

 
 

Search Report

  • 1. Schafer , Lindsey Statistical Analysis of Mining Parameters to Create Empirical Models to Predict Mine Pool Formation in Underground Coal Mines

    Master of Science (MS), Ohio University, 2018, Geological Sciences (Arts and Sciences)

    Mining has occurred in Ohio for over two hundred years and has resulted in hundreds of flooded or partially flooded mines releasing acid mine drainage. Present mining regulations in Ohio prevent the approval of a mining permit if it is predicted that the future mine will develop a mine pool that will discharge to the surface. However, there is not a methodology that mining companies and regulators use to determine, with some degree of uncertainty, if a mine will develop a pool or not. This thesis work is part of a larger project that aims to create a tool that can predict if a mine pool will form in a future coal mine, and if it does, where it could discharge. Mines that have been exploited in Ohio during the last 35 years were investigated. Public data sources were used to identify variables that could influence the water elevation within mines during and after mining. Information about boreholes and wells is reported in mine permits and quarterly monitoring reports and in ODNR and EPA web pages. The following variables were investigated: surface elevation of the well, bottom of well elevation, overburden thickness, thickness of the different strata (coal, sandstone, shale, clay and limestone), accumulative coal volume extracted from the mine at different times, water withdraw from the mine and precipitation data was collected and investigated. Using the statistical program, The Unscrambler X, multivariate statistical analysis was applied to identify the most important variables that determine the potentiometric head and obtain regression equations of potentiometric head as a function of the variables collected. The applied methods include: principal component analysis, principal component regression, and partial least squares regression. These methods were performed on these variables for 359 wells in eleven mines. This analysis resulted in regression equations that allows for the prediction of potentiometric heads at different depths using, hydrological param (open full item for complete abstract)

    Committee: Dina Lopez Dr. (Advisor); Natalie Kruse Dainels Dr. (Committee Member); Gregory Nadon Dr. (Committee Member) Subjects: Geochemistry; Geology; Hydrology; Mining
  • 2. Shorrab, Yousef Quantifying the Crevice Corrosion Mechanism in Alloy 625 and SS 316L in Chloride-Containing Environments

    Doctor of Philosophy, University of Akron, 2024, Chemical Engineering

    The aim of this work is to draw links between 1-D pitting corrosion in literature and crevice corrosion in Alloy 625 and SS 316L in chloride-containing environments. Crevice corrosion initiation, propagation, and repassivation were studied using real-time optical imaging and UV spectroscopy, and the crevice corrosion mechanism was investigated for different cases with the ultimate goal of finding a unifying mechanism for crevice corrosion that is in line with 1-D pitting corrosion. In the first part of this work, crevice corrosion was investigated for LPBF AM Alloy 625 specimens in comparison to the conventionally wrought condition. The AM specimens tested were of two different orientations with respect to the build platform. In addition, the tests were carried out for specimens of the as made or not treated condition as well as specimens that were subject to post manufacturing heat treatments including stress relieving, solution annealing, and solution plus stabilization annealing. Hence, the effect of heat treatment and build orientation on the susceptibility of LPBF additively manufactured Alloy 625 to crevice corrosion was investigated. There was not sufficient evidence that build orientation affects crevice corrosion susceptibility. Nevertheless, it has been shown that heat treatment affects crevice corrosion susceptibility. In addition, though the crevice corrosion susceptibility of as made AM Alloy 625 was not remarkably different from that of wrought 625, solution annealing improves crevice corrosion performance of the AM specimens beyond that of the wrought condition. Crevice corrosion performance differences could be explained with microstructure reflected in the corrosion morphology. Nevertheless, the different AM alloys studied followed the same kinetics/mechanism as the wrought alloy with similar trends in current density and repassivation potential. Such kinetics/mechanisms appear in literature for 1D pits supporting the applicability of the “1D (open full item for complete abstract)

    Committee: Robert Lillard (Advisor); Qixin Zhou (Committee Member); Dmitry Golovaty (Committee Member); Gregory Morscher (Committee Member); Linxiao Chen (Committee Member) Subjects: Chemical Engineering; Materials Science
  • 3. Yazbeck, Maha Novel Forward-Inverse Estimation and Hypothesis Testing Methods to Support Pipeline and Brain Image Analyses.

    Doctor of Philosophy, The Ohio State University, 2024, Industrial and Systems Engineering

    This dissertation addresses two applied problems relating to images. The first relates to images of pipeline corrosion and the second relates to images of the human brain and individuals with Attention-Deficit/Hyperactivity Disorder (ADHD). The corrosion of oil and gas pipelines is important because there are thousands of leaks every year costing billions of dollars for cleanups. ADHD is important because a substantial fraction of the world population has the disorder causing significant suffering and hundreds of billions of dollars of losses to the world economy. To address both image analysis problems, novel statistical and operations research techniques are proposed which have potentially wide applicability. Relating to pipeline corrosion, an established simulation method is called the “voxel” method which permits predictions about how images and pipelines or other media will change as corrosion evolves. In most realistic cases, we find that the parameter values or “inputs” (Xs) needed to run the simulation are unknown. We only have the images which are essentially outputs (Ys) which can be generated by real world experiments or simulations. The phenomenon of having incomplete inputs for simulation is common in many engineering and science situations and a critical challenge for both people and artificial intelligence. We and others have called this important subject, “empirical forward-inverse estimation” since we can gather data (empirically) in the forward manner progressing from assumed inputs (Xs) to measured outputs (Ys) and then generate inverse predictions from Ys to Xs. With (hopefully) accurately estimated X values, the experimental setup or simulation can then predict the future corrosion evolution and whether repair in critically needed. Relating to forward-inverse analyses, 24 variants of an established two stage method or framework are studied in relation to enhanced inverse prediction accuracy for two test cases including pipeline corrosion (open full item for complete abstract)

    Committee: Theodore T. Allen (Advisor); William (Bill) Notz (Committee Member); Samantha Krening (Committee Member); Marat Khafizov (Committee Member) Subjects: Engineering; Industrial Engineering; Materials Science; Statistics
  • 4. Afzal, Muhammad Hassan Bin The Legislative Politics and Public Attitude on Immigrants and Immigration Policies Amid Health Crises

    PHD, Kent State University, 2023, College of Arts and Sciences / Department of Political Science

    By thoroughly analyzing 910 U.S. House immigration bills from the 113th, 114th, 115th, and 116th Congressional sessions, my Ph.D. research delves into the impact of health crises on introduced U.S. House immigration bills in the United States. My research fills a crucial gap in the literature by examining the influence of legislative policy entrepreneurs (LPEs) on agenda-setting and socio-political discourse in health crises. By using Kingdon's policy entrepreneur theory and the inductive qualitative method of relational content analysis, I explore the general theme, underlying tone, rhetoric, and proposed measures of House immigration bills during health crisis versus non-health crisis periods. The findings reveal that elected House representatives are more likely to introduce restrictive immigration bills during health crises and that geographical location and political affiliation play a significant role in shaping these bills' rhetoric and proposed measures. Using the cumulative ANES dataset from 1948 to 2020, I demonstrate that the general population tends to be less welcoming towards immigrants and favors more restrictive immigration policies during health crises. Political ideology, education, income scale, and gender significantly determine public attitudes toward immigration policies and immigrants. My research sheds light on economic conditions, political environment, and legal frameworks that influence the legislative activity of elected House representatives and the public's attitudes toward immigration policy. The findings provide valuable insights and directions for future research, policy, and practice efforts toward a more equitable and just society.

    Committee: Ryan L. Claassen Ph.D. (Advisor); Daniel E. Chand Ph.D. (Committee Co-Chair); Oindrila Roy Ph.D. (Committee Member); Anthony D. Molina Ph.D. (Committee Member); Elizabeth M. Smith-Pryor Ph.D. (Other) Subjects: American Studies; Climate Change; Economics; Health; Health Care Management; Political Science; Public Health; Public Policy; Rhetoric; Social Research; Statistics; Sustainability
  • 5. Hafez, Mhd Ammar AN IMPROVED POLYNOMIAL CHAOS EXPANSION BASED RESPONSE SURFACE METHOD AND ITS APPLICATIONS ON FRAME AND SPRING ENGINEERING BASED STRUCTURES

    Doctor of Philosophy in Engineering, Cleveland State University, 2022, Washkewicz College of Engineering

    In engineering fields, computational models provide a tool that can simulate a real world response and enhance our understanding of physical phenomenas. However, such models are often computationally expensive with multiple sources of uncertainty related to the model's input/assumptions. For example, the literature indicates that ligament's material properties and its insertion site locations have a significant effect on the performance of knee joint models, which makes addressing uncertainty related to them a crucial step to make the computational model more representative of reality. However, previous sensitivity studies were limited due to the computational expense of the models. The high computational expense of sensitivity analysis can be addressed by performing the analysis with a reduced number of model runs or by creating an inexpensive surrogate model. Both approaches are addressed in this work by the use of Polynomial chaos expansion (PCE)-based surrogate models and design of experiments (DoE). Therefore, the objectives of this dissertation were: 1- provide guidelines for the use of PCE-based models and investigate their efficiency in case of non-linear problems. 2- utilize PCE and DoE-based tools to introduce efficient sensitivity analysis approaches to the field of knee mechanics. To achieve these objectives, a frame structure was used for the first aim, and a rigid body computational model for two knee specimens was used for the second aim. Our results showed that, for PCE-based surrogate models, once the recommended number of samples is used, increasing the PCE order produced more accurate surrogate models. This conclusion was reflected in the R2 values realized for three highly non-linear functions ( 0.9998, 0.9996 and 0.9125, respectively). Our results also showed that the use of PCE and DoE-based sensitivity analyses resulted in practically identical results with significant savings in the computational cost of sensitivity an (open full item for complete abstract)

    Committee: Jason Halloran (Advisor); Lutful Khan (Committee Member); Daniel Munther (Committee Member); Josiah Sam Owusu-Danquah (Committee Member); Stephen Duffy (Committee Member) Subjects: Biomechanics; Biomedical Engineering; Biomedical Research; Civil Engineering
  • 6. Pruitt, Marie Consider the Big Picture: A Quantitative Analysis of Readability and the Novel Genre, 1800-1922

    Master of Arts, Miami University, 2022, English

    What can readability studies tell us about the novel genre? By tracing both the history of readability studies, a partially abandoned field located at the intersection of education and literacy studies, and the history of the English language novel, this project makes a case for the validity of conversations around readability within literary circles. One of the primary outcomes of readability studies is a number of formulas that measure various elements of a text, such as vocabulary and sentence structure. However, few formulas were created with fiction, or more specifically, the novel genre, in mind. To determine the possible applications of classic readability formulas for the novel genre, this project uses a digital readability formula to measure the readability of a corpus of 127 English language novels from 1800 to 1922. However, the resulting data highlights the difficulty of measuring such a wide-ranging, unique literary genre. Finally, this project proposes a framework for using a statistical analysis of novels to identify potential lines of inquiry favorable to close reading. By approaching novels through a quantitative lens, this project highlights how considering the bigger picture can help us determine which specific elements may lead to a richer understanding of the text.

    Committee: Collin Jennings (Committee Chair); Tim Lockridge (Committee Member); Mary Jean Corbett (Committee Member) Subjects: American Literature; British and Irish Literature; Literacy; Literature
  • 7. Sadeqi, Sara Effect of Whole-Body Kinematics on ACL Strain and Knee Joint Loads and Stresses during Single-Leg Cross Drop and Single-Leg Landing from a Jump

    Doctor of Philosophy, University of Toledo, 2022, Engineering

    Anterior cruciate ligament (ACL) injury is quite common among young athletes, with the number of injury cases exceeding 120,000 annually in the United States alone. Over 70% of which account for non-contact injuries. Forces and moments acting on the knee joint play essential roles in these injuries. Motions of the other body segments are effective in increasing or decreasing these loads. In this study, the effect of whole-body (WB) kinematics on the knee joint biomechanics was investigated using in vivo and in silico methods. Motion analysis experiments were done on 14 able-bodied young participants wearing a full-body marker set and performing two variations of single-leg landings, using their left and right limbs (4 tasks). Marker trajectories and force plate data were recorded from the in vivo experiments of these participants. The in silico investigations consisted of two separate parts. First, musculoskeletal simulations were done to obtain whole-body kinematics, kinetics, and muscle forces, using inverse kinematics, inverse dynamics, and static optimization techniques. The next part was non-linear dynamic finite element (FE) analyses. A FE dynamic/explicit knee model was developed from medical images of a healthy young female and validated against in vitro experiments for the knee joints kinematics and ligaments strains. Ligaments' material properties for the knee cruciate and collateral ligaments were obtained through optimization to the experimental tensile test data in the literature. Then, the participants' data from musculoskeletal simulations were used as the input to the FE analyses. The FE outputs included ACL strain, knee joint contact forces, contact pressures, and soft tissue stresses. In order to find the relationship between WB kinematics and knee joint biomechanics, correlation analysis was used. Using Spearman correlation coefficients and P-values, the correlation between WB modifiable parameters and knee biomechanics along with their stat (open full item for complete abstract)

    Committee: Vijay Goel Dr. (Advisor) Subjects: Biomechanics; Biomedical Engineering
  • 8. Parise, Charles The Population Status and Diet of the North American River Otter in Ohio

    Master of Science, The Ohio State University, 2021, Environment and Natural Resources

    The North American river otter (Lontra canadensis) is a vitally important species, both biologically as an apex predator in riverine ecosystems, and economically as a furbearer species. River otters were once present throughout much of North America but were extirpated throughout much of the central United States by the 1970s due to excessive harvest, habitat loss, and other factors. Several reintroductions occurred throughout the 1980s and 1990s including an effort in Ohio between 1986-1993. The reintroduction effort in Ohio was considered successful enough that legal harvest was reimplemented in 2005. Since 2005 Ohio Division of Wildlife data has suggested that harvest is largely successful and sustainable, but little attention is currently being paid to population demographics and spatiotemporal variation of the populations in the state, especially as they relate to survival probability and harvest vulnerability. Additionally, the diet of river otters in Ohio and how it varies with age group, sex, and region are not well understood. Several studies have examined river otter diet in other locations using scat analysis, and the diet of other species of otters have also been studied with stable isotope analysis. Previous research has indicated that river otters eat smaller fish in shallower waters or larger, less mobile, bottom-feeding fish, in addition to some invertebrates that help supplement their diet. This study documented the current demographics of the river otter population in Ohio; modeled population trends, survival probability, harvest vulnerability, and recruitment trends for the river otter population in Ohio, determined the contribution of several prey items or prey item groups to river otter diet via stable isotope analysis, and determined how the contributions of these prey items or prey item groups varied with age group sex and location. Population demographics, specifically the larger share of older individuals and older reproductive females, (open full item for complete abstract)

    Committee: Stanley Gehrt (Advisor); Hance Ellington (Committee Member); Jeremy Bruskotter (Committee Member); Mažeika Sullivan (Committee Member) Subjects: Ecology; Environmental Science
  • 9. Matuk, James Bayesian Modelling Frameworks for Simultaneous Estimation, Registration, and Inference for Functions and Planar Curves

    Doctor of Philosophy, The Ohio State University, 2021, Statistics

    Functional Data Analysis (FDA) and Statistical Shape Analysis (SA) are fields in which the data objects of interest vary over a continuum, such as univariate functions and planar curves. While observations are typically measured and stored discretely, there are inherent benefits in acknowledging the infinite-dimensional processes from which the data arise. The typical statistical goals in FDA and SA are summarization, visualization, inference, and prediction. However, the geometric structure of the data presents unique challenges. In FDA, the observations exhibit two distinct forms of variability: amplitude, which describes the magnitude of features and phase, which describes the relative timing of amplitude features. In SA, objects are analyzed through their shape, which is a quantity that remains unchanged if the object is scaled, translated, rotated in space or reparametrized (referred to as shape-preserving transformations). Within both fields, analysis usually follows unrelated sequential steps. First, an estimation step is used to obtain an infinite-dimensional representation of the discretely measured observations. Then, a registration step is used to decouple amplitude and phase variability in the FDA setting, and remove variability in the observations associated with shape-preserving transformations in the SA setting. Finally, inference can be performed based on the registration results. There are two well-documented drawbacks to the sequential pipeline for analysis. (1) There is no formal uncertainty propagation between steps, which leads to overconfidence in inferential results. (2) There is a lack of flexibility under realistic observation regimes, such as sparsely sampled or fragmented observations. Previous methods that have attempted to overcome these drawbacks suffer from being too rigid or fail to account for misregistration of observations. In this thesis, we develop flexible modelling frameworks for FDA and SA that simultaneously perform t (open full item for complete abstract)

    Committee: Oksana Chkrebtii (Advisor); Sebastian Kurtek (Advisor); Peter Craigmile (Committee Member); Radu Herbei (Committee Member) Subjects: Statistics
  • 10. He, Karen DETECTING LOW FREQUENCY AND RARE VARIANTS ASSOCIATED WITH BLOOD PRESSURE

    Doctor of Philosophy, Case Western Reserve University, 2020, Epidemiology and Biostatistics

    Hypertension (HTN) or elevated blood pressure (BP) affects 1 in 3 adults in the US. Across ethnicities, BP levels have been consistently higher for African Americans (AA) with an earlier onset of HTN. Many studies have investigated racial differences in HTN, especially genetic factors contributing to disease progression. While genome-wide association studies (GWAS) have identified over 900 loci associated with BP variation, these variants together only explain a small portion of the heritability. Several studies have shown that rare variants could explain a portion of the “missing heritability”. Linkage analysis of families is a promising approach for detecting genetic signals because it is insensitive to allelic heterogeneity and facilitates the discovery of missing heritability due to rare variants. This dissertation includes two of the first studies to leverage linkage evidence from family-based studies and search for BP-associated rare variants in trans-ethnic whole genome sequencing (WGS) data. Because only a small amount of GWAS variants fall within linkage regions, combining linkage and association analyses would yield a powerful and robust approach for detecting rare variants. Given the increasing availability of WGS data, efficient approaches are needed to interrogate a large number of genetic variants involved in disease etiology. Although directly searching the whole genome using window-based or gene-based approaches are commonly implemented, these methods may suffer from statistical power lost due to the large number of statistical tests. In contrast, confining to genomic regions with linkage evidence helps to reduce the multiple testing burden. Furthermore, because variant annotation is independent from linkage information, it can be incorporated into rare variant association analysis. By leveraging linkage evidence from European American (EA) families, SLX4 was shown to be associated with pulse pressure in EA. Linkage evidence in AA families led to the (open full item for complete abstract)

    Committee: Scott Williams (Committee Chair); Xiaofeng Zhu (Advisor); Fredrick Schumacher (Committee Member); Jing Li (Committee Member) Subjects: Biostatistics; Epidemiology; Genetics
  • 11. Park, Joonsuk Using Sequential Sampling Models to Detect Selective Infuences: Pitfalls and Recommendations.

    Doctor of Philosophy, The Ohio State University, 2019, Psychology

    Sequential sampling models such as the Di usion Decision Model (DDM) and the Linear Ballistic Accumulator (LBA) are often used as measurement tools in psychology. However, two practical issues regarding the use of them have received limited attention: Identi abilities of the models and the appropriateness of the follow-up testing procedures in terms of statistical power. In the present research, I address these problems to ll the gap in the literature. Speci cally, I do the following two things. First, I formally conduct identi ability analyses of DDM and LBA. As a result, I argue that some version of DDM, namely the "full DDM," is unidenti able, even when multiple experimental conditions are employed. I show that this problem arises due to the excess flexibility of the model, and it can only be solved by reducing the number of free parameters to be estimated. Second, I demonstrate that the use of t-tests while comparing parameter estimates cannot be justi ed because such a practice assumes an over-simpli ed, single-level hierarchical model. As such, the statistical power is shown to be suboptimal. Instead, it is recommended that one employ an alternative procedure that explicitly models uncertainties about the parameter estimates, such as meta-regression or Hierarchical Bayes (HB). It is shown that such solutions are better theoretically grounded, exhibit larger statistical power, or yield more precise parameter estimates. Recommendations for substantive researchers are provided based on these considerations.

    Committee: Trish Van Zandt (Advisor); Brandon Turner (Advisor); Jolynn Pek (Committee Member) Subjects: Quantitative Psychology
  • 12. Humienny, Raymond Content Analysis of Video Game Loot Boxes in the Media

    Master of Science (MS), Ohio University, 2019, Journalism (Communication)

    Throughout the relatively nascent course of games media scholarship, representation of video games within popular mainstream media tends to suggest an antagonistic relationship between those familiar and unfamiliar with video games respectively. Yet, this outlook fails to acknowledge the content of popular gaming media that can be equally critical of the representation of games in reporting. For instance, within the past two years, reports pertaining to video game “loot boxes” have not only shown that reward systems in certain games can structurally mimic online gambling, but games and mainstream media can cohabitate in this reporting arena. Given our nascent understanding of gaming media from a mainstream perspective, this study examines how gaming and mainstream news outlets comparatively frame this “loot box” rewards practice. A content analysis of 274 articles containing the term “loot box(es)” revealed similarities wherein both types of media outlets framed “loot boxes” with political messages, references to gambling and cast some form of normative judgment. Traditional news writing provided fewer overall frames than more opinionated types of writing. Political intervention was the greatest predictor of frames assigned by both media. Overall, the internal regulation of “loot boxes” and games industry's opposition to government-assisted regulation are the strongest implications that warrant future study of this controversy.

    Committee: Hans Meyer (Committee Chair); Aimee Edmondson (Committee Member); Michael Sweeney (Committee Member) Subjects: Communication; Journalism; Mass Communications; Mass Media
  • 13. Mori, Hiroko Environmental and Other Factors Contributing to the Spatio-Temporal Variability of West Nile Virus in the United States

    Doctor of Philosophy, The Ohio State University, 2018, Environmental Science

    The West Nile Virus (WNV) was introduced into the U.S. in the summer of 1999 and caused the outbreaks of West Nile encephalitis. The virus is responsible for more than 45,000 cases in the U.S., including 2,017 fatalities. This virus is passed back and forth between infected birds and mosquitoes, but humans can be infected by the bite of an infected data mosquito, and no vaccine is currently available for humans. The WNV is difficult to eradicate because of its complex transmission behaviors and the transmission dynamics can also be altered by a variety of factors. Key factors include host (birds) and vector (mosquitoes) abundances, the numbers of susceptible individuals of humans, weather patterns, land use, and land cover. Furthermore, small water bodies are the source for breeding mosquitoes and provide promising components for the mosquito management. The ultimate goal of this dissertation was to better understand how WNV occurs in the continental U.S. by linking hydrological frameworks and other environmental and social-economic factors for disease transmission. In the first study, statistical models were designed to identify the factors leading to human infection at a local area in North Dakota. This study addressed how variability in meteorological data and river management can affect the disease transmission through its association with mosquitoes. The developed models also allowed the prediction of the onset of virus infections, which can contribute to mosquito control or lead to a preemptive warning for protection. In addition, the findings and conceptual framework of my statistical approach could potentially be applied to prediction analysis of other mosquito-borne diseases. In the next study, a network analysis was applied to clarify how multiple factors affect WNV incidence rates of humans in the continental U.S. The study identified which factors are associated with the conditions that are susceptible to the virus, and when the surge of disease (open full item for complete abstract)

    Committee: Motomu Ibaraki (Advisor); Franklin Schwartz (Committee Member); Jiyoung Lee (Committee Member); C.K. Shum (Committee Member) Subjects: Environmental Science
  • 14. Madaris, Aaron Characterization of Peripheral Lung Lesions by Statistical Image Processing of Endobronchial Ultrasound Images

    Master of Science in Biomedical Engineering (MSBME), Wright State University, 2016, Biomedical Engineering

    This thesis introduces the concept of implementing greyscale analysis, also known as intensity analysis, on endobronchial ultrasound (EBUS) images for the purposes of diagnosing peripheral lung tumors. The statistical methodology of using greyscale and histogram analysis allows the characterization of lung tissue in EBUS images. Regions of interest (ROI) will be analyzed in MATLAB and a feature vector will be created. A feature vector of first-order, second-order and histogram greyscale analysis will be created and used for the classification of malignant vs benign peripheral lung tumors. The tools that were implemented were MedCalc for the initial statistical analysis of receiver operating curves (ROC), Multiple Regression and MATLAB for the machine learning and ROI collection. Feature analysis, multiple regression and machine learning methods were used to better classify the malignant and benign EBUS images. The classification is assessed with a confusion matrix, ROC curve, accuracy, sensitivity and specificity. It was found that minimum pixel value, contrast and energy are the best determining factors to discriminate between benign and malignant EBUS images.

    Committee: Ulas Sunar Ph.D. (Advisor); Jason Parker Ph.D. (Committee Member); Jaime Ramirez-Vick Ph.D. (Committee Member) Subjects: Biomedical Engineering; Biomedical Research; Biostatistics; Computer Engineering; Engineering; Health Care; Medical Imaging
  • 15. Panozzo, Kimberly A Validation of Nass Crop Data Layer in the Maumee River Watershed

    Master of Arts, University of Toledo, 2016, Geography

    It is suspected that corn and soybean production in the Maumee Watershed has contributed to nutrient loading into Lake Erie, therefore affecting the frequency and duration of toxic algae (Dolan 1993), (Michalak, et al. 2012). Accurate crop type estimation is important in order to determine the potential impact on the lake and assess methods to reduce excess nutrient loading. Increasingly, the National Agricultural Statistics Survey (NASS) Crop Data Layer (CDL) is being used as a primary input for agricultural research and crop estimation modeling therefore assessing the accuracy of the CDL is imperative. This study aims to validate the CDL, assess accuracy differences on multiple spatial scales and to examine the efficiencies of using the CDL for future research in the region. Results of CDL validation using in situ field observations from 2011 and 2012 indicate an overall accuracy of at 94% and 92% respectively and khat accuracy of 90% (2011) and 86% (2012). Crop specific accuracy for corn, soy and wheat also resulted in considerably high user accuracy values, with slight differences between years. Accuracy measures vary by region and by year however in each circumstance analyzed, the differences were not significant. Of these measureable difference, it was shown that the 2012 comparison contained a higher degree of difference and this may be attributed to drought in the region for this year. It is concluded that NASS's CDL is an effective and efficient product for agricultural research.

    Committee: Kevin Czajkowski PHD (Committee Chair); P.L. Lawrence PHD (Committee Member); Dan Hammel PHD (Committee Member) Subjects: Agriculture; Geographic Information Science; Geography; Land Use Planning; Remote Sensing
  • 16. Aksu, Alper BENCH-TOP VALIDATION OF INTELLIGENT MOUTH GUARD

    Master of Science in Biomedical Engineering, Cleveland State University, 2013, Fenn College of Engineering

    Concussion is the signature athletics injury of the 21st Century. Scientists are hard at work monitoring effects of hard impacts on the human brain. However, existing tools and devices are inadequate to screen the effects. Hence, a new approach is required to accurately quantify peak values of head impacts or concussions and relate these values to clinical brain health outcomes. A new head impact dosimeter, the Intelligent Mouth Guard (IMG) has been developed and can be conveniently located inside the mouth. In this study, the IMG printed circuit board (PCB) including four (4) high-quality shock resistant sensors has been developed and implemented as a tri-axial impact analyzer in a mouthpiece. The bench-top validation process of the IMG was divided into theoretical uncertainty analysis of linear accelerometers, theoretical uncertainty analysis of angular rate sensors, bench-top uniaxial impact testing of linear accelerometers and bench-top uniaxial static testing of angular rate sensors. More specifically, this study also presents a method based on National Bureau of Standards (NBS) of analyzing measurement error for any components of a specialized electrical circuit and any types of data acquisition system. In the current application of an IMG printed circuit board (PCB), utilized for linear acceleration, angular acceleration and angular velocity measurements, has sensor uncertainties quantified. The uncertainty model is branched into two parts: The bias error (B) and the random error (R). In this paper, expected measurement error types for PCB components (ADXL001 linear accelerometer, L3G4200D gyroscope) are quantified and their effects on the IMG system are computed. The uncertainty analysis presented here can be a guide in future in vitro and in vivo IMG validation tests. During bench-top testing, IMG linear accelerometers quantified peak linear acceleration with 98.2% accuracy and 98.0% precision. The IMG gyroscope quantified peak angular velocity with 97.0 (open full item for complete abstract)

    Committee: Adam Bartsch Ph.D. (Advisor); Murad Hizlan Ph.D. (Committee Member); Sridhar Ungarala Ph.D. (Committee Member); Majid Rashidi Ph.D. (Committee Member) Subjects: Automotive Engineering; Biomedical Engineering; Electrical Engineering; Engineering; Mechanical Engineering
  • 17. Fisher, James Use of Remote Sensing in the Collection of Discontinuity Data for the Analysis and Design of Cut Slopes

    MS, Kent State University, 2011, College of Arts and Sciences / Department of Earth Sciences

    This study was conducted to examine the use of remote sensing techniques in the collection of discontinuity data for statistical and slope stability analyses. Two study areas where selected in Pulaski and Montgomery counties in central Virginia. Terrestrial LiDAR (light detection and ranging) and a transit compass were used to collect data at an abandoned quarry in the vicinity of Claytor Dam and Interstate 81 southwest of Christiansburg, Virginia. These data were used in a statistical analysis to compare both datasets and in a slope stability analysis for the adjacent section of Interstate 81. Digital photogrammetry was used to collect data on slopes along Interstate 81 northeast of Christiansburg. The digital photogrammetry dataset was qualitatively compared with the LiDAR dataset to illustrate differences and possible limitations of these remote sensing methods for the collection of discontinuity data. The objectives of this study were as follows: 1) compare the use of LiDAR and transit compass methods in collecting discontinuity orientation data through graphical and statistical analyses; 2) compare the kinematic analyses for both LiDAR and transit compass methods to determine the differences in the results; 3) compare LiDAR and photogrammetry methods to evaluate any limitations therein; and 4) compare the use of LiDAR and transit compass methods in the design of cut slopes along a portion of Interstate 81. For the comparison of the LiDAR and transit compass datasets, results show that the two datasets have similar mean orientation values for the corresponding discontinuity sets and are graphically similar when plotted on stereonet plots. However, the two datasets are not statistically derived from the same population. More importantly, a joint set was identified in the transit compass dataset that was either not detected or has a different mean orientation in the LiDAR dataset. These differences affected the kinematic analysis results and, therefore, the cut sl (open full item for complete abstract)

    Committee: Abdul Shakoor PhD (Advisor); Donna Witter PhD (Committee Member); Dahl Peter PhD (Committee Member) Subjects:
  • 18. Niti, Duggal Retail Location Analysis: A Case Study of Burger King & McDonald's in Portage & Summit Counties, Ohio

    MA, Kent State University, 2007, College of Arts and Sciences / Department of Geography

    There has been a growing interest among the academia and the private sector for the use of GIS techniques in the analysis and planning of retail store network. Over the past few decades the methodologies used for research of sighting of retail outlets have become more sophisticated as a result of applicable modeling procedures being developed with GIS. This study conducts a retail location analysis of the relationship between the fast-food store performance of McDonald's and Burger King and the various spatial and socio-economic factors of their respective catchment areas. Analytical procedures in GIS and statistical techniques have been applied to carry out the analysis in this study. Study areas have been partitioned into a set of Thiessen polygons and into various spatial configurations using variable buffer polygons to emulate various spatial configurations of catchment areas (i.e., trade areas) associated with each fast food store. The socio-economic profiles in the partitioned polygons have been analyzed with a series of regression models. The result of the study has brought out a better understanding of how location factors influence the performance of the stores as well as how the socio-economic attributes of the catchment areas affect the store revenues.

    Committee: Jay Lee (Advisor) Subjects:
  • 19. Thomas, Jaelynn Humanitarian Intervention: Motivations and Norms in Cases of Genocide

    Master of Arts (MA), Wright State University, 2024, International and Comparative Politics

    In 1948, the international community came together and promised to “prevent and punish” genocides under the Convention on the Prevention and Punishment of the Crime of Genocide (Genocide Convention). Despite the Genocide Convention's commitment to humanitarian intervention, states are selective and inconsistent in intervention. While there are many case studies done on state motivation for intervention, statistical studies are rarely done and a system for predicting what variables will likely produce humanitarian intervention on a wide scale has not been explored. This study uses a Cross-Sectional Time Series Estimator Model to track whether states were more likely to intervene in genocides over time since signing the Genocide Convention. It also tested whether valuable goods and shared borders made states more prone to intervention. The results concluded that the Genocide Convention has no correlation to states willingness to intervene in genocides. It also did not provide evidence that valuable goods are a factor in humanitarian intervention. The test did support the hypothesis that shared borders make states more likely to intervene in genocides. Future studies should focus on increasing the data pool to include more genocides and testing more variables in hopes of creating a system to predict humanitarian intervention in genocides.

    Committee: Liam Anderson Ph.D. (Advisor); Vaughn Shannon Ph.D. (Committee Member); Carlos Costa Ph.D. (Committee Member) Subjects: International Relations; Political Science
  • 20. Arts, Amanda An HPLC-ESI-QTOF Method to Analyze Polar Heteroatomic Species in Aviation Turbine Fuel via Hydrophilic Interaction Chromatography through Statistical Analysis of Mass Spectral Data

    Doctor of Philosophy (Ph.D.), University of Dayton, 2024, Mechanical Engineering

    Aviation turbine fuel is a complex mixture comprised of thousands of compounds. While organo-oxygen, nitrogen, and sulfur heteroatomic compounds are present in minute quantities (<0.1% by mass), their presence significantly influences fuel thermal stability. In response to the limitations of existing analytical methods, this study developed and validated a novel analytical approach employing hydrophilic interaction liquid chromatography (HILIC) in conjunction with high performance liquid chromatography (HPLC) with electrospray ionization and quadrupole time-of-flight mass spectrometry (ESI-QTOF). The HILIC method offers numerous advantages including rapid and straightforward sample preparation, without the need for extraction, thereby preserving compounds of interest. Moreover, it offers a way to capture precise compound data enabling chemometric analysis for the prediction of the behavior of the complex mixture that is aviation turbine fuel. Development of the HILIC method found column configuration, mobile phase composition, solvent gradient, re-equilibration time, injection volume, dilution factor, and sample solvent to be significant factors effecting separation efficiency and repeatability. For a sample dataset, optimized using a single aviation turbine fuel, retention time shift was able to be reduced from 0.4 minutes and 2.0% relative standard deviation (RSD) to approximately 0.1 minutes with RSD of 0.4% using the newly developed method. In addition, a high number of untargeted molecular features (944) and targeted amines (121) were able to be identified when using optimal method conditions. The optimized HILIC method was used to measure the heteroatom make up of a set of aviation turbine fuels; the subsequent data was then subjected to a rigorous statistical analysis using multiple techniques. Statistical analysis tools including principal component analysis (PCA) and fold change (FC) analysis offer a look inside the chemically complex composition of (open full item for complete abstract)

    Committee: Zachary West (Advisor) Subjects: Analytical Chemistry; Chemical Engineering; Chemistry; Engineering; Mechanical Engineering