Skip to Main Content

Basic Search

Skip to Search Results
 
 
 

Left Column

Filters

Right Column

Search Results

Search Results

(Total results 152)

Mini-Tools

 
 

Search Report

  • 1. Rader, Kara Talking about Narrative Messages: The Interaction between Elaboration and Interpersonal Validation

    Doctor of Philosophy, The Ohio State University, 2020, Communication

    Past research has found that generally having a discussion with other people about a health message after exposure can help increase the effectiveness of the message. While certain factors, such as conversational valence and the relationship between conversational partners, can impact the effectiveness of such a conversation, there is little research into the causal mechanisms that drive the impact of a discussion on attitudinal outcomes. To investigate the potential mechanisms of how a discussion of a health message can lead to more positive outcomes than there being no discussion, this dissertation turns to the elaboration likelihood model (Petty & Cacioppo, 1986) and self-validation theory (Petty et al., 2002). These theories suggest that discussion of a health message leads to more elaboration about the message. This higher level of elaboration leads to more thought confidence which results in more positive attitudes towards the health topic. Additionally, it is theorized that interpersonal discussion of a health message leads to more perceived validation of thoughts which has also been shown to positively influence though confidence. Dissertation hypotheses were tested within the context of a narrative about BRCA mutation testing for women in their 20s. To test whether elaboration was higher in interpersonal discussions than in other situations, this dissertation asked participants to either to discuss a health message, to think carefully and write about the message, or were not directly asked to elaborate on the health message. Additionally, those who were asked to discuss the health message were given exclusively positive feedback by a confederate (whom participants thought was another participant in the study). Results indicate that those who had a discussion did not engage in more elaboration than those who were asked to write about the message or were not given any elaboration instructions, nor was condition related to differences in perceived validation. (open full item for complete abstract)
    ... More

    Committee: Shelly Hovick PhD (Advisor); Emily Moyer-Guse PhD (Advisor); Jesse Fox PhD (Committee Member) Subjects: Communication
  • 2. Bunch, Nathan Oral Fluid Method Validation for Bowling Green State University

    Master of Science (MS), Bowling Green State University, 2020, Forensic Science

    Oral fluid (OF) is rapidly becoming a new media for assisting law enforcement in determining if a subject is driving under the influence of drugs (DUID). Preliminary research shows that drugs can be identified in OF in conjunction with blood, and drug concentrations in OF and blood correlate. Despite availability of several roadside devices to test for drugs in OF, the roadside devices are considered a presumptive test. The results from these roadside tests must be confirmed with a validated liquid chromatography- mass spectrometer (LC-MS) instrumentation. A validated instrument for confirmation of OF results is important, and this study validates BGSU's Shimadzu 8050 LC-MS. For validation, the instrument must pass various guidelines set by Scientific Working Group for Forensic Toxicology (SWGTOX) Standard Practices for Method Validation in Forensic Toxicology and others for accuracy, precision, linearity, Limit of Detection (LOD) and Limit of Quantitation (LOQ), carryover, interference, stability, and matrix effects. Due to the Covid-19 global pandemic, only accuracy, precision, linearity, and LOD and LOQ were accessed. The validation studies were conducted over five days (not consecutive) with two runs being conducted during each 24-hour period for a total of 10 runs. A total of 83 different analytes were accessed. The 83 analytes covered a broad range of drugs with abuse potential. The results of the validation study showed that the instrument is highly precise for the vast majority of analytes, but the cannabinoids, particularly delta-9-tetrahydrocannabinol (THC), were troublesome. Linearity for all analytes were accessed using the R^2 of the calibration curve, and all analytes were above the 0.95 limit. The LOD and LOQ study proved that the cutoff for each analyte is correctly higher than the factor of 2 limit for cutoff/LOQ.
    ... More

    Committee: Jon Sprague RPh, PhD (Advisor); Travis Worst PhD (Committee Member); Phillip Gibbs PhD (Committee Member) Subjects: Biology; Chemistry; Molecular Chemistry; Organic Chemistry; Pharmaceuticals; Pharmacology; Pharmacy Sciences; Physical Chemistry; Statistics; Toxicology
  • 3. Collingwood, Megan Generalized Shopping Innovativeness Scale: A Cross-Cultural Validation

    Master of Arts in Psychology, Cleveland State University, 2017, College of Sciences and Health Professions

    A new scale, the Generalized Shopping Innovativeness (GSI) scale by Blake and Neuendorf (in Hodges, 2009) was tested for its reliability and validity among two culturally different samples: the United States and China. College students in both countries were administered a survey (either online or on paper – depending on the sample) that contained the GSI as well as other measures, including scales of social desirability bias, frequency of usage and familiarity of different online shopping vehicles, to name a few. Results of different analyses support that the GSI scale is reliable among both samples. The scale's validity is promising among both samples but should be further tested by researchers in the future.
    ... More

    Committee: Brian F. Blake Ph.D. (Advisor); Kimberly Neuendorf Ph.D. (Committee Member); Michael Horvath Ph.D. (Committee Member) Subjects: Psychology
  • 4. Panozzo, Kimberly A Validation of Nass Crop Data Layer in the Maumee River Watershed

    Master of Arts, University of Toledo, 2016, Geography

    It is suspected that corn and soybean production in the Maumee Watershed has contributed to nutrient loading into Lake Erie, therefore affecting the frequency and duration of toxic algae (Dolan 1993), (Michalak, et al. 2012). Accurate crop type estimation is important in order to determine the potential impact on the lake and assess methods to reduce excess nutrient loading. Increasingly, the National Agricultural Statistics Survey (NASS) Crop Data Layer (CDL) is being used as a primary input for agricultural research and crop estimation modeling therefore assessing the accuracy of the CDL is imperative. This study aims to validate the CDL, assess accuracy differences on multiple spatial scales and to examine the efficiencies of using the CDL for future research in the region. Results of CDL validation using in situ field observations from 2011 and 2012 indicate an overall accuracy of at 94% and 92% respectively and khat accuracy of 90% (2011) and 86% (2012). Crop specific accuracy for corn, soy and wheat also resulted in considerably high user accuracy values, with slight differences between years. Accuracy measures vary by region and by year however in each circumstance analyzed, the differences were not significant. Of these measureable difference, it was shown that the 2012 comparison contained a higher degree of difference and this may be attributed to drought in the region for this year. It is concluded that NASS's CDL is an effective and efficient product for agricultural research.
    ... More

    Committee: Kevin Czajkowski PHD (Committee Chair); P.L. Lawrence PHD (Committee Member); Dan Hammel PHD (Committee Member) Subjects: Agriculture; Geographic Information Science; Geography; Land Use Planning; Remote Sensing
  • 5. Kolluri, Murali Mohan Developing a validation metric using image classification techniques

    MS, University of Cincinnati, 2014, Engineering and Applied Science: Mechanical Engineering

    The main objective of this thesis work was to investigate different image classification and pattern recognition methods to try to develop a validation metric. A validation metric is a means of comparison between two sets of numerical information. The numerical information could represent a set of measurements made on a system or its internal characteristics derived from such measurements. A validation metric (v-metric) is used to determine the correctness with which one of the data-sets is able to describe the other and to quantify the extent of this correctness. A moment descriptor method has been identified from among the most widely used image classification and pattern recognition methods as the system most likely to give way to an effective validation metric for reasons discussed in subsequent chapters. Different sets of Orthogonal Polynomials have been investigated as kernel functions for the aforementioned method to generate descriptors that depict the most significant features of the data-sets being compared. The algorithms developed as such have been verified using standard gray-scale and color images to establish their ability to reconstruct the image intensity function using a subset of the features extracted. The above Orthogonal Polynomials have then been used to extract features from two measured data-sets and means to develop a v-metric from these descriptors have been explored. A study of algorithms thus developed using different Orthogonal Polynomials has been made to compare their effectiveness as well as shortcomings as kernel functions for developing a v-metric. An alternate form of the existing two dimensional moments has been proposed to generate features that are more conveniently compared against each other. This method has been examined to determine its efficiency in reducing the amount of information that needs to be used in the final comparison for multiple pairs of data-sets. A way to effect such a comparison using singular (open full item for complete abstract)
    ... More

    Committee: Randall Allemang Ph.D. (Committee Chair); David L. Brown Ph.D. (Committee Member); Allyn Phillips Ph.D. (Committee Member) Subjects: Mechanics
  • 6. Kohli, Karan Structural Dynamics Model Calibration and Validation of a Rectangular Steel Plate Structure

    MS, University of Cincinnati, 2014, Engineering and Applied Science: Mechanical Engineering

    In this fast growing world, where technological advancements are influenced by more efficient and less expensive designs, there is a significant need of adopting methods and techniques that can support this thought process. To address this requirement, Finite Element (FE) modeling techniques were introduced as a viable solution. Specifically in case of mechanical structures, where prototype building and physical testing requires a lot of time and involves huge costs, developing an FE model that can predict its dynamic characteristics accurately is key to address the issue. However, with more reliance on FE models as a true representative of physical structures, there is a significant need to evaluate such models. This evaluation involves comparing the results obtained from simulating FE models with the results achieved from mathematical models and real world test data. In other words, such models need to undergo `verification' and `validation' before they can be used to predict reliable results for the future. In this research, a model verification, calibration, and validation case study is performed on a rectangular steel plate structure. The case study was based on a systematic approach that was used in an attempt to follow the Guidelines for Verification and Validation (V\&V) published recently by the American Society of Mechanical Engineers (ASME) and the American Institute of Aeronautics and Astronautics (AIAA). The accuracy of the model and the validation process was confirmed by modal correlation when the structure and the calibrated model were subjected to perturbed mass and constrained boundary conditions. The validation criteria were achieved using the calibrated model and the results obtained through validation criterion helped quantify the accuracy of the developed model under different boundary conditions.
    ... More

    Committee: Randall Allemang Ph.D. (Committee Chair); Allyn Phillips Ph.D. (Committee Member); Kumar Vemaganti Ph.D. (Committee Member) Subjects: Mechanical Engineering
  • 7. YU, CHENGGANG A SUB-GROUPING METHODOLOGY AND NON-PARAMETRIC SEQUENTIAL RATIO TEST FOR SIGNAL VALIDATION

    PhD, University of Cincinnati, 2002, Engineering : Nuclear and Radiological Engineering

    On-line signal validation is essential for safe and economic operations of a complicated industrial system such as a nuclear power plant. Various signal validation methods based on empirical signal estimation have been developed and successfully used. The first part of the thesis addresses a common and unavoidable problem for these methods -fault propagation, which causes false identification of healthy signals as faulty ones because of the faults existing in other signals. This effect is especially serious when faults occur in multiple signals and/or during system transient. A sub-grouping technique is presented in the thesis to prevent the effect of fault propagation in general signal validation methods. Specifically, two methods, Subgroups Consistency Check (SCC) and Subgroups Voting (SV), are developed. Their effectiveness is demonstrated by using a well-known Multivariate State Estimation technique (MSET) as a general method of signal estimation. To further improve the performance of MSET estimation, a procedure called Feedback Once (FBO) is also developed. All these new methods are tested and compared with MSET by using real transient data from a reactor startup process in a nuclear power plant. The results show that false identification of signals caused by fault propagation is significantly reduced by the two sub-grouping methods and the FBO method is able to improve the performance of MSET estimation to some extent. The results demonstrate that implementation of these new methods can lead to an improved signal validation technique that remains effective even when faults occur in multiple signals during system transients. The other major contribution is on the improvement of statistical test used for signal validation. Sequential Probability Ratio Test (SPRT) is a popular method that has been widely used in many signal validation methods. However, the assumption of SPRT is too stringent to satisfy in practice, which may cause the false identification rate ex (open full item for complete abstract)
    ... More

    Committee: Dr. Bingjing Su (Advisor) Subjects: Engineering, Nuclear
  • 8. Gurram, Mani Rupak Meta-Learning-Based Model Stacking Framework for Hardware Trojan Detection in FPGA Systems

    Master of Science (MS), Wright State University, 2024, Computer Science

    In today's technological landscape, hardware devices are integral to critical applications such as industrial automation, autonomous vehicles, and medical equipment, relying on advanced platforms like FPGAs for core functionalities. However, the multi-stage manufacturing process, often distributed across various foundries, introduces substantial security risks, notably the potential for hardware Trojan insertion. These malicious modifications compromise the reliability and safety of hardware systems. This research addresses the detection of hardware Trojans through side-channel analysis, utilizing power and electromagnetic signal data, combined with meta-learning techniques, specifically model stacking. By employing diverse base models and a meta-model to consolidate predictions, this non-invasive approach effectively identifies Trojans without requiring direct access to internal circuitry. The methodology demonstrates robust classification capabilities, achieving an accuracy of 88.0\%, precision of 81.0\%, and recall of 95.0\%, even on previously unseen data. The results highlight the superior performance of meta-learning over traditional detection methods, offering an efficient and reliable solution to enhance hardware security.
    ... More

    Committee: Fathi Amsaad Ph.D. (Advisor); Junjie Zhang Ph.D. (Committee Member); Huaining Cheng Ph.D. (Committee Member); Nitin Pundir Ph.D. (Committee Member); Thomas Wischgoll Ph.D. (Other); Subhashini Ganapathy Ph.D. (Other) Subjects: Computer Engineering; Computer Science; Electrical Engineering
  • 9. Ghimire, Saugat Design, Optimization, Validation, and Detailed Flow Physics Analysis of a CO2 Axial Compressor

    PhD, University of Cincinnati, 2024, Engineering and Applied Science: Aerospace Engineering

    The move towards renewable energy has highlighted the need for large-scale, environmentally friendly energy storage solutions. Among these, the Supercritical Carbon Dioxide (sCO2) power cycle is emerging as a promising technology for advanced energy conversion. The effectiveness of such systems depends heavily on the compressor's performance. Using optimization-based methods, a multistage axial compressor has been designed, and its first stage has been built and tested experimentally. Through a series of detailed design iterations and optimization strategies, 3D CFD analyses, the compressor's geometric and operational parameters were fine-tuned to address the unique challenges posed by operation using CO2. Key findings highlight the successful implementation of design optimization that significantly reduces aerodynamic losses and improves the overall efficiency of the compressor stages. The optimized compressor demonstrates robust performance across a range of operating conditions, particularly focusing on improved stall margin, which emphasizes the potential of sCO2 technology in contributing to efficient and sustainable energy systems. Further detailed studies using CFD to analyze cavity effects in shrouded configurations, tip clearance effects, real gas effects, and Reynolds number effects were performed. Experimental validations, conducted at the University of Notre Dame Turbomachinery Laboratory, confirm the CFD predictions and showcase the practical viability of the compressor design and the approach used. This work not only advances the state-of-the-art in turbomachinery design for supercritical fluids but also lays a foundation for future research into the integration of sCO2 and real gas based compressors in renewable energy systems and industrial applications. The insights gained from this study underscore the critical importance of tailored design and optimization strategies in overcoming the thermophysical challenges associated (open full item for complete abstract)
    ... More

    Committee: Mark Turner Sc.D. (Committee Chair); Daniel Cuppoletti Ph.D. (Committee Member); Kelly Cohen Ph.D. (Committee Member); Jeong-Seek Kang Ph.D. (Committee Member); Shaaban Abdallah Ph.D. (Committee Member) Subjects: Aerospace Engineering
  • 10. Rickman, William Surrogate Markov Models for Validation and Comparative Analysis of Proper Orthogonal Decomposition and Dynamic Mode Decomposition Reduced Order Models

    Master of Science, Miami University, 2025, Mechanical and Manufacturing Engineering

    Reduced order modeling (ROM) methods, such as those based upon Proper Orthogonal Decomposition (POD) and Dynamic Mode Decomposition (DMD), offer data-based turbulence modeling with potential applications for flow control. While these models are often cheaper than numerical approaches, their results require validation with source data. Within the literature, the metrics and standards used to validate these models are often inconsistent. Chabot (2014) produced a data-driven framework for validating these ROMs that used surrogate Markov models (SMMs) to compare how the system dynamics evolved rather than how any single metric evolved. These SMMs were constructed by clustering the flow data into different states of suitably similar flow fields, and the Markov model then mapped how likely each state was to transition into another. While this method was successful, there persisted an amount of uncertainty in how the outlier states within this clustering scheme were determined. Additionally, the study only examined the application of this procedure to POD-Galerkin ROMs. This study aims to tie the outlier state determination directly to the models' parent data. The study will also apply this procedure to ROMs generated from DMD to investigate how this framework's effectiveness carries over to different classes of ROMs.
    ... More

    Committee: Edgar Caraballo (Advisor); Andrew Sommers (Committee Member); Mehdi Zanjani (Committee Member) Subjects: Aerospace Engineering; Fluid Dynamics; Mathematics; Mechanical Engineering; Statistics
  • 11. Gray, Justin Development of a GC-MS Method to Quantify Fecal Short and Branched Chain Fatty Acids in Case-control Study of Inflammatory Bowel Disease Patients

    Doctor of Philosophy in Clinical-Bioanalytical Chemistry, Cleveland State University, 2024, College of Arts and Sciences

    Inflammatory bowel disease (IBD) is a non-contagious, chronic inflammation of the gastrointestinal (GI) tract classified into two subgroups, Crohn's Disease (CD) and Ulcerative Colitis (UC). IBD is a disease of the industrialized world, and its incidence and prevalence has increased worldwide. Short and branched chain fatty acids (SCFAs, BCFAs) produced by the gut microbiome are implicated with the immune systems inflammatory response. Chapter I summarizes our current understanding of SCFAs and BCFAs in the GI tract, fermentative pathways, etiology, inflammatory pathways relevant to the GI tract and beneficial impacts of SCFAs and BCFAs. Chapter II discusses the development and complete validation of a high throughput, fast and reliable gas chromatography-mass spectrometry (GC-MS) method with a simplified pre-treatment to quantify SCFAs and BCFAs in human stool. Chapter III summarizes a case-control study of 74 stool samples (21 healthy; 24 UC; 29 CD) measuring acetic, propionic, isobutyric, butyric, isovaleric, valeric, and caproic acid (μg/g stool) using the GC-MS method developed. Significant differences were observed for propionic, butyric and valeric acid (p < 0.05; all p values < 0.001) between healthy and IBD groups. Receiver operator curve (ROC) analysis resulted in area under the curve (AUC) value of 96% (95% CI: 0.89 – 0.98, p < 0.001). Significant differences were observed for propionic (p < 0.05; p = 0.018) and isobutyric acid (p < 0.05; p = 0.002) between UC and CD subgroups. ROC analysis resulted in AUC of 83% (95% CI: 0.66 – 0.92, p < 0.001). Acetic acid served as an endogenous, internal standard to normalize for watery stools because of its abundance and non-significant difference between groups. Chapter IV discusses a literature review of 11 published case-control studies quantifying SCFAs and BCFAs in stool between healthy, IBD, UC and CD subgroups. Valeric and butyric acid were increased in the stool of healthy groups when compared to IBD groups. (open full item for complete abstract)
    ... More

    Committee: Baochuan Guo (Advisor); Aimin Zhou (Committee Member); Chandrasekhar Kothapalli (Committee Member); John Turner (Committee Member); Xue-Long Sun (Committee Member) Subjects: Biochemistry; Experiments; Health Care; Immunology; Medicine; Microbiology; Pathology
  • 12. Folger, Timothy Conceptualizing Validity and Validation in Educational and Psychological Testing: A Three Article Dissertation

    Doctor of Philosophy (Ph.D.), Bowling Green State University, 2024, Leadership Studies

    This dissertation aims to bridge the gap between validity theory and the practice of validation. The dissertation employs a three-article approach. Following the introduction in Chapter I, three independent manuscripts representing three empirical studies are presented (i.e., Chapters II – IV). Each chapter is a stand-alone publishable manuscript, consisting of an introduction, review of literature, methods, results or findings, and discussion and implications. The Chapter II study used the Delphi method to explore measurement experts' conceptions of validity and validation. The purpose of the study was to define and reify three key aspects of validity and validation, namely test-score interpretation, test-score use, and the claims supporting interpretation and use. Definitions were developed through multiple iterations of data collection and analysis. By clarifying the language used when conducting validation, validation may be more accessible to a broader audience, including but not limited to test developers, test users, and test consumers. The Chapter III study used a phenomenological research design to explore K-12 teachers' perceptions of compassion and compassionate leadership. The purpose of this study was to conceptualize and operationalize compassionate leadership from the perspective of K-12 teachers. Data were collected through semi-structured interviews and analyzed using thematic analysis. Study findings highlight how social interaction, supportive relationships, and workplace culture mediate compassion and compassionate leadership in the context of K-12 education. The Chapter IV validation study used a quantitative approach to examine the Compassionate Leader Behavioral Index for Educators (CLBI-E); an instrument designed to measure compassionate leadership in PreK-12 education. CLBI-E development is described in the Chapter IV study, and Rasch (1960) measurement was used to evaluate validity claims related to the psychometric properties of (open full item for complete abstract)
    ... More

    Committee: Judy May Ph.D. (Committee Chair); Kristina LaVenia Ph.D. (Committee Member); Erin Krupa Ph.D. (Committee Member); Paul Johnson Ph.D. (Committee Member); Hee Soon Lee Ph.D. (Committee Member) Subjects: Educational Leadership; Educational Tests and Measurements
  • 13. Ansari, Mohd Sohib Hydrologic Monitoring to Simulate Water Quality in Mill Creek Watershed Using Personal Computer Storm Water Management Model (PCSWMM)

    Master of Science in Engineering, Youngstown State University, 2024, Department of Civil/Environmental and Chemical Engineering

    The Mill Creek watershed is located in the Northeast Ohio and covers an area of 78.3 square miles within the Mahoning River basin. The river has been experiencing significant water quality problems due to pollution from point and nonpoint source contributions from its tributaries. Before joining the Mahoning River, the river flows through several areas, including the City of Columbiana, Beaver Township, Boardman Township, and Youngstown. Mill Creek comprises seven major tributaries, namely Anderson Run, Cranberry Run, Indian Run, Bears Run, Ax Factory Run, Sawmill Run, and Turkey Run, all of which contribute to the degradation of the river water quality in terms of algal bloom, turbidity, and bacterial contamination. The water quality of rivers is significantly affected by several sources of contamination, such as combined sewer overflows, failing septic systems, animal waste, and runoff from agricultural and urban areas. Despite several studies conducted in the past, a hydrologic and hydraulic investigation in the context of water quality modeling has not been conducted in Mill Creek yet. To address this concern, monitoring stations were established in different locations along the river to record real-time flow depth data using HOBO loggers. In addition, sporadic water quality data from the past and the recent data collected by the Environmental Science Program at YSU have been used for water quality calibration and validation. The hydrologic and hydraulic model was developed using the Personal Computer Storm Water Management (PCSWMM) model. Data sourced from the National Oceanic and Atmospheric Administration (NOAA) of the National Climatic Data Center (NCDC), the Digital Elevation Model (DEM) from the United States Geological Survey (USGS), land cover data from the National Land Cover Datasets (NLCD), and soil data from the United States Department of Agriculture (USDA) were utilized to construct the model. The calibration and validation of the model were carrie (open full item for complete abstract)
    ... More

    Committee: Suresh Sharma PhD (Advisor); Felicia Armstrong PhD (Committee Member); Sahar Ehsani PhD (Committee Member) Subjects: Civil Engineering; Environmental Engineering; Hydrology
  • 14. Heard, Cherish Preliminary Psychometric Properties of the Experience of Cognitive Intrusion of Pain (ECIP-A) Scale in Pediatric Patients with Chronic Pain

    MA, University of Cincinnati, 2024, Arts and Sciences: Psychology

    Chronic pain can disrupt adolescents' attention. Such interruptions, in turn, may negatively impact one's overall functioning, causing frustration and distress when trying to engage in important tasks (e.g., schoolwork). This experience has been referred to as cognitive intrusion of pain (Attridge et al., 2015). To date, only one adult self-report measure of cognitive intrusion of pain exists: the Experience of Cognitive Intrusion of Pain Scale (ECIP). This is a critical gap in the literature, as there is currently no known measure of experienced cognitive intrusion of pain for pediatric chronic pain patients. The current study examined the psychometric properties of an adapted version of the ECIP (ECIP-A) among children and adolescents (ages 11-18) with pediatric chronic pain. Data were collected from pediatric chronic pain patients (N = 194) presenting for treatment at a tertiary pain clinic at a large Midwestern children's hospital. Exploratory analyses were conducted for deeper understanding of the current sample, as this is the first study to assess the ECIP-A in pediatric patients with chronic pain. The current sample consisted of 81.9% self-identified as Non-Hispanic White and 77.5% female chronic pain patients. There were no significant differences in ECIP-A scores between males and females, across patient age, or across primary pain condition. ECIP-A scores and Pain Frequency-Severity-Duration composite scores were significantly correlated, indicating that as pain symptoms increase, so does cognitive intrusion of pain (r = 0.23, p = 0.002). Confirmatory factor analysis (CFA) results supported a one-factor model for the ECIP-A, with excellent model fit (?2 = 30.24, df = 23, p = 0.14; CFI = 0.99, TLI = 0.99, RMSEA = .042 (90% CI 0.00 - 0.078), and SRMR = 0.021). Results suggest excellent internal consistency of ECIP-A scores (Cronbach's alpha = 0.94). Pearson correlations indicated good convergent and discriminant validity, as the ECIP-A was moderately and p (open full item for complete abstract)
    ... More

    Committee: Kristen Jastrowski Mano Ph.D. (Committee Chair); Quintino Mano Ph.D. (Committee Member); Cathleen Stough Ph.D. (Committee Member) Subjects: Psychology
  • 15. Hopkins, Erin Implicit Pitch-Height Cross-Modal Correspondence and Music Reading: Validation of the Pitch-Height Stroop Test

    Doctor of Philosophy, Case Western Reserve University, 2024, Music Education

    The purpose of this study was to determine the validity of the Pitch-Height Stroop Test, a novel measure of implicit psychological association between pitch and multiple dimensions of the construct of height, and to explore potential relationships between its linguistic dimension, its perceptual dimension, and music reading ability. The Pitch-Height Stroop Test (PHST) is a response time measure that includes four tasks: a baseline pitch classification task, an auditory pitch-word Stroop-like task, an auditory-visual pitch-location Stroop-like task, and an auditory-visual pitch-text Stroop-like task. In each of the Stroop-like tasks, participants indicate the pitch of a tone by pressing a corresponding button while ignoring a simultaneously presented indicator of linguistic or visuospatial height, which may or may not be congruent with the pitch. English-speaking adult singers (n = 50) completed the PHST as well as a demographic and musical background questionnaire, the Profile of Music Perception Skills (PROMS) pitch and melody subtests, an Erikson flanker task, and the Vocal Sight-Reading Inventory. This battery of measures enabled determination of the PHST's validity and reliability and examination of relationships between variables. Results indicated that the PHST was a valid and reliable measure for this population for purposes of group-level analysis. Magnitudes of pitch-height cross-modal correspondence as measured by the three Stroop-like tasks appeared to reflect differences between unisensory and cross-sensory processing and showed moderate correlation between its linguistic and perceptual forms. Regression analysis indicated that perceptual pitch-height cross-modal correspondence was a positive predictor of sight-singing fluency, contributing approximately 10% of the sight-singing score variance. Meanwhile, linguistic pitch-height cross-modal correspondence did not appear to contribute to sight-singing fluency. Correlation with musical background variab (open full item for complete abstract)
    ... More

    Committee: Ryan Scherber (Committee Chair); Nathan Kruse (Committee Member); Lisa Koops (Committee Member); Robert Greene (Committee Member) Subjects: Cognitive Psychology; Music; Music Education
  • 16. Shockley McCarthy, Karla School Social Work: Promoting Teacher Occupational Well-Being Through Teacher-Student Relationships: The Teacher Teacher-Student Relationship Motivation Scale

    Doctor of Philosophy, The Ohio State University, 2023, Social Work

    The general role of the school social worker is to provide services and supports to address barriers to the academic, social, emotional, and physical well-being of all students. This role includes all aspects of the school environment and climate. Teachers play an essential role in creating a culture where students believe they are capable and belong. Teacher well-being is critical to optimize student well-being and outcomes and promoting teacher well-being should be a consideration to fostering healthy schools. The positive interactions and relationships that teachers cultivate with students have a positive impact on the well-being of both parties. While research has acknowledged the significance of teacher-student relationships, the majority of studies have primarily focused on the student. There is a notable gap in understanding the mechanisms behind developing teacher-student relationships and the individual and ecological factors that either foster or hinder them. Studying this phenomenon with the social work person-in-environment perspective serves to provide a comprehensive examination that includes individual, school, and system-level factors. School social workers' training in systems, mental health, and psychology positions them to assess and intervene to support teachers' relational efficacy. This study had two specific aims: (a) Explore teachers' perspectives of teacher-student relationships and the factors affecting building and maintaining positive teacher-student relationships in the school environment; (b) Utilize the teachers' perspective to design and validate a scale to measure K-12 teachers' feelings of capability and motivation to establish and maintain positive teacher-student relationships. This dissertation applied an exploratory sequential mixed methods approach to research, develop, and validate a measure of facilitators and barriers to teachers' motivation to establish and maintain positive relationships with students. Fo (open full item for complete abstract)
    ... More

    Committee: Natasha Bowen (Committee Chair); Kisha Radliff (Committee Member); Bridget Freisthler (Committee Member) Subjects: Education; Occupational Health; Social Work; Teacher Education; Teaching
  • 17. Phillips, Andrew Contextualizing and Validating Five-Factor Model Scales to Measure Personality Behaviors of Teaching Assistants in First-Year Engineering Classroom Contexts

    Doctor of Philosophy, The Ohio State University, 2023, Engineering Education

    Teaching Assistants (TAs) play significant roles in many first-year engineering programs, including interacting with students in the classroom as the students learn engineering fundamentals and navigate their first year of college. An important element of any social interaction is the personality of the people, and the way personality manifests as behaviors may change in different social contexts. Although there have been studies about personality of teachers and students, a Systematic Literature Review provides motivation for investigating the personality behaviors of TAs in first-year engineering classroom contexts specifically. The Five-Factor Model (FFM) of personality is selected as the theoretical framework because of its rigorous structural validity, the availability of FFM scales online, and previous successful contextualization of FFM scales. The dimensions of personality in the FFM are: Extraversion, Agreeableness, Conscientiousness, Neuroticism, and Openness. Two sets of FFM scales (a 50-item version and a 100-item version) are contextualized for TAs teaching in first-year engineering through assessing face validity and content validity to result in one 90-item version (18 items per scale). TAs from four different first-year engineering programs respond to these 90 items to collect data for further validation. Item reduction analysis is conducted to remove 8 items from each scale so that each scale retains a final 10 items. Construct validity and reliability are assessed, and Classical Test Theory (CTT) item iii analysis and Item Response Theory (IRT) Rasch analysis are performed to identify items for removal. The final 10-item reduced scales are evaluated again for construct validity, reliability, and criterion validity, and CTT item analysis and IRT Rasch analysis are performed again to evaluate the validity of the final instrument. Overall, the properties of the final 50-item (10 items per personality factor scale) First-Year Engineering Teaching Ass (open full item for complete abstract)
    ... More

    Committee: Krista Kecskemety (Advisor); Rachel Kajfez (Committee Member); Ann Christy (Advisor) Subjects: Education; Engineering; Personality Psychology; Teaching
  • 18. Bijukshe, Shuvra Monitoring, Modeling and Implementation of Best Management Practices to Reduce Nutrient Loadings in the Atwood and Tappan Lake Watersheds in Tuscarawas Basin, Ohio

    Master of Science in Engineering, Youngstown State University, 2023, Department of Civil/Environmental and Chemical Engineering

    Water quality in lakes and reservoirs has been significantly degraded due to anthropogenic activities including point and non-point source pollution. Agricultural practices, particularly excessive fertilizer application, have been consistently identified as a major contributor to water contamination in the lakes and reservoirs. The escalation of nutrient loading into water bodies has raised serious concerns regarding eutrophication in lakes, as well as the potability of drinking water and other consumptive use of water. In order to address these problems, Best Management Practices (BMPs) have been implemented globally to reduce nutrient loadings and improve water quality in lakes and reservoirs. This study aims to assess the effectiveness of BMPs in reducing nutrient levels in the Atwood and Tappan Lakes of the Tuscarawas basin in Ohio by monitoring the sites for water quality sampling and using the Soil and Water Assessment Tool (SWAT) for watershed modeling. Stream flow data from five USGS gauge stations were gathered for multi-site calibration and validation of the model, whereas water quality data from five representative stations within the watersheds were monitored to calibrate the model for nutrients. The performance of the model for streamflow calibration at various USGS gauging stations was satisfactory with Nash-Sutcliffe Efficiency (NSE) values ranging from 0.54 to 0.79 during calibration, and 0.50 to 0.89 during validation. However, due to limited availability of water quality data, the calibration of nutrient was not as good as hydrological calibration. Subsequently, a scenario analysis was carried out using the calibrated SWAT model to assess the effectiveness of different management practices in reducing nutrient levels from the sub-watersheds. The selection of management practices, such as filter strips, grass waterways, fertilizer reduction, crop rotation, and cover crops, were considered for analysis based on active consultation with local stake (open full item for complete abstract)
    ... More

    Committee: Suresh Sharma PhD (Advisor); Sahar Ehsani PhD (Committee Member); Peter Kimosop PhD (Committee Member) Subjects: Civil Engineering
  • 19. Frazer, Rebecca Measuring and Predicting Character Depth in Media Narratives: Testing Implications for Moral Evaluations and Dispositions

    Doctor of Philosophy, The Ohio State University, 2023, Communication

    Perceived character depth is a concept relevant for understanding and predicting audience responses to narrative media, yet it has been largely unexplored in the field of media psychology. Through a careful review of diverse literatures, the current work offers a formal conceptualization of character depth as the extent to which a character's textual exposition evokes a detailed and multi-faceted mental conception of a character's psyche, behavior, and experience. After devising a series of items to measure character depth, this work then presents a series of experimental studies designed to test various aspects of validity of the proposed measurement scale and to test a causal path model of the relationship between character depth and processes specified by affective disposition theory (see Zillmann, 2000). Study 1 uses a known-groups approach and confirmatory factor analysis to test the predictive validity and measurement model of a 20-item proposed perceived character depth scale. Selective item retention results in a 6-item scale with excellent model fit. Studies 2 and 3 lend additional support to the validity of this 6-item scale's measurement model through tests of the scale in two different narrative contexts, both of which result in excellent model fit. Across Studies 1-3, evidence emerges of the convergent and discriminant validity of the scale in relation to other character perception variables. Study 4 applies this new measure in a 2 X 3 between-subjects experimental design that manipulates both character depth and character moral behavior independently. Results show that character depth impacts disposition formation and anticipatory responses above and beyond audience reactions to moral behavior. This finding has important theoretical implications for affective disposition theory (Zillmann, 2000), indicating that perceived character depth may serve as an additional predictor of disposition formation not specified in the original theory. Future research d (open full item for complete abstract)
    ... More

    Committee: Matthew Grizzard (Advisor); Emily Moyer-Guse (Advisor); Nicholas Matthews (Committee Member) Subjects: Communication; Mass Communications; Mass Media; Psychology
  • 20. Guo, Feng Revisiting Item Semantics in Measurement: A New Perspective Using Modern Natural Language Processing Embedding Techniques

    Doctor of Philosophy (Ph.D.), Bowling Green State University, 2023, Psychology/Industrial-Organizational

    Language understanding plays a crucial role in psychological measurement and so it is important that semantic cues should be studied for more effective and accurate measurement practices. With advancements in computer science, natural language processing (NLP) techniques have emerged as efficient methods for analyzing textual data and have been used to improve psychological measurement. This dissertation investigates the application of NLP embeddings to address fundamental methodological challenges in psychological measurement, specifically scale development and validation. In Study 1, a word embedding-based approach was used to develop a corporate personality measure, which resulted in a three-factor solution closely mirroring three dimensions out of the Big Five framework (i.e., Extraversion, Agreeableness, and Conscientiousness). This research furthers our conceptual understanding of corporate personality by identifying similarities and differences between human and organizational personality traits. In Study 2, the sentence-based embedding model was applied to predict empirical pairwise item response relationships, comparing its performance with human ratings. This study also demonstrated the effectiveness of fine-tuned NLP models for classifying item pair relationships into trivial/low or moderate/high empirical relationships, which provides preliminary validity evidence without collecting human responses. The research seeks to enhance psychological measurement practices by leveraging NLP techniques, fostering innovation and improved understanding in the field of social sciences.
    ... More

    Committee: Michael Zickar Ph.D. (Committee Chair); Neil Baird Ph.D. (Other); Richard Anderson Ph.D. (Committee Member); Samuel McAbee Ph.D. (Committee Member) Subjects: Psychological Tests; Psychology; Quantitative Psychology