Skip to Main Content

Basic Search

Skip to Search Results
 
 
 

Left Column

Filters

Right Column

Search Results

Search Results

(Total results 14)

Mini-Tools

 
 

Search Report

  • 1. Arya, Pulkit Creating Conversational Systems with Temporal Reasoning in Zero Resource Domains with Synthetic Data and Large Language Models

    Master of Science, The Ohio State University, 2023, Computer Science and Engineering

    Creating conversational systems for niche domains is a challenging task, further exacerbated by lack of quality datasets. In this work, we have created a data generation pipeline that can be adapted to new domains to generate a minimum viable dataset to bootstrap a semantic parser. We experimented with methods for automatic paraphrasing and tested the ability of language models to answer questions that require temporal reasoning. Based on our findings, large language models with emergent capabilities (GPT-3.5 and GPT-4) present a viable alternative to crowd sourced paraphrasing. We determined that conversational systems that rely upon language models' ability to do temporal reasoning struggle to provide accurate responses. Our proposed system outperforms such language models by performing temporal reasoning based on an intermediate representation of the user query.

    Committee: Michael White (Committee Member); Eric Fosler-Lussier (Advisor) Subjects: Computer Science
  • 2. Salari, Elahheh Using Machine Learning to Predict Gamma Passing Rate Values and to Differentiate Radiation Necrosis from Tumor Recurrence in Brain

    Doctor of Philosophy, University of Toledo, 2023, Physics

    A major concern in radiation therapy has always been to deliver the prescribed dose to a tumor volume while keeping the surrounding organs at risk (OAR) safe. This is significantly important for cases of using Intensity Modulated Radiosurgery (IMRS) with a single isocenter to treat multiple brain lesions. In these cases, small and wide spread-out tumors in the brain are irradiated with larger doses while sparing OAR is critical but significantly more challenging. It is common to perform pre-treatment verification to make sure the accurate treatment dose is delivered. Numerous applications including predictive modeling of treatment outcomes in radiation oncology, treatment optimization, error detection, and prevention have been developed and are now accessible. Typically, patient-specific QA compares dose distribution generated by the treatment planning system (TPS) with the delivery of that patient's treatment plan to an array of detectors. In other words, patient-specific QA (pre-treatment verification) compares the dose distributions between measured and predicted. This process compounds numerous potential sources of error, including dose calculation, data transfer, linac performance, device setup, and dosimeter response among others. Consequently, the cumulation of errors may cause the results to fail. Several reports show that common pre-treatment verification measurements are insensitive to delivery errors mand unable to predict the acceptability of plan delivery. Therefore, by using machine learning, we plan to devise an algorithm to achieve a higher level of understanding and insight to improve each patient's treatment plan and safely deliver precise radiation to tumors while minimizing the radiation dose to the surrounding normal tissues. Another crucial task is the differentiation of Radiation Necrosis (RN) from recurrence tumors. RN is one of the common adverse effects resulting from irradiation to the brain, nevertheless, RN is hard to diagnose and m (open full item for complete abstract)

    Committee: Aniruddha Ray Ph.D (Committee Chair) Subjects: Medical Imaging; Oncology; Radiation
  • 3. Aqeel, Aya EVIDENCE BASED MEDICAL QUESTION ANSWERING SYSTEM USING KNOWLEDGE GRAPH PARADIGM

    Master of Science in Software Engineering, Cleveland State University, 2022, Washkewicz College of Engineering

    Evidence Based Medicine (EBM) is the process of systematically finding, judging, and using research findings as the basis for clinical decisions and has become the standard of medical practice. There are countless new studies and research being published daily. Keeping track of each of them is impossible, not to mention needing to read and comprehend them. While search engines can help healthcare professionals search for a topic with suggesting relevant papers on the topic, healthcare professionals still need to go through the papers and extract relevant information themselves. This is a very time-consuming task as one study on Information Retrieval (IR) practices of healthcare information professionals that it takes on average 4 hours for healthcare information professionals to finish a search task. Moreover, a systematic review study on the barriers to medical residents' practicing of evidence-based medicine revealed that two of the most frequently mentioned barriers for residents were limitations in available time, knowledge, and skills. In this project, we address both problems by building a Medical Question Answering (QA) system that employees semi-supervised information extraction methods in Natural Language Processing (NLP) to construct a large scale Knowledge Graph (KG) from the extracted facts from a large repository of medical research publications. Then, the system translates a given user's question in a natural language to the KG efficiently to extract relevant answers based on evidences to present in a user-friendly manner. The system returns a compilation of summaries for the related evidences with one sentence summary for each evidence relevant to the user's question and the reference to the full publication. The system can help address the barriers of knowledge and skills by providing comprehensive summary of the evidences for a given question in a natural language that eliminates the need to formulate complex structured queries. The system was evalu (open full item for complete abstract)

    Committee: Sunnie Chung (Committee Chair); Satish Kumar (Committee Member); Yongjian Fu (Committee Member); Sunnie Chung (Advisor) Subjects: Artificial Intelligence; Biomedical Research; Medicine
  • 4. Macey, Nathaniel Evaluation of a MapCHECK2TM Diode Array for High Dose Rate Brachytherapy Quality Assurance

    Master of Science (MS), University of Toledo, 2015, Biomedical Sciences (Medical Physics: Diagnostic Radiology)

    Despite continuous improvements in design of HDR brachytherapy delivery systems, the position of the source is still verified only through the HDR afterloader hardware/software based on the length of the wire reeled out relative to the parked position. The position of the source is not independently monitored during the treatment, potentially opening the door to misadministrations. We investigate the feasibility of using dose maps acquired with a two-dimensional diode array to verify the relative source locations and confirm delivered dose during an HDR treatment.Dose maps for the source located at selected distances in air and depths in solid water were acquired with a MapCHECK2TM diode array and the Varian VariSourceTM Ir-192 HDR afterloader. The peak location of the measured dose profile for each dwell position provided the X and Y coordinates while the full width at half maximum (FWHM) of each peak was used to calibrate the source distance along the Z axis. Two treatment orientations were considered with the source moving along 1) the rectangular surface of a solid water phantom and 2) the inclined plane of a paraffin wax wedge to verify the method for both coplanar and non-coplanar source and detector geometries for three source positions. Acquired dose maps were used to restore coordinates of dwell positions.Although the spatial resolution is 10 mm along a row or column of diode detectors, the accuracy in determining dwell position coordinates was found to be within +/-2 mm in the X and Y directions of the diode plane using a polynomial fit to interpolate the values at distances between the measured points. The FWHM was found to increase linearly with source depth/distance, making it a suitable parameter for determining the source Z coordinate. Our studies have verified that the dose maps can be used as a routine QA tool for HDR treatment delivery verification.

    Committee: Diana Shvydka Ph.D. (Committee Chair); E. Ishmael Parsai Ph.D. (Committee Member); David Pearson Ph.D. (Committee Member) Subjects: Physics
  • 5. Tirabassi, Dana Effects of the qa-1F Activator Protein on the Expression of Quinic Acid Induced Genes in Neurospora crassa

    Master of Science in Biological Sciences, Youngstown State University, 2013, Department of Biological Sciences and Chemistry

    Neurospora crassa, like most fungi, is very flexible metabolically. When a preferred carbon source, such as dextrose, is unavailable, N. crassa has the ability to metabolize quinic acid. To do this, the quinic acid gene cluster is up-regulated by the qa-1F activator. This study uses a mutant form of N. crassa in which the qa-1F gene is knocked out. The protein profiles of N. crassa wild-type and qa-1F knockout when grown on both dextrose and quinic acid were analyzed and compared. The wild-type and knockout strains were first grown on 2% dextrose and then shifted either to fresh dextrose or 0.3% quinic acid. The proteins from these tissues were then extracted, quantitated, and separated using two-dimensional gel electrophoresis (2DGE). The 2DGE gels were then analyzed using PDQUESTTM. Cross-conditional comparisons were made and protein spots unique to each condition were identified. These gel comparisons show that, when grown on a preferred carbon source, nearly twice as many proteins are up-regulated than when grown on quinic acid. Also, the qa-1F knockout protein profiles had far fewer protein spots than their wild-type counterparts for both carbon sources. Protein spots were then selected, excised, and sent to the Ohio State University for mass spectrometry and bioinformatic analysis. Two proteins affected by the presence of qa-1F when grown on quinic acid were identified as hypothetical proteins NCU 04072 and NCU 08332 likely be a catechol dioxygenase and a translational protein SH3-like protein, respectively.

    Committee: David Asch Ph.D. (Advisor); Gary Walker Ph.D. (Committee Member); Chet Cooper Ph.D. (Committee Member) Subjects: Biology
  • 6. Allen, Katie Protein Profiling of Wild-type Neurospora crassa Grown on Various Carbon Sources

    Master of Science in Biological Sciences, Youngstown State University, 2011, Department of Biological Sciences and Chemistry

    Neurospora crassa possesses characteristics that make an ideal model for eukaryotic organisms. N. crassa metabolizes preferred carbon sources such as dextrose, but has the ability to metabolize less preferred carbon sources such as quinic acid or glycerol. This study analyzes the protein profiles of wild-type N. crassa grown on 2% dextrose, 2% glycerol, and 0.3% quinic acid. To perform the study, N. crassa was grown on Vogels minimal media and shifted to various carbon sources. Protein was extracted from N. crassa tissue and ran on two-dimensional gel electrophoresis (2-DGE). The 2-D gels were imaged and analyzed utilizing PDQuest™. Protein spots of interest were excised from the 2-D gels and sequenced by capillary-liquid chromatography-nanospray tandem mass spectrometry. Protein identifications were determined by searching SwissProt and NCBI databases for homologous fungal sequences. The study revealed that more protein was expressed on the preferred carbon source, dextrose, compared to the less preferred carbon sources, quinic acid and glycerol. Unique protein expression patterns were also generated for each of the carbon sources. The identified proteins found to be unique to dextrose included an ATP-dependent RNA helicase, an enolase, and a cytochrome c peroxidase. A probable pyridoxine biosynthesis protein was established to be unique to glycerol, while a peptidyl-prolyl cis-tans isomerase and a Cu-Zn superoxide dismutase were determined to be unique to quinic acid.

    Committee: David Asch PhD (Advisor); Gary Walker PhD (Committee Member); Xiangjia Min PhD (Committee Member) Subjects: Biochemistry; Biology; Molecular Biology
  • 7. Massie, Michael Respiratory-Gated IMRT Quality Assurance with Motion in Two Dimensions

    Master of Science (MS), Wright State University, 2010, Physics

    Intensity modulated radiation therapy (IMRT) plans can be further customized to each patient with the use of a four-dimensional (4D) respiratorygated computed tomography (CT), with time being the fourth dimension. The 4D respiratory-gated CT allows for the internal margin (IM), the expansion of the tumor volume that accounts for physiologic motions, to be addressed in the treatment planning process and no longer assumes that the treatments will be delivered to a fixed or rigid patient anatomy. Delivering the IMRT plan with a gated technique limits the treatment to a duty cycle when the target motion is at a minimum. The goal of this project is to study respiratory-gated IMRT quality assurance (QA) results for tumor motion in two dimensions and develop a guideline for acceptable limits on tumor motion and field size. Respiratory-gated IMRT QA was performed for four field sizes and varying amounts of motion with a fixed duty cycle using Sun Nuclear¿¿¿¿s MapCHECK and MotionSIM XY/4D products. The treated and planned dose planes were compared and errors were evaluated using standard acceptance conditions.

    Committee: Brent Foy PhD (Advisor); Gary Farlow PhD (Committee Member); Matthew Daniels PhD (Committee Member) Subjects: Physics
  • 8. Scharfe, Patrick Portrayals of the Later Abbasid Caliphs: The Role of the Caliphate in Buyid and Saljuq-era Chronicles, 936-1180

    Master of Arts, The Ohio State University, 2010, History

    Decline paradigms have long dominated the modern historiography of the pre-modern Middle East. In particular, the alleged decadence of the Abbasid caliphate after its loss of military power in the middle of the 10th-century has been seen as an index of the “decline” of Islamic civilization generally. This judgment, however, has usually been taken without much actual reference to the later history of the Abbasids. A thorough examination of the primary sources of medieval Islamic history – Arabic chronicles – reveals a much more nuanced picture of the later Abbasid caliphate. While the caliphs lacked military power during the Buyid and Saljuq eras, they were not mere hostages of the secular powers in the eyes of the chroniclers. A close reading of each chronicler against his political background is necessary to understand this fully, however. The caliphs' authority allowed them to bestow titles upon the rulers that they chose, and sultans were only legitimate when the caliphs had their names recited in the Friday prayer (khutba). The caliphs also exercised practical power, especially with the weakening of the Buyid amirate after 1000 C.E. With the caliph al-Qadir (d. 1030), the caliphs controlled judgeships, intervened in urban politics and led the struggle for religious orthodoxy. They were neither saved nor held hostage by the Saljuq sultan Tughril Beg who arrived in Baghdad in 1055. When the Saljuq sultanate fragmented in the 12th-century, the caliphs re-emerged as regional military leaders. Whereas previous caliphs had held authority but not military power, the caliph al-Muqtafi (d. 1160) united power and authority again through his victories in battle against the Saljuqs. Thus, the story of the later Abbasids is not a simple tale of decline.

    Committee: Jane Hathaway PhD (Committee Chair); Stephen Dale PhD (Committee Member); Parvaneh Pourshariati PhD (Committee Member) Subjects: History
  • 9. Chen, Yachuan Episodic Perspectives of Wireless Network Dependability

    Master of Science (MS), Ohio University, 2006, Telecommunications (Communication)

    Wireless networks have become critical telecommunication infrastructure as millions of people depend on these networks for daily communication. Additionally, thousands of new customers are subscribing to wireless service every day. In order to obtain larger market share, wireless carriers are expanding their networks to accommodate more customers. As networks grow, carriers face tremendous challenges to not compromise network dependability. How the dependability of a wireless network might change as it expands over time becomes an important issue. The dependability we are discussing in this thesis includes network reliability, availability, maintainability and survivability. This thesis addresses the dependability of a wireless infrastructure capable of serving 100,000 to 1,000,000 customers. A discrete time-event driven simulation is used to investigate a network's dependability as a function of network size, component Mean Time To Failure (MTTF) and component Mean Time to Restore (MTR). As the network expands in size, the number of concurrent outages can also be expected to grow. In order to assess this phenomenon, the notion of a disturbance called an “impact episode” is introduced in this thesis. Impact episodes are defined here to be either single or concurrent outages, resulting in new assessment parameters, namely, Mean Time To Episode (MTTE), Mean Time to Restore Network (MTRN), Quiescent Availability (AQ), Peak Customer Impacted (PCI), and Wireless Prime Lost Line Hours (WPLLH). The latter parameter uses a time based traffic profile, derived from empirical wireless traffic statistics, to assess unmet demand because of episodes. The purpose of this research is to understand the characteristics of concurrent network outages and how they provide perspectives on network dependability useful to network operators of large network infrastructures. Such understanding offers network operators valuable insights about predicting the frequency with which network episod (open full item for complete abstract)

    Committee: Snow Andrew (Advisor) Subjects: Engineering, System Science
  • 10. Bismack, Brian Implementation of the Dosimetry Check Software Package in Computing 3D Patient Exit Dose Through Generation of a Deconvolution Kernel to be Used for Patients' IMRT Treatment Plan QA

    Master of Science in Biomedical Sciences (MSBS), University of Toledo, 2010, College of Medicine

    Using the Dosimetry Check IMRT QA package a deconvolution kernel to be used in exit image dose calculations was created. This kernel modeled Electronic Portal Imaging Device response by incorporating the various machine characteristics along with the variant patient thickness and composition. To properly achieve this, Dosimetry Check first had to accurately model the beams for the machines being used. This was done by taking a series of in air and in water measurements including central axis depth dose values at various field sizes and in-air off center ratios to model beam flatness and symmetry. A deconvolution kernel for patient CT dose computation using pre-treatment (in-air) EPID images was created. This baseline helped establish the necessary measurements for the exit image kernel. The necessary measurements for the exit image kernel included EPID images of various field sizes with various thicknesses of water in the beam as well as a characterization of off axis narrow beam transmission. The fit was performed and a report generated for its variance. The kernel was then successfully used in the evaluation of an IMRT prostate plan that was created for an anthropomorphic phantom and compared to a baseline evaluation of the same plan using in air EPID images. Volumetric, planar, and point dose comparison between measured and computed dose distributions agreed favorably indicating the validity of technique used for IMRT QA.

    Committee: E. Ishmael Parsai PhD (Committee Chair); David Pearson PhD (Committee Member); Michael Dennis PhD (Committee Member) Subjects: Physics; Radiation
  • 11. Pichler, Joseph IMRT Plan Delivery Verification Utilizing a Spiral Phantom with Radiochromic Film Dosimetry

    Master of Science in Biomedical Sciences (MSBS), University of Toledo, 2010, College of Medicine

    The purpose of this study was to develop and report on the implementation and IMRT quality assurance plan delivery verification using a Spiral phantom. The phantom utilizes a cylindrical solid water system with a machined spiral trajectory for insertion of film. Several analyses were performed on various IMRT treatment plans comparing predicted planar fluence dose and measured dose using radiochromic film. A solid water cylindrical IMRT phantom manufactured by GAMMEX which has been machined creating a spiral cavity for placement of radiochromic film (GAFCHROMIC® EBT2) was employed for IMRT plan QA. This spiral phantom is implemented to measure data in a three dimensional (3D) subspace which was not previously demonstrated. The patient treatment plan with predicted planar and volumetric isodose distributions were obtained in Pinnacle treatment planning software (TPS). The patient treatment planning data using a CT data set was then projected onto the spiral phantom in the TPS where planar dose files were generated using a scripting file. The measured data was obtained upon successful delivery of the intended treatment plan with a linear accelerator on the spiral phantom with radiochromic film in place. Comparison of the predicted and measured data provides a quantitative and qualitative assessment and validation of the intended treatment when delivered as planned. The films were subsequently scanned and the measured dose data from the delivered plan were compared with the planar fluence dose maps generated in Pinnacle using RIT113 software. Schematic isodose overlays, vertical and horizontal dose profiles as well as IMRT distance to agreement (DTA) analysis of the predicted and measured dose distributions were shown to be in great accord with one another. One can easily see the value of this phantom as a quantitative and qualitative IMRT plan analysis tool.

    Committee: E. Ishamael Parsai PhD (Committee Chair); David Pearson PhD (Committee Member); Diana Shvydka PhD (Committee Member) Subjects: Physics
  • 12. Cheng, Wu Corrupted Image Quality Assessment

    Master of Science (M.S.), University of Dayton, 2012, Electrical Engineering

    We propose a foundation for assessing visual quality with "corrupted reference"(CR-QA) - a new quality assessment(QA) paradigm for reasoning about human vision and image restoration problems jointly. The visual quality of a processed image signal is assessed relative to an ideal reference image (not provided) with the help of observed image. This is in contrast to today's QAs, which are optimized for a "post-hoc" usage (process first, assess quality second) and are unequipped to handle the assessment of the processed data relative to the ideal reference that exist only in theory and not in practice.

    Committee: Keigo Hirakawa (Committee Chair); K. Asari Vijayan (Committee Member); H. Brian Tsou (Committee Member) Subjects: Engineering; Statistics
  • 13. Walker, Justin The Use of an On-Board MV Imager for Plan Verification of Intensity Modulated Radiation Therapy and Volumetrically Modulated Arc Therapy.

    Master of Science in Biomedical Sciences (MSBS), University of Toledo, 2013, College of Medicine

    The introduction of complex treatment modalities such as IMRT and VMAT has led to the development of many devices for plan verification. One such innovation in this field is the repurposing of the portal imager to not only be used for tumor localization but for recording dose distributions as well. Several advantages make portal imagers attractive options for this purpose. Very high spatial resolution allows for better verification of small field plans than may be possible with commercially available devices. Because the portal imager is attached to the gantry set up is simpler than any other method available, requiring no additional accessories, and often can be accomplished from outside the treatment room. Dose images capture by the portal imager are in digital format make permanent records that can be analyzed immediately. Portal imaging suffers from a few limitations however that must be overcome. Images captured contain dose information and a calibration must be maintained for image to dose conversion. Dose images can only be taken perpendicular to the treatment beam allowing only for planar dose comparison. Planar dose files are themself difficult to obtain for VMAT treatments and an in-house script had to be developed to create such a file before analysis could be performed. Using the methods described in this study, excellent agreement between planar dose files generated and dose images taken were found. The average agreement for IMRT field analyzed being greater than 97% for non-normalized images at 3mm and 3%. Comparable agreement for VAMT plans was found as well with the average agreement being greater than 98%.

    Committee: E. Ishmael Parsai PhD (Committee Chair); David Pearson PhD (Committee Member); Diana Shvydka PhD (Committee Member) Subjects: Biophysics; Medical Imaging; Physics
  • 14. Davenport, David Development of a Quality Assurance Procedure for Dose Volume Histogram Analysis

    Master of Science in Biomedical Sciences (MSBS), University of Toledo, 2013, College of Medicine

    The role of the dose-volume histogram (DVH) is rapidly expanding in radiation oncology treatment planning. DVHs are already relied upon to differentiate between two similar plans and evaluate organ-at-risk dosage. Their role will become even more important as progress continues towards implementing biologically based treatment planning systems. Therefore it is imperative that the accuracy of DVHs is evaluated and reappraised after any major software or hardware upgrades, affecting a treatment planning system (TPS). The purpose of this work is to create and implement a comprehensive quality assurance procedure evaluating dose volume histograms to insure their accuracy while satisfying American College of Radiology guidelines. Virtual phantoms of known volumes were created in Pinnacle TPS and exposed to different beam arrangements. Variables including grid size and slice thickness were varied and their effects were analyzed. The resulting DVHs were evaluated by comparison to the commissioned percent depth dose values using a custom Excel spreadsheet. After determining the uncertainty of the DVH based on these variables, multiple second check calculations were performed using MIM Maestro and Matlab software packages. The uncertainties of the DVHs were shown to be less than ± 3%. The average uncertainty was shown to be less than ± 1%. The second check procedures resulted in mean percent differences less than 1% which confirms the accuracy of DVH calculation in Pinnacle and the effectiveness of the quality assurance template. The importance of knowing the limits of accuracy of the DVHs, which are routinely used to assess the quality of clinical treatment plans, cannot be overestimated. The developed comprehensive QA procedure evaluating the accuracy of the DVH statistical analysis will become a part of our clinical arsenal for periodic tests of the treatment planning system. It will also be performed at the time of commissioning and after any maj (open full item for complete abstract)

    Committee: Diana Shvydka Ph.D. (Committee Chair); E. Parsai Ph.D. (Committee Member); David Pearson Ph.D. (Committee Member) Subjects: Physics; Radiation; Radiology