Skip to Main Content

Basic Search

Skip to Search Results
 
 
 

Left Column

Filters

Right Column

Search Results

Search Results

(Total results 8)

Mini-Tools

 
 

Search Report

  • 1. M Saleh, Mariam Encryption and Compression Classification of Internet of Things Traffic

    Doctor of Philosophy (PhD), Wright State University, 2023, Computer Science and Engineering PhD

    The Internet of Things (IoT) is used in many fields that generate sensitive data, such as healthcare and surveillance. Increased reliance on IoT raised serious information security concerns. This dissertation presents three systems for analyzing and classifying IoT traffic using Deep Learning (DL) models, and a large dataset is built for systems training and evaluation. The first system studies the effect of combining raw data and engineered features to optimize the classification of encrypted and compressed IoT traffic using Engineered Features Classification (EFC), Raw Data Classification (RDC), and combined Raw Data and Engineered Features Classification (RDEFC) approaches. Our results demonstrate that the EFC, RDC, and RDEFC models achieve a high classification accuracy of 80.94%, 86.45%, and 90.55%, respectively, outperforming systems reported in the literature with similar configurations. The second system uses three approaches of density estimation, which are histogram, Kernel Density Estimation (KDE), and Cumulative Distribution Function (CDF), to enhance encrypted and compressed variable-size IoT traffic classification. The results demonstrate that the KDE approach attains a significantly higher accuracy of 90.92% compared to 86.66% and 82.6% of the histogram and CDF, respectively. Furthermore, the KDE approach outperforms our RDEFC model in three aspects: variable file length, dataset complexity, and dimensionality reduction. The third system suggests a novel approach for file type classification of fragments in a compressed archive file for forensic digital investigation. Existing research in the literature classifies these files as archive file formats, such as .zip, with no further investigation of the compressed file types. In this system, an optimized modification of the Inception network is implemented. Two sets of filter sizes are implemented, and the attained accuracies are 73.18% and 75.24%, respectively. For future work, we suggest including (open full item for complete abstract)

    Committee: Bin Wang Ph.D. (Advisor); Soon M. Chung Ph.D. (Committee Member); Liu Meilin Ph.D. (Committee Member); Wu Zhiqiang Ph.D. (Committee Member) Subjects: Artificial Intelligence; Computer Engineering; Computer Science; Engineering; Information Science; Information Technology
  • 2. Datar, Archit Characterization of Nanoporous Materials and Computational Study for Water Adsorption-Related Applications

    Doctor of Philosophy, The Ohio State University, 2021, Chemical Engineering

    Nanoporous materials have the potential to be at the heart of several energy- and environment-related technologies which could be deployed to mitigate global challenges such as global warming and water shortages. Progress in their synthesis techniques have allowed the development of high performing materials with interesting properties such as large surface areas and high tunability, among others. This progress has also resulted in a large materials space with thousands of potential candidates for any given application. Consequently, experimental synthesis and testing of each material for a given application can become impracticable and computational approaches to study these materials can efficiently provide answers and insights. In this work, we have focused on two such areas where our computational studies have enabled useful insights for accelerating materials discovery. First, we have employed computational approaches to characterize the surface area of materials which is a critical property to predict adsorption performance in separations and storage applications. We have thoroughly investigated the current state-of-the-art method—the BET method—to systematically identify its strengths and limitations, and proposed physics-based and surrogate models to improve the characterization accuracy, especially for materials that are potentially important to adsorption applications. Second, we have developed efficient methods for screening materials for water adsorption-related applications such as water harvesting from air which can be an important tool to tackle global problems such as water scarcity. With the goal of recommending optimal materials for a particular adsorption application (water harvesting in this case), we studied their water adsorption behavior. We found that the conventional method—the grand canonical Monte Carlo (GCMC)—method could be rather unreliable for studying water adsorption. We employed biased sampling methods such as flat histogram methods (open full item for complete abstract)

    Committee: Aravind Asthagiri (Advisor); Isamu Kusaka (Committee Member); Li-Chiang Lin (Advisor) Subjects: Chemical Engineering
  • 3. Thapa, Mandira Optimal Feature Selection for Spatial Histogram Classifiers

    Master of Science in Electrical Engineering (MSEE), Wright State University, 2017, Electrical Engineering

    Point set classification methods are used to identify targets described by a spatial collection of points, each represented by a set of attributes. Relative to traditional classification methods based on fixed and ordered feature vectors, point set methods require additional robustness to obscured and missing features, thus necessitating a complex correspondence process between testing and training data. The correspondence problem is efficiently solved via spatial pyramid histograms and associated matching algorithms, however the storage requirements and classification complexity grow linearly with the number of training data points. In this thesis, we develop optimal methods of identifying salient point-features that are most discriminative in a given classification problem. We build upon a logistic regression framework and incorporate a sparsifying prior to both prune non-salient features and prevent overfitting. We present results on synthetic data and measured data from a fingerprint database where point-features are identified with minutia locations. We demonstrate that by identifying salient minutia, the training database may be reduced by 94\% without sacrificing fingerprint identification performance. additionally, we demonstrate that the regularization provided by saliency determination provides improved robustness over traditional pyramid histogram methods in the presence of point migration in noisy data.

    Committee: Joshua Ash Ph.D. (Advisor); Arnab Shaw Ph.D. (Committee Member); Steve Gorman Ph.D. (Committee Member) Subjects: Electrical Engineering
  • 4. Madaris, Aaron Characterization of Peripheral Lung Lesions by Statistical Image Processing of Endobronchial Ultrasound Images

    Master of Science in Biomedical Engineering (MSBME), Wright State University, 2016, Biomedical Engineering

    This thesis introduces the concept of implementing greyscale analysis, also known as intensity analysis, on endobronchial ultrasound (EBUS) images for the purposes of diagnosing peripheral lung tumors. The statistical methodology of using greyscale and histogram analysis allows the characterization of lung tissue in EBUS images. Regions of interest (ROI) will be analyzed in MATLAB and a feature vector will be created. A feature vector of first-order, second-order and histogram greyscale analysis will be created and used for the classification of malignant vs benign peripheral lung tumors. The tools that were implemented were MedCalc for the initial statistical analysis of receiver operating curves (ROC), Multiple Regression and MATLAB for the machine learning and ROI collection. Feature analysis, multiple regression and machine learning methods were used to better classify the malignant and benign EBUS images. The classification is assessed with a confusion matrix, ROC curve, accuracy, sensitivity and specificity. It was found that minimum pixel value, contrast and energy are the best determining factors to discriminate between benign and malignant EBUS images.

    Committee: Ulas Sunar Ph.D. (Advisor); Jason Parker Ph.D. (Committee Member); Jaime Ramirez-Vick Ph.D. (Committee Member) Subjects: Biomedical Engineering; Biomedical Research; Biostatistics; Computer Engineering; Engineering; Health Care; Medical Imaging
  • 5. Sun, Yawei The Development of a Bedside Display for the ICU

    Master of Sciences, Case Western Reserve University, 2014, EECS - Electrical Engineering

    In this thesis, an innovative Graphical User Interface (GUI) for the next generation of bedside decision-support systems for the Intensive Care Unit (ICU) is developed. Functions of existing monitors in the ICU are integrated into a single bedside system. The bedside monitor is capable of visualizing real-time data streaming from a patient monitor, performing routine and novel signal analytics and also reading archived patient waveform data in European Data File (EDF) format. Signals such as Electrocardiograph (ECG) waveform, Heart Rate (HR), Oxygen Saturation (SaO2), etc. are available for visualization and analysis. Selection for three different period time data with flexible data length is also created. Novel analytics including Poincare' and histogram plots are available to investigate the variation and connections of different physiological signals.

    Committee: Kenneth Loparo (Advisor); Farhad Kaffashi (Committee Member); Frank Jacono (Committee Member) Subjects: Electrical Engineering; Health Care
  • 6. Zhuang, Yuwen Metric Based Automatic Event Segmentation and Network Properties Of Experience Graphs

    Master of Science, The Ohio State University, 2012, Computer Science and Engineering

    Lifelogging, as a growing interest, is a term referring to people digitally capturing all the information produced by them in daily life. Lifelog is a data collection of records of an individual's daily activities in one or more media forms. In this thesis, we collect lifelog data by using a mobile phone or a Microsoft Research SenseCam worn around subjects' necks during their daily life. We then propose a way to organize the lifelog data - a metric based model for event segmentation. Further more, we analyse the data properties through constructing the experience graphs from the recorded images. This thesis involves two parts, the details are as follows: Firstly, we describe a metric-based model for event segmentation of sensor data recorded by a mobile phone worn around subjects' necks during their daily life. More specifically, we aim at detecting human daily event boundaries by analysing the recorded triaxial accelerometer signals and images sequence (lifelog data). In the experiments, different signal representations and three boundary detection models are evaluated on a corpus of 2 subjects over total 24 days. The contribution of this work is three-fold. First, we find that using accelerometer signals can provide much more reliable and significantly better performance than using image signals with MPEG-7 low level features. Second, the models using the accelerometer data based on the world's coordinates system can provide equally or even much better performance than using the accelerometer data based on the device's coordinates system. Finally, our proposed model has a better performance than the state of the art system. Secondly, we investigate data obtained from subjects wearing a Microsoft Research SenseCam as they engaged in their every day activities. We construct experience graphs for each subject from their corresponding images by using two different image representation methods - color histogram and color correlogram. The statistical analyses of thes (open full item for complete abstract)

    Committee: Mikhail Belkin (Advisor); Simon Dennis (Committee Member); Deliang Wang (Committee Member) Subjects: Computer Science
  • 7. Kirsch, Matthew Signal Processing Algorithms for Analysis of Categorical and Numerical Time Series: Application to Sleep Study Data

    Master of Sciences (Engineering), Case Western Reserve University, 2010, EECS - Electrical Engineering

    In this thesis, novel entropy-based measures are developed to quantify the fragmentation of an individual categorical time series and the coupling strength of two categorical time series. Existing entropy-based measures are also shown to be well suited for the same task. These measures are applied to the analysis of categorical time series derived from sleep study data. Specifically, fragmentation of the hypnogram categorical time series of sleep stages, fragmentation of the breathing categorical time series of oxygen desaturation events, and coupling between the hypnogram and breathing categorical time series is quantified and the relationship between these measures and the diagnostic outcomes of hypertension and obstructive sleep apnea is investigated. Additionally, electroencephalogram (EEG) activity during sleep is explored by analyzing the distribution of power in various frequency bands throughout the night using summary statistics and a histogram entropy measure computed using the maximum-entropy histogram binning procedure developed in this thesis.

    Committee: Kenneth Loparo PhD (Advisor); Susan Redline MD, MPH (Committee Member); Reena Mehra MD, MS (Committee Member) Subjects: Biomedical Research; Engineering
  • 8. Davenport, David Development of a Quality Assurance Procedure for Dose Volume Histogram Analysis

    Master of Science in Biomedical Sciences (MSBS), University of Toledo, 2013, College of Medicine

    The role of the dose-volume histogram (DVH) is rapidly expanding in radiation oncology treatment planning. DVHs are already relied upon to differentiate between two similar plans and evaluate organ-at-risk dosage. Their role will become even more important as progress continues towards implementing biologically based treatment planning systems. Therefore it is imperative that the accuracy of DVHs is evaluated and reappraised after any major software or hardware upgrades, affecting a treatment planning system (TPS). The purpose of this work is to create and implement a comprehensive quality assurance procedure evaluating dose volume histograms to insure their accuracy while satisfying American College of Radiology guidelines. Virtual phantoms of known volumes were created in Pinnacle TPS and exposed to different beam arrangements. Variables including grid size and slice thickness were varied and their effects were analyzed. The resulting DVHs were evaluated by comparison to the commissioned percent depth dose values using a custom Excel spreadsheet. After determining the uncertainty of the DVH based on these variables, multiple second check calculations were performed using MIM Maestro and Matlab software packages. The uncertainties of the DVHs were shown to be less than ± 3%. The average uncertainty was shown to be less than ± 1%. The second check procedures resulted in mean percent differences less than 1% which confirms the accuracy of DVH calculation in Pinnacle and the effectiveness of the quality assurance template. The importance of knowing the limits of accuracy of the DVHs, which are routinely used to assess the quality of clinical treatment plans, cannot be overestimated. The developed comprehensive QA procedure evaluating the accuracy of the DVH statistical analysis will become a part of our clinical arsenal for periodic tests of the treatment planning system. It will also be performed at the time of commissioning and after any maj (open full item for complete abstract)

    Committee: Diana Shvydka Ph.D. (Committee Chair); E. Parsai Ph.D. (Committee Member); David Pearson Ph.D. (Committee Member) Subjects: Physics; Radiation; Radiology