Skip to Main Content

Basic Search

Skip to Search Results
 
 
 

Left Column

Filters

Right Column

Search Results

Search Results

(Total results 81)

Mini-Tools

 
 

Search Report

  • 1. Achyutha, Shanmukha Murali SoC Security Verification Using Assertion-Based and Information Flow Tracking Techniques

    MS, University of Cincinnati, 2021, Engineering and Applied Science: Electrical Engineering

    A System-on-Chip (SoC) is an integrated circuit that is embedded in most electronic devices. It typically consists of a central processing unit (CPU) containing multiple cores, memory (RAM), input, output ports and a communication fabric. Due to their wide range of applications, huge demand and time to market constraints, the SoC development process (design, verification, fabrication and testing) is often distributed over various companies and countries. Throughout the SoC development process, several security vulnerabilities can come to exist. These vulnerabilities can occur due to the design bugs in the functional blocks, malicious intrusions in a 3rd party intellectual property (IP) cores and manufacturing defects in the fabrication process. When triggered during operation, vulnerabilities can lead to several undesirable outcomes, such as leaking sensitive information and denial of service. It is difficult to identify a triggered vulnerability in both pre-silicon verification and post-silicon validation. It takes many clock cycles to manifest its effect on functionality at an observable port. Hence, it is necessary to verify security policies that an SoC should enforce to mitigate vulnerabilities. These policies should be represented in formal notations and verified using static methods to ensure that the design adheres to the security policies. This thesis presents three different approaches to identify vulnerabilities in hardware (SoC) designs. The first method is to develop a library of parameterized assertions for a catalog of security policies identified from the literature. For a given SoC design, assertions are instantiated from the library with the actual signals from the architecture. There are certain security policies such as confidentiality, integrity and availability related to information flow that assertion-based methodology cannot verify directly. So, two novel Information Flow Tracking Models (IFTMs) are developed to verify such security policie (open full item for complete abstract)

    Committee: Ranganadha Vemuri Ph.D. (Committee Chair); Carla Purdy (Committee Member); Wen-Ben Jone Ph.D. (Committee Member) Subjects: Electrical Engineering
  • 2. Qiang, Qiang FORMAL: A SEQUENTIAL ATPG-BASED BOUNDED MODEL CHECKING SYSTEM FOR VLSI CIRCUITS

    Doctor of Philosophy, Case Western Reserve University, 2006, Computer Engineering

    Bounded Model Checking (BMC) is a formal method of verifying Very Large Scale Integrated (VLSI) circuits. It shows violation of a given circuit property by finding a counter-example to the property along bounded state paths of the circuit. The BMC problem is inherently NP-complete and is traditionally formulated to a boolean SATisfiability (SAT) problem, which is subsequently solved by a SAT solver. Automatic Test Pattern Generation (ATPG), as an alternative to SAT, has already been shown an effective solution to NP-complete problems in many computer-aided design areas. In the field of BMC, ATPG has already achieved promising results for simple properties; its effectiveness for more complicated nested properties, however, remains unknown. This thesis presents the first systematic framework of ATPG-based BMC capable of checking properties in all nested forms on gate level. The negation counterpart to a property is mapped into a structural monitor, which is tailored to a flattened model of the input circuit. A target fault is then injected at the monitor output, and a modified ATPG-based state justification algorithm is used to search a test for this fault. Finding such a test corresponds to formally establishing the property. The framework can easily incorporate any existing ATPG tool with little modification. The proposed framework has been implemented in a computer program called FORMAL, and has been used to check a comprehensive set of properties of GL85 microprocessor and USB 2.0 circuits. Experimental results show that the ATPG-based approach performs better in both capacity and efficiency than the SAT-based techniques, especially for large bounds and for properties that require large search space. Therefore, ATPG-based BMC has been demonstrated an effective supplement to SAT-based BMC in VLSI circuit verification.

    Committee: Daniel Saab (Advisor) Subjects:
  • 3. Johnston, Geoffrey Comparison of Vertical Misfit Between Pattern Resin and Welded Titanium Used to Fabricate Complete-Arch Implant Verification Jigs

    Master of Science, The Ohio State University, 2017, Dentistry

    Purpose: The aim of this study was to compare marginal fit and 3-dimensional distortion of verification jigs fabricated on a four-implant model using either pattern resin (GC pattern resin) or a novel welded titanium framework (Dentweld). Materials and Methods: A single master edentulous model simulating four implants in an edentulous mandible at sites 21, 23, 26, and 28 (Zimmer Biomet) was used to fabricate each sample. Verification jigs were fabricated out of reluted, sectioned pattern resin (GC Pattern Resin), or from titanium with use of the Titanium Welder (Dentweld). Five samples (n=5) from each material and method were fabricated. After a single screw was torqued to 15Ncm at implant site #28, all samples were scanned with an industrial computed tomography (CT) scanner (Nikon / X-Tek XT H 225kV MCT Micro-Focus). Polygonal mesh models (STL) were reconstructed from the direct CT dataset and transferred to a volume and graphics analysis software (Polyworks, Innovmetric). From this software, all measurements were extracted prior to statistical analysis. 3D gap measurements were generated for all coping/abutment mating surfaces, and color maps were created to show gap size between mating surfaces using +/- 0.250mm color scale ranges. 3D gap measurements were also reported as mean gap measurement at each abutment/coping interface as well as a maximum gap at each interface. The Life Reg (SAS) procedure was used to fit the gap distance values to a model of a normal distribution for the experimental design and a scale parameter. The interval-censored data was reported in tables as no-measurable gap (NMG). Results: When mean or maximum vertical gap data was analyzed, statistical significance was shown between mean vertical gaps for the method used to create verification jig (p-value: < 0.0001). Statistical significance was not shown when other trends in the data were examined. Conclusions: Within the limitation of this in vitro study, the following conclusion (open full item for complete abstract)

    Committee: Edwin McGlumphy DDS, MS (Advisor); William Johnston PhD (Committee Member); Burak Yilmaz DDS, PhD (Committee Member) Subjects: Dentistry; Health Care
  • 4. Matias, Matthew Program Verification of FreeRTOS using Microsoft Dafny

    Master of Science in Software Engineering, Cleveland State University, 2014, Washkewicz College of Engineering

    FreeRTOS is a popular real-time and embedded operating system. Real-time software requires code reviews, software tests, and other various quality assurance activities to ensure minimal defects. This free and open-source operating system has claims of robustness and quality [26]. Real-time and embedded software is found commonly in systems directly impacting human life and require a low defect rate. In such critical software, traditional quality assurance may not suffice in minimizing software defects. When traditional software quality assurance is not enough for defect removal, software engineering formal methods may help minimize defects. A formal method such as program verification is useful for proving correctness in real-time soft- ware. Microsoft Research created Dafny for proving program correctness. It contains a programming language with specification constructs. A program verification tool such as Dafny allows for proving correctness of FreeRTOS's modules. We propose using Dafny to verify the correctness of FreeRTOS' scheduler and supporting API.

    Committee: Nigamanth Sridhar PhD (Committee Chair); Yongjian Fu PhD (Committee Member); Chansu Yu PhD (Committee Member) Subjects: Computer Science
  • 5. Ingraham, Daniel Verification of a Computational Aeroacoustics Code Using External Verification Analysis (EVA)

    Master of Science in Mechanical Engineering, University of Toledo, 2010, College of Engineering

    As Computational Aeroacoustics (CAA) codes become more complex andwidely used, robust Verification of such codes becomes more and more important. Recently, Hixon et al. proposed a variation of the Method of Manufactured Solutions of Roache especially suited for Verifying unsteady CFD and CAA codes that does not require the generation of source terms or any modification of the code being Verified. This work will present the development of the External Verification Analysis (EVA) method and the results of its application to some popular model equations of CFD/CAA and a high-order nonlinear CAA code.

    Committee: Ray Hixon PhD (Committee Chair); Douglas Oliver PhD (Committee Member); Chunhua Sheng PhD (Committee Member) Subjects: Acoustics; Mechanical Engineering
  • 6. Srinivasan, Suriya Efficient Trojan and Error Detection and Localization Techniques using Assertion Based Verification

    PhD, University of Cincinnati, 2024, Engineering and Applied Science: Computer Science and Engineering

    Hardware security threats can be introduced at various stages of the System-on-Chip design life cycle. These threats manifest in the design as unwarranted design modifications and malicious backdoor insertions, which result in critical information leakage or denial of service. Thus, it is essential to verify these complex SoC designs at various stages of their life cycles. Verifying such complex designs is a critical and laborious task and consumes 70% of the design effort. Formal verification techniques offer significant performance compared to traditional functional verification strategies by using mathematical principles to identify critical corner cases that functional methods often miss. When a formal property fails for a given design, a counterexample is produced, providing evidence of failure. This counterexample includes a sequence of inputs that can be converted into test vectors for use in post-silicon verification. Despite offering an automated framework for bug detection, formal methods do not automatically pinpoint the exact location of these bugs or security vulnerabilities within the design. Formal verification engineers still need to manually analyze the counterexample and identify the precise location of the vulnerability and the reason behind the assertion failure. Some SATbased techniques attempt to localize bugs at the gate level of the System-on-Chip design by introducing additional test point hardware, such as multiplexers. However, these techniques are inefficient for complex SoC designs as they need to be represented in their conjunctive normal form leading to significant compatibility issues. Thus, we levitate toward model checkers which offer a more simplistic approach to property verification. This research intends to develop tools that leverage model checkers to localize a variety of security violations that fail security assertions and develop a wide range of test vectors to det (open full item for complete abstract)

    Committee: Ranganadha Vemuri Ph.D. (Committee Chair); Wen-Ben Jone Ph.D. (Committee Member); Suyuan Chen Ph.D. (Committee Member); Boyang Wang Ph.D. (Committee Member); John Emmert Ph.D. (Committee Member) Subjects: Engineering
  • 7. Bhamidipati, Padmaja Security Assurance In SoCs and NoCs: Analysis, Specification, and Quantification

    PhD, University of Cincinnati, 2024, Engineering and Applied Science: Electrical Engineering

    Modern heterogeneous architectures contain multiple cores that perform a variety of processing, storage, and communication tasks. The complexity of interactions among the cores and of the cores themselves introduces potential security vulnerabilities that can be exploited by malicious actors to mount a variety of attacks. To address these vulnerabilities, it is critical to conduct systematic security analysis, enforce relevant security policies, and verify designs through formal methods before fabrication. However, current SoC designs require a time-consuming and resource-intensive process to identify and verify security assets against applicable security policies. This gap highlights the need for efficient abstraction techniques that streamline the specification and verification of security policies, reducing both the verification cost and design overhead. As these complex architectures rely on information transfer between the cores, the significance of a well-established interconnect such as Network-on-Chip (NoC) is paramount. NoC architectures have gained prominence in modern computing systems for their scalability and efficiency. However, the globalization of NoC design and fabrication exposes them to security threats. The shared hardware resources between secure and malicious IPs in NoC create vulnerabilities that are exploited by the attacker to implement explicit and implicit data leakages. Quantitative analysis plays an important role in exposing vulnerabilities by quantifying packet information and traffic flow across components and pathways. It uses numerical data and mathematical models to understand complex systems, revealing patterns, and anomalies through qualitative methods. This dissertation introduces a comprehensive methodology to address the challenges associated with SoC and NoC security. First, we propose a systematic approach for security analysis using Assertion-Based Verification (ABV), focusing on cataloging SoC vulnerabilities and d (open full item for complete abstract)

    Committee: Ranganadha Vemuri Ph.D. (Committee Chair); Wen-Ben Jone Ph.D. (Committee Member); Suyuan Chen Ph.D. (Committee Member); Mike Borowczak Ph.D. (Committee Member); John Emmert Ph.D. (Committee Member) Subjects: Electrical Engineering
  • 8. Alatoun, Khitam Efficient Techniques for SoC Security Validation and Monitoring

    PhD, University of Cincinnati, 2024, Engineering and Applied Science: Computer Science and Engineering

    Embedded systems are commonly described as the powerhouses that drive a wide range of modern devices. The landscape of embedded systems has undergone significant transformation since the advent of System-on-Chip (SoC) technology. Throughout the life cycle of modern SoC designs, various hardware security threats can be introduced. These security vulnerabilities, when exercised during SoC operation, can lead to denial of service or disclosure of protected data. Hence, it is essential to undertake security validation before and after SoC fabrication and make provisions for continuous security assessment during operation. Assertion-based formal methods have been used to verify hardware designs. These methods use temporal logic to mathematically verify design properties. However, various of security policies such as confidentiality, integrity, time side-channels and power side-channels cannot be verified directly with traditional formal techniques. Additionally, current specification languages primarily focus on functional correctness and do not provide formal assurances for a wide range of security policies. Furthermore, critical design policies not only necessitate robust verification methods but also require continuous assessment during operation. Assertion-based runtime monitoring is a commonly used method to convert assertions into synthesizable hardware monitors. These monitors can be embedded into the design to detect property violations during runtime. However, an efficient method is required to generate optimized hardware runtime monitors from a security specification language. To design a trusted SoC, it is important to provide a methodology to specify, verify and monitor security policies. This research intends to develop tools that help to identify potential security vulnerabilities during SoC design and operation. We developed several methods to verify information flow, timing side-channel and power side-channel properties, they collabora (open full item for complete abstract)

    Committee: Ranganadha Vemuri Ph.D. (Committee Chair); John Emmert Ph.D. (Committee Member); Wen-Ben Jone Ph.D. (Committee Member); Rashmi Jha Ph.D. (Committee Member); Mike Borowczak Ph.D M.A B.A. (Committee Member) Subjects: Computer Engineering
  • 9. Taylor, Max Trustworthy UAS: A Holistic Approach

    Doctor of Philosophy, The Ohio State University, 2024, Computer Science and Engineering

    Unmanned Aerial Systems (UAS) are increasingly important. Farmers monitor crops and apply pesticides with UAS. First responders use UAS in applications ranging from fire fighting to search and rescue operations. There is potential for rapid shopping delivery by UAS. In all these applications, UAS work closely alongside humans. Onboard firmware controls the behavior of UAS. This dissertation studies ways to improve the quality of firmware. We start by presenting the first large-scale analysis of software defects ("bugs") reported in open-source UAS firmware. We examine nearly 300 reported bugs in the two most popular open-source systems (ArduPilot and PX4) and categorize the defects. Motivated by our findings, we propose three technologies to automate the detection and repair of UAS bugs. First, Avis seeks to automatically diagnose sensor bugs caused by misusing onboard sensors. Second, SA4U identifies unit type errors caused by incorrectly mixing values with different physical unit types (e.g., meters and minutes) in a computation. Finally, Scalpel automatically repairs bugs found by SA4U. Deep learning is increasingly used to provide advanced autonomous behavior for UAS. To support higher quality deep learning systems we propose checkd. Checkd automates checkpoint/restore policy configurations. Underlying checkd's contribution is the thesis that better tuned models yield better behavior. Checkd helps practitioners fine-tune models by reducing the overall cost to train.

    Committee: Feng Qin (Advisor); Michael Bond (Committee Member); Christopher Stewart (Committee Member) Subjects: Computer Science
  • 10. Sedari Mudiyanselage, Achira Identity Instability: An Exploration of Self-Verification Failure

    PhD, University of Cincinnati, 2023, Business: Business Administration

    In this study, I examined the effect of exogenous ‘shocks' on role identity, by analyzing the experiences of teachers during the COVID 19 pandemic. Using grounded theory, I conducted and analyzed 52 one-on-one interviews with teachers in multiple settings to understand the lived experiences of individuals directly affected by the event. Through my findings, I introduce the idea of a relational web to illustrate the relational co-construction of a role identity and how it contributes to its maintenance. The findings show how the disruption of identity-sustaining relationships could lead to “identity instability”, a construct that represents the state where the meanings ascribed to an identity by the identity holder are no longer verified through reflective appraisals from society. I also highlight how identity instability can trigger a process of appraisal, and how extended exposure to the precipitating event can affect the consistency of said appraisals.

    Committee: Elaine Hollensbe Ph.D. (Committee Chair); Heather Vough Ph.D. (Committee Member); Linda Plevyak Ph.D. (Committee Member); Eli Awtrey (Committee Member) Subjects: Management
  • 11. Bagnall, Alexander Formally Verified Samplers From Discrete Probabilistic Programs

    Doctor of Philosophy (PhD), Ohio University, 2023, Electrical Engineering & Computer Science (Engineering and Technology)

    This dissertation presents Zar: a formally verified compilation pipeline from discrete probabilistic programs in the conditional probabilistic guarded command language (cpGCL) to proved-correct executable samplers in the random bit model. Zar exploits the key idea that discrete probability distributions can be reduced to unbiased coin-flipping schemes. The compiler pipeline first translates cpGCL programs into choice-fix trees, an intermediate representation suitable for reduction of biased probabilistic choices. Choice-fix trees are then translated to coinductive interaction trees for execution within the random bit model. The correctness of the composed translations establishes the sampling equidistribution theorem: compiled samplers are correct with respect to the conditional weakest pre-expectation (cwp) semantics of their cpGCL source programs. Zar is implemented and fully verified in the Coq proof assistant. We extract verified samplers to OCaml and Python and empirically validate them on a number of illustrative examples. We additionally present AlgCo (Algebraic Coinductives), a practical framework for inductive reasoning over coinductive types such as conats, streams, and infinitary trees with finite branching factor, developed during the course of this work to enable convenient formal reasoning for coinductive samplers generated by Zar. The key idea is to exploit the notion of algebraic CPO from domain theory to define continuous operations over coinductive types via primitive recursion on "dense" collections of their elements, enabling a convenient strategy for reasoning about algebraic coinductives by straightforward proofs by induction. We implement the AlgCo library in Coq and demonstrate its utility by verifying a stream variant of the sieve of Eratosthenes, a regular expression library based on coinductive tries, and weakest pre-expectation semantics for potentially nonterminating sampling processes over discrete probability distributions in the r (open full item for complete abstract)

    Committee: David Juedes (Advisor); James Stewart (Committee Member); Vladimir Uspenskiy (Committee Member); Jundong Liu (Committee Member); Anindya Banerjee (Committee Member); David Chelberg (Committee Member) Subjects: Computer Science
  • 12. Daughety, Nathan Design and analysis of a trustworthy, Cross Domain Solution architecture

    PhD, University of Cincinnati, 2022, Engineering and Applied Science: Computer Science and Engineering

    With the paradigm shift to cloud-based operations, reliable and secure access to and transfer of data between differing security domains has never been more essential. A Cross Domain Solution (CDS) is a guarded interface which serves to execute the secure access and/or transfer of data between isolated and/or differing security domains defined by an administrative security policy. Cross domain security requires trustworthiness at the confluence of the hardware and software components which implement a security policy. Security components must be relied upon to defend against widely encompassing threats -- consider insider threats and nation state threat actors which can be both onsite and offsite threat actors -- to information assurance. Current implementations of CDS systems use sub-optimal Trusted Computing Bases (TCB) without any formal verification proofs, confirming the gap between blind trust and trustworthiness. Moreover, most CDSs are exclusively operated by Department of Defense agencies and are not readily available to the commercial sectors, nor are they available for independent security verification. Still, more CDSs are only usable in physically isolated environments such as Sensitive Compartmented Information Facilities and are inconsistent with the paradigm shift to cloud environments. Our purpose is to address the question of how trustworthiness can be implemented in a remotely deployable CDS that also supports availability and accessibility to all sectors. In this paper, we present a novel CDS system architecture which is the first to use a formally verified TCB. Additionally, our CDS model is the first of its kind to utilize a computation-isolation approach which allows our CDS to be remotely deployable for use in cloud-based solutions.

    Committee: John Franco Ph.D. (Committee Member); John Emmert Ph.D. (Committee Member); Marcus Dwan Pendleton Ph.D. (Committee Member); Nan Niu Ph.D. (Committee Member); Rashmi Jha Ph.D. (Committee Member) Subjects: Computer Science
  • 13. Karumanchi, Aditya Comparing Dynamic System Models with Additive Uncertainty

    Doctor of Philosophy, The Ohio State University, 2022, Mechanical Engineering

    Due to the complexity of the operational design domain of Automated Driving Systems, the industry is trending towards the use of simulation-based methods for their verification and validation (V\&V), which rely on the use of models of the vehicles, sensors, vehicle environments, etc. Depending on the testing requirements, computational capabilities, modeling effort, and other such factors, these models can vary in fidelity. However, this variation in fidelity has an effect on the excitation of the control systems under test, and can therefore affect the results of the tests themselves. Moreover, since every model is an approximation of the actual physical system it represents, there is uncertainty associated with its output. Therefore, we need to be able to compare uncertain system models in order to understand the effect of model fidelity variation on test accuracy. The existing metrics such as Hankel Singular Values compare asymptotic behavior of the models, whereas simulation studies are over finite time. Although some of these metrics may be applied over finite time, they rely on hyper-parameters like weights on time or frequency. In this study, we propose an approach for computing a (pseudo)metric based on the literature for comparing the predictive performance of two models, called finite-time Kullback-Leibler (KL) rate. For any two general state space models with a general additive uncertainty, we first discuss an approach for propagating a general additive uncertainty (represented as a Gaussian Mixture Model (GMM) ) through a linear time-invariant system. We then apply this propagation approach to linear representations of nonlinear systems obtained through Dynamic Mode Decomposition (DMD). We illustrate this combined approach for the comparison of two lateral vehicle dynamics models over an obstacle avoidance maneuver to measure the effect of fidelity on the predictive performance of each model. We also apply this to a \vv problem, wherein we compare (open full item for complete abstract)

    Committee: Punit Tulpule (Advisor); Shawn Midlam-Mohler (Advisor); Marcello Canova (Committee Member) Subjects: Mechanical Engineering
  • 14. Lou, Shuangsheng Deployment and Integrity Verification of Streaming IoT Applications on Edge Computing

    Master of Science, The Ohio State University, 2022, Computer Science and Engineering

    In recent years, there has been considerable interest in developing streaming applications for IoT (or Edge Computing) environments. In this context, several studies have (manually) deployed application components on different nodes in the path from the extreme edge to the cloud. It is desirable to automate this mapping process. However, when considering this problem in the context of heterogeneous multi-layer wireless networks, we see challenges like limited computing and battery power at the extreme edge, modest transmission bandwidth, and different processing powers for different nodes. Automatic deployment or partitioning for streaming applications considering these challenges has not been addressed in the previous work. In the first chapter, a framework for automated deployment is presented with an emphasis on optimizing latency in the presence of resource constraints. A dynamic programming-based deployment algorithm is developed to make deployment decisions. With battery power being a key constraint, a major component of our work is a power model to help assess the power consumption of the edge devices at the runtime. Using three applications, we show the large reductions in both power consumption and response latency with our framework, as compared to a baseline involving cloud-only execution. At the same time, such an edge or fog processing model is increasingly being used for critical applications, often in environments where devices can be compromised. The second chapter considers a number of attacks that can negatively impact streaming IoT applications, and develop solutions to verify the integrity of the applications. We focus on premature code execution and overwriting of devices generated outputs, and propose a minimalist logging scheme for periodic verification with the use of hash-chains and Merkle Hash Tree (MHT).

    Committee: Feng Qin (Committee Member); Mircea-Radu Teodorescu (Advisor) Subjects: Computer Science
  • 15. Weide, Alan Contributions to Formal Specification and Modular Verification of Parallel and Sequential Software

    Doctor of Philosophy, The Ohio State University, 2021, Computer Science and Engineering

    Modular verification of parallel and concurrent software built from reusable data abstractions is a challenging problem. Reasoning about sequential software can be modularized using the specifications of data abstractions, but the need to consider implementation details complicates reasoning about parallel execution. Addressing this challenge requires advancing the state of the art in several ways, beginning with a theoretical foundation. The A/P Calculus for describing the effects of program actions is developed in this dissertation to enable sound modular reasoning about parallel programs with non-interfering parallel sections of operation calls on abstract data types. Building on the calculus and a programming language with clean semantics, a methodology for designing decomposable data abstractions is presented to produce fork-join parallel programs that are manifestly data race free and readily amenable to modular reasoning. A new specification construct, the non-interference contract, is proposed to enhance the specification of data abstractions to hide implementation details and yet facilitate modular reasoning about parallel programs that share objects among processes. As a key first step to transition these results to practice, this dissertation describes Clean++, a discipline for writing software in C++ that leverages move semantics to make ownership transfer the primary data movement operation (as opposed to either deep or shallow copying) and produce programs that are amenable to formal verification with only minimal scaffolding related to pointer manipulation.

    Committee: Paolo A.G. Sivilotti (Advisor); Murali Sitaraman (Committee Member); Neelam Soundarajan (Committee Member); Michael Bond (Committee Member) Subjects: Computer Science
  • 16. Bentley, John Implementing a Resolve Online Prover Using Z3

    Master of Science, The Ohio State University, 2021, Computer Science and Engineering

    The Resolve Online development environment currently supports multiple provers: Dafny, SplitDecision, and variations and combinations thereof. The existing provers used by Resolve Online have some areas of improvement. One downside of the Dafny prover is that it generates Dafny programs to discharge verification conditions, tying it closely to the specific version of the Dafny compiler used. This approach is brittle because new releases of Dafny tend to break the backend prover, making it difficult for the Resolve Online tool to leverage improvements in the Dafny language. Other areas of potential improvement include the number of verification conditions that are automatically discharged and the execution time of the provers. Z3 is a modern theorem prover developed at Microsoft Research. First released in 2012, Z3 is under active development, with 3 releases in the last year at the time of writing and frequent commits. Z3 is used by Boogie, F*, Pex, and more, as well as tools that indirectly use Z3 through Boogie. This ongoing development could lead to future improvements that will allow for better verification of Resolve programs in the future. In order to leverage Z3 in Resolve Online, we have extended Resolve Online by creating a new prover backend that uses Z3. A benchmark of Resolve code consisting of 30 examples was used. Each example was run with the three provers: the new Z3 backend, Dafny, and SplitDecision. In all examples Z3 is able to prove at least as many VCs as Dafny, and in 28 examples, Z3 was able to prove at least as many VCs as SplitDecision. There are 20 examples where Z3 proved more VCs than Dafny and 11 when compared to SplitDecision. A smaller benchmark consisting of 8 complex examples from the full benchmark was selected to be used for analyzing the execution time of the provers. In most cases, Dafny takes substantially longer than SplitDecision and Z3. SplitDecision and Z3 are both close in the execution times. Either on (open full item for complete abstract)

    Committee: Neelam Soundarajan (Committee Member); Paolo Sivilotti (Advisor) Subjects: Computer Science
  • 17. Shankaranarayanan, Bharath Assertion-Based Monitors for Run-time Security Validation

    MS, University of Cincinnati, 2021, Engineering and Applied Science: Electrical Engineering

    A modern hardware processor consists of many modules that are integrated into a System-on-Chip (SoC). Sensitive modules must be protected against malicious attacks. Most high-performance processing machines used in security assurance systems are produced and assembled abroad. With enough resources, an attacker could maliciously modify a general-purpose processor across many stages of the acquisition chain, from design and manufacturing to assembly and transport. These altered processors might figure out a way into high-security systems. Security validation is one method to ensure that a future attack can be thwarted before its manifestation. Pre-silicon verification involves verifying designs in a virtual environment with simulation, emulation, and formal verification methods at the design level. Assertion-based verification is a widely used formal verification technique. Assertions ensure functional correctness during the design and verification phases. To extend the assertion-based verification technique to functional testing during silicon bring-up requires the translation of these assertions into post-fabrication run-time security monitors is employed. A security monitor - translation of these pre-fab security assertions, when embedded with the Design Under Test (DUT) in the long run, proves helpful in observing security vulnerability. This thesis presents a compiler for post-fabrication monitoring of assertions, implemented based on System Verilog Assertions (SVA). The compiler's core purpose is to translate SVA into run-time Verilog monitors added to the DUT for post-silicon validation. The entire compiler has been developed using the Python and PLY Python Lex-Yacc framework. Monitors thus generated are verified for their ability to catch an error during run-time. We have checked the compiler's robustness by applying numerous tests in a systematic approach from basic to complex input assertions. We have also analyzed the monitors' performance to show (open full item for complete abstract)

    Committee: Ranganadha Vemuri Ph.D. (Committee Chair); Wen-Ben Jone Ph.D. (Committee Member); Carla Purdy Ph.D. (Committee Member) Subjects: Electrical Engineering
  • 18. Ndikom, Kyrian Use of Electronic Visit Verification System to reduce Time Banditry for Optimized Quality of Care in Home Health Care by Certified Nursing Assistants

    Doctor of Nursing Practice Degree Program in Population Health Leadership DNP, Xavier University, 2021, Nursing

    Time banditry is defined as the misuse of time doing non-task related activities during paid work; this is also considered to be "stealing time" (Martin et al., 2010). Notably, time banditry may happen in all employment sectors and in various forms. Each time bandit steals time for a marginally diverse reason. Although the motives may be different between individuals, the concept is the same. This DNP project implemented an Electronic Visit Verification (EVV) system to replace the existing time management methods used by organizations. The goal was to facilitate consistency in the duties of CNAs who cared for homebound elderly residents and reduce time banditry. EVV is a computer-based software that enables real-time touchscreen smartphone contact between workers (CNAs) and the managers (nursing supervisor). EVV functions as a medium to monitor home care workers' attendance, current client situation, data entry, and care plan coordinating. CNAs log-in at the time of arrival at the work location indicating the start of work and log out at end of work – completing task. The clients are expected to countersign with the CNAs to confirm agreement with time entry. Findings from the outcome of patient surveys suggests that customers verbalize satisfaction with the EVV use because CNAs are arriving to work on time, staying for the entire scheduled shift, and completing task. Stakeholders state that billing is more accurate and organizational growth becomes promising.

    Committee: Susie R. Allen Ph.D., RN-BC (Advisor) Subjects: Health; Health Care; Health Care Management; Health Sciences; Nursing
  • 19. Obeidat, Nawar Using Formal Methods to Build and Validate Reliable and Secure Smart Systems via TLA+

    PhD, University of Cincinnati, 2021, Engineering and Applied Science: Computer Science and Engineering

    Verification and validation are very important in the design of any system. For safety-critical systems, especially, the validation level is crucial. In this work, developing a robust procedure for designing smart, safe, secure systems is our eventual goal. There are different methods for verification and validation, but formal methods are used widely to validate these systems due to their high level of confidence for guaranteeing correctness, security, and safety properties. TLA+ is a formal specification language and its TLC model checker will be used in this work to formally verify and validate several example systems, and to find security hacks. Designing smart systems has recently received much attention because of the ability to connect heterogeneous sub-systems together to build a fully controlled smart system. Here we propose models for three systems: a smart school building system, an ADS-B system for avionics, and a SCADA system. We model the systems using UML diagrams to describe objects, states, and sequences of actions. Formal methods techniques are used to verify the properties of the systems. TLA+ is used for formal modeling. For the smart school building system, the ADS-B system, and the SCADA system, we verify their properties and correctness using the TLA+ toolbox and the TLC model checker. Also, we use the TLC model checker to find example security bugs we introduce into the designs.

    Committee: Carla Purdy Ph.D. (Committee Chair); Shaaban Abdallah Ph.D. (Committee Member); Nan Niu Ph.D. (Committee Member); Massoud (Max) Rabiee Ph.D. (Committee Member); Boyang Wang (Committee Member) Subjects: Computer Engineering
  • 20. Carman, Benjamin Translating LaTeX to Coq: A Recurrent Neural Network Approach to Formalizing Natural Language Proofs

    Bachelor of Science (BS), Ohio University, 2021, Computer Science

    There is a strong desire to be able to more easily formalize existing mathematical statements and develop machine-checked proofs to verify their validity. Doing this by hand can be a painstaking process with a steep learning curve. In this paper, we propose a model that could automatically parse natural language proofs written in LaTeX into the language of the interactive theorem prover, Coq, using a recurrent neural network. We aim to show the ability for such a model to work well within a very limited domain of proofs about even and odd expressions and exhibit generalization at test time. We demonstrate the model's ability to generalize well given small variations in natural language and even demonstrate early promising results for the model to generalize to expressions of intermediate lengths unseen at training time.

    Committee: David Juedes (Advisor) Subjects: Computer Science