Skip to Main Content

Basic Search

Skip to Search Results
 
 
 

Left Column

Filters

Right Column

Search Results

Search Results

(Total results 17)

Mini-Tools

 
 

Search Report

  • 1. Tendulkar, Vaibhav Behavioral Signature-based Framework for Identifying Unsatisfiable Variable Mappings between Digital Designs

    Doctor of Philosophy (PhD), Wright State University, 2012, Computer Science and Engineering PhD

    Throughout its design process (from specification to implementation) a digital circuit goes through a variety of structural changes. These changes are introduced primarily due to the use of automated tools in the design process. Checking whether the Boolean functions representing the two designs are equivalent is thus necessary to verify if a design implementation adheres to its specification. Combinational Equivalence Checking (CEC) - a process of determining whether two combinational design functions are equiv-alent, has been one of the most researched Boolean matching problems. The well-known CEC techniques that have been proposed adopt some kind of a formal approach such as Canonical Form (ROBDD, BMDs, etc.) isomorphism or Boolean Satisfiability in order to prove or disprove equivalence. Hybrid techniques that adopt a combination of the above mentioned two techniques have also been proposed. Knowing the exact nature of variable mappings / correspondences between the two designs a priori is advantageous during the CEC process. However, situations may arise arise wherein the knowledge of these mappings is unknown or lost. Not knowing the variable mappings between the two designs a priori increases the computational complexity of CEC techniques. The CEC problem, under unknown variable mappings, is a more complex Boolean matching problem - the problem of determining if an input and an output variable mapping/permutation exists under which the two designs are functionally equivalent. The use of signatures/filters has proven to be a well-known approach taken by the design verification community quickly detect and prune those variable mappings that do not make the two designs equivalent. In our work we propose and implement three novel output behavior based signatures known as Behavioral signatures. Behavioral signatures are computed solely based on the binary output behavior exhibited by the designs and thus distance themselves from relying on Boolean equations, cano (open full item for complete abstract)

    Committee: Travis Doom PhD (Committee Chair); John Gallagher PhD (Committee Member); Meilin Liu PhD (Committee Member); Henry Chen PhD (Committee Member); Sridhar Ramachandran PhD (Committee Member) Subjects: Computer Engineering; Computer Science; Electrical Engineering
  • 2. Kong, Cindy Formal Methods Applied to the Specification of an Active Network Node

    MS, University of Cincinnati, 2001, Engineering : Computer Engineering

    An active network describes a new generation of network where all network nodes are programmable. Nodes can be configured according to user requirements at booting or on-the-fly, and each node can also provide computational capabilities to process regular data carrying packets. The flexibility gained in the dynamics of active networks gives rise to significant security and safety issues. In this thesis, we apply Formal Methods to the specification of an active node. The specification process consists of designing a formal framework and then evaluating it. Constructive and axiomatic specifications are used during the design. Evaluation of the framework is achieved by building a working node model and verifying that the model demonstrates the critical properties of an active node. The formal framework and model are both written in the Prototype Verification System (PVS) language. The PVS theorem prover is then used for verifying and validating the proposed model.

    Committee: Perry Alexander (Advisor) Subjects: Computer Science
  • 3. Bhamidipati, Padmaja Security Assurance In SoCs and NoCs: Analysis, Specification, and Quantification

    PhD, University of Cincinnati, 2024, Engineering and Applied Science: Electrical Engineering

    Modern heterogeneous architectures contain multiple cores that perform a variety of processing, storage, and communication tasks. The complexity of interactions among the cores and of the cores themselves introduces potential security vulnerabilities that can be exploited by malicious actors to mount a variety of attacks. To address these vulnerabilities, it is critical to conduct systematic security analysis, enforce relevant security policies, and verify designs through formal methods before fabrication. However, current SoC designs require a time-consuming and resource-intensive process to identify and verify security assets against applicable security policies. This gap highlights the need for efficient abstraction techniques that streamline the specification and verification of security policies, reducing both the verification cost and design overhead. As these complex architectures rely on information transfer between the cores, the significance of a well-established interconnect such as Network-on-Chip (NoC) is paramount. NoC architectures have gained prominence in modern computing systems for their scalability and efficiency. However, the globalization of NoC design and fabrication exposes them to security threats. The shared hardware resources between secure and malicious IPs in NoC create vulnerabilities that are exploited by the attacker to implement explicit and implicit data leakages. Quantitative analysis plays an important role in exposing vulnerabilities by quantifying packet information and traffic flow across components and pathways. It uses numerical data and mathematical models to understand complex systems, revealing patterns, and anomalies through qualitative methods. This dissertation introduces a comprehensive methodology to address the challenges associated with SoC and NoC security. First, we propose a systematic approach for security analysis using Assertion-Based Verification (ABV), focusing on cataloging SoC vulnerabilities and d (open full item for complete abstract)

    Committee: Ranganadha Vemuri Ph.D. (Committee Chair); Wen-Ben Jone Ph.D. (Committee Member); Suyuan Chen Ph.D. (Committee Member); Mike Borowczak Ph.D. (Committee Member); John Emmert Ph.D. (Committee Member) Subjects: Electrical Engineering
  • 4. Alatoun, Khitam Efficient Techniques for SoC Security Validation and Monitoring

    PhD, University of Cincinnati, 2024, Engineering and Applied Science: Computer Science and Engineering

    Embedded systems are commonly described as the powerhouses that drive a wide range of modern devices. The landscape of embedded systems has undergone significant transformation since the advent of System-on-Chip (SoC) technology. Throughout the life cycle of modern SoC designs, various hardware security threats can be introduced. These security vulnerabilities, when exercised during SoC operation, can lead to denial of service or disclosure of protected data. Hence, it is essential to undertake security validation before and after SoC fabrication and make provisions for continuous security assessment during operation. Assertion-based formal methods have been used to verify hardware designs. These methods use temporal logic to mathematically verify design properties. However, various of security policies such as confidentiality, integrity, time side-channels and power side-channels cannot be verified directly with traditional formal techniques. Additionally, current specification languages primarily focus on functional correctness and do not provide formal assurances for a wide range of security policies. Furthermore, critical design policies not only necessitate robust verification methods but also require continuous assessment during operation. Assertion-based runtime monitoring is a commonly used method to convert assertions into synthesizable hardware monitors. These monitors can be embedded into the design to detect property violations during runtime. However, an efficient method is required to generate optimized hardware runtime monitors from a security specification language. To design a trusted SoC, it is important to provide a methodology to specify, verify and monitor security policies. This research intends to develop tools that help to identify potential security vulnerabilities during SoC design and operation. We developed several methods to verify information flow, timing side-channel and power side-channel properties, they collabora (open full item for complete abstract)

    Committee: Ranganadha Vemuri Ph.D. (Committee Chair); John Emmert Ph.D. (Committee Member); Wen-Ben Jone Ph.D. (Committee Member); Rashmi Jha Ph.D. (Committee Member); Mike Borowczak Ph.D M.A B.A. (Committee Member) Subjects: Computer Engineering
  • 5. Taylor, Max Trustworthy UAS: A Holistic Approach

    Doctor of Philosophy, The Ohio State University, 2024, Computer Science and Engineering

    Unmanned Aerial Systems (UAS) are increasingly important. Farmers monitor crops and apply pesticides with UAS. First responders use UAS in applications ranging from fire fighting to search and rescue operations. There is potential for rapid shopping delivery by UAS. In all these applications, UAS work closely alongside humans. Onboard firmware controls the behavior of UAS. This dissertation studies ways to improve the quality of firmware. We start by presenting the first large-scale analysis of software defects ("bugs") reported in open-source UAS firmware. We examine nearly 300 reported bugs in the two most popular open-source systems (ArduPilot and PX4) and categorize the defects. Motivated by our findings, we propose three technologies to automate the detection and repair of UAS bugs. First, Avis seeks to automatically diagnose sensor bugs caused by misusing onboard sensors. Second, SA4U identifies unit type errors caused by incorrectly mixing values with different physical unit types (e.g., meters and minutes) in a computation. Finally, Scalpel automatically repairs bugs found by SA4U. Deep learning is increasingly used to provide advanced autonomous behavior for UAS. To support higher quality deep learning systems we propose checkd. Checkd automates checkpoint/restore policy configurations. Underlying checkd's contribution is the thesis that better tuned models yield better behavior. Checkd helps practitioners fine-tune models by reducing the overall cost to train.

    Committee: Feng Qin (Advisor); Michael Bond (Committee Member); Christopher Stewart (Committee Member) Subjects: Computer Science
  • 6. Daughety, Nathan Design and analysis of a trustworthy, Cross Domain Solution architecture

    PhD, University of Cincinnati, 2022, Engineering and Applied Science: Computer Science and Engineering

    With the paradigm shift to cloud-based operations, reliable and secure access to and transfer of data between differing security domains has never been more essential. A Cross Domain Solution (CDS) is a guarded interface which serves to execute the secure access and/or transfer of data between isolated and/or differing security domains defined by an administrative security policy. Cross domain security requires trustworthiness at the confluence of the hardware and software components which implement a security policy. Security components must be relied upon to defend against widely encompassing threats -- consider insider threats and nation state threat actors which can be both onsite and offsite threat actors -- to information assurance. Current implementations of CDS systems use sub-optimal Trusted Computing Bases (TCB) without any formal verification proofs, confirming the gap between blind trust and trustworthiness. Moreover, most CDSs are exclusively operated by Department of Defense agencies and are not readily available to the commercial sectors, nor are they available for independent security verification. Still, more CDSs are only usable in physically isolated environments such as Sensitive Compartmented Information Facilities and are inconsistent with the paradigm shift to cloud environments. Our purpose is to address the question of how trustworthiness can be implemented in a remotely deployable CDS that also supports availability and accessibility to all sectors. In this paper, we present a novel CDS system architecture which is the first to use a formally verified TCB. Additionally, our CDS model is the first of its kind to utilize a computation-isolation approach which allows our CDS to be remotely deployable for use in cloud-based solutions.

    Committee: John Franco Ph.D. (Committee Member); John Emmert Ph.D. (Committee Member); Marcus Dwan Pendleton Ph.D. (Committee Member); Nan Niu Ph.D. (Committee Member); Rashmi Jha Ph.D. (Committee Member) Subjects: Computer Science
  • 7. Obeidat, Nawar Using Formal Methods to Build and Validate Reliable and Secure Smart Systems via TLA+

    PhD, University of Cincinnati, 2021, Engineering and Applied Science: Computer Science and Engineering

    Verification and validation are very important in the design of any system. For safety-critical systems, especially, the validation level is crucial. In this work, developing a robust procedure for designing smart, safe, secure systems is our eventual goal. There are different methods for verification and validation, but formal methods are used widely to validate these systems due to their high level of confidence for guaranteeing correctness, security, and safety properties. TLA+ is a formal specification language and its TLC model checker will be used in this work to formally verify and validate several example systems, and to find security hacks. Designing smart systems has recently received much attention because of the ability to connect heterogeneous sub-systems together to build a fully controlled smart system. Here we propose models for three systems: a smart school building system, an ADS-B system for avionics, and a SCADA system. We model the systems using UML diagrams to describe objects, states, and sequences of actions. Formal methods techniques are used to verify the properties of the systems. TLA+ is used for formal modeling. For the smart school building system, the ADS-B system, and the SCADA system, we verify their properties and correctness using the TLA+ toolbox and the TLC model checker. Also, we use the TLC model checker to find example security bugs we introduce into the designs.

    Committee: Carla Purdy Ph.D. (Committee Chair); Shaaban Abdallah Ph.D. (Committee Member); Nan Niu Ph.D. (Committee Member); Massoud (Max) Rabiee Ph.D. (Committee Member); Boyang Wang (Committee Member) Subjects: Computer Engineering
  • 8. Carman, Benjamin Translating LaTeX to Coq: A Recurrent Neural Network Approach to Formalizing Natural Language Proofs

    Bachelor of Science (BS), Ohio University, 2021, Computer Science

    There is a strong desire to be able to more easily formalize existing mathematical statements and develop machine-checked proofs to verify their validity. Doing this by hand can be a painstaking process with a steep learning curve. In this paper, we propose a model that could automatically parse natural language proofs written in LaTeX into the language of the interactive theorem prover, Coq, using a recurrent neural network. We aim to show the ability for such a model to work well within a very limited domain of proofs about even and odd expressions and exhibit generalization at test time. We demonstrate the model's ability to generalize well given small variations in natural language and even demonstrate early promising results for the model to generalize to expressions of intermediate lengths unseen at training time.

    Committee: David Juedes (Advisor) Subjects: Computer Science
  • 9. Rumreich, Laine The Binary Decision Diagram: Formal Verification of a Reference Implementation

    Master of Science, The Ohio State University, 2021, Computer Science and Engineering

    Formal verification is a method of proving program correctness based on formal specifications and using mathematics. The goal of this study is to formally verify by mathematical proof the correctness of a Java implementation of a Binary Decision Diagram (BDD). Specifically, the verification of the implementation proves its correctness and makes it significantly less susceptible to errors, crashing, and undiscovered bugs that could be exploited. This verified BDD implementation can then be used to solve a wide variety of problems with a higher level of confidence than would be possible with an unverified implementation. The formal specification of the Java-based BDD component verified in this project was used to prove the correctness of a reference implementation. Each method in this reference implementation was represented in a reasoning table, and a rigorous set of proofs were written for each verification condition in the reasoning tables. These proofs and the method reasoning tables, taken together, form a single proof that establishes the correctness of the BDD reference implementation as a whole. In the development of formal correctness proofs for each method in the BDD component, several errors in the implementation and specifications were discovered and corrected. These methods had been tested using a comprehensive set of test cases but errors were discovered during the formal verification process, exhibiting the value of the proofs. An additional limitation related to the testing capabilities of the component design pattern used in the Java-based BDD component was also discovered in this work. The results of this thesis mean users can be confident in using this implementation based on its proven correctness.

    Committee: Paul Sivilotti (Advisor); Neelam Soundarajan (Committee Member) Subjects: Computer Science
  • 10. Sisco, Zachary Verifying Data-Oriented Gadgets in Binary Programs to Build Data-Only Exploits

    Master of Science (MS), Wright State University, 2018, Computer Science

    Data-Oriented Programming (DOP) is a data-only code-reuse exploit technique that “stitches” together sequences of instructions to alter a program's data flow to cause harm. DOP attacks are difficult to mitigate because they respect the legitimate control flow of a program and by-pass memory protection schemes such as Address Space Layout Randomization, Data Execution Prevention, and Control Flow Integrity. Techniques that describe how to build DOP payloads rely on a program's source code. This research explores the feasibility of constructing DOP exploits without source code—that is, using only binary representations of programs. The lack of semantic and type information introduces difficulties in identifying data-oriented gadgets and their properties. This research uses binary program analysis techniques and formal methods to identify and verify data-oriented gadgets, and determine if they are reachable and executable from a given memory corruption vulnerability. This information guides the construction of DOP attacks without the need for source code, showing that common-off-the-shelf programs are also vulnerable to this class of exploit.

    Committee: Adam Bryant Ph.D. (Committee Co-Chair); John Emmert Ph.D. (Committee Co-Chair); Meilin Liu Ph.D. (Committee Member); Krishnaprasad Thirunarayan Ph.D. (Committee Member) Subjects: Computer Science
  • 11. Kimura, Adam Development of Trust Metrics for Quantifying Design Integrity and Error Implementation Cost

    Doctor of Philosophy, The Ohio State University, 2017, Electrical and Computer Engineering

    One of the major concerns in the Integrated Circuit (IC) industry today is the issue of Hardware Trust. This problem has risen as result of increased outsourcing and from the integration of more third party Intellectual Property (IP) into designs. Trusted Microelectronics is a new field of research that has emerged to address these hardware assurance concerns. Trojan Detection, Design for Security and Trust, Trusted Supply Chain Management, Anti-Counterfeiting, Trusted Design Verification, and Vulnerability & Attack Mitigation are the major sub-fields of Trusted Microelectronics where research progress is being made. There is; however, currently a lack of well-defined metrics for quantifying Hardware Trust. As such, developing a portfolio of Trust Metrics is a needed contribution in the Trusted Microelectronics space that will also bring value to the other sub-fields of Trust. In this work, a Trust Metric Solution Space is defined in order to establish a roadmap for developing Trust Metrics. The Solution Space also creates a coalescing point for the metrics work being conducted in other communities to integrate into. An Error Implementation Cost (EIC) measure is developed as a technique to quantify errors and to allow error ranking and rating. Four MIPS Processor test cases containing embedded errors are utilized in order to show that the EIC scoring can be applied for creating quantifiable differentiation between different errors of varying severity. Errors 1 and 2 were shown to be the least severe with System Payloads of 0.0181 and 0.0010 respectively. Errors 3 and 4 were shown to be more severe with System Payloads of 0.5140 and 0.3216 respectively. The EIC scoring is then used to assist in developing Test Articles (TA) for example case scenarios that contain embedded errors. A 32-bit Floating Point Adder, a Fixed to Floating Point Converter, a MIPS Processor, and a Full System TA was developed in order to apply the techniques for evaluating h (open full item for complete abstract)

    Committee: Steve Bibyk Dr. (Advisor); Lisa Fiorentini Dr. (Committee Member); Ayman Fayed Dr. (Committee Member) Subjects: Electrical Engineering
  • 12. Zaccai, Diego A Balanced Verification Effort for the Java Language

    Doctor of Philosophy, The Ohio State University, 2016, Computer Science and Engineering

    Current tools used to verify software in languages with reference semantics, such as Java, are hampered by the possibility of aliases. Existing approaches to addressing this long-standing verification problem try not to sacrifice modularity because modular reasoning is what makes verification tractable. To achieve this, these approaches treat the value of a reference variable as a memory address in the heap. A serious problem with this decision is that it severely limits the usefulness of generic collections because they must be specified as collections of references, and components of this kind are fundamentally flawed in design and implementation. The limitations become clear when attempting to verify clients of generic collections. The first step in rectifying the situation is to redefine the "value" of a reference variable in terms of the abstract value of the object it references. A careful analysis of what the "value" of a reference variable might mean leads inevitably to this conclusion, which is consistent with the denotation of a variable in languages with value semantics, such as RESOLVE. Verification in languages with value semantics is straightforward compared to verification in languages with reference semantics precisely because of the lack of concern with heap properties. However, there is still a nagging problem: aliasing can occur in legal programs in languages with reference semantics, such as Java, and it must be handled when it does occur. The crux of the issue is not in-your-face assignment statements that copy references but rather aliasing arising within (hidden) method bodies. The reason is that the effects of calls to these methods in client code must be summarized in their specifications in order to preserve modularity. So, the second step is the introduction of a discipline restricting what a client can do with a reference that is aliased within a method. The discipline advertises the creation of such aliases in method speci (open full item for complete abstract)

    Committee: Bruce W. Weide (Advisor); Neelam Soundarajan (Committee Member); Paul A. G. Sivilotti (Committee Member) Subjects: Computer Science
  • 13. Matias, Matthew Program Verification of FreeRTOS using Microsoft Dafny

    Master of Science in Software Engineering, Cleveland State University, 2014, Washkewicz College of Engineering

    FreeRTOS is a popular real-time and embedded operating system. Real-time software requires code reviews, software tests, and other various quality assurance activities to ensure minimal defects. This free and open-source operating system has claims of robustness and quality [26]. Real-time and embedded software is found commonly in systems directly impacting human life and require a low defect rate. In such critical software, traditional quality assurance may not suffice in minimizing software defects. When traditional software quality assurance is not enough for defect removal, software engineering formal methods may help minimize defects. A formal method such as program verification is useful for proving correctness in real-time soft- ware. Microsoft Research created Dafny for proving program correctness. It contains a programming language with specification constructs. A program verification tool such as Dafny allows for proving correctness of FreeRTOS's modules. We propose using Dafny to verify the correctness of FreeRTOS' scheduler and supporting API.

    Committee: Nigamanth Sridhar PhD (Committee Chair); Yongjian Fu PhD (Committee Member); Chansu Yu PhD (Committee Member) Subjects: Computer Science
  • 14. Sarpangala, Kishan Arbitration Techniques for SoC Bus Interconnect with Optimized Verification Methodology

    MS, University of Cincinnati, 2013, Engineering and Applied Science: Computer Engineering

    The die size of the VLSI chip has been drastically decreasing over the years while the density has been increasing exponentially with more functionality squeezed into a smaller space. The need for manufacturers to have an edge in the market and to keep up with Moore's law are the driving factors which motivated the industry to move towards SoC (System on Chip). There is more on-chip integration of different IP (intellectual property) cores, in a smaller space, with more functionality being included. This leads to problems in designing and verifying the entire design. In particular, verifying the SoC bus interconnects is difficult and presents major challenges especially with respect to time to market. One aspect of this verification involves power usage. As the complexity of the SoC increases in the communication architecture, the power consumed becomes a major concern. In terms of power use the most important component in any communication architecture is the bus interconnect and the arbiter is the major component in the bus interconnect. In this research, an effective arbiter for the bus communication which supports six priority policies will be designed. It's named Blue-Jay. Arbitration policies are applied using one of the three approaches to data multiplexing- transfer, transaction, and desired transfer length. Hence, there are total of eighteen possible arbitration schemes. To verify the functioning of our design we have developed an optimized verification methodology, based on formal methods and System Verilog. Our results show higher reliability than the commonly used AMBA arbitration scheme.

    Committee: Carla Purdy Ph.D. (Committee Chair); Raj Bhatnagar Ph.D. (Committee Member); George Purdy Ph.D. (Committee Member) Subjects: Electrical Engineering
  • 15. Ford, Gregory Hardware Emulation of Sequential ATPG-Based Bounded Model Checking

    Master of Sciences (Engineering), Case Western Reserve University, 2014, EECS - Computer Engineering

    The size and complexity of integrated circuits is continually increasing, in accordance with Moore’s law. Along with this growth comes an expanded exposure to subtle design errors, thus leaving a greater burden on the process of formal verification. Existing methods for formal verification, including Automatic Test Pattern Generation (ATPG) are susceptible to exploding model sizes and run times for larger and more complex circuits. In this paper, a method is presented for emulating the process of sequential ATPG-based Bounded Model Checking on reconfigurable hardware. This achieves a speed up over software based methods, due to the fine-grain massive parallelism inherent to hardware.

    Committee: Daniel Saab (Committee Chair); Francis Merat (Committee Member); Christos Papachristou (Committee Member) Subjects: Computer Engineering
  • 16. MANSOURI, NAZANIN AUTOMATED CORRECTNESS CONDITION GENERATION FOR FORMAL VERIFICATION OF SYNTHESIZED RTL DESIGNS

    PhD, University of Cincinnati, 2001, Engineering : Computer Engineering

    This work presents a formal methodology for verifying the functional correctness of synthesized register transfer level designs (RTL) generated by a high-level synthesis system. The verification is conducted by proving the observation equivalence of the RTL design with a description of its desired behavior. High-level synthesis tools generate register transfer level designs from algorithmic behavioral specifications. The high-level synthesis process consists of dependency graph scheduling,function unit allocation, register allocation, interconnect allocation and controller generation tasks. Widely used algorithms for these tasks retain the overall control flow structure of the behavioral specification allowing code motion only within basic blocks. Further, high-level synthesis algorithms are in general oblivious to the mathematical properties of arithmetic and logic operators, selecting and sharing RTL library modules solely based on matching uninterpreted function symbols and constants. Many researchers have noted that these features of high-level synthesis algorithms can be exploited to develop efficient verification strategies for synthesized designs. This dissertation reports a verification methodology that effectively exploits these features to achieve efficient and fully automated verification of synthesized designs. Contributions of this research include formalization and formulation in higher-order logic in a theorem proving environment mathematical models for the synthesized register transfer level designs and their behavioral specifications and a set of sufficient correctness conditions for these designs. It presents an in depth study of pipelining in design synthesis, and identifies the complete set of correctness conditions for RTL designs generated through the synthesis processes that employ pipelining techniques. This method has been implemented in a verification tool (the correctness condition generator, CCG) and is integrated with a high-level synthe (open full item for complete abstract)

    Committee: Dr. Ranga Vemuri (Advisor) Subjects:
  • 17. Qiang, Qiang FORMAL: A SEQUENTIAL ATPG-BASED BOUNDED MODEL CHECKING SYSTEM FOR VLSI CIRCUITS

    Doctor of Philosophy, Case Western Reserve University, 2006, Computer Engineering

    Bounded Model Checking (BMC) is a formal method of verifying Very Large Scale Integrated (VLSI) circuits. It shows violation of a given circuit property by finding a counter-example to the property along bounded state paths of the circuit. The BMC problem is inherently NP-complete and is traditionally formulated to a boolean SATisfiability (SAT) problem, which is subsequently solved by a SAT solver. Automatic Test Pattern Generation (ATPG), as an alternative to SAT, has already been shown an effective solution to NP-complete problems in many computer-aided design areas. In the field of BMC, ATPG has already achieved promising results for simple properties; its effectiveness for more complicated nested properties, however, remains unknown. This thesis presents the first systematic framework of ATPG-based BMC capable of checking properties in all nested forms on gate level. The negation counterpart to a property is mapped into a structural monitor, which is tailored to a flattened model of the input circuit. A target fault is then injected at the monitor output, and a modified ATPG-based state justification algorithm is used to search a test for this fault. Finding such a test corresponds to formally establishing the property. The framework can easily incorporate any existing ATPG tool with little modification. The proposed framework has been implemented in a computer program called FORMAL, and has been used to check a comprehensive set of properties of GL85 microprocessor and USB 2.0 circuits. Experimental results show that the ATPG-based approach performs better in both capacity and efficiency than the SAT-based techniques, especially for large bounds and for properties that require large search space. Therefore, ATPG-based BMC has been demonstrated an effective supplement to SAT-based BMC in VLSI circuit verification.

    Committee: Daniel Saab (Advisor) Subjects: