Search Results (1 - 5 of 5 Results)

Sort By  
Sort Dir
 
Results per page  

Ford, Gregory FickHardware Emulation of Sequential ATPG-Based Bounded Model Checking
Master of Sciences (Engineering), Case Western Reserve University, 2014, EECS - Computer Engineering
The size and complexity of integrated circuits is continually increasing, in accordance with Moore’s law. Along with this growth comes an expanded exposure to subtle design errors, thus leaving a greater burden on the process of formal verification. Existing methods for formal verification, including Automatic Test Pattern Generation (ATPG) are susceptible to exploding model sizes and run times for larger and more complex circuits. In this paper, a method is presented for emulating the process of sequential ATPG-based Bounded Model Checking on reconfigurable hardware. This achieves a speed up over software based methods, due to the fine-grain massive parallelism inherent to hardware.

Committee:

Daniel Saab (Committee Chair); Francis Merat (Committee Member); Christos Papachristou (Committee Member)

Subjects:

Computer Engineering

Keywords:

Formal verification, ATPG, BMC, PODEM

Qiang, QiangFORMAL: A SEQUENTIAL ATPG-BASED BOUNDED MODEL CHECKING SYSTEM FOR VLSI CIRCUITS
Doctor of Philosophy, Case Western Reserve University, 2006, Computer Engineering
Bounded Model Checking (BMC) is a formal method of verifying Very Large Scale Integrated (VLSI) circuits. It shows violation of a given circuit property by finding a counter-example to the property along bounded state paths of the circuit. The BMC problem is inherently NP-complete and is traditionally formulated to a boolean SATisfiability (SAT) problem, which is subsequently solved by a SAT solver. Automatic Test Pattern Generation (ATPG), as an alternative to SAT, has already been shown an effective solution to NP-complete problems in many computer-aided design areas. In the field of BMC, ATPG has already achieved promising results for simple properties; its effectiveness for more complicated nested properties, however, remains unknown. This thesis presents the first systematic framework of ATPG-based BMC capable of checking properties in all nested forms on gate level. The negation counterpart to a property is mapped into a structural monitor, which is tailored to a flattened model of the input circuit. A target fault is then injected at the monitor output, and a modified ATPG-based state justification algorithm is used to search a test for this fault. Finding such a test corresponds to formally establishing the property. The framework can easily incorporate any existing ATPG tool with little modification. The proposed framework has been implemented in a computer program called FORMAL, and has been used to check a comprehensive set of properties of GL85 microprocessor and USB 2.0 circuits. Experimental results show that the ATPG-based approach performs better in both capacity and efficiency than the SAT-based techniques, especially for large bounds and for properties that require large search space. Therefore, ATPG-based BMC has been demonstrated an effective supplement to SAT-based BMC in VLSI circuit verification.

Committee:

Daniel Saab (Advisor)

Keywords:

Bounded Model Checking; BMC; Automatic Test Pattern Generation; ATPG; Sequential ATPG; Formal Verification; Functional Verification; VLSI Design and Verification

Sarpangala, KishanArbitration Techniques for SoC Bus Interconnect with Optimized Verification Methodology
MS, University of Cincinnati, 2013, Engineering and Applied Science: Computer Engineering
The die size of the VLSI chip has been drastically decreasing over the years while the density has been increasing exponentially with more functionality squeezed into a smaller space. The need for manufacturers to have an edge in the market and to keep up with Moore's law are the driving factors which motivated the industry to move towards SoC (System on Chip). There is more on-chip integration of different IP (intellectual property) cores, in a smaller space, with more functionality being included. This leads to problems in designing and verifying the entire design. In particular, verifying the SoC bus interconnects is difficult and presents major challenges especially with respect to time to market. One aspect of this verification involves power usage. As the complexity of the SoC increases in the communication architecture, the power consumed becomes a major concern. In terms of power use the most important component in any communication architecture is the bus interconnect and the arbiter is the major component in the bus interconnect. In this research, an effective arbiter for the bus communication which supports six priority policies will be designed. It's named Blue-Jay. Arbitration policies are applied using one of the three approaches to data multiplexing- transfer, transaction, and desired transfer length. Hence, there are total of eighteen possible arbitration schemes. To verify the functioning of our design we have developed an optimized verification methodology, based on formal methods and System Verilog. Our results show higher reliability than the commonly used AMBA arbitration scheme.

Committee:

Carla Purdy, Ph.D. (Committee Chair); Raj Bhatnagar, Ph.D. (Committee Member); George Purdy, Ph.D. (Committee Member)

Subjects:

Electrical Engineering

Keywords:

Bus arbitration;arbiter;Bus protocol;formal verification

Tendulkar, Vaibhav UdayBehavioral Signature-based Framework for Identifying Unsatisfiable Variable Mappings between Digital Designs
Doctor of Philosophy (PhD), Wright State University, 2012, Computer Science and Engineering PhD

Throughout its design process (from specification to implementation) a digital circuit goes through a variety of structural changes. These changes are introduced primarily due to the use of automated tools in the design process. Checking whether the Boolean functions representing the two designs are equivalent is thus necessary to verify if a design implementation adheres to its specification. Combinational Equivalence Checking (CEC) - a process of determining whether two combinational design functions are equiv-alent, has been one of the most researched Boolean matching problems. The well-known CEC techniques that have been proposed adopt some kind of a formal approach such as Canonical Form (ROBDD, BMDs, etc.) isomorphism or Boolean Satisfiability in order to prove or disprove equivalence. Hybrid techniques that adopt a combination of the above mentioned two techniques have also been proposed.

Knowing the exact nature of variable mappings / correspondences between the two designs a priori is advantageous during the CEC process. However, situations may arise arise wherein the knowledge of these mappings is unknown or lost. Not knowing the variable mappings between the two designs a priori increases the computational complexity of CEC techniques. The CEC problem, under unknown variable mappings, is a more complex Boolean matching problem - the problem of determining if an input and an output variable mapping/permutation exists under which the two designs are functionally equivalent. The use of signatures/filters has proven to be a well-known approach taken by the design verification community quickly detect and prune those variable mappings that do not make the two designs equivalent.

In our work we propose and implement three novel output behavior based signatures known as Behavioral signatures. Behavioral signatures are computed solely based on the binary output behavior exhibited by the designs and thus distance themselves from relying on Boolean equations, canonical forms or any other functional representations for their computation. This property makes the Behavioral signatures useful to all digital design domains including those that might not possess any knowledge of the Boolean functions representing the designs that face the problem of determining design equivalence under unknown variable correspondences. The Grouped Row Sum and the Row Difference signatures that we propose are used to identify unsatisfiable input variable mappings between the two designs. Our Grouped Column Sum signature on the other hand is used to identify the unsatisfiable output variable mappings.

The success of a Behavioral signature lies in the fineness with which it summarizes the design behavior. However there exists a tradeoff between the degree of fineness achieved and the execution time. For instance, amongst the input variable signatures that we propose, the Row Difference signature is more adept in summarizing the design behavior than its counterpart the Grouped Row Sum signature. However the computation cost associated with the Row Difference signature is higher as compared to the Grouped Row Sum signature. We thus define a framework for using these signatures based on the nature of the designs under consideration.

Committee:

Travis Doom, PhD (Committee Chair); John Gallagher, PhD (Committee Member); Meilin Liu, PhD (Committee Member); Henry Chen, PhD (Committee Member); Sridhar Ramachandran, PhD (Committee Member)

Subjects:

Computer Engineering; Computer Science; Electrical Engineering

Keywords:

Boolean Matching; Formal Equivalence Checking; Boolean Signatures; Behavioral Signatures; Formal Verification

MANSOURI, NAZANINAUTOMATED CORRECTNESS CONDITION GENERATION FOR FORMAL VERIFICATION OF SYNTHESIZED RTL DESIGNS
PhD, University of Cincinnati, 2001, Engineering : Computer Engineering
This work presents a formal methodology for verifying the functional correctness of synthesized register transfer level designs (RTL) generated by a high-level synthesis system. The verification is conducted by proving the observation equivalence of the RTL design with a description of its desired behavior. High-level synthesis tools generate register transfer level designs from algorithmic behavioral specifications. The high-level synthesis process consists of dependency graph scheduling,function unit allocation, register allocation, interconnect allocation and controller generation tasks. Widely used algorithms for these tasks retain the overall control flow structure of the behavioral specification allowing code motion only within basic blocks. Further, high-level synthesis algorithms are in general oblivious to the mathematical properties of arithmetic and logic operators, selecting and sharing RTL library modules solely based on matching uninterpreted function symbols and constants. Many researchers have noted that these features of high-level synthesis algorithms can be exploited to develop efficient verification strategies for synthesized designs. This dissertation reports a verification methodology that effectively exploits these features to achieve efficient and fully automated verification of synthesized designs. Contributions of this research include formalization and formulation in higher-order logic in a theorem proving environment mathematical models for the synthesized register transfer level designs and their behavioral specifications and a set of sufficient correctness conditions for these designs. It presents an in depth study of pipelining in design synthesis, and identifies the complete set of correctness conditions for RTL designs generated through the synthesis processes that employ pipelining techniques. This method has been implemented in a verification tool (the correctness condition generator, CCG) and is integrated with a high-level synthesis system. CCG generates (1) formal specifications of the behavior and the RTL design including the data path and the controller, (2) the correctness lemmas establishing equivalence between the synthesized RTL design and its behavioral specification, and (3) their proof scripts that can be submitted to a higher-order logic proof checker. The tool performs model extraction, correctness condition generation and proof generation automatically and without human interaction. This approach is based on the identification, by the synthesis tool during the synthesis process, of the binding between critical specification variables and criticalregisters in the RTL design and between the critical states in the behavior and the corresponding states in the RTL design. CCG is capable of handling a broad range of behavior constructs that may be used for specifying the behavior and a wide variety of algorithms that may be employed during the synthesis process. Also, the verification algorithms of CCG have the appealing feature of relying on symbolic analysis of the uninterpreted values of the variables and registers. This has resulted in a considerable reduction in verification time compared to other post-synthesis verification systems, that are often restrained by this factor,

Committee:

Dr. Ranga Vemuri (Advisor)

Keywords:

formal verification; register transfer level (RTL); High-level synthesis; correctness conditions