Industrial Application of Formal Methods
an ECSI Workshop
October 27, 2005
This event organised by the ECSI Institute is intended to give an overview of the current status of the development and of the commercial EDA offer in the domain of formal design and verification methods.
In the first session a tutorial will be presented to introduce the main formal verification techniques: the equivalence checking, the property (model) checking and the theorem proving. The tutorial will discuss the type of verification problems to which each of these methods can be applied, their performances and also limitations. It will also present the complete verification approach in which traditional verification methods based on simulation coexist in a coherent way with formal techniques to provide the highest possible level of confidence in verification results. The combination of formal and traditional techniques may be a solution to the verification productivity, and thus at the end can offer the cost and time-to-market advantage.
The following session will provide an industry point of view on the needs with regard to the verification methodology, the concrete requirements and also experience gained by industrial companies in the application of advanced formal methods in solving real verification problems.
The EDA companies in turn will present their solutions and methods on which their tools are based. They will also provide an overview of the design and verification environments in which formal methods are integrated within design flows. These presentations will be followed by a demonstration session to allow the participants to have a better understanding of EDA solutions. During the demonstrations participants will have the opportunity to discuss directly with presenters and exchange experiences.
Infineon, Cadence, Mentor Graphics, Synopsys, Real Intent, KeesDA
SESSION 1: FORMAL METHODS TUTORIAL
Presentation 1: Tutorial on Formal Methods and their Applications
Presenter: Dr. Claudia Blank, Infineon Technologies
Even if modern testbench concepts allow for flexible and efficient modelling and sophisticated coverage analysis, functional verification by simulation is still incomplete, causes high efforts in testbench design and consumes a deal in simulator runtime. Formal methods overcome the insufficiencies of simulation in reliability by proving a design’s behavior and its correct functionality instead of observing selected traces and hunting bugs. In addition, experience has shown that formal techniques not only improve verification quality but also can reduce the verification effort.
The tutorial on formal methods and their applications will give an introduction to the world of formal hardware verification for designers and managers. Even if formal verification has outgrown its child’s shoes and become qualified for solving industrial relevant problems, formal methods still have the reputation of being heavy-weighted mathematics for which one needs a special education to make proper use of, whereas simulation is considered to be somehow native to a hardware designer. This tutorial aims at sweeping away the myths on the incompatibility of a hardware designer’s world and formal verification.
The presentation will provide a short introduction to theorem proving, which is mostly used for the verification of algorithms, equivalence checking, which has almost replaced gate-level simulation in implementation verification, and property checking as specification verification. The tutorial will discuss the type of problems that can be solved by employing these methods and will also show the limitations one has to face.
Since the uncertainty about formal methods is mainly in property checking, the focus in the application of formal methods will be set on this topic. The tutorial will introduce the methodology of property checking, illustrate the specification formalism, i.e. today’s property languages, show how property checking gives you the root cause of an error, and will give hands on examples for selected problems. Also, there will be a discussion on what a verification engineer has to take care of in order to obtain the completeness formal methods promise. Since formal property checking and simulation-based verification are complementary approaches, the tutorial will also give an idea on how formal methods and the traditional approach can co-exist and moreover co-work.
SESSION 2: INDUSTRIAL NEEDS AND EXPERIENCE
Presentation 1: Lessons learned from formal methods application at Infineon
Presenter: Tim Blackmore, Infineon Technologies
Duration: 30 min
The Infineon Bristol Design Centre has been using formal methods successfully for verification for the last 5 years. The methods used include automated, semi-formal and formal property checking. The range of projects that formal methods have been applied to in Bristol, and lessons learned from this experience are presented.
Presentation 2: Leveraging Assertion Based Verification by using Magellan
Presenter: Jacob Andersen, SyoSil, Denmark
Duration: 30 min
Verification IP (VIP) has emerged to support verification of protocol compliance of design blocks. As an important part of VIP, assertions describe the temporal actions on block level interfaces. Based on the Wishbone standardized on-chip interface protocol, SystemVerilog Assertions (SVAs) were written to constitute the VIP of the Wishbone protocol. With the VIP as a basis, we then show how we have utilized the state-of-the-art hybrid verification tool Magellan to prove the interface protocol compliance of two different design blocks attachable to the interface, without needing to write any test benches for the design blocks. Furthermore, we show how the hybrid engine within Magellan can be employed to perform coverage based analysis by performing reach-ability searches. We also show how to perform design bring up of new design modules.
SESSION 3: FORMAL VERIFICATION TOOLS
Presentation 1: How to increase Functional Coverage at the block level with Hybrid-Formal Verification
Presenter: Markus Willems, Synopsys
Duration: 40 min
One crucial metric for ensuring design compliance is that of functional coverage. Functional coverage guarantees that the critical functionality within a design block is adequately tested. Earlier techniques - mainly based on simulation - relied solely on code coverage metrics to guide the verification process.
With the standardization of the SystemVerilog and SystemVerilog Assertion languages the industry can now rely on an efficient way to express functional coverage events comprised of complex temporal scenarios.
This presentation introduces the concept of Hybrid-Formal Verification and illustrates how Hybrid-Formal Verification, coupled with SystemVerilog, can be successfully used to increase functional coverage and to report un-reachable coverage goals at the block level.
The presentation will conclude with a real-life application of Synopsys' Magellan, and how Hybrid-Formal verification was used to verify a complex PCI-Express block.
Presentation 2: Successful Adoption of Formal Verification
Presenter: Staffan Berg, Mentor Graphics
Duration: 40 min
Time spent on verification has surpassed that spend on design, reports from industry indicate as much as 70% of total development time. Total cost of failure has also increased, making it more critical that designs are adequately verified. This presentation addresses this verification challenge, and shows how some of the worlds leading semiconductor companies have successfully adopted advanced verification methodologies to counter the challenge.
Presentation 3: Getting the Most Out of Formal Analysis
Presenter: Joerg Mueller, Cadence
Duration: 40 min
Using the right methodology for applying and using products within a design project is critical to getting a high return on investment (ROI). This is especially true in emerging areas such as formal analysis.
Formal analysis holds the promise of reducing testbench development and simulation cycles while improving design verification. Formal analysis uses sophisticated algorithms to conclusively prove or disprove that a design behaves as desired for all possible operating states. Desired behavior is not expressed in a traditional testbench, but rather as a set of assertions. Formal analysis does not require user-developed test vectors, but instead analyzes all legal input sequences concurrently and automatically.
When applied correctly, formal analysis improves productivity and reduces time to market. With formal analysis, many bugs can be found quickly and very early in the design process without the need to develop large sets of test vectors or random test generators. Furthermore, because of its exhaustive nature, formal analysis improves quality by finding corner-case bugs often missed by traditional testbench driven verification. The dual benefits of increased productivity and increased quality are driving the adoption of formal analysis into the design flows of many companies.
The adoption of formal analysis is a relatively recent phenomenon, enabled by the confluence of several factors. These factors include improvements in usability, the recent availability of industry-standard assertion languages, and advancements in performance and capacity. While these factors have enabled formal analysis to transition from the academic into the commercial realm, full mainstream adoption has been limited primarily due to the lack of an adequate methodology.
Presentation 4: Real Intent: Formal Methods for Verification from Spec to Sign-off
Presenter: Pranav Ashar, CTO, Real Intent
Duration: 40 min
Judicious application of formal methods to various verification needs is one of the important must-have tools in an EDA-tool suite today. Real Intent believes that the key to an efficient verification methodology is to catch design errors at the earliest possible stage as the design is transformed from its functional specification to the final taped-out form. In practical terms, this means that there must be a significant level of confidence in the design even before chip-level simulation is started - the almost total reliance thus far on simulation to catch all sorts of design errors is the primary reason for ballooning verification costs and multiple re-spins. Formal methods enable solutions to the early verification requirement in significant ways.
Real Intent has been very successful in exploiting formal methods to provide practical solutions for catching functional RTL errors, errors in the implementation of clock-domain crossings, and errors in the specification of timing constraints. This has been possible through a combination of powerful formal engines, a thorough understanding of IC design principles, and a software architecture based on a clear understanding of use-model requirements for an effective deployment of formal methods
Presentation 5: Theorem Proven Refinement with SystemB®
Presenter: Jean Mermet, KeesDA
Duration: 40 min
KeesDA believes that the key to a correct design is to elaborate first a good abstract specification. This can be done, in an interactive way with the stakeholders of a new system, with the help of the B method used by KeesDA’s SystemB®. After the spec is accepted by all parties, a refinement process can start that will continue until the implementation of the system, either in silicon or into embedded software. System B using an efficient predicate transformer and theorem prover can check that each refinement step is a logic implication of the previous one and maintains the important properties that should be invariant in the system. Along the refinement road, correct- by-construction SystemC-Transaction Level Models can be automatically generated from BTLM ® . These models can be used, in a true HW/SW concurrent design by the embedded software team. But the refinement can continue until the implementation levels are reached. This means C, ADA, JAVA for SW, that can be generated automatically using AtelierB ® or until BHDL®, out of which KeesDA’s tool produce automatically synthesisable VHDL or SystemC-RTL. Simulation, as a verification method, is left to software developers, that use the “correct-by-construction” SystemC-TLM to test their software, or to performance evaluation tasks. The functionality, being proven has not to be further verified.
Further benefit, the implementable refinement of the system IS PROVEN to imply the upper levels, among which the TLM, giving the guaranty that the embedded software developed concurrently using this TLM will not have to be modified due to hardware implementation. No other approach can provide this guaranty to-day.
SESSION 4: PARALLEL EDA DEMONSTRATIONS / DISCUSSION WITH COCKTAIL
Presenters: Cadence, KeesDA, Mentor Graphics, Real Intent, Synopsys