CSE Verification Plan

Similar documents
CREATIVE ASSERTION AND CONSTRAINT METHODS FOR FORMAL DESIGN VERIFICATION

SoC Verification Methodology. Prof. Chien-Nan Liu TEL: ext:

ADVANCED DIGITAL IC DESIGN. Digital Verification Basic Concepts

CS/ECE 5780/6780: Embedded System Design

Optimizing Emulator Utilization by Russ Klein, Program Director, Mentor Graphics

Leveraging Formal Verification Throughout the Entire Design Cycle

Computer Science and Software Engineering University of Wisconsin - Platteville 9-Software Testing, Verification and Validation

width: 10, 20 or 40-bit interface maximum number of lanes in any direction

Choosing an Intellectual Property Core

Overview of Digital Design with Verilog HDL 1

Formal Technology in the Post Silicon lab

Administrivia. ECE/CS 5780/6780: Embedded System Design. Acknowledgements. What is verification?

Test Automation. Fundamentals. Mikó Szilárd

Software Design. Levels in Design Process. Design Methodologies. Levels..

Next Generation Verification Process for Automotive and Mobile Designs with MIPI CSI-2 SM Interface

ECE 587 Hardware/Software Co-Design Lecture 11 Verification I

Accelerating RTL Simulation Techniques by Lior Grinzaig, Verification Engineer, Marvell Semiconductor Ltd.

Power Up Hardware/Software Verification Productivity by Matthew Ballance, Mentor Graphics

Digital Design Methodology (Revisited) Design Methodology: Big Picture

An Evaluation of the Advantages of Moving from a VHDL to a UVM Testbench by Shaela Rahman, Baker Hughes

CS 250 VLSI Design Lecture 11 Design Verification

Digital Design Methodology

ASIC world. Start Specification Design Verification Layout Validation Finish

DO-254 Testing of High Speed FPGA Interfaces by Nir Weintroub, CEO, and Sani Jabsheh, Verisense

Formal Verification: Not Just for Control Paths

By Matthew Noonan, Project Manager, Resource Group s Embedded Systems & Solutions

EE 4755 Digital Design Using Hardware Description Languages

Lecture 15 Software Testing

Practical Approaches to Formal Verification. Mike Bartley, TVS

Integrate Ethernet QVIP in a Few Hours: an A-to-Z Guide by Prashant Dixit, Questa VIP Product Team, Mentor Graphics

XVIII. Software Testing. Laurea Triennale in Informatica Corso di Ingegneria del Software I A.A. 2006/2007 Andrea Polini

ECE 448 Lecture 15. Overview of Embedded SoC Systems

Contents 1 Introduction 2 Functional Verification: Challenges and Solutions 3 SystemVerilog Paradigm 4 UVM (Universal Verification Methodology)

CSE 374 Programming Concepts & Tools. Hal Perkins Fall 2015 Lecture 15 Testing

EE 4755 Digital Design Using Hardware Description Languages

Topics in Software Testing

System Level Design with IBM PowerPC Models

Introduction To Software Testing. Brian Nielsen. Center of Embedded Software Systems Aalborg University, Denmark CSS

Testing! The material for this lecture is drawn, in part, from! The Practice of Programming (Kernighan & Pike) Chapter 6!

Examination Questions Time allowed: 1 hour 15 minutes

Complex Signal Processing Verification under DO-254 Constraints by François Cerisier, AEDVICES Consulting

Page 1. Outline. A Good Reference and a Caveat. Testing. ECE 254 / CPS 225 Fault Tolerant and Testable Computing Systems. Testing and Design for Test

Mapping Multi-Million Gate SoCs on FPGAs: Industrial Methodology and Experience

EEM870 Embedded System and Experiment Lecture 4: SoC Design Flow and Tools

Integration and Testing. Uses slides from Lethbridge & Laganiere, 2001

This paper was presented at DVCon-Europe in November It received the conference Best Paper award based on audience voting.

Comprehensive CDC Verification with Advanced Hierarchical Data Models

Design Verification Lecture 01

DRVerify: The Verification of Physical Verification

Is Power State Table Golden?

Set a longer list of transaction attributes as per protocol legality Perform the cache data/state update at end of transaction, as needed

Lecture 20: SW Testing Presented by: Mohammad El-Ramly, PhD

Lecture 8: Synthesis, Implementation Constraints and High-Level Planning

Mixed Signal Verification Transistor to SoC

TDD For Embedded Systems... All The Way Down To The Hardware. Neil Johnson

Portable VHDL Testbench Automation with Intelligent Testbench Automation by Matthew Ballance, Mentor Graphics

IJTAG Compatibility with Legacy Designs - No Hardware Changes

The Integration Phase Canard. Integration and Testing. Incremental Integration. Incremental Integration Models. Unit Testing. Integration Sequencing

Chapter 9. Software Testing

Verification Planning with Questa Verification Management

DVCon India 2016 Abstract submission template. Taking UVM to wider user base the open-source way Name: Nagasundaram Thillaivasagam

Verifying the Correctness of the PA 7300LC Processor

Making the Most of your MATLAB Models to Improve Verification

1 Visible deviation from the specification or expected behavior for end-user is called: a) an error b) a fault c) a failure d) a defect e) a mistake

Software Quality Assurance. David Janzen

Simplified UVM for FPGA Reliability UVM for Sufficient Elemental Analysis in DO-254 Flows by Shashi Bhutada, Mentor Graphics

Testing. Prof. Clarkson Fall Today s music: Wrecking Ball by Miley Cyrus

GENERATION OF GRAPH FOR ETHERNET VERIFICATION USING TREK M.Vinodhini* 1, P.D. Rathika 2, J.U.Nambi 2, V.Kanmani 1

CONSIDERATIONS FOR THE DESIGN OF A REUSABLE SOC HARDWARE/SOFTWARE

For a long time, programming languages such as FORTRAN, PASCAL, and C Were being used to describe computer programs that were

Introduction to Software Engineering

CSE P567 - Winter 2010 Lab 1 Introduction to FGPA CAD Tools

01 1 Electronic Design Automation (EDA) the correctness, testability, and compliance of a design is checked by software

Simulation-Based FlexRay TM Conformance Testing an OVM success story

Component-Based Software Engineering TIP

CSE 548 Computer Architecture. Clock Rate vs IPC. V. Agarwal, M. S. Hrishikesh, S. W. Kechler. D. Burger. Presented by: Ning Chen

Program Verification! Goals of this Lecture! Words from the Wise! Testing!

PowerAware RTL Verification of USB 3.0 IPs by Gayathri SN and Badrinath Ramachandra, L&T Technology Services Limited

Portable Stimulus vs Formal vs UVM A Comparative Analysis of Verification Methodologies Throughout the Life of an IP Block

UNIT 4 INTEGRATED CIRCUIT DESIGN METHODOLOGY E5163

outline Reliable State Machines MER Mission example

COMP 3500 Introduction to Operating Systems Project 5 Virtual Memory Manager

Software Testing Interview Question and Answer

CS/EE 3710 Computer Architecture Lab Checkpoint #2 Datapath Infrastructure

Transaction-Based Acceleration Strong Ammunition In Any Verification Arsenal

CS 250! VLSI System Design

MICRO DIGITAL: TECHNICAL CRITERIA FOR MAKING THE RTOS CHOICE

Graph-Based IP Verification in an ARM SoC Environment by Andreas Meyer, Verification Technologist, Mentor Graphics Corporation

Logic Verification 13-1

Testing! The material for this lecture is drawn, in part, from! The Practice of Programming (Kernighan & Pike) Chapter 6!

SystemVerilog Assertions in the Design Process 213

Bibliography. Measuring Software Reuse, Jeffrey S. Poulin, Addison-Wesley, Practical Software Reuse, Donald J. Reifer, Wiley, 1997.

Effective Verification of ARM SoCs

Three General Principles of QA. COMP 4004 Fall Notes Adapted from Dr. A. Williams

Introduction to CMOS VLSI Design (E158) Lecture 7: Synthesis and Floorplanning

Silicon Virtual Prototyping: The New Cockpit for Nanometer Chip Design

Tackling Verification Challenges with Interconnect Validation Tool

Part 5. Verification and Validation

Computational Systems COMP1209

Testing is the process of evaluating a system or its component(s) with the intent to find whether it satisfies the specified requirements or not.

Transcription:

CSE 45493-3 Verification Plan 1

Verification Plan This is the specification for the verification effort. It indicates what we are verifying and how we are going to do it! 2

Role of the Verification Plan Traditional approach Do as you wish! New approaches Want metrics to determine when verification is done Plan specifies the verification effort Defines 1 st time success (ideally) 3

Specifying the Verification When are you done? Metrics The specification determines what has to be done! The specification does NOT determine the following: How many will it take? How long will it take? Are you are doing what needs to be done? Verification Plans help define these! 4

Specifying the Verification (continued) The plan starts from the design specification Why? The reconvergence model!! The plan must exist in written form the same thing as before, but at twice the speed, with these additions Not acceptable specification. The specification is a golden document it is the law. It is the common source for the verification efforts and the implementation efforts. 5

Defining 1 st time success The plan determines what is to be verified. If, and only if, it is in the plan, it will it be verified. The plan provides the forum for the entire design team to determine 1 st time success. For 1 st time success, every feature must be identified and under what conditions. As well as expected response. Plan documents these features (as well as optional ones) and prioritized them. This way, consciously, risk can be mitigated to bring in the schedule or reduce cost. 6

Defining 1 st time success (cont.) Plan creates a well defined line, that when crossed can endanger the whole project and its success in the market. It defines: How many test scenarios (testbenches/testcases) must be written How complex they need to be Their dependencies From the plan, a detailed schedule can be produced: Number of resources it will take people, machines, etc Number of tasks that needs to be performed Time it will take with current resources 7

Verification plan Team owns it! Everyone involved in the project has a stake in it. Anybody who identifies deficiencies should have input so they are fixed. The process used to write the plan is not new. It has been used in the industry for decades. It is just now finding its way to hardware. Others who use it: NASA FAA Software 8

Hierarchical design System Chip... Unit Macro Allows design team to break system down into logical and comprehendible components. Also allows for repeatable components. 9

Verification levels Verification can be performed at various granularities: Designer/Unit/sub-subunit ASIC/FPGA/Reusable component System/sub-system/SoC board 10

Verification levels (continued) How to decide what levels? There are trade-offs: Smaller ones offer more control and observability but more effort to construct more environments Larger ones, the integration of the smaller partitions are implicitly verified at the cost of lower control and observability but less effort because fewer environments Whatever level is decided, those pieces should remain stable. Each verifiable piece should have its own specification document 11

Designer level Usually pretty small One design of a design engineer Interfaces and functionality tend to change often Usually don t have independent specification associated with them Environment (if one) is usually ad hoc Not intended to gain full coverage, just enough to verify basic operation (no boundary conditions) 12

Unit level Typically designs assigned to one design engineer Each unit level piece is stressfully verified Unlike the ad hoc approach If unit level is fully verified, then you only need to worry about the boundary conditions. High number of units in a design usually means it is not reasonable to do unit level due to high level of effort for environments 13

Reusable component level Independent of any particular use Intended to be used as is in many different parts of a design. Saves in verification time. Can code BFM s for these pieces that others who communication with the reusable component can use. They necessitate a regression suite Components will not be reused if users do not have confidence in them Gained by well documented and executed process 14

ASIC/FPGA level Physical partition provides a natural boundary for verification Once specified, very little change due to ramification down stream. FPGA s used to get debugged at integration. Now they are so dense, that a more ASIC flow is required. 15

System level What is a system? Everyone s definition is a little different. Verification at this level focuses on interaction, not function buried in one piece. A large ASIC may take this approach. (Assuming that all pieces that make up the ASIC are specified, documented, and fully verified). Assume that individual components are functionally correct. Testcase defines the system. 16

Board level Main Objective: Confirm the connectivity of board design tool Perhaps better in formal verification Can also be used as a system to verify its functionality When doing board level verification, one must ensure that what is in the system, is what is to be manufactured. Things to consider Problem: Getting models for all components 3 rd party? Many board level components don t fit into a digital simulation analog! Board-level parasitic 17

Current practices for verifying a system Designer Level Sim Verification of a macro (or a few small macros) Unit Level Sim Verification of a group of macro s Chip (Element) Sim Verification of an entire logical function (i.e. processor, storage controller, Ethernet MAC, etc.) System Level Sim Multiple chip verification Sometimes done at a board level Can utilize a mini operating system 18

In an Ideal World. Every Macro has had a perfect verification All permutations are verified based on legal inputs All outputs checked on the small chunks of the design Unit, Chip, and System level would then only need to check interconnections Ensure that macros used correct Input/Output assumptions and protocols 19

Reality Macro verification across a system is not feasible May be over 100 macros on a chip Would require 200 verification engineers Shortage of skilled verification engineers Business can t support the development expense Verification Leaders must make reasonable trade-offs Concentrate on Unit level Designer level on riskiest macros 20

Verification Strategies Level of granularity Types of testcases for each granularity Test cases: white Box, Black Box, Grey Box Level of abstraction Processor example How to verify the response How random will it be 21

Verifying the Response Deciding how to apply stimulus is usually easy part. Deciding response is the hard part. Must plan how to determine the expected response How to verify that design provided the expected response Self checking testbenches is recommended Detect errors and stop simulation as early as possible. Why? Error identified when it takes place easier to debug No wasted simulation cycles 22

Random Verification Does not mean randomly applying 0 s and 1 s Inputs must produce valid stimulus The sequence of operations and the content of the data transferred is random! To creates conditions that one does not think of Hit corner cases Reduces bias from the engineer Very complex to specify and code Must be a fair representation of operating conditions More complicated to determine expected response With proper constraints, a random environment can produce directed tests 23

Specifications to Features 1 st step in planning is to identify verifiable features Label each feature with a short description Cross referenced specification and verification Unit level features bulk of verification System level features Concentrate on functional errors, not tool induced errors Specify features for the proper level of verification 24

Features to testcases Prioritize the features Must have features gate tape out Less important feature receive less attention This helps project managers make informed decisions when schedule pressures arise. Group features; become a testcases; assign to an engineer Describe Testcases and cross-reference to features list If a feature does not have a testcase, not being verified Define dependencies helps prioritize Specify testcase stimulus (or constraints for random) Also must specify how each feature will be checked Inject errors to ensure that checks are working 25

Testcases to testbenches Group like testcases into testbenches Similar configuration or same abstraction level Cross-reference testbenches with testcases Assign each group of testcases to a verification engineer Verifying the testbench Use a broken model for each feature to be verified Peer reviews Provide logging messages in the testbench 26

Verification Do s and Don ts Do: Talk to designers about the function Spend time thinking about all the pieces that need to be verified and situations that designer might have missed Talk to other designers about the signals that interface to DUV Don t: Rely on the designer s word for input/output specification Allow a chip to go out without being properly verified pay now or pay later (with inflation!) 27

Terminology Facilities: a general term for named wires (or signals) and latches. Facilities feed gates (and/or/nand/nor/invert, etc) which feed other facilities. EDA: Engineering Design Automation--Tool vendors. IBM has an internal EDA organization that supplies tools. We also procure tools from external companies Behavioral: Code written to perform the function of logic on the interface of the design-under-test Macro: A behavioral ; A piece of logic Driver: Code written to manipulate the inputs of the design-under-test. The driver understands the interface protocols. 28

Terminology (Cont.) Checker: Code written to verify the outputs of the designunder-test. A checker may have some knowledge of what the driver has done. A check must also verify interface protocol compliance Snoop/Monitor: Code that watches interfaces or internal signals to help the checkers perform correctly. Also used to help drivers be more devious Architecture: Design criteria as seen by the customer. The design's architecture is specified in documents and the design must be compliant with this specification Microarchitecture: The design's implementation Microarchitecture refers to the constructs that are used in the design, such as pipelines, caches, etc. 29

Escape analysis Escape analysis is a critical part of the verification process Fully understand bug! Reproduce in sim if possible Lack of repeat means fix cannot be verified Could misunderstand the bug Why did the bug escape simulation? Update process to avoid similar escapes in future (plug the hole!) 30

Verification Cycle Create Testplan Develop environment Debug hardware Escape Analysis This class only Regression Hardware debug Fabrication 31

32

Testplan summary Testplan includes Schedule Specific tests and methods by simulation level Required tools Input criteria Completion criteria What is expected to be found with each test/level What's not covered by each test/level 33

Verification Methodology Evolution Test Patterns Hand Generated Hand Checked Hardcoded Testcases Hand Generated Self Checking Hardcoded AVPs, IVPs Time Testcase Generators Testcase Drivers Tool Generated Self Checking Hardcoded AVPGEN, GENIE/GENESYS SAK Interactive on-the-fly generation On-the-fly checking Random SMP, C/C++ More Stress per Cycle Coverage tools Formal Verification 34

Escape Analysis: Classification We currently classify all escapes under two views Verification view What areas are the complexities that allowed the escap Cache Set-up, Cycle dependency, Configuration dependency, Sequence complexity, and expected resul Design View What was wrong with the logic? Logic hole, data/logic out of synch, bad control reset, wrong spec, Bad logic 35

Basic Testcase/Model Interface: Clocking Clocking cycles A simulator has the concept of time. Event sim uses the smallest increment of time in the tar technology All other sim environments use a single cycle A testcase controls the clocking of cycles (movement of time) All APIs include a clock statement Example: "Clock(n)", where n is the increment to clock (usually '1') Cycle 0 Cycle 1 Cycle 2......Cycle n 36

Basic Testcase/Model Interface: Setfac/Putfac Setting facilities A simulator API allows you to alter the value of facilities Used most often for driving inputs Can be used to alter internal latches or signals Can set a single bit or multi-bit facility values can be 0,1, or possibly X, high impedence, etc Example syntax: "Setfac facility_name value" Setfac address_bus(0:31) "0F3D7249"x Cycle 0 Cycle 1 Cycle 2......Cycle n 37

Basic Testcase/Model Interface: Getfac Reading facilities values A simulator API allows you to read the value of a facility Used most often checking outputs Can be used to read internal latches or signals Example syntax: "Getfac facility_name varname" Getfac adder_sum checksum Cycle 0 Cycle 1 Cycle 2......Cycle n 38

Basic Testcase/Model Interface: Putting it togethe Clocking, setfacs and putfacs occur at set times during a cycle Setting of facilities must be done at the beginning of the cycle. Getfacs must occur at the end of a cycle In between, control goes to the simulation engine, where t logic under test is "run" (evaluated) Setfac address_bus(0:31) "0F3D7249"x Getfac adder_sum checksum Cycle 0 Cycle 1 Cycle 2......Cycle n 39