Functional Verification Advancements and UVM

Similar documents
Introduction to Functional Verification. Niels Burkhardt

VON BRAUN LABS. Issue #1 WE PROVIDE COMPLETE SOLUTIONS ULTRA LOW POWER STATE MACHINE SOLUTIONS VON BRAUN LABS. State Machine Technology

BY STEVE BROWN, CADENCE DESIGN SYSTEMS AND MICHEL GENARD, VIRTUTECH

Introduction to Digital System Design

Systems on Chip Design

ARM Webinar series. ARM Based SoC. Abey Thomas

Digital Systems Design! Lecture 1 - Introduction!!

System-on. on-chip Design Flow. Prof. Jouni Tomberg Tampere University of Technology Institute of Digital and Computer Systems.

PREFACE WHY THIS BOOK IS IMPORTANT

Codesign: The World Of Practice

White Paper. S2C Inc Technology Drive, Suite 620 San Jose, CA 95110, USA Tel: Fax:

FPGA Prototyping Primer

Understanding DO-254 Compliance for the Verification of Airborne Digital Hardware

State-of-Art (SoA) System-on-Chip (SoC) Design HPC SoC Workshop

Agenda. Michele Taliercio, Il circuito Integrato, Novembre 2001

Software-defined networking

7a. System-on-chip design and prototyping platforms

Concurrent Hardware/Software Development Platforms Speed System Integration and Bring-Up

MPSoC Designs: Driving Memory and Storage Management IP to Critical Importance

Five best practices for deploying a successful service-oriented architecture

Universal Flash Storage: Mobilize Your Data

ESE566 REPORT3. Design Methodologies for Core-based System-on-Chip HUA TANG OVIDIU CARNU

Electronic system-level development: Finding the right mix of solutions for the right mix of engineers.

System / Verification: Performance & Debug Track Abstracts

Design Patterns for Complex Event Processing

Basic Trends of Modern Software Development

Embedded Development Tools

Architectures and Platforms

OSI Seven Layers Model Explained with Examples

NSW Government Standard Approach to Information Architecture. December 2013 v.1.0

Automating Root-Cause Analysis to Reduce Time to Find Bugs by Up to 50%

Accellera Systems Initiative completes SystemC AMS 2.0 standard for mixed-signal design of electronic systems

ESP-CV Custom Design Formal Equivalence Checking Based on Symbolic Simulation

A Framework for Automatic Generation of Configuration Files for a Custom Hardware/Software RTOS

PowerPlay Power Analysis & Optimization Technology

Testing of Digital System-on- Chip (SoC)

2 (18) - SOFTWARE ARCHITECTURE Service Oriented Architecture - Sven Arne Andreasson - Computer Science and Engineering.

Product Development Flow Including Model- Based Design and System-Level Functional Verification

OPTIMIZATION OF PROCESS INTEGRATION

How Network Operators Do Prepare for the Rise of the Machines

Plug. & Play. Various ECUs tested by automated sequences. dspace Magazine 3/2009 dspace GmbH, Paderborn, Germany info@dspace.com

Chapter 13: Verification

SDN and Streamlining the Plumbing. Nick McKeown Stanford University

COMPLEXITY AND INTERNATIONALISATION OF INNOVATION

System-on-Chip Design Verification: Challenges and State-of-the-art

design Synopsys and LANcity

Eingebettete Systeme. 4: Entwurfsmethodik, HW/SW Co-Design. Technische Informatik T T T

Applying 4+1 View Architecture with UML 2. White Paper

Testing & Verification of Digital Circuits ECE/CS 5745/6745. Hardware Verification using Symbolic Computation

EMA CMDB Assessment Service

Hardware Verification with the Unified Modeling Language and Vera

Extending the Power of FPGAs. Salil Raje, Xilinx

AMS Verification at SoC Level: A practical approach for using VAMS vs SPICE views

Manage Software Development in LabVIEW with Professional Tools

MAJORS: Computer Engineering, Computer Science, Electrical Engineering

TURN YOUR COMPANY S GOALS INTO AN ACTIONABLE PLAN

The potential shake-up in semiconductor manufacturing business models

TURN YOUR COMPANY S GOALS INTO AN ACTIONABLE PLAN

High Performance or Cycle Accuracy?

Solutions for Mixed-Signal SoC Verification New techniques that are making advanced SoC verification possible

Solutions for Quality Management in a Agile and Mobile World

The Evolution of Manufacturing Software Platforms: Past, Present, and Future

9/14/ :38

Requirements-driven Verification Methodology for Standards Compliance

From Bus and Crossbar to Network-On-Chip. Arteris S.A.

Assertion Synthesis Enabling Assertion-Based Verification For Simulation, Formal and Emulation Flows

Intent NBI for Software Defined Networking

Virtualization s Evolution

Optimizing Configuration and Application Mapping for MPSoC Architectures

Select the right configuration management database to establish a platform for effective service management.

The Software Process. The Unified Process (Cont.) The Unified Process (Cont.)

Cisco Network Optimization Service

The Scalable Enterprise: By Jimmy D. Pike, Scalable Enterprise Architect, Office of the CTO; and Drew EngstRom, E x e c u t i v e S u m m a r y

There are a number of factors that increase the risk of performance problems in complex computer and software systems, such as e-commerce systems.

Configuration Management One Bite At A Time

Hardware Virtualization for Pre-Silicon Software Development in Automotive Electronics

VtRES Towards Hardware Embedded Virtualization Technology: Architectural Enhancements to an ARM SoC. ESRG Embedded Systems Research Group

CDC UNIFIED PROCESS PRACTICES GUIDE

Designing a System-on-Chip (SoC) with an ARM Cortex -M Processor

Development With ARM DS-5. Mervyn Liu FAE Aug. 2015

How Router Technology Shapes Inter-Cloud Computing Service Architecture for The Future Internet

SERVICE-ORIENTED MODELING FRAMEWORK (SOMF ) SERVICE-ORIENTED SOFTWARE ARCHITECTURE MODEL LANGUAGE SPECIFICATIONS

Integrated Testing Solution Using SAP Solution Manager, HP-QC/QTP and SAP TAO

Building Your EDI Modernization Roadmap

Transcription:

Functional Verification Advancements and UVM Functional verification, in VLSI industry, is the task of verifying that the logic design conforms to specification. In simpler words, it is the task of answering the question: Is this design working correctly? The previous question is quite hard to answer when we are dealing with SoCs including a number of gates in the order of 100 million of gates. However, functional verification is an important task in the digital VLSI design flow. As it is performed in the early stages of the project as shown in Figure 1, it helps reduce the problems that appear later in the final stages of the flow and reduces the probability of silicon failure upon fabrication. Figure 1 Standard Digital VLSI Design Cycle (Source: http://www.ece.unm.edu/) To tackle the emerging verification challenges, a shift in the mindset of dealing with functional verification started around 15 years ago, which led to the appearance of the Universal Verification Methodology (UVM), which we will discuss in this article. Before we discuss it, we need to clear the confusion that you might have about verification and two other different tasks; validation and testing. Validation is the task of checking whether the design meets the customer requirements or not, while verification is more related to design specifications. Verification is done on a software layer, to make sure that the functionality of the design in the stage of pre-silicon behaves according to specifications. Testing on the other hand is performed on the hardware layer -in a lab- in post silicon stage to make sure that design was taped out correctly. Design - Verification Gap During the past decade, the time spent by systems-on-chip developers in functional verification has risen to 60% or more of the development time on some projects. Even developers of smaller chips and FPGAs are having problems with the past verification approaches. In 2001, it was predicted in the International Technology Roadmap for Semiconductor industry that the verification process will fail to keep pace with the design capabilities, as shown in Figure 2. To enhance the functional verification, many proven and promising technologies have been developed, such as: simulation, emulation, object-oriented programming, OOP, constrained random stimulus, coverage based verification, formal verification etc. Figure 2 the verification gap leaves design potential unrealized. This means that the potential for something to go wrong is greater, and the verification task has become exponentially more complex. (Source: SIA Roadmap, 2001)

It is worth mentioning that many of these techniques are based on the capabilities of the System Verilog language which combined the RTL capabilities of Verilog and the verification capabilities of open Vera language developed by Synopsys. System Verilog was adopted as an IEEE Standard in 2005. Methodologies and tools for constructing and implementing hardware have dramatically improved, while verification processes appear to have not kept pace with the same improvements. As hardware construction is simplified, then there is a trend to have less resources building hardware but same or more resources performing verification. Design teams with 3X verification to hardware design are not unrealistic and that ratio is trending higher. - Bill Grundmann, Fellow at Xilinx, DVcon 2014 architecture creation. The problem with the above statement is that we have too many verification techniques. This has led to several problems, among these problems: 1) Miscommunication: Different teams use different verification techniques. Consequently, the communication between teams is hard. Moreover, it is hard to get a new team member used to the techniques adopted by this team. 2) Reusability Problem: As there is no standard way to do things, it is hard to reuse parts of a project; either horizontally in other projects or vertically in the same project. This is where a methodology appears. We need to get teams and engineers to do the same things in the same ways. A methodology provides guidance on when, where, why and how each technique should be applied for maximum efficiency. It also provides buildingblock libraries, documentation, coding guidelines and lots of examples. A methodology lets the verification engineer focus on the verification planning and test effort rather than complex test-bench The Universal Verification Methodology, UVM, was announced by Accellera, a standards organization specialized in electronic design automation and IC design and manufacturing. It is a complete methodology that includes the best practices for efficient and exhaustive verification. Functional Verification Design Separation To understand how the field of Functional verification has been separated from design, we must return around 15 years ago. In 2000, Verisity Design Inc. introduced a collection of best known verification practices. It was targeted towards the e user community. Later, in 2002, Verisity introduced the first verification library called the e Reuse Methodology, erm. In 2004, the 9-year-old company was featured at the edaforum04, held in Dresden, Germany, in a talk titled "Improving Shareholder Value by Separating Verification from Design". In this presentation, Verisity delivered a solution to provide unique value that can be generated when you separate the concerns of functional verification from design. In 2005, Cadence Design Systems acquired Verisity in a deal that was estimated to be worth $285 million.

However, Verisity s efforts were not the only efforts towards neither separating functional verification from design nor reaching a unified verification methodology. In 2003, Synopsys announced its Reuse Verification Methodology library, RVM, for the Vera verification language. It didn t include architecture guidelines and was considered as a subset of the erm. Over time, it was converted into the System Verilog Verification Methodology Manual, VMM, supporting the evolving System Verilog Standard. Later, in 2006, Mentor introduced its Advanced Verification Methodology, AVM. It was the first open-source methodology and the first methodology to adopt the SystemC Transaction- Level Methodology standard. After Cadence s acquisition of Verisity, it started converting the erm to System Verilog, introducing the Universal Reuse Methodology. Not only did it include the proven capabilities of erm, but it also used TLM and was the first open source methodology. In 2008, Cadence and Mentor collaborated to release the Open Verification Methodology, OVM. The impact of OVM was great as it was the first multi-vendor methodology tested against different "Verisity is strong in verification automation and hardware acceleration. Add that to our strengths in simulation and incircuit emulation, we can offer a more complete solution for customers," - Adolph Hunter, group director of corporate communications at Cadence. vendors simulators. This was important due to the fact that System Verilog was in the early stages and many of its constructs lacked clarity. The collaboration in OVM proved to be a very good solution, which made Synopsys collaborate with Cadence and Mentor to introduce a unified methodology. In 2010, OVM 2.1.1 was chosen as the basis for the UVM standard. It is tested by all vendors and no more technical comparisons between VMM and OVM are needed. UVM is currently an Accellera standard. It represents an alignment on verification methodology across the industry, supported by the major EDA suppliers and their eco-systems. Universal Verification Methodology The UVM is a complete methodology that codifies the best known verification practices. One of its key principles is to produce reusable verification components called Universal Verification Components, UVCs. It is targeted to verify both the small designs and large IP-based SoCs. The key features of UVM are: 1) Data Structures The UVM provides the ability to clearly partition your verification environment into a set of data objects and components. Moreover, it provides means for setting and getting data values hierarchically, textually printing and graphically viewing objects and automating commonplace activities, such as copying, comparing and packing items, which we will refer to later as transactions. This allows engineers to focus on what objects contain and how they work, instead of the supporting code. Test Test Writer: Selects sequences, configures the environment(s) and runs the test. 2) Stimulus Generation The UVM provides infrastructures and built-in stimulus generation that can be customized to include user-defined transactions and transaction sequences. These sequences can be randomized and Environment (Test-bench) UVC User: Uses the UVCs and integrates them into environments to test different designs. UVCs Developer: Designs the UVCs. Where all the complications occur. Figure 3 The UVM environment different development levels

controlled based on the current state of the Design under Test, interface or previously generated data. 3) Building and Running Reusable Test-Benches (Test/Test-bench Separation) The UVM includes well-defined build flows for creating reusable verification environments. Moreover, it includes configuration mechanisms that allow customizing the runtime behavior without modifying the original implementation. This is beneficial when creating a test-bench for a design with different IPs, interfaces or processors. 4) Coverage Model Design and Checking Practices The UVM includes the best-known practices for incorporating functional coverage, in addition to protocol and data checks, into a reusable Universal Verification Component (UVC). 5) User Example The UVM library and user guide include a golden example, based on an understandable, yet complete, protocol called the UBus. A UVM test-bench is composed of UVCs. Each UVC is an encapsulated, ready-to-use, configurable verification component for an interface protocol, sub-module or a full system. A UVC consists of a sequencer and a driver for stimulating the design, a monitor for monitoring the pin-level activity and scoreboard for checking. It can optionally contain a coverage collector. Consequently, UVM enables the verification process to be divided into three different levels, as shown Figure 3. Moreover, UVM provides a framework to achieve coverage-driven verification (CDV) as shown in Figure 4. It combines automatic test generation, self-checking test-benches and coverage metrics. It eliminates the efforts and time spent in creating hundreds of tests and ensures thorough verification using up-front goal setting. Figure 4 Constrained Random Verification Flow (Source: Mentorgraphics Verification Academy) Conclusion The advancements in VLSI Design techniques and methodologies have created a huge gap between design and verification capabilities. This gap led to increasing products cost and time to market, while limiting design capabilities. Consequently, efforts have been exerted over the past 15 years to create and enhance new verification

methodologies and techniques to reduce such gap. These efforts led to the development of the Universal Verification Methodologies (UVM). In fact, UVM provides a lot of useful utilities and functionalities to the verification engineers, but the question is Would UVM be sufficient to face the ever-growing design complexities? However, the fact that it is developed through the collaboration of the big three EDA giants and its support and adoption by Accellera as an open source standard indicates that it will be supported for a long period of time. References 1. A new vision of 'scalable' verification EE Times, 2013 2. SIA Roadmap, 2001 3. Improving Shareholder Value by Separating Verification from Design edaforum04, 2004 4. UVM Community, Accellera 5. Mentorgraphics Verification Academy 6. Cadence buys Verisity for $285 million EE Times, 2005 Mustafa Khairallah is a Verification Engineer at Boost Valley for Engineering Services. He is currently a Masters Student at Ain Shams University, Electronics and Communications Department. Mustafa is a graduate of Alexandria University, Electronics and Communications Department, class of 2013, with a grade of Distinction with honors and has one published research paper.