A glimpse on Metric Driven Verification Methodology

As the design complexity increases, the use of traditional verification methodology becomes minimal for verifying hardware designs. Directed Tests were used quite long back. Later, Coverage Driven Verification methodology (CDV) came up. In directed tests approach, verification engineer is going to state exactly what stimulus should be applied to the Design Under Test (DUT). This can be applied only for small designs which has very limited features.

As the design became more complex, verification engineers started looking for the possibility of checking the effectiveness of the verification, or in other words the features covered during verification. This is the whole idea behind CDV, which is done by setting up cover-groups for the features to be verified and also for coverage closure. The stimulus generation is random (by using Constrained Random Generation method) for CDV, so this approach is much more effective than directed tests. CDV improves productivity and also quality, but you will find difficulties in planning and estimating the verification completion. For complex designs there will be thousands of cover-groups and it is difficult to map with the specification.

Metric Driven Verification (MDV) is a proven methodology for verifying hardware designs which has been introduced by Cadence. This is based on CDV approach, but overcomes pitfalls in CDV approach. In MDV flow, features are stated in an executable verification plan. This is the first phase for the verification and later this will be correlated with the actual cover-groups. This uses constrained random for stimulus generation which helps to have better coverage than traditional simulation method.

Different stages in Metric Driven Verification Flow:
The different stages in MDV flow are plan, construct, execute, measure and analyze. The coverage information from “measure” stage will be mapped to verification plan and do the analysis to see which features are already verified with existing tests and the given seeds. Having this information upfront helps to improve the verification environment and hence there will not be any chance of missing out the planned features.

The verification plan is a living document to achieve the goal of verifying the functionality of the design completely. This needs to correlate functional specification, designers’ intent and implementation of test-bench. The plan can be an XML file, a spreadsheet, a document or a text file and defines exactly what needs to be verified. Different sections can be made in verification plan like interested features, co-features, interface features etc. A good and meaningful verification plan always helps the verification engineers to achieve his final goal by correlating different coverage results to each feature. It also helps to measure the progress of verification at different stages and can re-evaluate estimated effort if required.

Without a plan it is always difficult to differentiate high priority and low priority features and all coverage information will appear flat. The verification engineers will not have a clear picture on the progress or verification closure.


The next step is to construct a verification environment. The verification engineers start constructing an environment by reusing existing verification IPs, reusing available UVM/OVM libraries and/or developing from scratch some part of the environment. This depends on what you decide in the planning stage. The test-bench and some of the test cases will be ready by this time.

Once the verification environment is ready, test cases can be executed and results checked. The tool vManager from Cadence can fire the regression and can easily capture the result and correlate with verification plan, if you specify the v-plan feature information while defining the coverage in your code. Incisive Metric Centre is now the default way of viewing coverage as a unified coverage browser, which clearly shows up what part of the design has been exercised.

Once the coverage information is available, this should be analysed with the v-plan. Cadence INCISIV tool package helps to get a clear picture on v-plan to feature mapping against the coverage result. It also shows coverage based ranking to see which test is most effective and which tests are redundant. The tests with ranking id of -1 is redundant and can be filtered out while ranking id of 0 would be the most effective test. We can find out the ranking of other tests as well and the effective improvement in the coverage by executing those tests.

By having better verification planning and management and correlating with coverage, MDV flow significantly improves the productivity of your verification.

4 comments on “A glimpse on Metric Driven Verification Methodology

  1. Ashwath

    Hello ma’am,

    Can you explain me about ” traceability in an IP development?”
    Also how this vManager helps us to achieve traceability?

    Currently I’m doing an internship at Canon ISDC in hardware design and verification doamin. I’m learning about basics of verification. I have seen in many articles, which talks about traceability. But not getting the exact picture of it.
    It would be really helpful if you could explain it.

    Thank you

    Ashwath.

    Reply
  2. Ashwath Hegde

    Hello ma’am,

    I’m currently in final sem of my engineering and learning basics of verification.
    I often came across the world traceability in an IP development. But I’m not been able to grasp the exact meaning of it.

    Can you explain me about this.?Also how vManager of cadence helps to achieve traceability..

    Thank u..

    Reply
    1. Sini Balakrishnan Post author

      Traceability is a general term used to indicate the ability to trace back all the stages that led to a particular point in a process.

      In ASIC verification, the steps followed are
      1.Define verification plan – Verification scenarios, methodology, goals, coverage plans & goals etc.
      vManager is an efficient tool for developing the verification plan by using v-Planner/e-planner.
      (Here tool can capture do classification and tracking of specification data).

      2. Develop verification environment, test-cases, coverage coding and run using simulation tool.

      3. Run regression (vManager) to include all the tests with multiple seeds.
      vManager gives out a report which helps to find out the missing part in the verification plan and hence the specification holes.
      Here we can trace back from v-Manager final report to specification to check the completeness of our verification. This is the traceability of spec to tests.

      Another area we call traceability is in debugging,
      If test fails in v-Manager, it gives out enough information to find out the root cause of the problem. This is called debug traceability.

      Thanks
      Sini

      Reply

Leave a Reply to Ashwath Cancel reply

Your email address will not be published. Required fields are marked *