Reverse Coverage Analysis

  • Ariel Birnbaum
  • Laurent Fournier
  • Steve Mittermaier
  • Avi Ziv
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7261)


Commonly used approaches for accumulating coverage data do not properly track events that have been covered in the past but not recently (stale events). They either treat stale events as covered events (global approach) or as uncovered events (window approach). We propose a new approach called reverse coverage analysis that is based on tracking the last time each coverage event was hit and looking at the coverage data backward in time from the present. With this approach, we can easily identify stale events and when the ability to cover them was lost. The reverse coverage approach was successfully used in the verification of two high-end IBM microprocessors and improved treatment of stale events and their causes.


Coverage Data Coverage Model Coverage Analysis Covered Event Coverage Space 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Wile, B., Goss, J.C., Roesner, W.: Comprehensive Functional Verification - The Complete Industry Cycle. Elsevier (2005)Google Scholar
  2. 2.
    Adir, A., Almog, E., Fournier, L., Marcus, E., Rimon, M., Vinov, M., Ziv, A.: Genesys-Pro: Innovations in test program generation for functional processor verification. IEEE Design and Test of Computers 21(2), 84–93 (2004)CrossRefGoogle Scholar
  3. 3.
    Shacham, O., Wachs, M., Solomatnikov, A., Firoozshahian, A., Richardson, S., Horowitz, M.: Verification of chip multiprocessor memory systems using a relaxed scoreboard. In: Proceedings of the 41st IEEE/ACM International Symposium on Microarchitecture (MICRO-41), pp. 294–305 (2008)Google Scholar
  4. 4.
    Piziali, A.: Functional Verification Coverage Measurement and Analysis. Springer (2004)Google Scholar
  5. 5.
    Carter, H.B., Hemmady, S.G.: Metric Driven Design Verification: An Engineer’s and Executive’s Guide to First Pass Success. Springer (2007)Google Scholar
  6. 6.
    Azatchi, H., Fournier, L., Marcus, E., Ur, S., Ziv, A., Zohar, K.: Advanced analysis techniques for cross-product coverage. IEEE Transactions on Computers 55(11), 1367–1379 (2006)CrossRefGoogle Scholar
  7. 7.
    Hajjar, A., Chen, T., Munn, I., Andrews, A., Bjorkman, M.: High quality behavioral verification using statistical stopping criteria. In: Proceedings of the 2001 Design, Automation and Test in Europe Conference, pp. 411–418 (2001)Google Scholar
  8. 8.
    Ludden, J.M., et al.: Functional verification of the POWER4 microprocessor and POWER4 multiprocessor systems. IBM Journal of Research and Development 46(1), 53–76 (2002)CrossRefGoogle Scholar
  9. 9.
    Aharoni, M., Asaf, S., Fournier, L., Koyfman, A., Nagel, R.: FPgen - a deep-knowledge test generator for floating point verification. In: Proceedings of the 8th High-Level Design Validation and Test Workshop, pp. 17–22 (2003)Google Scholar
  10. 10.
    Buchnik, E., Ur, S.: Compacting regression-suites on-the-fly. In: Proceedings of the 4th Asia Pacific Software Engineering Conference, pp. 385–394 (1997)Google Scholar
  11. 11.
    Lyu, M.: The Handbook of Software Reliability Engineering. McGraw Hill (1996)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Ariel Birnbaum
    • 1
  • Laurent Fournier
    • 1
  • Steve Mittermaier
    • 2
  • Avi Ziv
    • 1
  1. 1.IBM Research - HaifaIsrael
  2. 2.IBM Server and Technology GroupPoughkeepsieUSA

Personalised recommendations