Building Faithful Embedded Systems Models: Challenges and Opportunities

Chapter

Abstract

In this chapter, we overview some of the sought challenges for building faithful embedded systems models. We highlight the growing demand for using formal models especially for dealing with performance. The chapter illustrates the impact of the hardware part of the system on performance and suggests a probabilistic interpretation in order to build appropriately abstract models towards trustworthy analysis. We believe that such a view is worth to investigate to faithfully characterize the system performance as it provides a formal and parsimonious framework. In this context, we survey some probabilistic models and techniques that we think interesting for building such faithful representations.

References

  1. 1.
    S. Ahuja, D.A. Mathaikutty, A. Lakshminarayana, S.K. Shukla, Scope: statistical regression based power models for co-processors power estimation. J. Low Power Electron. 5 (4), 407–415 (2009)CrossRefGoogle Scholar
  2. 2.
    R. Alur, D.L. Dill, A theory of timed automata. Theor. Comput. Sci. 126 (2), 183–235 (1994)MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    T.W. Anderson, J.D. Finn, The New Statistical Analysis of Data (Springer, New York, 1996)CrossRefMATHGoogle Scholar
  4. 4.
    C. Baier, J.P. Katoen, Principles of Model Checking (MIT Press, Cambridge, MA, 2008)MATHGoogle Scholar
  5. 5.
    C. Baier, N. Bertrand, P. Bouyer, T. Brihaye, M. Grösser, Probabilistic and topological semantics for timed automata, in Proceedings of the 27th International Conference on Foundations of Software Technology and Theoretical Computer Science, FSTTCS (Springer, Berlin/Heidelberg, 2007), pp. 179–191MATHGoogle Scholar
  6. 6.
    A. Bakshi, V.K. Prasanna, A. Ledeczi, MILAN: a model based integrated simulation framework for design of embedded systems. ACM Sigplan Not. 36 (8), 82–93 (2001)CrossRefGoogle Scholar
  7. 7.
    A. Basu, B. Bensalem, M. Bozga, J. Combaz, M. Jaber, T.-H. Nguyen, J. Sifakis, Rigorous component-based system design using the BIP framework. IEEE Softw. 28 (3), 41–48 (2011)CrossRefGoogle Scholar
  8. 8.
    A. Basu, S. Bensalem, M. Bozga, P. Bourgos, M. Maheshwari, J. Sifakis, Component assemblies in the context of manycore, in Formal Methods for Components and Objects (Springer, New York, 2013), pp. 314–333CrossRefGoogle Scholar
  9. 9.
    T. Benaglia, D. Chauveau, D.R. Hunter, D.S. Young, Mixtools: an R package for analyzing finite mixture models. J. Stat. Softw. 32 (6), 1–29 (2009)CrossRefGoogle Scholar
  10. 10.
    P. Bourgos, Rigorous design flow for programming manycore platforms. Ph.D. thesis, Grenoble University, 2013Google Scholar
  11. 11.
    P. Bourgos, A. Basu, M. Bozga, S. Bensalem, J. Sifakis, K. Huang, Rigorous system level modeling and analysis of mixed HW/SW systems, in MEMOCODE (2011), pp. 11–20Google Scholar
  12. 12.
    G.E.P. Box, G.M. Jenkins, G.C. Reinsel, Time Series Analysis: Forecasting and Control. Forecasting and Control Series (Prentice Hall, Englewood Cliffs, 1994)MATHGoogle Scholar
  13. 13.
    P.J. Brockwell, R.A. Davis, Introduction to Time Series and Forecasting. Number v. 1 in Introduction to Time Series and Forecasting (Springer, New York, 2002)Google Scholar
  14. 14.
    P.E. Bulychev, A. David, K.G. Larsen, M. Mikucionis, D.B. Poulsen, A. Legay, Z. Wang, UPPAAL-SMC: statistical model checking for priced timed automata, in Proceedings 10th Workshop on Quantitative Aspects of Programming Languages and Systems, QAPL 2012, Tallinn, 31 March and 1 April 2012 (2012), pp. 1–16Google Scholar
  15. 15.
    R.C. Carrasco, J. Oncina, Learning stochastic regular grammars by means of a state merging method, in International Colloquium on Grammatical Inference (1994), pp. 139–152Google Scholar
  16. 16.
    F.J. Cazorla, E. Quinones, T. Vardanega, L. Cucu-Grosjean, B. Triquet, G. Bernat, E. Berger, J. Abella, F. Wartel, M. Houston, L. Santinelli, L. Kosmidis, C. Lo, D. Maxim, PROARTIS: probabilistically analysable real-time systems, Research Report RR-7869, INRIA, 2012Google Scholar
  17. 17.
    S. Chakraborty, S. Künzli, L. Thiele, A general framework for analysing system properties in platform-based embedded system designs, in Design Automation and Test in Europe, Citeseer, vol. 3 (2003), p. 10190Google Scholar
  18. 18.
    G. Cowan, Statistical Data Analysis (Oxford University Press, Oxford, 1998)Google Scholar
  19. 19.
    L. Cucu-Grosjean, L. Santinelli, M. Houston, C. Lo, T. Vardanega, L. Kosmidis, J. Abella, E. Mezzeti, E. Quinones, F.J. Cazorla, Measurement-based probabilistic timing analysis for multi-path programs, in The 24th Euromicro Conference on Real-Time Systems, Pisa, Italy (2012)Google Scholar
  20. 20.
    A. de Matos Pedro, P.A. Crocker, S.M. de Sousa, Learning stochastic timed automata from sample executions, in Leveraging Applications of Formal Methods, Verification and Validation. Technologies for Mastering Change (Springer, New York, 2012), pp. 508–523Google Scholar
  21. 21.
    P. Giusto, G. Martin, E. Harcourt, Reliable estimation of execution time of embedded software, in Proceedings of the Conference on Design, Automation and Test in Europe, DATE ’01 (IEEE Press, Piscataway, NJ, USA, 2001), pp. 580–589Google Scholar
  22. 22.
    D. Gross, J.F. Shortle, J.M. Thompson, C.M. Harris, Fundamentals of Queueing Theory. Wiley Series in Probability and Statistics (Wiley, New York, 2011)Google Scholar
  23. 23.
    W. Haid, M. Keller, K. Huang, I. Bacivarov, L. Thiele, Generation and calibration of compositional performance analysis models for multi-processor systems, in ICSAMOS (2009), pp. 92–99Google Scholar
  24. 24.
    M.G. Harbour, J.J.G. García, J.C.P. Gutiérrez, J.M.D. Moyano, Mast: modeling and analysis suite for real time applications, in 13th Euromicro Conference on Real-Time Systems (IEEE, Computer Society, Washington, DC, USA, 2001), pp. 125–134Google Scholar
  25. 25.
    R. Henia, A. Hamann, M. Jersak, R. Racu, K. Richter, R. Ernst, System level performance analysis - the SymTA/S approach, in IEEE Proceedings Computers and Digital Techniques (2005)Google Scholar
  26. 26.
    T. Hérault, R. Lassaigne, F. Magniette, S. Peyronnet, Approximate probabilistic model checking, in Verification, Model Checking, and Abstract Interpretation (2004), pp. 73–84Google Scholar
  27. 27.
    K. Huang, W. Haid, I. Bacivarov, M. Keller, L. Thiele, Embedding formal performance analysis into the design cycle of MPSoCs for real-time streaming applications. ACM Trans. Embed. Comput. Syst. 11 (1), 8:1–8:23 (2012)Google Scholar
  28. 28.
    Z.J. Jia, A. Núñez, T. Bautista, A.D. Pimentel, A two-phase design space exploration strategy for system-level real-time application mapping onto MPSoC. Microprocess. Microsyst. 38 (1), 9–21 (2014)CrossRefGoogle Scholar
  29. 29.
    Y. Jiang, Y. Liu, Stochastic Network Calculus (Springer, London, 2008)MATHGoogle Scholar
  30. 30.
    K. Keutzer, S. Malik, S. Member, A.R. Newton, J.M. Rabaey, A. Sangiovanni-vincentelli, System-level design: orthogonalization of concerns and platform-based design. IEEE Trans. Comput. Aided Des. Integr. Circuits Syst. 19, 1523–1543 (2000)CrossRefGoogle Scholar
  31. 31.
    S.A. Klugman, H.H. Panjer, G.E. Willmot, Loss Models: From Data to Decisions, vol. 715 (Wiley, New York, 2012)MATHGoogle Scholar
  32. 32.
    M.Z. Kwiatkowska, G. Norman, R. Segala, J. Sproston, Automatic verification of real-time systems with discrete probability distributions. Theor. Comput. Sci. 282 (1), 101–150 (2002)MathSciNetCrossRefMATHGoogle Scholar
  33. 33.
    M. Kwiatkowska, G. Norman, D. Parker, Probabilistic verification of Herman’s self-stabilisation algorithm. Form. Asp. Comput. 24 (4), 661–670 (2012)MathSciNetCrossRefMATHGoogle Scholar
  34. 34.
    K. Lampka, G. Giannopoulou, R. Pellizzoni, Z. Wu, N. Stoimenov, A formal approach to the WCRT analysis of multicore systems with memory contention under phase-structured task sets. Real-Time Syst. 50 (5–6), 736–773 (2014)CrossRefMATHGoogle Scholar
  35. 35.
    J.-Y. Le Boudec, Performance Evaluation of Computer and Communication Systems (EPFL Press, Lausanne, 2010)MATHGoogle Scholar
  36. 36.
    J.-Y. Le Boudec, P. Thiran, Network Calculus: a Theory of Deterministic Queuing Systems for the Internet (Springer, Berlin/Heidelberg, 2001)CrossRefMATHGoogle Scholar
  37. 37.
    P. Lieverse, P. Van Der Wolf, K. Vissers, E. Deprettere, A methodology for architecture exploration of heterogeneous signal processing systems. J. VLSI Sig. Process. Syst. Sig. Image Video Technol. 29 (3), 197–207 (2001)CrossRefMATHGoogle Scholar
  38. 38.
    H. Mao, Y. Chen, M. Jaeger, T.D. Nielsen, K.G. Larsen, B. Nielsen, Learning probabilistic automata for model checking, in QEST (2011), pp. 111–120Google Scholar
  39. 39.
    H. Mao, Y. Chen, M. Jaeger, T.D. Nielsen, K.G. Larsen, B. Nielsen, Learning Markov decision processes for model checking. arXiv preprint arXiv:1212.3873 (2012)Google Scholar
  40. 40.
    N. Matloff, From Algorithms to Z-Scores: Probabilistic and Statistical Modeling in Computer Science (University Press of Florida, Gainesville, 2009)Google Scholar
  41. 41.
    S. Mohanty, V.K. Prasanna, Rapid system-level performance evaluation and optimization for application mapping onto SoC architectures, in ASIC/SOC Conference, 2002. 15th Annual IEEE International (IEEE, Piscataway, NJ, USA, 2002), pp. 160–167Google Scholar
  42. 42.
    NIST/SEMATECH, NIST/SEMATECH e-Handbook of Statistical Methods (2016)Google Scholar
  43. 43.
    A. Nouri, Rigorous system-level modeling and performance evaluation for embedded system design, Theses, Université Grenoble Alpes, 2015Google Scholar
  44. 44.
    A. Nouri, M. Bozga, A. Molnos, A. Legay, S. Bensalem, Building faithful high-level models and performance evaluation of manycore embedded systems, in Twelfth ACM/IEEE International Conference on Formal Methods and Models for Codesign (MEMOCODE), 2014 (IEEE, 2014), pp. 209–218Google Scholar
  45. 45.
    A. Nouri, B. Raman, M. Bozga, A. Legay, S. Bensalem, Faster statistical model checking by means of abstraction and learning, in Proceedings of Runtime Verification - 5th International Conference, RV 2014, Toronto, ON, Canada, 22–25 Sept 2014 (2014), pp. 340–355Google Scholar
  46. 46.
    A. Nouri, S. Bensalem, M. Bozga, B. Delahaye, C. Jégourel, A. Legay, Statistical model checking QoS properties of systems with SBIP. Softw. Tools Technol. Trans. 17 (2), 171–185 (2015)CrossRefGoogle Scholar
  47. 47.
    R. Pearson, Exploring Data in Engineering, the Sciences, and Medicine (Oxford University Press, New York, 2011)Google Scholar
  48. 48.
    A.D. Pimentel, The artemis workbench for system-level performance evaluation of embedded systems. Int. J. Embed. Syst. 3, 181–196 (2008)CrossRefGoogle Scholar
  49. 49.
    A.D. Pimentel, C. Erbas, S. Polstra, A systematic approach to exploring embedded system architectures at multiple abstraction levels. Comput. IEEE Trans. 55 (2), 99–112 (2006)CrossRefGoogle Scholar
  50. 50.
    A.D. Pimentel, M. Thompson, S. Polstra, C. Erbas, Calibration of abstract performance models for system-level design space exploration. J. Sig. Process. Syst. 50 (2), 99–114 (2008)CrossRefGoogle Scholar
  51. 51.
    L. Rabiner, A tutorial on hidden Markov models and selected applications in speech recognition. Proc. IEEE 77 (2), 257–286 (1989)CrossRefGoogle Scholar
  52. 52.
    J. Reineke, B. Wachter, S. Thesing, R. Wilhelm, I. Polian, J. Eisinger, B. Becker, A definition and classification of timing anomalies, in 6th International Workshop on Worst-Case Execution Time (WCET), Analysis, 4 Jul 2006, Dresden, Germany (2006)Google Scholar
  53. 53.
    L. Santinelli, L. Cucu-Grosjean, Toward probabilistic real-time calculus. SIGBED Rev. 8 (1), 54–61 (2011)CrossRefGoogle Scholar
  54. 54.
    K. Sen, M. Viswanathan, G. Agha, Learning continuous time Markov chains from sample executions, in First International Conference on the Quantitative Evaluation of Systems QEST. (IEEE, Computer Society, Washington, DC, USA, 2004), pp. 146–155Google Scholar
  55. 55.
    A. Stolcke, S. Omohundro, Hidden Markov model induction by Bayesian model merging, in Advances in Neural Information Processing Systems (1993), pp. 11–11Google Scholar
  56. 56.
    L. Thiele, S. Chakraborty, M. Naedele, Real-time calculus for scheduling hard real-time systems, in International Symposium on Computer Architecture, vol. 4 (2000), pp. 101–104Google Scholar
  57. 57.
    L. Thiele, I. Bacivarov, W. Haid, K. Huang, Mapping applications to tiled multiprocessor embedded systems, in Application of Concurrency to System Design (2007)Google Scholar
  58. 58.
    L. Thiele, L. Schor, I. Bacivarov, H. Yang, Predictability for timing and temperature in multiprocessor system-on-chip platforms, in ACM Transactions on Embedded Computing Systems (TECS) - Special Section on ESTIMedia12, LCTES11, Rigorous Embedded Systems Design, and Multiprocessor, 12 March 2013Google Scholar
  59. 59.
    S. Verwer, R. Eyraud, C. de la Higuera, Results of the pautomac probabilistic automaton learning competition, in International Conference on Grammatical Inference (2012), pp. 243–248Google Scholar
  60. 60.
    D. Vose, Risk Analysis: a Quantitative Guide (Wiley, New York, 2008)MATHGoogle Scholar
  61. 61.
    R. Wilhelm, J. Engblom, A. Ermedahl, N. Holsti, S. Thesing, D. Whalley, G. Bernat, C. Ferdinand, R. Heckmann, T. Mitra, F. Mueller, I. Puaut, P. Puschner, J. Staschulat, P. Stenström, The worst-case execution-time problem—overview of methods and survey of tools. ACM Trans. Embed. Comput. Syst. 7 (3), 36:1–36:53 (2008)Google Scholar
  62. 62.
    R. Wilhelm, D. Grund, J. Reineke, M. Schlickling, M. Pister, C. Ferdinand, Memory hierarchies, pipelines, and buses for future architectures in time-critical embedded systems. IEEE Trans. CAD Integr. Circuits Syst. 28 (7), 966–978 (2009)CrossRefGoogle Scholar
  63. 63.
    H.L.S. Younes, Verification and planning for stochastic processes with asynchronous events. Ph.D. thesis, Carnegie Mellon, 2005Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Université Grenoble AlpesGrenobleFrance
  2. 2.CEA, LETI, MINATEC CampusGrenobleFrance
  3. 3.VERIMAGUniversité Grenoble AlpesGrenobleFrance
  4. 4.VERIMAGCNRSGrenobleFrance

Personalised recommendations