Advertisement

Verification of Operating System Monolithic Kernels Without Extensions

  • Evgeny NovikovEmail author
  • Ilja Zakharov
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11247)

Abstract

Most widely used, general-purpose operating systems are built on top of monolithic kernels to achieve maximum performance. These monolithic kernels are written in the C/C++ programming language primarily and they may exceed one million lines of code in size even without optional extensions or loadable kernel modules such as device drivers and file systems. In addition, they evolve rapidly for supporting new functionality and due to continuous optimizations and elimination of defects. Since operating systems and, in turn, applications strongly depend on monolithic kernels, requirements for their functionality, security, reliability and performance are ones of the highest. Currently used approaches to software quality assurance help to reveal quite many defects in monolithic kernels, but none of them aims at detecting all violations of checked requirements and alongside providing guarantees that target programs always operate correctly. This paper presents a new method that is based on the software verification technique and that enables thorough checking and finding hard-to-detect faults in various versions of monolithic kernels. One of its key features is the possibility to avoid considerable efforts for configuring tools and developing specifications to obtain valuable verification results while one still can steadily improve their quality. We implemented the suggested method within software verification framework Klever and evaluated it on subsystems of the Linux monolithic kernel.

Keywords

Formal verification Software verification Deductive verification Formal specification Program decomposition Environment model Operating system Monolithic kernel 

References

  1. 1.
    Silberschatz, A., Galvin, P.B., Gagne, G.: Operating System Concepts, 9th edn. Wiley, Hoboken (2012)zbMATHGoogle Scholar
  2. 2.
    Zakharov, I.S., Mandrykin, M.U., Mutilin, V.S., Novikov, E.M., Petrenko, A.K., Khoroshilov, A.V.: Configurable toolset for static verification of operating systems kernel modules. Program. Comput. Soft. 41(1), 49–64 (2015)CrossRefGoogle Scholar
  3. 3.
    Lal, A., Qadeer, S.: Powering the Static Driver Verifier using Corral. In: Proceedings of the 22nd ACM SIGSOFT International Symposium on Foundations of Software Engineering, FSE 2014, pp. 202–212. ACM, New York (2014)Google Scholar
  4. 4.
    Beyer, D., Petrenko, A.K.: Linux driver verification. In: Margaria, T., Steffen, B. (eds.) ISoLA 2012. LNCS, vol. 7610, pp. 1–6. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-34032-1_1CrossRefGoogle Scholar
  5. 5.
    Ball, T., Levin, V., Rajamani, S.K.: A decade of software model checking with SLAM. Commun. ACM 54(7), 68–76 (2011)CrossRefGoogle Scholar
  6. 6.
    Post, H., Sinz, C., Küchlin, W.: Towards automatic software model checking of thousands of Linux modules - a case study with Avinux. Softw. Test. Verif. Reliab. 19(2), 155–172 (2009)CrossRefGoogle Scholar
  7. 7.
    Witkowski, T., Blanc, N., Kroening, D., Weissenbacher, G.: Model checking concurrent Linux device drivers. In: Proceedings of the 22nd International Conference on Automated Software Engineering, ASE 2007, pp. 501–504. ACM, New York (2007)Google Scholar
  8. 8.
    Novikov, E.: Evolution of the Linux kernel. Trudy ISP RAN/Proc. ISP RAS 29(2), 77–96 (2017)CrossRefGoogle Scholar
  9. 9.
    Novikov, E.: Static verification of operating system monolithic kernels. Trudy ISP RAN/Proc. ISP RAS 29(2), 97–116 (2017)CrossRefGoogle Scholar
  10. 10.
    Black, P., Ribeiro, A.: SATE V Ockham sound analysis criteria. NIST Interagency/Internal Report 8113, 1–31 (2016)Google Scholar
  11. 11.
    Gu, R., et al.: Deep specifications and certified abstraction layers. In: Proceedings of the 42nd Annual ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages, POPL 2015, pp. 595–608. ACM, New York (2015)Google Scholar
  12. 12.
    Klein, G., et al.: Comprehensive formal verification of an OS microkernel. ACM Trans. Comput. Syst. 32(1), 1–70 (2014)CrossRefGoogle Scholar
  13. 13.
    Alkassar, E., Paul, W.J., Starostin, A., Tsyban, A.: Pervasive verification of an OS microkernel. In: Leavens, G.T., O’Hearn, P., Rajamani, S.K. (eds.) VSTTE 2010. LNCS, vol. 6217, pp. 71–85. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-15057-9_5CrossRefGoogle Scholar
  14. 14.
    Efremov, D., Mandrykin, M.: Formal verification of Linux kernel library functions. Trudy ISP RAN/Proc. ISP RAS 29(6), 49–76 (2017)CrossRefGoogle Scholar
  15. 15.
    Ferreira, J.F., Gherghina, C., He, G., Qin, S., Chin, W.N.: Automated verification of the FreeRTOS scheduler in HIP/SLEEK. Int. J. Softw. Tools Technol. Transf. 16(4), 381–397 (2014)CrossRefGoogle Scholar
  16. 16.
    Gotsman, A., Yang, H.: Modular verification of preemptive OS kernels. In: Proceedings of the 16th ACM SIGPLAN International Conference on Functional Programming, ICFP 2011, pp. 404–417. ACM, New York (2011)Google Scholar
  17. 17.
    Azevedo de Amorim, A., et al.: A verified information-flow architecture. In: Proceedings of the 41st ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages, POPL 2014, pp. 165–178. ACM, New York (2014)Google Scholar
  18. 18.
    Leino, K.R.M.: Developing verified programs with Dafny. In: Proceedings of the 2013 International Conference on Software Engineering, ICSE 2013, pp. 1488–1490. IEEE Press, Piscataway (2013)Google Scholar
  19. 19.
    DeHon, A., et al.: Preliminary design of the SAFE platform. In: Proceedings of the 6th Workshop on Programming Languages and Operating Systems, PLOS 2011, pp. 1–5. ACM, New York (2011)Google Scholar
  20. 20.
    Yang, J., Hawblitzel, C.: Safe to the last instruction: automated verification of a type-safe operating system. In: Proceedings of the 31st ACM SIGPLAN Conference on Programming Language Design and Implementation, PLDI 2010, pp. 99–110. ACM, New York (2010)Google Scholar
  21. 21.
    Jhala, R., Majumdar, R.: Software model checking. ACM Comput. Surv. 41(4), 1–54 (2009)CrossRefGoogle Scholar
  22. 22.
    Beyer, D.: Software verification with validation of results. In: Legay, A., Margaria, T. (eds.) TACAS 2017. LNCS, vol. 10206, pp. 331–349. Springer, Heidelberg (2017).  https://doi.org/10.1007/978-3-662-54580-5_20CrossRefGoogle Scholar
  23. 23.
    Musuvathi, M., Engler, D.R.: Model checking large network protocol implementations. In: Proceedings of the 1st Conference on Symposium on Networked Systems Design and Implementation, NSDI 2004, pp. 12–12. USENIX Association, Berkeley (2004)Google Scholar
  24. 24.
    Galloway, A., Lüttgen, G., Mühlberg, J.T., Siminiceanu, R.I.: Model-checking the Linux virtual file system. In: Jones, N.D., Müller-Olm, M. (eds.) VMCAI 2009. LNCS, vol. 5403, pp. 74–88. Springer, Heidelberg (2008).  https://doi.org/10.1007/978-3-540-93900-9_10CrossRefGoogle Scholar
  25. 25.
    Yang, J., Twohey, P., Engler, D., Musuvathi, M.: Using model checking to find serious file system errors. ACM Trans. Comput. Syst. 24(4), 393–423 (2006)CrossRefGoogle Scholar
  26. 26.
    Liakh, S., Grace, M., Jiang, X.: Analyzing and improving Linux kernel memory protection: a model checking approach. In: Proceedings of the 26th Annual Computer Security Applications Conference, ACSAC 2010, pp. 271–280. ACM, New York (2010)Google Scholar
  27. 27.
    Khoroshilov, A., Mutilin, V., Novikov, E., Zakharov, I.: Modeling environment for static verification of Linux kernel modules. In: Voronkov, A., Virbitskaite, I. (eds.) PSI 2014. LNCS, vol. 8974, pp. 400–414. Springer, Heidelberg (2015).  https://doi.org/10.1007/978-3-662-46823-4_32CrossRefzbMATHGoogle Scholar
  28. 28.
    Novikov, E., Zakharov, I.: Towards automated static verification of GNU C programs. In: Petrenko, A.K., Voronkov, A. (eds.) PSI 2017. LNCS, vol. 10742, pp. 402–416. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-74313-4_30CrossRefGoogle Scholar
  29. 29.
    Beyer, D., Keremoglu, M.E.: CPAchecker: a tool for configurable software verification. In: Gopalakrishnan, G., Qadeer, S. (eds.) CAV 2011. LNCS, vol. 6806, pp. 184–190. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-3-642-22110-1_16CrossRefGoogle Scholar
  30. 30.
    Engler, D., Musuvathi, M.: Static analysis versus software model checking for bug finding. In: Steffen, B., Levi, G. (eds.) VMCAI 2004. LNCS, vol. 2937, pp. 191–210. Springer, Heidelberg (2004).  https://doi.org/10.1007/978-3-540-24622-0_17CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Ivannikov Institute for System Programming of the Russian Academy of SciencesMoscowRussia

Personalised recommendations