Advertisement

Mobile Networks and Applications

, Volume 24, Issue 1, pp 100–114 | Cite as

Modeling and Verifying Basic Modules of Floodlight

  • Shuangqing Xiang
  • Xi Wu
  • Huibiao ZhuEmail author
  • Wanling Xie
  • Lili Xiao
  • Phan Cong Vinh
Article
  • 156 Downloads

Abstract

Software-Defined Networking (SDN) is an emerging networking paradigm that provides flexible network programmability and reduces the complexity of network management and control. OpenFlow Protocol is the best-known southbound interface of SDN. As one of the most popular OpenFlow controllers, Floodlight is famous for its excellent basic modules, which are used by many other mainstream controllers. Due to the popularity and widespread use of the Floodlight’s modules, it is important to analyze them in a formal framework. In this paper, we focus on six modules of Floodlight that are mainly responsible for pushing decisions and discovering topology. In order to verify whether these modules meet two important requirements, including Decision Pushed Correctly and Rule Installed Once, we propose a system model that formalizes these modules as well as hosts and switches using Communicating Sequential Processes (CSP). Moreover, we add two types of attacks: Link Fabrication and Host Hijacking, which aim at poisoning the topology information collected by a controller, to construct a second system model. Finally, we implement these two system models in Process Analysis Toolkit (PAT), a model checker, to verify whether our models cater for four assertions based on two requirements and if Floodlight is vulnerable to these two attacks.

Keywords

Modeling Formal verification SDN Floodlight 

References

  1. 1.
  2. 2.
  3. 3.
    LLDP. IEEE Std 802.1AB-2016 (Revision of IEEE Std 802.1AB- 2009).  https://doi.org/10.1109/IEEESTD.2016.7433915
  4. 4.
    Al-Shaer E, Al-Haj S (2010) Flowchecker: Configuration analysis and verification of federated openflow infrastructures. In: Proceedings of the 3rd ACM Workshop on Assurable and Usable Security Configuration, SafeConfig ’10, pp 37–44Google Scholar
  5. 5.
    Canini M, Venzano D, Peresíni P, Kostic D, Rexford J (2012) A Nice way to test openflow applications. In: Proceedings of the 9th USENIX Symposium on Networked Systems Design and Implementation, NSDI 2012, San Jose, CA, USA, April 25-27, 2012, pp 127–140Google Scholar
  6. 6.
    Hoare CAR (1985) Communicating sequential processes. Prentice-Hall, Upper Saddle RiverzbMATHGoogle Scholar
  7. 7.
    Hong S, Xu L, Wang H, Gu G (2015) Poisoning network visibility in software-defined networks: New attacks and countermeasures. In: 22Nd annual network and distributed system security symposium, NDSS 2015, san diego, california, USA, February 8-11, 2015Google Scholar
  8. 8.
    Jammal M, Singh T, Shami A, Asal R, Li Y (2014) Software defined networking: State of the art and research challenges. Comput Netw 72:74–98CrossRefGoogle Scholar
  9. 9.
    Khan S, Gani A, Wahab AWA, Guizani M, Khan MK (2017) Topology discovery in software defined networks: threats, taxonomy, and state-of-the-art. IEEE Commun Surv Tutorials 19(1):303–324CrossRefGoogle Scholar
  10. 10.
    Kreutz D, Ramos FMV, Veríssimo PJE, Rothenberg CE, Azodolmolky S, Uhlig S (2015) Software-defined networking: A comprehensive survey. Proc IEEE 103(1):14–76CrossRefGoogle Scholar
  11. 11.
    Lowe G, Roscoe AW (1997) Using CSP to detect errors in the TMN protocol. IEEE Trans Software Eng 23(10):659–669CrossRefGoogle Scholar
  12. 12.
    Mai H, Khurshid A, Agarwal R, Caesar M, Godfrey B, King ST (2011) Debugging the data plane with anteater. In: Proceedings of the ACM SIGCOMM 2011 Conference on Applications, Technologies, Architectures, and Protocols for Computer Communications, Toronto, ON, Canada, August 15-19, 2011, pp 290–301Google Scholar
  13. 13.
    Mazur T, Lowe G (2014) Csp-based counter abstraction for systems with node identifiers. Sci Comput Program 81:3–52CrossRefGoogle Scholar
  14. 14.
    McKeown N, Anderson T, Balakrishnan H, Parulkar GM, Peterson LL, Rexford J, Shenker S, Turner JS (2008) Openflow: enabling innovation in campus networks. Comput Commun Rev 38(2):69–74CrossRefGoogle Scholar
  15. 15.
    Roscoe AW (2010) Understanding concurrent systems. Texts in computer science. Springer, BerlinCrossRefGoogle Scholar
  16. 16.
    Si Y, Sun J, Liu Y, Dong JS, Pang J, Zhang SJ, Yang X (2014) Model checking with fairness assumptions using PAT. Frontiers Comput Sci 8(1):1–16MathSciNetCrossRefGoogle Scholar
  17. 17.
    Son S, Shin S, Yegneswaran V, Porras PA, Gu G (2013) Model checking invariant security properties in openflow. In: Proceedings of IEEE International Conference on Communications, ICC 2013, Budapest, Hungary, June 9-13, 2013, pp 1974–1979Google Scholar
  18. 18.
    Sun J, Liu Y, Dong JS, Pang J (2009) PAT: towards flexible verification under fairness. In: Computer aided verification, 21st international conference, CAV 2009, Grenoble, France, June 26 - July 2, 2009. Proceedings, pp 709–714Google Scholar
  19. 19.
    Sun J, Liu Y, Dong JS, Liu Y, Shi L, Andrė É (2013) Modeling and verifying hierarchical real-time systems using stateful timed CSP. ACM Trans Softw Eng Methodol 22(1):3:1–3:29CrossRefGoogle Scholar
  20. 20.
    Wu X, Zhu H (2016) Formalization and analysis of the REST architecture from the process algebra perspective. Future Generation Comp Syst 56:153–168CrossRefGoogle Scholar
  21. 21.
    Yeung WL (2007) Csp-based verification for web service orchestration and choreography. Simulation 83(1):65–74CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Shanghai Key Laboratory of Trustworthy ComputingEast China Normal UniversityShanghaiChina
  2. 2.School of Information Technology and Electrical EngineeringThe University of QueenslandBrisbaneAustralia
  3. 3.College of Computer Science and Software EngineeringShenzhen UniversityShenzhenChina
  4. 4.Nguyen Tat Thanh UniversityHo Chi Minh CityVietnam

Personalised recommendations