Advertisement

Designing Repeatable Experiments on an Emulab Testbed

  • Andres Perez-Garcia
  • Christos Siaterlis
  • Marcelo Masera
Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 66)

Abstract

Emulation testbeds are increasingly used in an effort to promote repeatable experiments in the area of distributed systems and networking. In this paper we are studying how different design choices, e.g. use of specific tools, can affect the repeatability of experiments of an emulation testbed (e.g. based on the Emulab software).

Our study is based on multiple experiments that are checked for stability and consistency (e.g., repetition of the same experiment and measurement of the mean and standard deviation of our metrics). The results indicate that repeatability of quantitative results is possible, under a degree of expected statistical variation. The event scheduling mechanism of Emulab is proven to be accurate down to a sub-second granularity. On the other hand we demonstrate that there are significant differences between traffic generation tools in terms of consistent recreation of a predefined traffic pattern and therefore experiment repeatability.

The main contribution of this study is that based on experimental results we provide scientific proofs that Emulab as a platform can be used for scientifically rigorous experiments for networking research. New users of Emulab can benefit from this study by understanding that Emulab’s scheduling mechanism, it’s built-in packet generators and Iperf can sufficiently support repeatable experiments while TCPreplay cannot and therefore an alternative tool, i.e. TCPivo should be used.

Keywords

emulation network test-bed repeatability traffic generators 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Pawlikowski, K., Joshua Jeong, H.d., Ruth Lee, J.s.: On credibility of simulation studies of telecommunication networks. IEEE Communications Magazine 40, 132–139 (2002)CrossRefGoogle Scholar
  2. 2.
    Benzel, T., Braden, R., Kim, D., Neuman, C., Joseph, A., Sklower, K., Ostrenga, R., Schwab, S.: Design, deployment, and use of the deter testbed. In: DETER: Proceedings of the DETER Community Workshop on Cyber Security Experimentation and Test on DETER Community Workshop on Cyber Security Experimentation and Test 2007, p. 1. USENIX Association, Berkeley (2007)Google Scholar
  3. 3.
    Neville, S.W., Li, K.F.: The rational for developing larger-scale 1000+ machine emulation-based research test beds. In: International Conference on Advanced Information Networking and Applications Workshops, pp. 1092–1099 (2009)Google Scholar
  4. 4.
    Emulab Bibliography, http://www.emulab.net/expubs.php/
  5. 5.
    White, B., Lepreau, J., Stoller, L., Ricci, R., Guruprasad, S., Newbold, M., Hibler, M., Barb, C., Joglekar, A.: An integrated experimental environment for distributed systems and networks. In: Proc. of the Fifth Symposium on Operating Systems Design and Implementation, pp. 255–270. USENIX Association, Boston (2002)CrossRefGoogle Scholar
  6. 6.
    DETER. cyber-DEfense Technology Experimental Research laboratory Testbed, http://www.isi.edu/deter/
  7. 7.
    Mirkovic, J., Hussain, A., Fahmy, S., Reiher, P.L., Thomas, R.K.: Accurately measuring denial of service in simulation and testbed experiments. IEEE Trans. Dependable Sec. Comput. 6(2), 81–95 (2009)CrossRefGoogle Scholar
  8. 8.
    Anderson, D.S., Hibler, M., Stoller, L., Stack, T., Lepreau, J.: Automatic online validation of network configuration in the emulab network testbed. In: ICAC 2006: Proceedings of the 2006 IEEE International Conference on Autonomic Computing, pp. 134–142. IEEE Computer Society, Washington, DC (2006)CrossRefGoogle Scholar
  9. 9.
    Chertov, R., Fahmy, S., Shroff, N.B.: Fidelity of network simulation and emulation: A case study of tcp-targeted denial of service attacks. ACM Trans. Model. Comput. Simul. 19(1), 1–29 (2008)CrossRefGoogle Scholar
  10. 10.
    Guglielmi, M., Fovino, I.N., Garcia, A.P., Siaterlis, C.: A preliminary study of a wireless process control network using emulation testbed. In: Proc. of the 2nd International Conference on Mobile Lightweight Wireless Systems. ICST, Barcelona (2010)Google Scholar
  11. 11.
    ISI, Network simulator ns-2, http://www.isi.edu/nsnam/ns/
  12. 12.
    Rizzo, L.: Dummynet: a simple approach to the evaluation of network protocols. SIGCOMM Comput. Commun. Rev. 27(1), 31–41 (1997)CrossRefGoogle Scholar
  13. 13.
    Andres Perez Garcia, M.M., Siaterlis, C.: Testing the fidelity of an emulab testbed. In: Proc. of the 2nd workshop on Sharing Field Data and Experiment Measurements on Resilience of Distributed Computing Systems, Genova, Italy (June 2010)Google Scholar
  14. 14.
    Turner, A.: Tcpreplay tool, http://tcpreplay.synfin.net/trac/
  15. 15.
    Feng, W.-C., Goel, A., Bezzaz, A., Feng, W.-C., Walpole, J.: Tcpivo: a high-performance packet replay engine. In: MoMeTools 2003: Proceedings of the ACM SIGCOMM Workshop on Models, Methods and Tools for Reproducible Network Research, pp. 57–64. ACM, New York (2003)CrossRefGoogle Scholar
  16. 16.
    NLANR/DAST, Iperf: The TCP/UDP bandwidth measurement tool, http://sourceforge.net/projects/iperf/
  17. 17.
    Cho, K.: WIDE-TRANSIT 150 Megabit Ethernet Trace 2008-03-18 (Anonymized) (collection), http://imdc.datcat.org/collection/1-05L8-9=WIDE-TRANSIT+150+Megabit+Ethernet+Trace+2008-03-18+%28Anonymized%29

Copyright information

© ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering 2012

Authors and Affiliations

  • Andres Perez-Garcia
    • 1
  • Christos Siaterlis
    • 1
  • Marcelo Masera
    • 1
  1. 1.Institute for the Protection and Security of the CitizenJoint Research CentreIspraItaly

Personalised recommendations