Abstract
The scientific community needs experimental practices to be reproducible or replicable when possible. Reproducible research has gain lots of attention in the scientific community in recent years, which includes computer science in general [1, 2]. Reproducible research in computer networking calls for public availability of code (from either real prototypes or simulation environments), data, and experimental practices. Even if one makes code and data available at public repositories, it is very important to share the methodology used in the experiments. And very often, the time required to reproduce some experiments is discouraging. This chapter tackles design of experiments for computer networking research, in order to make the experiments effective and efficient while encouraging reproducibility.
The original version of this chapter was revised. An erratum to this chapter can be found at DOI 10.1007/978-3-319-54521-9_6
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
For example, it might make sense to restrict the range of latency to a few milliseconds in experiments for cloud environments.
- 2.
Don’t waste your time doing this again. It has been proven a gazillion of times that TCP is kind of fair, in most scenarios.
References
Peng, Roger D. 2011. Reproducible research in computational science. Science 334 (6060): 1226–1227.
Hanson, Brooks, Andrew Sugden, and Bruce Alberts. 2011. Making data maximally available. Science 331 (6018): 649–649.
Škrjanc, Igor. 2015. Evolving fuzzy-model-based design of experiments with supervised hierarchical clustering. IEEE Transactions on Fuzzy Systems 23 (4): 861–871.
Teran-Somohano, Alejandro, et al. 2015. A model-driven engineering approach to simulation experiment design and execution. Proceedings of the 2015 Winter Simulation Conference. IEEE Press.
Lucas, Thomas W., et al. 2015. Changing the paradigm: Simulation, now a method of first resort. Naval Research Logistics (NRL) 62 (4): 293–303.
Sanchez, Susan M., and Hong Wan. 2015. Work smarter, not harder: A tutorial on designing and conducting simulation experiments. 2015 Winter Simulation Conference (WSC). IEEE.
Keene, Samuel. 2012. Six sigma approach to requirements development. Design for reliability 121–135.
Mukerjee, Rahul, and C.F. Jeff Wu. 2007. A modern theory of factorial design. New York: Springer Science & Business Media.
Seufert, Michael, et al. 2015. A survey on quality of experience of http adaptive streaming. IEEE Communications Surveys & Tutorials 17 (1): 469–492.
Tanco, MartÃn, Elisabeth Viles, and Lourdes Pozueta. 2009. Comparing different approaches for design of experiments (DoE). In Advances in electrical engineering and computational science, 611–621. Dordrecht: Springer.
Janevski, Nikola, and Katerina Goseva-Popstojanova. 2013. Session reliability of web systems under heavy-tailed workloads: An approach based on design and analysis of experiments. IEEE Transactions on Software Engineering 39 (8): 1157–1178.
Constantine, Barry, et al. 2011. Framework for TCP throughput testing. No. RFC 6349.
Alexopoulos, Christos, and Andrew F. Seila. 1996. Implementing the batch means method in simulation experiments. Proceedings of the 28th conference on Winter simulation. IEEE Computer Society.
Shao, Jun, and Tu. Dongsheng. 2012. The jackknife and bootstrap. New York: Springer Science & Business Media.
Pronzato, Luc, and Andrej Pázman. (2013). Design of experiments in nonlinear models. Lecture notes in statistics 212.
Law, Averill M. 2014.A tutorial on design of experiments for simulation modeling. Proceedings of the 2014 Winter Simulation Conference. IEEE Press.
Gatti, Christopher. 2014. Design of experiments for reinforcement learning. Cham: Springer.
NIST/SEMATECH e-Handbook of Statistical Methods, Chapter 5: Process improvement. http://www.itl.nist.gov/div898/handbook/. Accessed Aug 2016.
Kurbel, Karl E. 2008. Developing information systems. In The making of information systems: Software engineering and management in a globalized world, 155–234. Berlin: Springer.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this chapter
Cite this chapter
Fernandes, S. (2017). Designing and Executing Experimental Plans. In: Performance Evaluation for Network Services, Systems and Protocols . Springer, Cham. https://doi.org/10.1007/978-3-319-54521-9_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-54521-9_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-54519-6
Online ISBN: 978-3-319-54521-9
eBook Packages: Computer ScienceComputer Science (R0)