Advertisement

Designing Benchmarks for P2P Systems

  • Max Lehn
  • Tonio Triebel
  • Christian Gross
  • Dominik Stingl
  • Karsten Saller
  • Wolfgang Effelsberg
  • Alexandra Kovacevic
  • Ralf Steinmetz
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6462)

Abstract

In this paper we discuss requirements for peer-to-peer (P2P) benchmarking, and we present two exemplary approaches to benchmarks for Distributed Hashtables (DHT) and P2P gaming overlays. We point out the characteristics of benchmarks for P2P systems, focusing on the challenges compared to conventional benchmarks. The two benchmarks for very different types of P2P systems are designed applying a common methodology. This includes the definition of the system under test (SUT) and particularly its interfaces, the workloads and metrics. A set of common P2P quality metrics helps to achieve a comprehensive selection of workloads and metrics for each scenario.

Keywords

Success Ratio System Under Test Query Response Time Game World Interest Management 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    CAIDA - Macroscopic Topology Measurements, http://www.caida.org/projects/macroscopic
  2. 2.
    OECD Broadband statistics, http://oecd.org/sti/ict/broadband
  3. 3.
    QuaP2P Project Website, http://www.quap2p.tu-darmstadt.de
  4. 4.
  5. 5.
    Butnaru, B., Dragan, F., Gardarin, G., Manolescu, I., Nguyen, B., Pop, R., Preda, N., Yeh, L.: P2PTester: a tool for measuring P2P platform performance. In: ICDE 2007. IEEE 23rd International Conference on Data Engineering, Istanbul, pp. 1501–1502 (2007)Google Scholar
  6. 6.
    Chambers, C., Feng, W.-C., Sahu, S., Saha, D.: Measurement-based characterization of a collection of on-line games. In: 5th ACM SIGCOMM conference on Internet Measurement. USENIX Association, New York (2005)Google Scholar
  7. 7.
    Chen, K.-T., Huang, P., Huang, C.-Y., Lei, C.-L.: Game traffic analysis: An MMORPG perspective. Computer Networks 50(16), 3002–3023 (2006)CrossRefGoogle Scholar
  8. 8.
    BAPCo consortium. SYSmark 2007 Preview, http://www.bapco.com/products/sysmark2007preview/
  9. 9.
    Futuremark Corporation. 3DMark Vantage, http://www.futuremark.com/benchmarks/3dmarkvantage/
  10. 10.
    Standard Performance Evaluation Corporation. SPECjms (2007), http://www.spec.org/jms2007/
  11. 11.
    Standard Performance Evaluation Corporation. SPEC’s Benchmarks and Published Results, http://www.spec.org/benchmarks.html
  12. 12.
    Transaction Processing Performance Council. TPC Benchmarks, http://tpc.org/information/benchmarks.asp
  13. 13.
    Fan, L., Trinder, P., Taylor, H.: Design Issues for Peer-to-Peer Massively Multiplayer Online Games. In: MMVE 2009 (2009)Google Scholar
  14. 14.
    Gummadi, K.P., Saroiu, S., Gribble, S.D.: King: Estimating latency between arbitrary Internet end hosts. In: 2nd ACM SIGCOMM Workshop on Internet Measurment, pp. 5–18. ACM, New York (2002)CrossRefGoogle Scholar
  15. 15.
    Hu, S.-Y., Liao, G.-M.: VON: A Scalable Peer-to-Peer Network for Virtual Environments. IEEE Network 20(4), 22–31 (2006)CrossRefGoogle Scholar
  16. 16.
    Jain, R.: The Art of Computer Systems Performance Analysis. John Wiley & Sons, Chichester (1991)zbMATHGoogle Scholar
  17. 17.
    Kaune, S., Pussep, K., Leng, C., Kovacevic, A., Tyson, G., Steinmetz, R.: Modelling the internet delay space based on geographical locations. In: 17th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing (PDP 2009), pp. 301–310 (2009)Google Scholar
  18. 18.
    Kounev, S.: Performance Engineering of Distributed Component-Based Systems - Benchmarking, Modeling and Performance Prediction. Shaker Verlag, Aachen (2005)Google Scholar
  19. 19.
    Kounev, S., Sachs, K.: Benchmarking and Performance Modeling of Event-Based Systems. It - Information Technology 51, 262–269 (2009)CrossRefGoogle Scholar
  20. 20.
    Kovacevic, A.: Peer-To-Peer Location-Based Search: Engineering a Novel Peer-To-Peer Overlay Network. PhD thesis, Technische Universität Darmstadt (2009)Google Scholar
  21. 21.
    Kovacevic, A., Graffi, K., Kaune, S., Leng, C., Steinmetz, R.: Towards Benchmarking of Structured Peer-to-Peer Overlays for Network Virtual Environments. In: 14th IEEE International Conference on Parallel and Distributed Systems, pp. 799–804. IEEE, Los Alamitos (2008)Google Scholar
  22. 22.
    Kovacevic, A., Kaune, S., Liebau, N., Steinmetz, R., Mukherjee, P.: Benchmarking Platform for Peer-to-Peer Systems (Benchmarking Plattform für Peer-to-Peer Systeme). It - Information Technology 49(5), 312–319 (2007)CrossRefGoogle Scholar
  23. 23.
    Neumann, T., Bender, M., Michel, S., Weikum, G.: A Reproducible Benchmark for P2P Retrieval. In: International Workshop on Performance and Evaluation of Data Management Systems. ACM, New York (2006)Google Scholar
  24. 24.
    Eugene Ng, T.S., Zhang, H.: Towards global network positioning. In: 1st ACM SIGCOMM Workshop on Internet Measurement, pp. 25–29. ACM Press, New York (2001)Google Scholar
  25. 25.
    Nocentini, C., Crescenzi, P., Lanzi, L.: Performance Evaluation of a Chord-based JXTA Implementation. In: First International Conference on Advances in P2P Systems, pp. 7–12. IEEE, Los Alamitos (2009)Google Scholar
  26. 26.
    Sachs, K., Appel, S., Kounev, S., Buchmann, A.: Benchmarking Publish/Subscribe-based Messaging Systems. In: Yoshikawa, M., Meng, X., Yumoto, T., Ma, Q., Sun, L., Watanabe, C. (eds.) Database Systems for Advanced Applications. LNCS, vol. 6193, pp. 203–214. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  27. 27.
    Sachs, K., Kounev, S., Bacon, J., Buchmann, A.: Performance evaluation of message-oriented middleware using the SPECjms2007 benchmark. Performance Evaluation 66(8), 410–434 (2009)CrossRefGoogle Scholar
  28. 28.
    Sachs, K., Kounev, S., Carter, M., Buchmann, A.: Designing a workload scenario for benchmarking message-oriented middleware. In: SPEC Benchmark Workshop (2007)Google Scholar
  29. 29.
    Schmieg, A., Stieler, M., Jeckel, S., Kabus, P., Kemme, B., Buchmann, A.: pSense - Maintaining a Dynamic Localized Peer-to-Peer Structure for Position Based Multicast in Games. In: IEEE International Conference on Peer-to-Peer Computing (2008)Google Scholar
  30. 30.
    Svoboda, P., Karner, W., Rupp, M.: Traffic Analysis and Modeling for World of Warcraft. In: IEEE International Conference on Communications, pp. 1612–1617 (2007)Google Scholar
  31. 31.
    Tan, S.A., Lau, W., Loh, A.: Networked Game Mobility Model for First-Person-Shooter Games. In: 4th ACM SIGCOMM workshop on Network and system support for games, p. 9. ACM, New York (2005)Google Scholar
  32. 32.
    Triebel, T., Guthier, B., Süselbeck, R., Schiele, G., Effelsberg, W.: Peer-to-Peer Infrastructures for Games. In: 18th International Workshop on Network and Operating Systems Support for Digital Audio and Video, NOSSDAV 2008, pp. 123–124 (2008)Google Scholar
  33. 33.
    Winick, J., Jamin, S.: Inet-3.0: Internet topology generator. Technical report, University of Michigan (2002)Google Scholar
  34. 34.
    Zhou, S.: Characterising and modelling the internet topology – the rich-club phenomenon and the pfp model. BT Technology Journal 24(3), 108–115 (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Max Lehn
    • 1
  • Tonio Triebel
    • 2
  • Christian Gross
    • 3
  • Dominik Stingl
    • 3
  • Karsten Saller
    • 4
  • Wolfgang Effelsberg
    • 2
  • Alexandra Kovacevic
    • 3
  • Ralf Steinmetz
    • 3
  1. 1.Databases and Distributed SystemsTechnische Universität DarmstadtGermany
  2. 2.Praktische Informatik IVUniversität MannheimGermany
  3. 3.KOM – Multimedia Communications LabTechnische Universität DarmstadtGermany
  4. 4.Real-Time Systems LabTechnische Universität DarmstadtGermany

Personalised recommendations