Advertisement

CAGE: Customizable Large-Scale SOA Testbeds in the Cloud

  • Lukasz Juszczyk
  • Daniel Schall
  • Ralph Mietzner
  • Schahram Dustdar
  • Frank Leymann
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6568)

Abstract

Large-scale and complex distributed systems are increasingly implemented as SOAs. These comprise diverse types of components, e.g., Web services, registries, workflow engines, and services buses, that interact with each others to establish composite functionality. The drawback of this trend is that testing of complex SOAs becomes a challenging task. During the development phase, testers must verify the system’s correct functionality, but often do not have access to adequate testbeds. In this paper, we present an approach for solving this issue. We combine the Genesis2 testbed generator, that emulates SOA environments, with Cafe, a framework for provisioning of component-based applications in the cloud. Our approach allows to model large-scale service-based testbed infrastructures, to specify their behavior, and to deploy these automatically in the cloud. As a result, testers can emulate required environments on-demand for evaluating SOAs at runtime.

Keywords

Software Product Line Component Type Variability Model Software Product Line Engineering BPEL Process 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Papazoglou, M.P., Traverso, P., Dustdar, S., Leymann, F.: Service-oriented computing: a research roadmap. Int. J. Cooperative Inf. Syst. 17(2), 223–255 (2008)CrossRefGoogle Scholar
  2. 2.
    Huhns, M.N., Singh, M.P.: Service-oriented computing: Key concepts and principles. IEEE Internet Computing 9(1), 75–81 (2005)CrossRefGoogle Scholar
  3. 3.
    Juszczyk, L., Dustdar, S.: Script-based generation of dynamic testbeds for soa. In: ICWS 2010, pp. 195–202. IEEE Computer Society, Los Alamitos (2010)Google Scholar
  4. 4.
    Mietzner, R., Unger, T., Leymann, F.: Cafe: A generic configurable customizable composite cloud application framework. In: Meersman, R., Dillon, T., Herrero, P. (eds.) OTM 2009. LNCS, vol. 5870, pp. 357–364. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  5. 5.
    Juszczyk, L., Dustdar, S.: Programmable Fault Injection Testbeds for Complex SOA. In: Maglio, P.P., Weske, M., Yang, J., Fantinato, M. (eds.) ICSOC 2010. LNCS, vol. 6470, pp. 411–425. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  6. 6.
    Psaier, H., Juszczyk, L., Skopik, F., Schall, D., Dustdar, S.: Runtime Behavior Monitoring and Self-Adaptation in Service-Oriented Systems. In: SASO, pp. 164–174. IEEE, Los Alamitos (2010)Google Scholar
  7. 7.
    Mietzner, R., Leymann, F.: Generation of bpel customization processes for saas applications from variability descriptors. In: IEEE SCC (2), pp. 359–366. IEEE Computer Society, Los Alamitos (2008)Google Scholar
  8. 8.
    Mietzner, R., Leymann, F.: Towards provisioning the cloud: On the usage of multi-granularity flows and services to realize a unified provisioning infrastructure for saas applications. In: SERVICES I, pp. 3–10. IEEE Computer Society, Los Alamitos (2008)Google Scholar
  9. 9.
    Varia, J.: Cloud architectures. Amazon white paper (2008)Google Scholar
  10. 10.
    Ciortea, L., Zamfir, C., Bucur, S., Chipounov, V., Candea, G.: Cloud9: a software testing service. SIGOPS Oper. Syst. Rev. 43(4), 5–10 (2010)CrossRefGoogle Scholar
  11. 11.
    Arnold, W., Eilam, T., Kalantar, M.H., Konstantinou, A.V., Totok, A.: Pattern based soa deployment. In: Krämer, B.J., Lin, K.-J., Narasimhan, P. (eds.) ICSOC 2007. LNCS, vol. 4749, pp. 1–12. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  12. 12.
    Bosch, J.: Design and use of software architectures: adopting and evolving a product-line approach. Addison-Wesley Professional, Reading (2000)Google Scholar
  13. 13.
    Pohl, K., Böckle, G., Van Der Linden, F.: Software product line engineering: foundations, principles, and techniques. Springer, Heidelberg (2005)CrossRefzbMATHGoogle Scholar
  14. 14.
    Bianculli, D., Binder, W., Drago, M.L.: Automated performance assessment for service-oriented middleware. Technical Report 2009/07, Faculty of Informatics - University of Lugano (November 2009)Google Scholar
  15. 15.
    Bertolino, A., De Angelis, G., Frantzen, L., Polini, A.: Model-based generation of testbeds for web services. In: Suzuki, K., Higashino, T., Ulrich, A., Hasegawa, T. (eds.) TestCom/FATES 2008. LNCS, vol. 5047, pp. 266–282. Springer, Heidelberg (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Lukasz Juszczyk
    • 1
  • Daniel Schall
    • 1
  • Ralph Mietzner
    • 2
  • Schahram Dustdar
    • 1
  • Frank Leymann
    • 2
  1. 1.Distributed Systems GroupVienna University of TechnologyAustria
  2. 2.Institute of Architecture of Application SystemsUniversity of StuttgartGermany

Personalised recommendations