Skip to main content

Refactoring Scientific Applications for Massive Parallelism

  • Chapter
  • First Online:

Part of the book series: Lecture Notes in Computational Science and Engineering ((LNCSE,volume 80))

Abstract

We describe several common problems that we discovered during our efforts to refactor several large geofluid applications that are components of the Community Climate System Model (CCSM) developed at the National Center for Atmospheric Research (NCAR). We stress tested the weak scalability of these applications by studying the impact of increasing both the resolution and core counts by factors of 10–100. Several common code design and implementations issues emerged that prevented the efficient execution of these applications on very large microprocessor counts. We found that these problems arise as a direct result of disparity between the initial design assumptions made for low resolution models running on a few dozen processors, and today’s requirements that applications run in massively parallel computing environments. The issues discussed include non-scalable memory usage and execution time in both the applications themselves and the supporting scientific data tool chains.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Adiga NR, et al (2002) An overview of the Blue Gene/L supercomputer. In: Proceedings of SC2002, Baltimore, MD

    Google Scholar 

  2. Baker AH, Falgout FD, Yang UM (2006) An assumed partition algorithm for determing processor inter-communication. Parallel Computing 32:394–414

    Article  MathSciNet  Google Scholar 

  3. Collins WD, Rasch P, Boville BA, Hack J, McCaa J, Williamson DL, Bitz BPBCM, Lin SJ, Zhang M (2006) The formulation and atmospheric simulation of the Community Atmosphere Model version 3 (CAM3). Journal of Climate 19(11):2144–2161

    Article  Google Scholar 

  4. Dennis JM (2007) Inverse space-filling curve partitioning of a global ocean model. In: IEEE International Parallel & Distributed Processing Symposium, Long Beach, CA

    Google Scholar 

  5. Dennis JM, Tufo HM (2008) Scaling climate simulation applications on IBM Blue Gene. IBM Journal of Research and Development: Applications for Massively Parallel Systems 52(1/2)

    Google Scholar 

  6. Dennis JM, Fournier A, Spotz WF, St-Cyr A, Taylor MA, Thomas SJ, Tufo H (2005) High resolution mesh convergence properities and parallel efficiency of a spectral element atmospheric dynamical core. Int J High Perf Comput Appl 19:225–235

    Article  Google Scholar 

  7. Hoffman FM, Vertenstein M, Kitabata H, III JBW (2005) Vectorizing the community land model. International Journal of High Performance Computing Applications 19:247–260

    Article  Google Scholar 

  8. Hunke EC, Lipscomb WH (2008) CICE: the Los Alamos sea ice model documenation and software user’s manual version 4.0. Tech. Rep. LA-CC-06-012, Los Alamos National Laboratory, T-3 Fluid Dynamics Group

    Google Scholar 

  9. Jacob R, Larson J, Ong E (2005) MxN communication and parallel interpolation in CCSM3 using the Model Coupling Toolkit. Int J High Perf Comp Appl 19(3):293–307

    Article  Google Scholar 

  10. Jones P (2003) Parallel Ocean Program (POP) user guide. Tech. Rep. LACC 99-18, Los Alamos National Laboratory

    Google Scholar 

  11. Jones PW, Worley PH, Yoshida Y, White JBI, Levesque J (2005) Practical performance portability in the Parallel Ocean Program (POP). Concurrency Comput Prac Exper 17:1317–1327

    Article  Google Scholar 

  12. Larson J, Jacob R, Ong E (2005) The Model Coupling Toolkit: A new Fortran90 toolkit for building multiphysics parallel coupled models. Int J High Perf Comp App 19(3):277–292

    Article  Google Scholar 

  13. Murray RJ (1996) Explicit generation of orthogonal grids for ocean models. J Comp Phys 126:251–273

    Article  MATH  Google Scholar 

  14. OpenMP (2005) OpenMP application programing interface. http://www.openmp.org/

  15. RedStorm (2006) The Cray XT3 Supercomputer. http://www.cray.com/products/xt3/index.html

  16. Shingu S, Y T, Ohfuchi W, Otsuka K, Takahara H, Hagiwara T, Habata S, Fuchigami H, Yamada M, Sasaki Y, Kobayashi K, Yokokawa M, Itoh H (2002) A 26.58 Tflops global atmospheric simulation with the spectral transform method on the earth simulator. In: Proceedings of the 2002 ACM/IEEE Conference on Supercomputing, pp 1–19

    Google Scholar 

  17. Smith R, Kortas S, Meltz B (1995) Curvilinear coordinates for global ocean models. LANL Technical Report LA-UR-95-1146

    Google Scholar 

  18. Snir M, Otto S, Huss-Lederman S, Walker D, Dongarra J (2000) MPI: The Complete Reference: Volume 1, The MPI Core. The MIT Press

    Google Scholar 

  19. Solomon S, Qin D, Manning M, Chen Z, Marquis M, Tignor KAM, Miller H (eds) (2007) Contribution of Working Group 1 to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change. Cambridge University Press, Cambridge United Kingdom and New York, NY, USA

    Google Scholar 

  20. Taylor MA, Edwards J, St-Cyr A (2008) Petascale atmospheric models for the community climate system model: New developments and evaluation of scalable dynamical cores. J Phys Conf Ser 125

    Google Scholar 

  21. Walkup B (2007) Personal Communication

    Google Scholar 

Download references

Acknowledgements

We would like to thank our colleagues Mariana Vertenstein, Tony Craig for all their work addressing the many code design issues discovered during this study. We would like to thank Dr. Mark Taylor for running several of the applications on compute platforms at Sandia National Laboratory, and Lawrence Livermore National Laboratory. We also thank Brookhaven National Laboratory, and Oak Ridge National Laboratory for access to their large compute platforms. We thank Fred Mintzer for access to the Thomas J. Watson Research facility through the 2nd and 3rd Blue Gene Watson Consortium Days event. Significant computational resources were provided through grants by the LLNL 2nd and 3rd Institutional Grant Challenge program. Code development would not have been possible without the access to the Blue Gene system at NCAR, which is funded through NSF MRI Grants CNS-0421498, CNS-0420873, and CNS-0420985 and through the IBM Shared University Research (SUR) Program with the University of Colorado. The work of these authors was supported through National Science Foundation Cooperative Grant NSF01 which funds the National Center for Atmospheric Research (NCAR), and through the grants: #OCI-0749206 and #OCE-0825754. Additional funding is provided through the Department of Energy, CCPP Program Grant #DE-PS02-07ER07-06.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to John M. Dennis .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Dennis, J.M., Loft, R.D. (2011). Refactoring Scientific Applications for Massive Parallelism. In: Lauritzen, P., Jablonowski, C., Taylor, M., Nair, R. (eds) Numerical Techniques for Global Atmospheric Models. Lecture Notes in Computational Science and Engineering, vol 80. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-11640-7_16

Download citation

Publish with us

Policies and ethics