Advertisement

A Performance Study of Applications in the Australian Community Climate and Earth System Simulator

  • Mark Cheeseman
  • Ben Evans
  • Dale Roberts
  • Marshall Ward
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 448)

Abstract

A 3-year investigation is underway into the performance of applications used in the Australian Community Climate and Earth System Simulator on the petascale supercomputer Raijin hosted at the National Computational Infrastructure. Several applications have been identified as candidates for this investigation including the UK MetOffice’s Unified Model (UM) atmospheric model and Princeton University’s Modular Ocean Model (MOM). In this paper we present initial results of the investigation of the performance and scalability of UM and MOM on Raijin. We also present initial results of a performance study on the data assimilation package (VAR) developed by the UK MetOffice and used by the Australian Bureau of Meteorology in its operational weather forecasting suite. Further investigation and optimization is envisioned for each application investigated and will be discussed.

Keywords

climate simulation Unified Model performance evaluation 

References

  1. Altair: PBS Works, http://www.pbsworks.com (retrieved October 23, 2014)
  2. Home: ARC Centre of Excellence for Climate System Science,http://www.climatescience.org.au (retrieved October 23, 2014)
  3. Bi, D., Dix, M., Marsland, S., O’Farrell, S., Rashid, H., Uotila, P., et al.: The ACCESS coupled model: description, control climate and evaluation. Australian Meteorological and Oceanographic Journal 63(1), 41–64 (2013)Google Scholar
  4. Brown, A., Milton, S., Golding, B., Mitchell, J., Shelly, A.: Unified Modeling and Prediction of Weather and Climate A 25-Year Journey. American Meteorological Society, 1865–1877 (2012)Google Scholar
  5. Cullen, M.J.: The unified forecast/climate model. Meteorological Magazine 122, 81-94Google Scholar
  6. Edwards, T., Roy, K.: Using I/O Servers to Improve Application Performance on Cray XT Technology, Cray Users Group, pp. 1–4. Edinburgh (2010)Google Scholar
  7. Gabriel, E., Fagg, G.E., Bosilca, G., Angskun, T., Dongarra, J.J., Squyres, J.M., et al.: Open MPI: Goals, Concept and Design of a Next Generation MPI Implementation. In: Proceedings 11th European PVM/MPI Users’ Group Meeting, pp. 97–104. Budapest (2004)Google Scholar
  8. Griffies, S.M., Harrison, M.J., Pacanowski, R.C., Rosati, A.: A Technical Guide to MOM4. Technical Report, Princeton University, Geophysical Fluids Dynamics Laboratory, Princeton (2004)Google Scholar
  9. Intel Corporation, Intel Hyper-Threading Technology, from Intel Corporation website: http://www.intel.com/content/www/us/en/architecture-and-technology/hyper-threading/hyper-threading-technology.html (retrieved October 25, 2014)
  10. Keenan, T., Puri, K., Pugh, T., Evans, B., Dix, M., Pitman, A., et al.: Next Generation Australian Community Climate and Earth-System Simulator (NGACCESS) - A Roadmap 2014-2019. Technical, Centre for Australian Weather and Climate Research, CAWCR (2014)Google Scholar
  11. Kowalczyk, E., Wang, Y., Law, R., Davies, H., McGregor, J., Abramowitz, G.: The CSIRO Atmosphere Biosphere Land Exchange (CABLE) model for use in climate models and as an offline model. Technical, Commonwealth Scientific and Industrial Research Organisation (2006)Google Scholar
  12. Los Alamos National Laboratory, The Los Alamos sea ice model (CICE) (2013), From COSIM: The Climate, Ocean and Sea Ice Modeling Group: http://oceans11.lanl.gov/drupal/CICE (retrieved November 7, 2014)
  13. Rawlins, F., Ballard, S.P., Bovis, K.J., Clayton, A.M., Li, D., Inverarity, W., et al.: The Met Office global four-dimensional variational data assimilation scheme. Quarterly Journal of the Royal Meteorological Society 133, 347–362 (2007)CrossRefGoogle Scholar
  14. Schlutter, M., Philippen, P., Morin, L., Geimer, M., Mohr, B.: Profiling Hybrid HMPP Applications with Score-P on Heterogeneous Hardware. In: Bader, M., Bode, A., Bungartz, H.-J., Gerndt, M., Joubert, G.R., Peters, F.J. (eds.) Parallel Computing: Accelerating Computational Science and Engineering (CSE), vol. 25, pp. 773–782. IOS Press (2014)Google Scholar
  15. Valcke, S., Craig, T., Coquart, L.: OASIS3-MCT User Guide. Technical Report, CERFACS, CNRS, Toulouse (2013)Google Scholar
  16. Winton, M.: FMS Sea Ice Simulator. Princeton University, Geophysical Fluid Dynamics Laboratory, Princeton (2001)Google Scholar
  17. Wittmann, M., Hager, G., Zeiser, T., Wellein, G.: Asynchronous MPI for the Masses. Cornell University, Department of Computer Science, Ithaca (2013)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  • Mark Cheeseman
    • 1
  • Ben Evans
    • 1
  • Dale Roberts
    • 1
  • Marshall Ward
    • 1
  1. 1.National Computational InfrastructureCanberraAustralia

Personalised recommendations