Performance optimization and evaluation for parallel processing of big data in earth system models
- 571 Downloads
Big data and high performance computing in Earth System Models (ESMs) are receiving increased attention in earth science research. When scaling to large-scale multi-core computing, efficient parallelization of an ESM, which demands fast parallel computing for long-term integration or climate simulation, becomes extremely challenging because of time-consuming internal big data communication. In this paper, an optimization algorithm for the massive data communication between the Weather Research and Forecasting model and Coupler version 7 in the Chinese Academy of Sciences-Earth System Model (CAS-ESM) is proposed. The optimization strategy is to transmit data from a small packet into a larger packet. Through experiments on a multi-core cluster, the efficiency of the algorithm is confirmed. Then, the parallel performance of the CAS-ESM is evaluated fully. Results show that the parallel efficiency of the CAS-ESM on 1024 CPU cores reaches nearly 70%, indicating that the CAS-ESM has desirable parallel performance and strong scalability. In addition, a generic performance evaluation method for ESMs from perspectives of optimal load balance and efficiency is proposed. Results show that the computing speed is the fastest and computational efficiency is the highest when the CAS-ESM runs on a certain number of cores.
KeywordsBig data High performance computing Performance optimization Earth system model
This work is supported by the National Key Research and Development Program of China (No. 2016YFB0200800), National Natural Science Foundation of China (No. 61602477, No. 41401512), China Postdoctoral Science Foundation (No. 2016M601158), Youth Innovation Promotion Association of CAS (No. Y6YR0300QM), and the Fundamental Research Funds for the Central Universities (No. 2652017113).
- 13.Vertenstein, M., Craig, T., Middleton, A., Feddema, D., Fischer, C.: CESM1. 0.4 Users Guide. Technical report, Community Earth System Model, NCAR, USA (2011)Google Scholar
- 14.Sun, H., Zhou, G., Zeng, Q.: Assessments of the climate system model (CAS-ESM-C) using IAP AGCM4 as its atmospheric component. Chin. J. Atmos. Sci. 36(2), 215–233 (2012). in ChineseGoogle Scholar
- 22.Nakaegawa, T., Kitoh, A., Ishizaki, Y., Kusunoki, S., Murakami, H.: Caribbean low-level jets and accompanying moisture fluxes in a global warming climate projected with CMIP3 multi-model ensemble and fine-mesh atmospheric general circulation models. Int. J. Climatol. 34(4), 964–977 (2014)CrossRefGoogle Scholar
- 27.Wehner, M.F., Ambrosiano, J.J., Brown, J.C., et al.: Toward a high performance distributed memory climate model. In: High Performance Distributed Computing, 1993. IEEE Proceedings the 2nd International Symposium, pp. 102–113 (1993)Google Scholar
- 28.Mechoso, C.R., Drummond, L.A., Farrara, J.D., Spahr, J.A.: The UCLA AGCM in high performance computing environments. In: Proceedings of the 1998 ACM/IEEE Conference on Supercomputing, pp. 1–7. IEEE Computer Society (1998)Google Scholar
- 38.Skamarock, W.C., Klemp, J.B., Dudhia, J., et al.: A description of the advanced research WRF version 3. NCAR technical note, TN-475+STR (2008)Google Scholar
- 39.Johnsen, P., Straka, M., Shapiro, M., Norton, A., Galarneau, T.: Petascale WRF simulation of hurricane sandy: Deployment of NCSA’s cray XE6 blue waters. In: High Performance Computing, Networking, Storage and Analysis (SC’13), pp. 1–7. IEEE (2013)Google Scholar
- 40.Xie, S., Zhang, M., Branson, M., et al.: Simulations of midlatitude frontal clouds by single-column and cloud-resolving models during the atmospheric radiation measurement March 2000 cloud intensive operational period. J. Geophys. Res. 110(D15) (2005)Google Scholar