Comparison of Tree Based Strategies for Parallel Simulation of Self-gravity in Agglomerates

  • Nestor RocchettiEmail author
  • Sergio Nesmachnow
  • Gonzalo Tancredi
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 979)


This article presents an algorithm conceived to improve the computational efficiency of simulations in ESyS-Particle that involve a large number of particles. ESyS-Particle applies the Discrete Element Method to simulate the interaction of agglomerates of particles. The proposed algorithm is based on the Barnes & Hut method, in which a domain is divided and organized in an octal tree. The algorithm is compared to a variation of the octal tree version that uses a binary tree instead. Experimental evaluation is performed over two scenarios: a collapsing cube scenario and two agglomerates orbiting each other. The experimental evaluation comprises the performance analysis of the two scenarios using the two algorithms, including a comparison of the results obtained and the analysis of the numerical accuracy. Results indicate that the octal tree version performs faster and is more accurate than the binary tree version.


Multithreading Self-gravity DEM 


  1. 1.
    Harris, A., Fahnestock, E., Pravec, P.: On the shapes and spins of “rubble pile” asteroids. Icarus 199(2), 310–318 (2009)CrossRefGoogle Scholar
  2. 2.
    Fujiwara, A., et al.: The rubble-pile asteroid itokawa as observed by hayabusa. Science 312(5778), 1330–1334 (2006)CrossRefGoogle Scholar
  3. 3.
    Walsh, K., Richardson, D., Michel, P.: Spin-up of rubble-pile asteroids: disruption, satellite formation, and equilibrium shapes. Icarus 220(2), 514–529 (2012)CrossRefGoogle Scholar
  4. 4.
    Goldreich, P., Sari, R.: Tidal evolution of rubble piles. Astrophys. J. 691(1), 54 (2009)CrossRefGoogle Scholar
  5. 5.
    Rozitis, B., MacLennan, E., Emery, J.: Cohesive forces prevent the rotational breakup of rubble-pile asteroid (29075) 1950 DA. Nature 512(7513), 174–176 (2014)CrossRefGoogle Scholar
  6. 6.
    Cundall, P., Strack, O.: A discrete numerical model for granular assemblies. Geotechnique 29(1), 47–65 (1979)CrossRefGoogle Scholar
  7. 7.
    Abe, S., et al.: ESyS-Particle: HPC Discrete Element Modeling Software. Open Software License version, 3 (2009)Google Scholar
  8. 8.
    Frascarelli, D., Nesmachnow, S., Tancredi, G.: High-performance computing of self-gravity for small solar system bodies. Computer 47(9), 34–39 (2014)CrossRefGoogle Scholar
  9. 9.
    Nesmachnow, S., Frascarelli, D., Tancredi, G.: A parallel multithreading algorithm for self-gravity calculation on agglomerates. In: Gitler, I., Klapp, J. (eds.) ISUM 2015. CCIS, vol. 595, pp. 311–325. Springer, Cham (2016). Scholar
  10. 10.
    Rocchetti, N., Frascarelli, D., Nesmachnow, S., Tancredi, G.: Performance improvements of a parallel multithreading self-gravity algorithm. In: Mocskos, E., Nesmachnow, S. (eds.) CARLA 2017. CCIS, vol. 796, pp. 291–306. Springer, Cham (2018). Scholar
  11. 11.
    Hockney, R., Eastwood, J.: Computer Simulation Using Particles. CRC Press, London (1988)CrossRefGoogle Scholar
  12. 12.
    Darden, T., York, D., Pedersen, L.: Particle mesh ewald: an n\(\cdot \) log (n) method for ewald sums in large systems. J. Chem. Phys. 98(12), 10089–10092 (1993)CrossRefGoogle Scholar
  13. 13.
    Essmann, U., Perera, L., Berkowitz, M.L., Darden, T., Lee, H., Pedersen, L.G.: A smooth particle mesh ewald method. J. Chem. Phys. 103(19), 8577–8593 (1995)CrossRefGoogle Scholar
  14. 14.
    Sánchez, P., Scheeres, D.: Dem simulation of rotation-induced reshaping and disruption of rubble-pile asteroids. Icarus 218(2), 876–894 (2012)CrossRefGoogle Scholar
  15. 15.
    Kravtsov, A., Klypin, A., Khokhlov, A.: Adaptive refinement tree: a new high-resolution N-body code for cosmological simulations. Astrophys. J. Suppl. Ser. 111(1), 73 (1997)CrossRefGoogle Scholar
  16. 16.
    Couchman, H.: Mesh-refined P3M-A fast adaptive N-body algorithm. Astrophys. J. 368, L23–L26 (1991)CrossRefGoogle Scholar
  17. 17.
    MacFarland, T., Couchman, H., Pearce, F., Pichlmeier, J.: A new parallel P3M code for very large-scale cosmological simulations. New Astron. 3(8), 687–705 (1998)CrossRefGoogle Scholar
  18. 18.
    Harnois-Déraps, J., Pen, U., Iliev, I., Merz, H., Emberson, J., Desjacques, V.: High-performance P3M N-body code: CUBEP3M. Mon. Not. R. Astron. Soc. 436(1), 540–559 (2013)CrossRefGoogle Scholar
  19. 19.
    Brieu, P., Summers, F., Ostriker, J.: Cosmological simulations using special purpose computers: implementing P3M on GRAPE. Astrophys. J. Suppl. 453, 566–575 (1995)CrossRefGoogle Scholar
  20. 20.
    Barnes, J., Hut, P.: A hierarchical O(\(N\)\(\log \)\(N\)) force-calculation algorithm. Nature 324(6096), 446–449 (1986)CrossRefGoogle Scholar
  21. 21.
    Greengard, L., Rokhlin, V.: A fast algorithm for particle simulations. J. Comput. Phys. 73(2), 325–348 (1987)MathSciNetCrossRefGoogle Scholar
  22. 22.
    Xu, G.: A new parallel N-body gravity solver: TPM. Astrophys. J. Suppl. 98, 355–376 (1994)CrossRefGoogle Scholar
  23. 23.
    Bode, P., Ostriker, J., Xu, G.: The tree particle-mesh n-body gravity solver. Astrophys. J. Suppl. Ser. 128(2), 561 (2000)CrossRefGoogle Scholar
  24. 24.
    Bagla, J.: Treepm: a code for cosmological n-body simulations. J. Astrophys. Astron. 23(3), 185–196 (2002)CrossRefGoogle Scholar
  25. 25.
    Khandai, N., Bagla, J.: A modified TreePM code. Res. Astron. Astrophys. 9(8), 861 (2009)CrossRefGoogle Scholar
  26. 26.
    Nesmachnow, S.: Computación científica de alto desempeño en la Facultad de Ingeniería, Universidad de la República. Revista de la Asociación de Ingenieros del Uruguay 61(1), 12–15 (2010)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Nestor Rocchetti
    • 1
    Email author
  • Sergio Nesmachnow
    • 1
  • Gonzalo Tancredi
    • 2
  1. 1.Facultad de IngenieríaUniversidad de la RepúblicaMontevideoUruguay
  2. 2.Facultad de CienciasUniversidad de la RepúblicaMontevideoUruguay

Personalised recommendations