A parallel implementation of the confined–unconfined aquifer system model for subglacial hydrology: design, verification, and performance analysis (CUAS-MPI v0.1.0)

The subglacial hydrological system affects (i) the motion of ice sheets through sliding, (ii) the location of lakes at the ice margin, and (iii) the ocean circulation by freshwater discharge directly at the grounding line or (iv) via rivers flowing over land. For modeling this hydrology system, a pr...

Full description

Bibliographic Details
Published in:Geoscientific Model Development
Main Authors: Fischler, Yannic, Kleiner, Thomas, Bischof, Christian, Schmiedel, Jeremie, Sayag, Roiy, Emunds, Raban, Oestreich, Lennart Frederik, Humbert, Angelika
Format: Article in Journal/Newspaper
Language:English
Published: Copernicus Publications 2023
Subjects:
Online Access:https://doi.org/10.5194/gmd-16-5305-2023
https://noa.gwlb.de/receive/cop_mods_00068874
https://noa.gwlb.de/servlets/MCRFileNodeServlet/cop_derivate_00067286/gmd-16-5305-2023.pdf
https://gmd.copernicus.org/articles/16/5305/2023/gmd-16-5305-2023.pdf
Description
Summary:The subglacial hydrological system affects (i) the motion of ice sheets through sliding, (ii) the location of lakes at the ice margin, and (iii) the ocean circulation by freshwater discharge directly at the grounding line or (iv) via rivers flowing over land. For modeling this hydrology system, a previously developed porous-media concept called the confined–unconfined aquifer system (CUAS) is used. To allow for realistic simulations at the ice sheet scale, we developed CUAS-MPI, an MPI-parallel C/C++ implementation of CUAS (MPI: Message Passing Interface), which employs the Portable, Extensible Toolkit for Scientific Computation (PETSc) infrastructure for handling grids and equation systems. We validate the accuracy of the numerical results by comparing them with a set of analytical solutions to the model equations, which involve two types of boundary conditions. We then investigate the scaling behavior of CUAS-MPI and show that CUAS-MPI scales up to 3840 MPI processes running a realistic Greenland setup on the Lichtenberg HPC system. Our measurements also show that CUAS-MPI reaches a throughput comparable to that of ice sheet simulations, e.g., the Ice-sheet and Sea-level System Model (ISSM). Lastly, we discuss opportunities for ice sheet modeling, explore future coupling possibilities of CUAS-MPI with other simulations, and consider throughput bottlenecks and limits of further scaling.