This event is a part of the "Best Practices for HPC Software Developers" webinar series, produced by the IDEAS Productivity Project. The HPC Best Practices webinars address issues faced by developers of computational science and engineering (CSE) software on high-performance computers (HPC) and occur approximately monthly.
|Webinar Title||Taking HACC into the Exascale Era: New Code Capabilities, and Challenges|
|Date and Time||2023-10-11 01:00 pm EDT|
|Presenter||Esteban Rangel (Argonne National Laboratory)|
|Registration, Information, and Archives||https://ideas-productivity.org/events/hpc-best-practices-webinars/#webinar079|
Webinars are free and open to the public, but advance registration is required through the Event website. Archives (recording, slides, Q&A) will be posted at the same link soon after the event.
HACC (Hardware/Hybrid Accelerated Cosmology Code) is a well-established code within the US Department of Energy community, and with a long history — having run on every flagship computing system for over a decade. Often participating in early-access programs for upcoming systems, an ongoing challenge for HACC developers is to not only contend with state-of-the-art architectures, but also with their initially supported, and often novel, programming models. The increased computing power brought about by today’s exascale systems has allowed HACC to support additional baryonic physics through a newly developed Smoothed Particle Hydrodynamics (SPH) formalism called Conservative Reproducing Kernel (CRK). This webinar will discuss the challenges faced in preparing HACC for multiple exascale systems while simultaneously adding additional code capabilities, with ongoing development, all the while with a central focus on performance.
Esteban Rangel is a member of the HACC development team. He joined the Computational Science (CPS) division at Argonne National Laboratory as an Assistant Computational Scientist in 2021. Prior to joining CPS, he was a postdoctoral researcher at the Argonne Leadership Computing Facility (ALCF) working on porting HACC’s hydrodynamics solvers to the Aurora supercomputer. He began contributing to the HACC codebase as a Ph.D. student at Northwestern University, where much of the work towards his thesis was designing and implementing scalable analysis software for N-body cosmological simulations.