Uncategorized

Pre-exascale Workloads of DPD Fluid Simulations Using OBMD on EuroHPC Vega System – Achieving Another Milestone in MultiXscale

We are excited to announce the successful completion of Milestone 5 in the MultiXscale project: “WP4 Pre-exascale workload executed on EuroHPC architecture”. This marks a significant achievement in our ongoing efforts to push the boundaries of computational fluid dynamics. In this milestone, we performed a comprehensive strong scalability study of a DPD fluid in equilibrium using the open boundary molecular dynamics plugin (OBMD) (solid lines), and compared it to simulations without OBMD (dashed lines), where periodic boundary conditions were applied. One of the standout results from our study was that the inclusion of OBMD did not impact the strong scalability of the LAMMPS code, except for a constant prefactor. This means that while OBMD introduces a small fixed computational overhead, it does not affect the efficiency or the scaling behavior of the code as we increase the computational resources. We observed excellent strong scalability up to at least 80 A100 GPUs, which represents about one-third of the EuroHPC Vega GPU partition. To put this into perspective, each A100 GPU delivers approximately 10 TFlops in double precision and 20 TFlops in single precision. This means that the 80 GPU workload we tested is running on hardware with between 0.75 and 1.5 Petaflops raw compute capabilities, highlighting the immense computational power being utilized. Such scalability is crucial for large-scale simulations, as it ensures our methods remain efficient even when deployed on advanced supercomputing infrastructures. The strong scalability of our simulations is vital for researchers aiming to model complex fluid dynamics at large scales. By confirming that OBMD scales effectively alongside LAMMPS, we can confidently apply this method to larger, more detailed simulations without worrying about performance degradation. This opens up new possibilities for highly accurate simulations in fields such as materials science, biophysics, and chemical engineering. With Milestone 5 now complete, we are moving closer to our overarching goal of enabling multi-scale simulations that leverage the power of modern supercomputing resources. This achievement highlights the robustness and efficiency of our OBMD approach, and we are excited to continue exploring its applications in even more complex scenarios.

Book of Abstracts of the 2024 ESPResSo Summer School “Simulating soft matter across scales”

By Jean-Noël Grad The ESPResSo summer school is a CECAM flagship school organized every year by the Institute for Computational Physics at the University of Stuttgart to train students and scientists in simulation software for soft matter physics and foster synergies between simulation experts and experimentalists. The 2024 edition focused on multiscale simulation methods. The event attracted 45 attendees and featured 14 talks and 11 posters. Lectures introduced the audience to particle-based simulations, long-range solvers for electrostatics, machine-learned inter-atomic potentials, coarse-graining techniques, and continuum methods, with a focus on the lattice-Boltzmann method. In hands-on sessions, participants learned to use ESPResSo, waLBerla, lbmpy and pystencils to model coarse-grained particle-based systems (noble gases, polymer diffusion, electrolytic capacitors), particle-fluid interactions at the mesoscale (sedimentation dynamics), and solvent-solute interaction at the continuum level (diffusion-advection-reaction methods). Field experts shared their experience in coarse-graining and machine-learning techniques to automatically transform atomistic descriptions of molecules into coarse-grained descriptions and vice-versa, improving sampling of the conformational space of disordered polymers, combining molecular dynamics and Monte Carlo algorithms to titrate polyelectrolytes, and studying protein and DNA motion at different levels of resolution using multiscale and multiphysics software. The conference contributions have been collected into this book of abstracts. The talk slides can be obtained from the event website in the “Documents” tab, and recorded lectures are available on the YouTube channel ESPResSo Simulation Package.

Code for Thought podcast: “HPC software installations made EESSI”

Code for Thought is a podcast for the growing number of researchers & scientists who code, with episodes in English, German and French. In the latest episode of the podcast, October 15 2024, MultiXscale project members Kenneth Hoste and Alan O’Cais discuss the European Environment for Scientific Software Installations (EESSI) and how it can help make scientists more productive…and reduce the technical burden of working with HPC resources. Click here to access the episode.

CI workflow leveraging EESSI

EESSI’s CI workflows are available on GitHub Actions and as a GitLab CI/CD component. Enabling this is as simple as adding EESSI’s CI to your workflow of choice, giving you access to the entire EESSI software stack optimized for the relevant CPU architecture(s) in your runner’s environment. If you are developing an application on top of the EESSI software stack, for example, this means you don’t need to invest heavily in configuring and maintaining a CI setup: EESSI does that for you so you can focus on your code. With the EESSI CI workflows you don’t have to worry about figuring out how to optimize build and runtime dependencies as these will be streamed seamlessly to your runner’s environment. Using the CI component in GitLab To showcase this, let’s create a simple R package that just outputs a map of the European Union and Norway, and colours the participating countries in the MultiXscale CoE. We’ll make a package eessirmaps that relies on popular R packages ggplot2, sf, and rnaturalearth to render and save this map. Installing GIS tools for R can be somewhat cumbersome, which could become trickier if it has to be done in a CI environment. This is because sf requires system packages libgdal-dev and libproj-dev, which would add yet another step, complicating our CI workflow. Thankfully, EESSI makes a lot of the packages dependencies available to us from the start, as well as a fully functioning version of R, and the necessary package dependencies to boot! As far as setup goes, this results in a simple CI workflow: Note how we simply include the EESSI GitLab CI component and set up a blank directory for our user R libraries. Remember, because of EESSI, the environment that you develop in will be exactly the same as the one the CI is run in. Apart from the rnaturalearthdata R package, all the other dependencies are taken care of by the R-bundle-CRAN/2023.12-foss-2023a EESSI module. This is true for the system and R package dependencies. Then we simply have to install our package to the CI environment and call the multixscale_map() function to produce the plot, which is uploaded as an artifact from the CI environment. We can then retrieve the artifact archive, unpack it and obtain the map.

OpenModel Exploitation Workshop in Hamburg (Germany)

Our coordinator, Matej Praprotnik from Kemijski inštitut – National Institute of Chemistry, was invited to present MultiXscale CoE at the OpenModel Exploitation Workshop, on 18 September, in Hamburg (Germany). During the event, OpenModel partners presented various project components developed for running workflows, which included interactive demos and real-world success stories. OpenModel is an integrated open access materials modelling innovation platform for Europe. Its primary goal is to design, create, provide, and maintain a sustainable platform that seamlessly integrates third-party physics-based models, solvers, post-processors, and databases.

MultiXscale – CECAM webinar: Supporting the Development of Multiscale Methods via the European Environment for Scientific Software Installations (EESSI)

17 October 2024 | 09:00 -13:00 CEST The goal of the MultiXscale EuroHPC Centre-of-Excellence is to enable the simulation of hydrodynamics at different length scales, from atomistic to continuum models, on large scale HPC resources like those provided by EuroHPC systems. It will do this via 3 scientific showcases:  This webinar provides an introduction to EESSI and demonstrates how it supports the development of the key MultiXscale application codes—LAMMPS, waLBerla, and ESPResSo. Discover how EESSI accelerates scientific software installations and development, enabling cutting-edge research in multiscale modeling across various scientific domains. Perfect for researchers, developers, and engineers looking to enhance their software efficiency. This webinar is a joint effort between the CoE MultiXscale EuroHPC and CECAM. Moderators: Matej Praprotnik (NIC), Sara Bonella (CECAM) Zoom link: https://epfl.zoom.us/s/63031772254 Program: 09:00 – 09:15 Welcome message and introduction (Matej Praprotnik, National Institute of Chemistry and Sara Bonella, CECAM) 09:15 – 09:45 Introduction to EESSI (Kenneth Hoste, Ghent University) 09:45 – 10:20 Improving the Scalability of Energy Materials Simulations in ESPResSo (Jean-Noël Grad, University of Stuttgart) 10:20 – 10:25 EESSI CI/CD services for ESPResSo and pyMBE (Alan O`Cais, University of Barcelona) 10:25 – 10:40 Q&A 10:40 – 11:00 Coffee break 11:00 – 11:20 Digital twin for ultrasound through OBMD plugin for LAMMPS (Tilen PotiskNational Institute of Chemistry)  11:20 – 11:40 Performance Portability and Scalability of Codes: Kokkos and ALL (Rodrigo Bartolomeu, Jülich Supercomputing Centre) 11:40 – 12:05 Usage of waLBerla for simulation of turbulent flows (Matteo Zanfrognini, LEONARDO) 12:05 – 12:10 Supporting waLBerla applications in EESSI(Alan O`Cais, University of Barcelona) 12:10 – 12:30 Mesoscopic simulations of full supercapacitors using pystencils in EESSI (Céline Merlet, University of Toulouse) 12:30 – 13:00 Q&A

MultiXscale at the Nordic Industry Days

Alan O ‘Cais presented MultiXScale and EESSI as part of the Nordic Industry Days 2024 – Supercomputing the gateway to AI, which took place 2-3 September 2024 in Copenhagen. It was a collaboration between the EuroCC competence centres in Denmark, Norway, Finland, Iceland, and Sweden, as well as Dansk Industri. More on the agenda and topics can be found here https://enccs.se/events/nordic-indust….

EESSI nominated for HPCwire Readers’ Choice Awards 2024

EESSI has been nominated for the HPCwire Readers’ Choice Awards 2024, in the “Best HPC Programming Tool or Technology” category. You can help us win the award by joining the vote. To vote, you should: 1) Fill out and submit the form to register yourself as an HPCWire reader and access your ballot; 2) Access your ballot here; 3) Select your favorite in one or more categories; 4) Submit your vote by filling in your name, organisation, and email address (to avoid ballot stuffing), and hitting the Done button. Note that you are not required to vote for all categories, you can opt for only voting for one particular nominee in only one of the categories. For example, you could vote for European Environment for Scientific Software Installations (EESSI) in category 13: Best HPC Programming Tool or Technology.

The poster abstract submission deadline for CECAM Flagship School “Simulating soft matter across scales” has been extended to Sept. 27th

The poster session of the ESPResSo summer school is a great opportunity to present your research and engage in meaningful discussions with soft matter experts and ESPResSo/waLBerla/pyMBE developers. It serves not only as a platform to present your work to your peers but also as a chance to network, gather feedback, and foster collaborations that can extend well beyond the duration of the event. Everyone bringing a poster is invited to present it in a 1 minute lightning talk during the poster session. The poster boards will remain up for the entire duration of the school. The school will focus on coarse-grained and lattice-based simulations methods to model soft matter systems at mesoscopic length and time scales. We will simulate coarse-grained ionic liquids in electrolytic capacitors, coarse-grained liquids with machine-learned effective potentials, polymer diffusion, hydrodynamic interactions via the lattice-Boltzmann method, and electrokinetics and catalysis with diffusion-advection-reaction solvers. Lectures will provide an introduction to the physics and simulation model building as well as an overview of the necessary simulation algorithms to resolve physical processes at different time scales. During the afternoon, students will practice running their own simulations in hands-on sessions using ESPResSo and waLBerla. Time will be dedicated to research talks and poster sessions. Invited speakers: We invite all interested to attend the ESPResSo summer school “Simulating soft matter across scales” on October 7-11, 2024, University of Stuttgart, Germany. Attendance to the summer school is free of charge. To register, go to https://www.cecam.org/workshop-details/1324 and write a short motivation and CV. You can submit a poster abstract until September 27th, 2024.

Scroll to Top