Uncategorized

Video of the first MultiXscale Ambassador event online!

The video of the first Ambassador event organized in collaboration with NCC Austria and Slovenia, on 4 October 2024, is already available on our YouTube channel. The European Environment for Scientific Software Installations (EESSI – pronounced “easy”) is a common stack of scientific software for HPC systems and beyond, including laptops, personal workstations, and cloud infrastructure. In many ways it works like a streaming service for scientific software, instantly giving you the software you need, when you need it, and compiled to work efficiently for the architecture you have access to. In this online workshop, we explain what EESSI is, how it is being designed, how to get access to it, and how to use it. We give a number of demonstrations and you can try EESSI out yourself.

MultiXscale CECAM Webinar Video Now Available!

We’re excited to announce that the recording of our recent MultiXscale CECAM webinar: Supporting the Development of Multiscale Methods via the European Environment for Scientific Software Installations (EESSI) is now available to view! This session dives into the innovative aspects of the MultiXscale CoE, covering key research applications, methodologies, and collaborative efforts that drive our work. Missed the live session? Full webinar available on our YouTube channel.

EuroHPC User Day (22-23 Oct 2024, Amsterdam)

We had a great time at the EuroHPC User Day 2024 in Amsterdam earlier this week. Both MultiXscale and EESSI were strongly represented, and the work we have been doing was clearly being appreciated. Find more details about MultiXscale participation on this event at the EESSI blog here

Exascale Day: Inspiring Future Scientists

To celebrate Exascale Day, the MultiXscale CoE in partnership with NCC Slovenia and SLING, opened the doors of the National Institute of Chemistry (NIC) to students from local primary and high schools. The event gave young students a unique chance to explore the world of supercomputers and understand how these powerful machines drive scientific breakthroughs. The day began with a short presentation on the significance of exascale computing in science, highlighting its role in solving complex problems such as climate research, drug discovery, and advanced simulations. Following this, students toured the NIC facility, where they saw the impressive infrastructure behind high-performance computing. By showcasing the potential of exascale computing the event aimed to spark curiosity and inspire the next generation of scientists and engineers. This collaboration between MultiXscale, NIC, NCC Slovenia, and SLING demonstrated the importance of education and outreach in shaping the future of computing.

Pre-exascale Workloads of DPD Fluid Simulations Using OBMD on EuroHPC Vega System – Achieving Another Milestone in MultiXscale

We are excited to announce the successful completion of Milestone 5 in the MultiXscale project: “WP4 Pre-exascale workload executed on EuroHPC architecture”. This marks a significant achievement in our ongoing efforts to push the boundaries of computational fluid dynamics. In this milestone, we performed a comprehensive strong scalability study of a DPD fluid in equilibrium using the open boundary molecular dynamics plugin (OBMD) (solid lines), and compared it to simulations without OBMD (dashed lines), where periodic boundary conditions were applied. One of the standout results from our study was that the inclusion of OBMD did not impact the strong scalability of the LAMMPS code, except for a constant prefactor. This means that while OBMD introduces a small fixed computational overhead, it does not affect the efficiency or the scaling behavior of the code as we increase the computational resources. We observed excellent strong scalability up to at least 80 A100 GPUs, which represents about one-third of the EuroHPC Vega GPU partition. To put this into perspective, each A100 GPU delivers approximately 10 TFlops in double precision and 20 TFlops in single precision. This means that the 80 GPU workload we tested is running on hardware with between 0.75 and 1.5 Petaflops raw compute capabilities, highlighting the immense computational power being utilized. Such scalability is crucial for large-scale simulations, as it ensures our methods remain efficient even when deployed on advanced supercomputing infrastructures. The strong scalability of our simulations is vital for researchers aiming to model complex fluid dynamics at large scales. By confirming that OBMD scales effectively alongside LAMMPS, we can confidently apply this method to larger, more detailed simulations without worrying about performance degradation. This opens up new possibilities for highly accurate simulations in fields such as materials science, biophysics, and chemical engineering. With Milestone 5 now complete, we are moving closer to our overarching goal of enabling multi-scale simulations that leverage the power of modern supercomputing resources. This achievement highlights the robustness and efficiency of our OBMD approach, and we are excited to continue exploring its applications in even more complex scenarios.

Book of Abstracts of the 2024 ESPResSo Summer School “Simulating soft matter across scales”

By Jean-Noël Grad The ESPResSo summer school is a CECAM flagship school organized every year by the Institute for Computational Physics at the University of Stuttgart to train students and scientists in simulation software for soft matter physics and foster synergies between simulation experts and experimentalists. The 2024 edition focused on multiscale simulation methods. The event attracted 45 attendees and featured 14 talks and 11 posters. Lectures introduced the audience to particle-based simulations, long-range solvers for electrostatics, machine-learned inter-atomic potentials, coarse-graining techniques, and continuum methods, with a focus on the lattice-Boltzmann method. In hands-on sessions, participants learned to use ESPResSo, waLBerla, lbmpy and pystencils to model coarse-grained particle-based systems (noble gases, polymer diffusion, electrolytic capacitors), particle-fluid interactions at the mesoscale (sedimentation dynamics), and solvent-solute interaction at the continuum level (diffusion-advection-reaction methods). Field experts shared their experience in coarse-graining and machine-learning techniques to automatically transform atomistic descriptions of molecules into coarse-grained descriptions and vice-versa, improving sampling of the conformational space of disordered polymers, combining molecular dynamics and Monte Carlo algorithms to titrate polyelectrolytes, and studying protein and DNA motion at different levels of resolution using multiscale and multiphysics software. The conference contributions have been collected into this book of abstracts. The talk slides can be obtained from the event website in the “Documents” tab, and recorded lectures are available on the YouTube channel ESPResSo Simulation Package.

Code for Thought podcast: “HPC software installations made EESSI”

Code for Thought is a podcast for the growing number of researchers & scientists who code, with episodes in English, German and French. In the latest episode of the podcast, October 15 2024, MultiXscale project members Kenneth Hoste and Alan O’Cais discuss the European Environment for Scientific Software Installations (EESSI) and how it can help make scientists more productive…and reduce the technical burden of working with HPC resources. Click here to access the episode.

CI workflow leveraging EESSI

EESSI’s CI workflows are available on GitHub Actions and as a GitLab CI/CD component. Enabling this is as simple as adding EESSI’s CI to your workflow of choice, giving you access to the entire EESSI software stack optimized for the relevant CPU architecture(s) in your runner’s environment. If you are developing an application on top of the EESSI software stack, for example, this means you don’t need to invest heavily in configuring and maintaining a CI setup: EESSI does that for you so you can focus on your code. With the EESSI CI workflows you don’t have to worry about figuring out how to optimize build and runtime dependencies as these will be streamed seamlessly to your runner’s environment. Using the CI component in GitLab To showcase this, let’s create a simple R package that just outputs a map of the European Union and Norway, and colours the participating countries in the MultiXscale CoE. We’ll make a package eessirmaps that relies on popular R packages ggplot2, sf, and rnaturalearth to render and save this map. Installing GIS tools for R can be somewhat cumbersome, which could become trickier if it has to be done in a CI environment. This is because sf requires system packages libgdal-dev and libproj-dev, which would add yet another step, complicating our CI workflow. Thankfully, EESSI makes a lot of the packages dependencies available to us from the start, as well as a fully functioning version of R, and the necessary package dependencies to boot! As far as setup goes, this results in a simple CI workflow: Note how we simply include the EESSI GitLab CI component and set up a blank directory for our user R libraries. Remember, because of EESSI, the environment that you develop in will be exactly the same as the one the CI is run in. Apart from the rnaturalearthdata R package, all the other dependencies are taken care of by the R-bundle-CRAN/2023.12-foss-2023a EESSI module. This is true for the system and R package dependencies. Then we simply have to install our package to the CI environment and call the multixscale_map() function to produce the plot, which is uploaded as an artifact from the CI environment. We can then retrieve the artifact archive, unpack it and obtain the map.

OpenModel Exploitation Workshop in Hamburg (Germany)

Our coordinator, Matej Praprotnik from Kemijski inštitut – National Institute of Chemistry, was invited to present MultiXscale CoE at the OpenModel Exploitation Workshop, on 18 September, in Hamburg (Germany). During the event, OpenModel partners presented various project components developed for running workflows, which included interactive demos and real-world success stories. OpenModel is an integrated open access materials modelling innovation platform for Europe. Its primary goal is to design, create, provide, and maintain a sustainable platform that seamlessly integrates third-party physics-based models, solvers, post-processors, and databases.

Scroll to Top