Uncategorized

EESSI has been recognized with an HPCwire Award!

We are thrilled to share that EESSI (European Environment for Scientific Software Installations) has received the HPCwire Readers’ Choice Award for the Best HPC Programming Tool or Technology! This prestigious recognition celebrates the groundbreaking work of EESSI and MultiXscale, showcasing their significant impact on the HPC community. A heartfelt thank you to everyone who contributed their dedication and expertise to push the boundaries of scientific software installations. This award is a true testament to what we can achieve together! Thank you very much to EuroHPC JU for the ongoing support for MultiXscale and EESSI, and thanks to everyone who voted for EESSI! Here’s to continued innovation and excellence in HPC. More information available here

NRIS Talks : How to build on top EESSI using EasyBuild

NRIS (Norwegian Research Infrastructure Services) is organizing webinars as part of it’s outreach program NRIS Talks. Join us for an enlighting webinar on How to build on top of EESSI using EasyBuild. This webinar aims to provide a comprehensive overview of how to build software on top of the scientific software stack provided by the European Environment for Scientific Software Installations (EESSI). Webinar Details: About the Webinar: We will explain and demonstrate how everyone can easily build and install software on top of EESSI. While EESSI already provides ~ 500 software packages a specific software you need may not be available through EESSI yet. The webinar will show how to use EasyBuild to build (scientific) software on top of EESSI for all users, for a specific project account or individual users. More information and registration here

Webinar: Streaming Optimised Scientific Software: an Introduction to EESSI

Date: 15 November from 14h to 15h30 – Location: Online About the talk:  What if you could avoid having to install a broad range of scientific software from scratch on every HPC cluster, laptop, or cloud instance you use or maintain, without compromising on performance? Installing scientific software for supercomputers is known to be a tedious and time-consuming task. Especially as the HPC user community becomes more diverse, computational science expands rapidly, and the diversity of system architectures increases, the application software stack continues to deepen. Simultaneously, we see a surge in interest in cloud computing for scientific computing. Delivering optimised software installations and providing access to these installations in a reliable, user-friendly, and reproducible way is a highly non-trivial task that affects application developers, HPC user support teams, and the users themselves. This webinar aims to address these challenges by providing the attendees with the knowledge to stream optimised scientific software installations. For this, we will introduce the European Environment for Scientific Software Installations (EESSI), a collaboration between various European HPC sites & industry partners, with the common goal of creating a shared repository of scientific software installations that can be used on a variety of systems, regardless of which flavor/version of Linux distribution or processor architecture is used, or whether it’s a full size HPC cluster, a cloud environment, or a personal workstation. We will cover the design and usage EESSI, different ways to accessing EESSI, how to add software to EESSI, and highlight some more advanced features including support for NVIDIA GPUs and facilitating the deployment of pre-release builds of scientific software. We will also show how to engage with the community and contribute to the project. More info and registration here

Video of the first MultiXscale Ambassador event online!

The video of the first Ambassador event organized in collaboration with NCC Austria and Slovenia, on 4 October 2024, is already available on our YouTube channel. The European Environment for Scientific Software Installations (EESSI – pronounced “easy”) is a common stack of scientific software for HPC systems and beyond, including laptops, personal workstations, and cloud infrastructure. In many ways it works like a streaming service for scientific software, instantly giving you the software you need, when you need it, and compiled to work efficiently for the architecture you have access to. In this online workshop, we explain what EESSI is, how it is being designed, how to get access to it, and how to use it. We give a number of demonstrations and you can try EESSI out yourself.

MultiXscale CECAM Webinar Video Now Available!

We’re excited to announce that the recording of our recent MultiXscale CECAM webinar: Supporting the Development of Multiscale Methods via the European Environment for Scientific Software Installations (EESSI) is now available to view! This session dives into the innovative aspects of the MultiXscale CoE, covering key research applications, methodologies, and collaborative efforts that drive our work. Missed the live session? Full webinar available on our YouTube channel.

EuroHPC User Day (22-23 Oct 2024, Amsterdam)

We had a great time at the EuroHPC User Day 2024 in Amsterdam earlier this week. Both MultiXscale and EESSI were strongly represented, and the work we have been doing was clearly being appreciated. Find more details about MultiXscale participation on this event at the EESSI blog here

Exascale Day: Inspiring Future Scientists

To celebrate Exascale Day, the MultiXscale CoE in partnership with NCC Slovenia and SLING, opened the doors of the National Institute of Chemistry (NIC) to students from local primary and high schools. The event gave young students a unique chance to explore the world of supercomputers and understand how these powerful machines drive scientific breakthroughs. The day began with a short presentation on the significance of exascale computing in science, highlighting its role in solving complex problems such as climate research, drug discovery, and advanced simulations. Following this, students toured the NIC facility, where they saw the impressive infrastructure behind high-performance computing. By showcasing the potential of exascale computing the event aimed to spark curiosity and inspire the next generation of scientists and engineers. This collaboration between MultiXscale, NIC, NCC Slovenia, and SLING demonstrated the importance of education and outreach in shaping the future of computing.

Pre-exascale Workloads of DPD Fluid Simulations Using OBMD on EuroHPC Vega System – Achieving Another Milestone in MultiXscale

We are excited to announce the successful completion of Milestone 5 in the MultiXscale project: “WP4 Pre-exascale workload executed on EuroHPC architecture”. This marks a significant achievement in our ongoing efforts to push the boundaries of computational fluid dynamics. In this milestone, we performed a comprehensive strong scalability study of a DPD fluid in equilibrium using the open boundary molecular dynamics plugin (OBMD) (solid lines), and compared it to simulations without OBMD (dashed lines), where periodic boundary conditions were applied. One of the standout results from our study was that the inclusion of OBMD did not impact the strong scalability of the LAMMPS code, except for a constant prefactor. This means that while OBMD introduces a small fixed computational overhead, it does not affect the efficiency or the scaling behavior of the code as we increase the computational resources. We observed excellent strong scalability up to at least 80 A100 GPUs, which represents about one-third of the EuroHPC Vega GPU partition. To put this into perspective, each A100 GPU delivers approximately 10 TFlops in double precision and 20 TFlops in single precision. This means that the 80 GPU workload we tested is running on hardware with between 0.75 and 1.5 Petaflops raw compute capabilities, highlighting the immense computational power being utilized. Such scalability is crucial for large-scale simulations, as it ensures our methods remain efficient even when deployed on advanced supercomputing infrastructures. The strong scalability of our simulations is vital for researchers aiming to model complex fluid dynamics at large scales. By confirming that OBMD scales effectively alongside LAMMPS, we can confidently apply this method to larger, more detailed simulations without worrying about performance degradation. This opens up new possibilities for highly accurate simulations in fields such as materials science, biophysics, and chemical engineering. With Milestone 5 now complete, we are moving closer to our overarching goal of enabling multi-scale simulations that leverage the power of modern supercomputing resources. This achievement highlights the robustness and efficiency of our OBMD approach, and we are excited to continue exploring its applications in even more complex scenarios.

Book of Abstracts of the 2024 ESPResSo Summer School “Simulating soft matter across scales”

By Jean-Noël Grad The ESPResSo summer school is a CECAM flagship school organized every year by the Institute for Computational Physics at the University of Stuttgart to train students and scientists in simulation software for soft matter physics and foster synergies between simulation experts and experimentalists. The 2024 edition focused on multiscale simulation methods. The event attracted 45 attendees and featured 14 talks and 11 posters. Lectures introduced the audience to particle-based simulations, long-range solvers for electrostatics, machine-learned inter-atomic potentials, coarse-graining techniques, and continuum methods, with a focus on the lattice-Boltzmann method. In hands-on sessions, participants learned to use ESPResSo, waLBerla, lbmpy and pystencils to model coarse-grained particle-based systems (noble gases, polymer diffusion, electrolytic capacitors), particle-fluid interactions at the mesoscale (sedimentation dynamics), and solvent-solute interaction at the continuum level (diffusion-advection-reaction methods). Field experts shared their experience in coarse-graining and machine-learning techniques to automatically transform atomistic descriptions of molecules into coarse-grained descriptions and vice-versa, improving sampling of the conformational space of disordered polymers, combining molecular dynamics and Monte Carlo algorithms to titrate polyelectrolytes, and studying protein and DNA motion at different levels of resolution using multiscale and multiphysics software. The conference contributions have been collected into this book of abstracts. The talk slides can be obtained from the event website in the “Documents” tab, and recorded lectures are available on the YouTube channel ESPResSo Simulation Package.

Scroll to Top