Uncategorized

Using EESSI in GitHub Action workflows

By Jean-Noël Grad GitHub continuous integration and continuous delivery (CI/CD) pipelines can leverage EESSI1 to download pre-built scientific software using the EESSI GitHub Action2. GitHub workflows are routinely used to execute test suites, generate and deploy software documentation, and run executable papers. As a real-world example, we will explore pyMBE3, a molecular builder that simplifies and automates the creation of complex molecular models in the molecular dynamics engine ESPResSo. As part of pyMBE’s software quality assurance, every code contribution is automatically tested against stable ESPResSo releases. This is achieved by a workflow called testsuite.yml4, which loads ESPResSo 4.2.1 and installs the subset of Python dependencies not already provided by EESSI. In a subsequent stage, the test suite is executed to check the software behavior meets our specifications and reproduces published results. The software user guide is generated to verify compliance with the Sphinx specifications, and uploaded as an artifact that can be downloaded by human reviewers to confirm that any new feature is properly documented. After a contribution is merged to the main branch, and upon successful completion of a test suite on the main branch, another workflow called deploy.yml5 automatically reads and uploads the documentation artifact to the pyMBE online user guide6, which is hosted on GitHub Pages. References:

pyMBE: The Python-based molecule builder for ESPResSo

By Jean-Noël Grad We are happy to announce the first release of pyMBE, an open-source Python package designed to facilitate the design of custom coarse-grained models of polyelectrolytes, peptides and proteins in ESPResSo (https://doi.org/10.5281/zenodo.12102635). pyMBE extends the ESPResSo API with methods to automate repetitive and error-prone tasks, such as setting up chemical bonds, non-bonded interactions and reaction methods. pyMBE is maintained by an active community of soft matter researchers with a shared interest in the modeling of weak polyelectrolytes and biomacromolecules. We welcome new users and developers to join the project and contribute new features! Learn more about pyMBE in our recent publication at The Journal of Chemical Physics (https://doi.org/10.1063/5.0216389), where we outline the main features of pyMBE and show how it can be leveraged in computational soft matter research.

HPC Knowledge Meeting – HPCKP Barcelona, May 2024

The recording and presentation of the talk “Streaming scientific software has never been so EESSI”, by Alan O’Cais, at HPCKP’24 Barcelona are already available online here: Abstract: Have you ever wished that all the scientific software you use was available on all the resources you had access to without having to go through the pain of getting them installed the way you want/need? The European Environment for Scientific Software Installations (EESSI – pronounced “easy”) is a common stack of scientific software installations for HPC systems and beyond, including laptops, personal workstations and cloud infrastructure. In many ways it works like a streaming service for scientific software, instantly giving you the software you need, when you need it, and compiled to work efficiently for the architecture you have access to. In this talk, we’ll explain what EESSI is, how it is being designed, how to get access to it, and how to use it. We’ll include a number of demonstrations and review significant developments of the last 12 months (including support for NVIDIA GPUs, and active development for RISC-V systems). Download PDF

MultiXscale at InPEx Workshop 2024

The International Post-Exascale Project (InPEx) is a pioneering initiative bringing together the brightest minds in the field of high-performance computing, from researchers and engineers to policy organizations and funding bodies. It’s workshops, accompanied by workgroups of experts dedicated on critical topics for Exascale, are designed to foster international collaboration and co-design, essential in the journey towards and beyond Exascale computing. The technical manager of MultiXscale (AlanO’Cais) was invited to participate in the 2024 InPEx Workshop, and in particular contributed to the “Software production and management” topic of the workshop. Lively discussions were held as part of that topic with a general agreement that creating a software installation description format could be very useful to encourage collaboration between the various software installation tools being used “in the wild” (at this particular workshop EasyBuild/EESSI, Spack and Guix were represented).

First portable test run on two systems with different architectures

One of the milestones that we have in MultiXscale is to be able to run the EESSI test suite on at least two different architectures. In the context of EuroHPC that means running on different partitions of the available EuroHPC Supercomputers.  Our initial effort focused on getting the test suite portable between two different supercomputers: Karolina and Vega (the CPU partitions of both are a Zen2 architecture). More recently we have spent time getting the same test suite working on a more “exotic” architecture, the ARM A64FX architecture of Deucalion (currently in pre-production). This has some additional complications for us as CernVM-FS is not yet natively available there. The performance/scalability plots we have measured for the ESPResSo application of MultiXscale are below. For full technical details of how we carried this out (and how you can repeat it for yourself!), please take a look at the EESSI blog post on this milestone. We look forward to reporting increased performance for ESPResSo in the future as we implement some of the ideas suggested in our recent deliverable on this topic.

New Paper available at Faraday Discussions Journal

New Paper available at Faraday Discussions Journal: “Investigating the effect of particle size distribution and complex exchange dynamics on NMR spectra of ions diffusing in disordered porous carbons through a mesoscopic model”. The document is accesible for download here. The poster was recently presented at CECAM workshop: “Electrochemical Interfaces in Energy Storage: Advances in Simulations, Methods and Models”, carried out in CECAM-HQ-EPFL, Lausanne (Switzerland), from 18 to 21 June 20024.

Austrian-Slovenian HPC Meeting 2024 – ASHPC24

Our coordinator, Matej Praprotnik, gave a talk and presented MultiXscale poster at the Austrian-Slovenian HPC Meeting 2024 – ASHPC24. He also chairs the Program committee. The event was carried out at Seeblickhotel Grundlsee (Austria) from 11-13 June 2024. ASHPC24 was organized by EuroCC Austria – National Competence Centre for Supercomputing, Big Data and Artificial Intelligence, Austria and  EuroCC Slovenia in cooperation with the Vienna Scientific Cluster (VSC), Austria; the Research Area Scientific Computing in Innsbruck, Austria and the Slovenian consortium for high performance computing (SLING). Further information is available online here: https://www.ashpc.at/

MultiXscale at ISC Hamburg, May 2024

Several members of MultiXscale CoE participated from 12 to 16 May at the ISC High Performance 2024, in Hamburg (Germany). This event connects public, industry users and technology developers in the fields of High Performance Computing, Machine Learning, Data Analytics & Quantum Computing.  Read further details about MultiXscale participation on this event at the brand new EESSI blog here

ESPResSo Summer School, “Simulating soft matter across scales”, October 7-11, 2024, Stuttgart, Germany

We invite all interested to attend the ESPResSo summer school “Simulating soft matter across scales” on October 7-11, 2024, University of Stuttgart, Germany. The school will focus on coarse-grained and lattice-based simulations methods to model soft matter systems at mesoscopic length and time scales. We will simulate coarse-grained ionic liquids in electrolytic capacitors, coarse-grained liquids with machine-learned effective potentials, polymer diffusion, hydrodynamic interactions via the lattice-Boltzmann method, and electrokinetics and catalysis with diffusion-advection-reaction solvers. Lectures will provide an introduction to the physics and simulation model building as well as an overview of the necessary simulation algorithms to resolve physical processes at different time scales. During the afternoon, students will practice running their own simulations in hands-on sessions using ESPResSo and waLBerla. Time will be dedicated to research talks and poster sessions. Invited speakers: Attendance to the summer school is free of charge. To register, go to https://www.cecam.org/workshop-details/1324 and write a short motivation and CV. You can submit a poster abstract until September 1, 2024. Feel free to forward this announcement to interested colleagues. A flyer is available at: https://members.cecam.org/storage/workshop_files/flyer-1715854439.pdf

All-Hands Meeting in Slovakia, 2024

The All-Hands Meeting held in Slovakia from April 22-24, 2024, was a resounding success, bringing together members from CASTIEL 2, EuroCC 2, and EuroHPC CoE Consortia. This productive event fostered important discussions and set a clear path forward for our collaborative efforts. Discussions focused on enhancing collaboration to attract new users and organize effective training sessions, identifying and promoting mutual benefits for CoEs’ target groups and the dissemination of knowledge across NCCs and CoEs was explored. Copyright ©2024 EuroCC 2, CASTIEL 2, and EuroHPC CoE Consortia

Scroll to Top