Feb 25 – 27, 2026
Technical University of Braunschweig
Europe/Berlin timezone

JUBE: An Environment for systematic benchmarking and scientific workflows

Not scheduled
30m
SN 20.2 (Technical University of Braunschweig)

SN 20.2

Technical University of Braunschweig

Developer Talk Other (specify in comments) Contributed Sessions

Speaker

Pit Steinbach (Forschungszentrum Juelich)

Description

A key aspect of developing research software is testing the installation and the expected results on various configurations, as well as benchmarking the performance preferably continuously. This applies especially to software that targets high-performance computing (HPC) installations around Europe and the world. For these applications performance, scalability, and efficiency are key metrics that need to be monitored and compared among systems. Due to the complexity of these technical installations, individual scripts written for a specific system lack portability, reusability and reproducibility.

These challenges were addressed by the development of the Jülich Benchmarking Environment (JUBE) [1] at the Jülich Supercomputing Centre (JSC). JUBE is a generic and lightweight framework that automates the systematic execution, monitoring, and analysis of applications. It is a free, open-source software [2] implemented in Python that operates on a "declarative configuration" paradigm, where experiments are defined in human-readable YAML/XML files, automating script generation, job submission, and result analysis. Due to its standardized configuration format, it simplifies collaboration and usability of research software. JUBE integrates seamlessly with CI/CD pipelines, enabling automated regression testing, performance tracking, and benchmarking as part of HPC software development workflows.

The entry barrier of JUBE is relatively low as it builds upon basic knowledge of the Linux shell and either XML or YAML, and an extensive documentation including tutorials and advanced examples is available [2]. Offering a high degree of flexibility, JUBE may be used in every phase of the HPC software development pipeline. Example use cases comprise standard benchmarks to track a project's development in terms of performance, or systematic studies to explore parameter combinations---including orchestrating scaling experiments, which has already been shown to streamline the application process for HPC compute resources [3]. JUBE has been previously used to successfully automate a large variety of scientific codes and standard HPC benchmarks, with configurations available open-source [4]. The software can be easily installed, with existing configurations also available for the software managers EasyBuild [5] and Spack [6]. Further projects have been built ontop of JUBE [7,8].

In conclusion, JUBE is a well-established software, which has already been used in several national and international projects and on numerous and diverse HPC systems [9-16]. Given its broad scope and range of applications, JUBE is likely to be of interest to those working in the HPC software sector.

This presentation will provide an overview of JUBE, covering its core principles, current status, and future development roadmap. Additionally, two illustrative use cases will be presented to demonstrate JUBE's practical applications. The first is benchmarking as part of the procurement of JUPITER, Europe’s first exascale supercomputer [3]; the second is continuous insight into HPC system health through the regular execution of applications, and the subsequent graphical presentation of their results.

[1] https://apps.fz-juelich.de/jsc/jube/docu/index.html
[2] https://github.com/FZJ-JSC/JUBE
[3] https://www.fz-juelich.de/en/jsc/jupiter/jureap
[4] https://github.com/FZJ-JSC/jubench
[5] EasyBuild: https://github.com/easybuilders/easybuild-easyconfigs/tree/develop/easybuild/easyconfigs/j/JUBE
[6] Spack: https://packages.spack.io/package.html?name=jube
[7] https://github.com/edf-hpc/unclebench
[8] https://dl.acm.org/doi/10.1145/3733723.3733740
[9] MAX CoE: https://max-centre.eu/impact-outcomes/key-achievements/benchmarking-and-profiling/
[10] RISC2: https://risc2-project.eu/?p=2251
[11] EoCoE: https://www.eocoe.eu/technical-challenges/programming-models/
[12] DEEP: https://deep-projects.eu/modular-supercomputing/software/benchmarking-and-tools/
[13] DEEP-EST: https://cordis.europa.eu/project/id/754304/reporting
[14] IO-SEA: https://cordis.europa.eu/project/id/955811/results
[15] EPICURE: https://epicure-hpc.eu/wp-content/uploads/2025/07/EPICURE-BEST-PRACTICE-GUIDE-Power-measurements-in-EuroHPC-machines_v1.0.pdf
[16] UNSEEN: https://juser.fz-juelich.de/record/1007796/files/UNSEEN_ISC_2023_Poster.pdf

Authors

Pit Steinbach (Forschungszentrum Juelich) Thomas Breuer (Forschungszentrum Juelich)

Co-authors

Dr Filipe Guimaraes (Forschungszentrum Juelich) Dr Jan-Oliver Mirus (Forschungszentrum Juelich) Dr Wolfgang Frings (Forschungszentrum Juelich)

Presentation materials

There are no materials yet.