Download PDFOpen PDF in browserARCH-COMP25 Repeatability Evaluation Report5 pages•Published: December 22, 2025AbstractThe repeatability evaluation for the 9th International Competition on Verifying Continuous andHybrid Systems (ARCH-COMP’25) is summarized in this report. The competition was held as part of the Applied Verification for Continuous and Hybrid Systems (ARCH) workshop in 2025. In its 9th edition, participants submitted their tools via an automated evaluation system developed over recent years. Each submission includes a Dockerfile and the necessary scripts for running the tool, enabling consistent execution in a containerized environment with all dependencies preinstalled. This setup improves comparability by running all tools on the same hardware. Submissions and results are automatically synchronized with a Git repository for repeatability evaluation and long-term archiving. We plan to further extend the evaluation system by refining the submission pipeline, aiming to enable automated evaluation across all competition categories. Keyphrases: closed loop, dynamic system, formal verification, neural networks, reachability analysis, safe ai In: Goran Frehse and Matthias Althoff (editors). Proceedings of 12th Int. Workshop on Applied Verification for Continuous and Hybrid Systems, vol 108, pages 190-194.
|

