Ludger Paehler


Sessions

02-22
15:00
15min
A Cross-Language Probabilistic Programming Protocol for Physics and Beyond
Tim Gymnich, Ludger Paehler

In this short technical talk we will present a first prototype, which in the spirit of Etalumis, seeks to enable such seamless composability between high performance computation, and probabilistic programming by leveraging the compiler-based infrastructure of Enzyme. For this we will introduce a new mode in Enzyme, which will enable the integration of classical high performance-computing simulators into probabilistic programming systems upon ahead-of-time compilation of the simulator with the to-be-introduced compiler-based probabilistic programming protocol relying on Enzyme's acticity and type-analysis. Proof-of-concept frontends to commonly used probabilistic programming systems, such as Gen, Pyro, and BlackJAX are presented.

Technical Talk
ECNT 312
02-22
15:15
15min
Numba-Enzyme: A Differentiable JIT-ed Python
Ludger Paehler, Jan Hueckelheim, Nikolaus A. Adams, Lukas Heinrich

In this short technical talk we will present a first prototype, as well as a forward-looking roadmap for Numba-Enzyme, a gradient-providing Just-in-time (JIT) compiler for Python relying on Numba to JIT Python, and Enzyme to provide the gradients for respective kernels. Including recent advances in Enzyme for forward-mode, we will prevent the JIT-pipeline for forward-mode differentiation, and reverse-mode differentiation as well as presenting first performance comparisons to C++ differentiated code with Enzyme, as well as gradient computation in JAX. We will conclude by providing an outlook on future extensions to expose Enzyme's vectorization, as well as kernel composability between Numba-Enzyme, and JAX to enable users to leverage the strengths of both ecosystems.

Technical Talk
ECNT 312
02-22
16:00
60min
Enzyme Tutorial
William Moses, Ludger Paehler

Derivatives are key to algorithms in scientific computing and machine learning such as training neural networks, optimization, uncertainty quantification, and stability analysis. While machine learning frameworks rely on differentiable domain-specific languages (DSLs), the automatic differentiation (AD) of general-purpose programs poses significant challenges with regards to the to be differentiated programming languages, and the differentiability of language features such as parallelism and heterogeneity. Enzyme is a LLVM-based compiler plugin for automatic differentiation of statically analyzable programs expressed in the LLVM intermediate representation (IR), thus generating fast gradients of programs in a variety of languages (C/C++, FORTRAN, Julia, Rust, Swift, etc) and architectures (CPU, CUDA, ROCm). While existing tools operate at the source level, Enzyme differentiates after the application of compile-time optimizations, which allows for asymptotically faster gradients. The need to optimize first is especially pronounced when differentiating parallel, and specifically GPU programs, where data races and complex memory hierarchies can dramatically alter runtimes. This tutorial is aimed for both potential users of automatic differentiation and compiler-writers who may want to enable automatic differentiation in their compiler. Participants will be given an interactive introduction to automatic differentiation with Enzyme. Along the way, we will cover the foundations of automatic differentiation, how to use Enzyme to differentiate programs, parallel and GPU-specific differentiation, and all the tools necessary to enable Enzyme for your choice of compiler.

Tutorial
ECNT 312
02-22
13:15
15min
Opening
William Moses, Oleksandr Zinenko, Ludger Paehler, Jed Brown, Leila Ghaffari, Tim Gymnich, Patrick Heimbach

Opening remarks an PC report

Other Format
ECNT 312