December 7, 2023
While formal reasoning in the machine learning domain achieves remarkable success in certifying properties specified over highly complex, non-linear functions (e.g., DNNs), lifting this formal reasoning to these function’s derivatives has been severely understudied. Further, because gradient computations and more broadly differentiable programming make up the backbone of modern machine learning and are also ubiquitous in other applications like Scientific Computing and Optimization, the immediate needs of practitioners have rapidly outpaced formal development on the programming languages side.
To address these challenges, I present a framework for abstract interpretation of Differentiable Programming and Automatic Differentiation (AD) and show how this idea allows us to cleanly, formally, and compositionally reason about both a function (e.g., DNN, scientific model or optimization objective) and its derivatives in a sound manner, even in the face of points of non-differentiability. I also show how this framework can be generalized to arbitrary orders of derivatives as well as instantiated with more expressive abstract domains to further improve the generality and precision. I will lastly conclude by showing how to leverage the structure of AD computations to optimally synthesize AD abstractions for the chain rule, product rule and quotient rule, thus maximizing the precision of the abstract interpretation.
With this work, we make the first step toward unlocking the potential to define, analyze and verify a brand-new set of formal properties expressed over derivatives, for a broad and general class of programs
About Jacob Laurel Scott (UIUC)
Jacob Laurel is a final year Computer Science PhD Candidate at the University of Illinois Urbana-Champaign advised by Prof. Sasa Misailovic. Jacob’s research interests center upon applying insights from continuous mathematics to build program analyses for differentiable and probabilistic programming. His current research focuses on building precise, general and scalable static analyses for Automatic Differentiation. He has published in conferences including POPL, OOPSLA, ESOP, DAC, DATE, CVPR and ICLR. He received a Sloan UCEM Scholarship as well as a Mavis Future Faculty Fellowship for his contributions. He earned bachelors degrees in both Electrical Engineering and Applied Mathematics (Scientific Computation Track) at the University of Alabama at Birmingham.