POPL 2024
Sun 14 - Sat 20 January 2024 London, United Kingdom
Fri 19 Jan 2024 15:30 - 15:50 at Kelvin Lecture - Machine and Automata Learning Chair(s): Steven Holtzen

The non-linearity of activation functions presents a major challenge in the verification of neural networks. Due to the computational complexity in the (complete) reasoning over exact activation functions, various studies have been dedicated to efficiently and tightly over-approximating them (sound but often incomplete). One commonly used approach is to employ convex hulls, which reduces the activation functions into linear constraints. However, computing the exact convex hull \emph{per se} is costly, and striking a balance between efficiency and precision in its approximation remains a challenging task.

In this work, we propose a fast and precise over-approximation approach for the convex hull of the ReLU function (referred to as the \emph{ReLU hull}), one of the most used activation functions. Our key insight is to formulate a \emph{convex polyhedron} that “wraps” the ReLU hull, by reusing the linear pieces of the ReLU function as the \emph{lower faces} and constructing \emph{upper faces} that are adjacent to the lower faces. Relying on the edges and vertices of the lower faces, the upper faces can be efficiently constructed, given that an $n$-dimensional (or simply $n$d hereafter) hyperplane can be determined by an $(n-1)$d hyperplane and a point outside of it. This approximation method takes full advantage of the uniqueness of the ReLU hull in sharing linear pieces of the original ReLU function as its lower faces, enabling efficient determination of the forms of the adjacent faces and vertices. Additionally, the adjacency of the newly-generated upper faces to the exact lower faces and the inclusion of the exact vertices contribute to a tight approximation.

We have implemented our method as WraLU, and evaluated its performance in terms of precision, efficiency, constraint complexity, and scalability in approximating ReLU hulls. WraLU outperforms existing advanced methods by generating fewer constraints to achieve tighter approximation in less time. It exhibits versatility by effectively addressing arbitrary input polyhedra and higher-dimensional cases, which are beyond the capabilities of existing methods. We integrate WraLU into PRIMA, a state-of-the-art neural network verifier based on ReLU hull approximation and apply it to verify large-scale ReLU-based neural networks. Our experimental results demonstrate that WraLU achieves a high efficiency without compromising precision. It reduces the number of constraints that need to be solved by the linear programming solver by up to half, while delivering comparable or even superior results compared to the state-of-the-art verifiers.

Fri 19 Jan

Displayed time zone: London change

15:10 - 16:30
Machine and Automata LearningPOPL at Kelvin Lecture
Chair(s): Steven Holtzen Northeastern University
15:10
20m
Talk
Efficient CHAD
POPL
Tom Smeding Utrecht University, Matthijs Vákár Utrecht University
DOI Pre-print
15:30
20m
Talk
ReLU Hull Approximation
POPL
Zhongkui Ma The University of Queensland, Jiaying LI Microsoft, Guangdong Bai The University of Queensland
15:50
20m
Talk
On Learning Polynomial Recursive Programs
POPL
Alex Buna-Marginean University of Oxford, Vincent Cheval Inria Paris, Mahsa Shirmohammadi CNRS & IRIF, Paris, James Worrell University of Oxford
16:10
20m
Talk
Programming-by-Demonstration for Long-Horizon Robot Tasks
POPL
Noah Patton The University of Texas at Austin, Kia Rahmani The University of Texas at Austin, Meghana Missula The University of Texas at Austin, Joydeep Biswas The University of Texas at Austin, Işıl Dillig University of Texas at Austin