“Lipschitz Function Approximation using DeepSpline Neural Networks”
September 8, 2022 | Time 12:00 CET
In this talk, we investigate NNs with prescribed bounds on the Lipschitz constant. One possibility to obtain Lipschitz-constrained NNs is to impose constraints on the architecture. Here, it turns out that this significantly limits the expressivity if we use the popular ReLU activation function. In particular, we are unable to represent even simple continuous piece-wise linear functions. On the contrary, using learnable linear splines instead fixes this problem and leads to maximal expressivity among all component-wise activation functions. From the many possible applications of Lipschitz-constrained NNs, we discuss one in more detail to see that the theoretical observations also transition into improved performance.
Sebastian Neumayer studied mathematics at TU Kaiserslautern and received his PhD in 2020 at TU Berlin under the supervision of Gabriele Steidl. Currently, he is a postdoctoral researcher in the Biomedical Imaging Group at EPFL. His main research interests are centered around convex analysis, inverse problems, and theoretical aspects of neural networks. In the past months, he has focused on studying stability properties of neural networks and designing new network architectures.