Global Convergence of Frank Wolfe on One Hidden Layer Networks.
TITLE: Global Convergence of Frank Wolfe on One Hidden Layer Networks.
AUTHORS: Alexandre d'Aspremont, Mert Pilanci.
ABSTRACT: We derive global convergence bounds for the Frank Wolfe algorithm when
training one hidden layer neural networks. When using the ReLU activation
function, and under tractable preconditioning assumptions on the sample data
set, the linear minimization oracle used to incrementally form the solution can
be solved explicitly as a second order cone program. The classical Frank Wolfe
algorithm then converges with rate O(1/T) where T is both the number of
neurons and the number of calls to the oracle.
STATUS: Preprint.
ArXiv PREPRINT: 2002.02208
|