IAM

APRIL2016

READING

P. Ochs, Y. Chen,T. Brox, T. Pock. iPiano: Inertial Proximal Algorithm for Non-Convex Optimization. Computing Research Repository, abs/1404.4805, 2014.

Stay tuned for a C++ implementation and a seminar paper on this topic!

Ochs et al. combine two streams of research in optimization: proximal algorithms, in particular forward-backward splitting, with an inertial term. Specifically, Ochs et al. consider problems of the form

$\min_{x \in \mathbb{R}^n} h(x) = \min_{x \in \mathbb{R}^n}(f(x) + g(x))$.

where $f$ is smooth and $g$ convex. The proposed algorithms is termed iPiano and based on the following iteration scheme:

$x^{(n + 1)} = \text{prox}_{\alpha_n g} \left(x^{(n)} - \alpha_n f(x^{(n)}) + \beta_ n(x^{(n)} - x^{(n - 1)})\right)$

where $x^{(n)}$ denotes the $n$-th iterate and $\alpha_n$ and $\beta_n$ are step size and momentum parameter. The proximal mapping is defined as usual, i.e.

$\text{prox}_{\alpha g}(\hat{x}) := \arg\min_{x \in \mathbb{R}^n} \frac{\|x - \hat{x}\|_2^2}{2} + \alpha g(x)$,

and well-defined as $g$ is convex. The algorithm can be summarized as in Algorithm 1 where the interesting part - choosing $\alpha_n$ and $\beta_n$ in order to guarantee convergence - is omitted. Ochs et al. prove convergence for bounded, coercive $h$ where $f$ is smooth (but might be non-convex) and $g$ is proper closed convex (but might be non-smooth). Additional requirements include the Lipschitz-continuity of $\nabla f$ as well as the lower semi-continuity of $g$.

function ipiano(
        $f$, $g$,
        $N$ // Maximum number of iterations
    )
    initialize $x^{(0)}$
    $x^{(-1)} := x^{(0)}$
    for $n = 1,\ldots,N$
        choose $\alpha_n$
        choose $\beta_n$
        $x^{(n + 1)} = \text{prox}_{\alpha_n g} \left(x^{(n)} - \alpha_n f(x^{(n)}) + \beta_n(x^{(n)} - x^{(n - 1)})\right)$
    return $x^{(n)}$
Algorithm 1: Simplified iPiano.
What is your opinion on this article? Let me know your thoughts on Twitter @davidstutz92 or LinkedIn in/davidstutz92.