Learning with Submodular Functions : A Convex Optimization Perspective

Bok av Bach
Submodular functions are relevant to machine learning for at least two reasons: (1) some problems may be expressed directly as the optimization of submodular functions, and (2) the Lovasz extension of submodular functions provides a useful set of regularization functions for supervised and unsupervised learning. In Learning with Submodular Functions, the theory of submodular functions is presented in a self-contained way from a convex analysis perspective, presenting tight links between certain polyhedra, combinatorial optimization and convex optimization problems. In particular, it describes how submodular function minimization is equivalent to solving a wide variety of convex optimization problems. This allows the derivation of new efficient algorithms for approximate and exact submodular function minimization with theoretical guarantees and good practical performance. By listing many examples of submodular functions, it reviews various applications to machine learning, such as clustering, experimental design, sensor placement, graphical model structure learning or subset selection, as well as a family of structured sparsity-inducing norms that can be derived and used from submodular functions. This is an ideal reference for researchers, scientists, or engineers with an interest in applying submodular functions to machine learning problems.