One-day pre-conference workshop
at the 46th IEEE Conference on Decision and Control, New Orleans, LA, USA

Optimization on manifolds

Tuesday, December 11th, 2007, prior to the 46th IEEE CDC


Organizers   Additional presenters   Main goal   Synopsis   Schedule   Target audience   Registration   Questions?

Organizers

Additional presenters

Main Goal

The goal is to provide the participants with the background necessary to design and analyze optimization methods on manifolds and turn these methods into concrete numerical algorithms for several problems in control, computer vision and signal processing.

Synopsis

The recent years have witnessed an increasing interest in the development of efficient optimization algorithms defined on manifolds. Applications abound in numerical linear algebra (eigenproblems), statistical analysis (Principal and Independent Component analysis), signal processing (blind source separation, subspace tracking), machine learning (clustering), computer vision (pose estimation), to name a few.

Good algorithms result from the combination of insights from differential geometry, optimization and numerical analysis. The purpose of the workshop is to provide a tutorial introduction to this rich field of applied mathematics with a parcimonious selection of topics in differential geometry and in numerical algebra, and with an illustration of engineering problems where the theory is currently applied.

Introductory talks provide the participants with the basic concepts of differential geometry instrumental to algorithmic development. The other talks illustrate why differential geometry provides a natural foundation for the development of efficient numerical algorithms for many equality-constrained optimization problems. Several well-known optimization techniques, such as steepest descent, conjugate gradients, trust-region and Newton-type methods, are generalized to the manifold setting. A generic development of each of these methods is provided, building upon the geometric material. The participants are then guided through the constructions and computations that turn these geometrically formulated methods into concrete numerical algorithms. The techniques are general and are illustrated on several problems in linear algebra, signal processing, data mining, computer vision, and statistical analysis.

Schedule

This pre-CDC workshop will take place in Ascot Room, 3rd floor; 8:30 - 17:00.

8:30am - 8:45am (15 min) Welcome and overview
Speakers: P-A Absil, Rodolphe Sepulchre

PART I : Theory and Illustrations

  1. 8:45am - 9:30am (45 min) Introduction and motivation. Collection of computational problems on manifolds: blind source separation, face recognition, time series clustering.
    Speaker: Jochen Trumpf
  2. 9:30am - 10:15am (45 min) Manifolds, submanifolds, and quotient manifolds. A tour of the manifolds that appear in computational problems.
    Speaker: P-A Absil

    10:15am - 10:45am: coffee break

  3. 10:45am - 11:30am (45 min) First-order geometry and algorithms. Tangent vectors, Riemannian metric, retractions, steepest-descent, line-search algorithms.
    Speaker: Rodolphe Sepulchre
  4. 11:30am - 12:15pm (45 min) Second-order geometry and algorithms. Covariant derivative, vector transport, Newton's method, conjugate gradients, trust-region methods.
    Speaker: P-A Absil

PART II: Applications

  1. 1:30pm - 2:15pm (45 min) Adaptive Tracking on Grassmann Manifolds

    Based on the differential geometry of the Grassmann manifold, we propose a new class of algorithms for adaptive tracking of zeros of vector fields on a Grassmann manifold. This has many applications to signal processing tasks, such as eigenvalue computations, antenna arrays and principal and minor component analysis. Using arbitrary local parameterizations of the Grassmann manifold, we introduce a new class of quasi-Newton methods on Grassmannians with resulting simple expressions for the subspace tracking schemes. Key benefits of the algorithms are (a) the reduced computational complexity due to efficient parametrizations of the Grassmannian and (b) their guaranteed accuracy during all iterates.

    Speaker: Uwe Helmke

  2. 2:15pm - 3:00pm (45 min) Independent component analysis and applications in genomics

    The transcriptome is the set of all mRNA molecules in a given cell. Unlike the genome, which is roughly similar for all the cells of an organism, the transcriptome may vary from one cell to another according to the biological functions of that cell as well as to the external stimuli. The transcriptome reflects the activity of all the genes within the cell. The quantity of a given mRNA is determined by a complex interaction between cooperative and counteracting biological processes. Understanding the intricate mechanism that induces the mRNA expression is an important step in elucidating the relation between the transcriptome of a cell and its phenotype.

    Microarray technology provides a quantitative measure of the concentration of mRNA molecules in a cell for the whole transcriptome in a systematic way. This technology yields a huge amount of data, typically related to several thousand genes over a few hundred experiments, which correspond, e.g., to different patients, tissues or environmental conditions. Gene expression data sets are so large that a direct interpretation of them is usually infeasible. Unsupervised methods are required to reduce the dimension of the data set and to provide some biological insight in an automatic way.

    ICA provides a means to tackle this task. Most ICA algorithms can be viewed as the combination of two elements: (i) a contrast function, which can be thought of as an estimator of the level of statistical dependence between signals; (ii) an optimization method that attempts to compute an optimizer of the constrast function and thereby extract components that are "as independent as possible".

    In this talk, we will show that the underlying optimization problems in ICA can be naturally phrased as optimization problems on differentiable manifolds. A key reason lies in the inherent invariance properties of contrast functions: the level of dependence between signals must not be altered by scaling or permutation of these signals. This geometric formulation opens avenues for exploiting recently-developed optimization techniques on manifolds. In particular, a new trust-region method on the oblique manifold will be briefly discussed. We will also comment on the adequacy of various contrast functions in the context of gene expression analysis.

    Speaker: Michel Journée

    3:00pm - 3:30pm: coffee break

  3. 3:30pm - 4:15pm (45 min) Optimal motion recovery in computer vision: Essential matrix and trifocal tensor estimation

    Extracting the motion parameters of a moving stereo rig is an important issue in computer vision and vision based robotics. This is due to the need of emerging applications such as telepresence or robot navigation. In the context of telepresence, we consider a robot to be equipped with a stereo camera the pose of which has to be determined in a fast and robust way. The key issue is to compute the robust estimates of $3\times 3$ compatible essential matrices of the stereo cameras since the motion can then be uniquely estimated.

    The estimation of essential matrices and their generalisation, namely the trifocal tensor, can be reformulated as an optimisation problem on a manifold. We will discuss the mathematical background and present Newton-type methods in an intrinsic way.

    Speaker: Knut Hüper

  4. 4:15pm - 5:00pm (45 min) Generalized Principal Component Analysis

    In many scientific and engineering problems, the data of interest can be viewed as drawn from a mixture of geometric or statistical models instead of a single one. Such data are often referred to in different contexts as ``mixed,'' or ``multi-modal,'' or ``multi-model,'' or ``heterogeneous,'' or ``hybrid.'' For instances, a natural image normally consists of multiple regions of different texture, a video sequence may contains multiple independently moving objects, and a hybrid dynamical system may arbitrarily switch among different subsystems.

    Generalized Principal Component Analysis (GPCA) is a general method for modeling and segmenting such mixed data using a collection of subspaces, also known in mathematics as a subspace arrangement. By introducing certain new algebraic models and techniques into data clustering, traditionally a statistical problem, GPCA offers a new spectrum of algorithms for data modeling and clustering that are in many aspects more efficient and effective than (or complementary to) traditional methods (e.g. Expectation Maximization and K-Means).

    Speaker: Rene Vidal

Target Audience

The workshop is aimed at abroad audience of graduate students and researchers, including applied mathematicians, engineers and computer scientists, with little or no background in differential geometry and optimization.

Provided material

Copies of the slides and supplementary material will be provided to the registered participants.

Workshop web page

http://www.inma.ucl.ac.be/~absil/oom/cdc07/

Registration

Pre-registration for the CDC pre-conference workshops is strongly encouraged. Please register through the conference web site.

Questions

For further information on the workshop, please feel free to contact any of the organizers.
Pierre-Antoine Absil