Für eine korrekte Darstellung dieser Seite benötigen Sie einen XHTML-standardkonformen Browser, der die Darstellung von CSS-Dateien zulässt.

Convex Optimization for Computer Vision

Semester: 
Summer Term 2019
Lecturer: 
Place/Time: 
Lecture: Monday 10:00 - 11:45 in room H-A 7106, Friday 8:30 - 10:00 in room H-A 7106. Exercises: Thursday 8:30-10:00 in room H-A 7116
SWS/LP: 
4+2 SWS / 10LP
Recommended for: 
Master students in informatics, interested in optimization, mathematics and computer vision
News: 

If you are interested in attending this lecture, write an email to Jonas Geiping or Michael Möller.

Being able to determine the argument that minimizes a (possibly nonsmooth) convex cost function efficiently is of great practical relevance. For example, convex variational methods are one of the most powerful techniques for many computer vision and image processing problems, e.g. denoising, deblurring, inpainting, stereo matching, optical flow computation, segmentation, or super resolution. Furthermore a clear understanding of convex optimization provides a baseline for further study of advanced non-convex or stochastic optimization techniques as encountered in deep learning, design or control problems.

In this lecture we will discuss first order convex optimization methods to implement and solve the aforementioned problems efficiently. Particular attention will be paid to problems including constraints and non-differentiable terms, giving rise to methods that exploit the concept of duality such as the primal-dual hybrid gradient method or the alternating directions methods of multipliers. This lecture will cover both the mathematical background, proving why the investigated methods converge, as well as their efficient practical implementation.

Convex Optimization

We will cover the following topics:

Mathematical background

  • Convex sets and functions
  • Existence and uniqueness of minimizers
  • Subdifferentials
  • Convex conjugates
  • Saddle point problems and duality

Numerical methods

  • (Sub-)Gradient descent schemes
  • Proximal point algorithm
  • Primal-dual hybrid gradient method
  • Augmented Lagrangian methods
  • Acceleration schemes, adaptive step sizes, and heavy ball methods

Example applications in computer vision and signal processing problems, including

  • Image denoising, deblurring, inpainting, segmentation
  • (Multinomial) logistic regression

Lecture

Location:  Room H-F 115, Hölderlinstraße 3

Time and Date: Monday 12:15 - 14:00, Tuesday 12:15 - 14:00

Start: April 1th, 2019, 12:15

Unisono Lecture: Unisono Link

Unisono Exercise: Unisono Link

The lecture is held in English. 

Exercises

Location: Room H-F 115  Hölderlinstraße 3

Time and Date: Monday 14:15 - 16:00

Start: April 8th, 2019
Exercise Webpage: Link
The lecture is accompanied by weekly exercises to solidify understanding of the material. The exercise sheets consist of two parts, theoretical and programming exercises.
The exercise sheets will be online on the exercise page each Monday and you have one week to solve them. Submission deadline is on the following Friday at 18:00 in the letterbox in front of H-A 7116 or via email. The solutions will be discussed in the exercises on the next Monday.

Fast Optimization Challenge

During the course of the lecture, we will pose a challenge to solve an optimization problem as quickly as possible. The challenge ends on Friday 21.06 23:59 . The best solution will receive a prize. The challenges will be a good preparation for the final exam!

Submission instructions: The source code should be sent via e-mail to michael.moeller@uni-siegen.de
 

Challenge: To be announced in the lecture

Leaderboard

Name Runtime Method
Michael Moeller 604 s Gradient descent (fixed step size)
     
     
Exam

The exam will be oral.

Practice Manager: 
Exercise operational: 

Please refer to the exercise webpage. For a password, please contact jonas.geiping@uni-siegen.de.