You are here
Convex Optimization for Computer Vision
-
Time and place of the exercise class has been changed to Thursday, 16:00 - 17:30 in room H-F 104/05!
-
The submission deadline for the exercise sheets has been postponed to Tuesday.
Being able to determine the argument that minimizes a (possibly nonsmooth) convex cost function efficiently is of great practical relevance. For example, convex variational methods are one of the most powerful techniques for many computer vision and image processing problems, e.g. denoising, deblurring, inpainting, stereo matching, optical flow computation, segmentation, or super resolution. In this lecture we will discuss first order convex optimization methods to implement and solve the aforementioned problems efficiently. Particular attention will be paid to problems including constraints and non-differentiable terms, giving rise to methods that exploit the concept of duality such as the primal-dual hybrid gradient method or the alternating directions methods of multipliers. This lecture will cover the mathematical background for proving why the investigated methods converge as well as their efficient practical implementation.
We will cover the following topics:
Mathematical background
-
Convex sets and functions
-
Existence and uniqueness of minimizers
-
Subdifferentials
-
Convex conjugates
-
Saddle point problems and duality
Numerical methods
-
(Sub-)Gradient descent schemes
-
Proximal point algorithm
-
Primal-dual hybrid gradient method
-
Augmented Lagrangian methods
-
Acceleration schemes, adaptive step sizes, and heavy ball methods
Example applications in computer vision and signal processing problems, including
-
Image denoising, deblurring, inpainting, segmentation
-
Implementation in MATLAB
Lecture
Location: Room H-F 115, Hölderlinstraße 3
Time and Date: Monday 12:15 - 14:00, Tuesday 12:15 - 14:00
Start: April 9th, 2018, 12:15
Unisono: Unisono Link
The lecture is held in English.
Exercises
Location: Room H-F 104/05 , Hölderlinstraße 3
Time and Date: Thursday 16:00 - 17:00
Start: April 9th, 2018
The lecture is accompanied by weekly exercises to solidify understanding of the material. The exercise sheets consist of two parts, theoretical and programming exercises.
The exercise sheets will be online on the exercise page each Tuesday and you have one week to solve them. Submission deadline is usually on the following Tuesday at 18:00 in the letterbox in front of H-A 7116 or via email. The solutions will be discussed in the exercises on the next Thursday.
Fast Optimization Challenge
During the course of the lecture, we will pose a challenge to solve an optimization problem as quickly as possible. The challenge ends on Monday 18.06 23:59 . The best solution will receive a prize. The challenges will be a good preparation for the final exam!
Submission instructions: The source code should be sent via e-mail to michael.moeller@uni-siegen.de
Challenge: To be announced
Leaderboard
Name | Runtime | Method |
---|---|---|
Michael Moeller | 604 s | Gradient descent (fixed step size) |
Exam
The exam will be oral.
Please refer to Exercise and Materials. If you need a password, please contact hartmut.bauermeister@uni-siegen.de.