Übungsbetreuung:

Machine Learning (ML) and in particular Deep Learning (DL) achieved many breakthroughs in the last years. Unfortunately, ML/DL does not only require data, but it requires also a lot of expert knowledge, if you want to apply it successfully---even experts in ML/DL still need a lot of time to do it. Major challenges for new ML/DL applications include the choice of the algorithm to be used (SVM, random forest, deep neural networks) and its hyper-parameter settings (e.g., kernel coefficient of a RBF-SVM). Unfortunately, to obtain accurate predictions, these design decisions are crucial and have to be made for each dataset. This is particularly hard for deep learning, where we have to choose a well-performing architecture of the network and for example to set the hyper-parameters of the optimizer (e.g., learning rate). Since training deep neural networks often requires quite some time (minutes, hours or even weeks), we cannot exhaustively try several networks architectures and hyper-parameter configurations, but we have to find more efficient approaches. Overall, all these design decisions require a lot of expert knowledge, the process takes quite some time and the manual tuning is a tedious and error-prone task. We will discuss approaches and meta-systems, that automate the process of obtaining well-performing machine learning systems, so-called Automated Machine Learning (AutoML). These AutoML systems allow for faster development of new ML/DL applications, require far less expert knowledge than doing everything from scratch and often even outperform human developers.

In this seminar we will focus on the analysis and automatic optimization of an algorithm's hyper-parameters. We will discuss methods to find the importance of given hyper-parameters, various Algorithm Configuration algorithms as well as more advanced approaches for Per-Instance Algoritm Configuration and Dynamic Algorithm Configuration.

We strongly recommend that you know the foundations of machine learning in order to attend the course. You should have attended at least one other course for ML in the past.

Preferably you are also familiar at least one of:

- Bayesian Optimization
- Evolutionary Algorithms
- Deep Learning
- Reinforcement Learning

We will focus on Hyperparameter Optimization, including:

- Parameter Importance evaluation
- Algorithm Configuration
- Per-Instance Algoritm Configuration
- Dynamic Algorithm Configuration

The full list of papers can be found on the StudentIP course page.

The course, including your presentation, will be in English. We will have weekly sessions in which one or two students will each present a paper of their choice. You will be expected to attend every session and have read all paper in advance. Additionally, you will prepare summaries for some of the other students' papers. Your grade will consist of your presentation and a written discussion of the paper you chose.