Zuse Research Seminar

What? The Zuse Research Seminar serves as an interdisciplinary forum for researchers in the field of Applied Mathematics and Computer Science.

How? Talks are plenary-style and last 45 minutes plus time for questions. They should provide both an accessible overview over the research field as well as some broadly understandable insights into the speaker’s own research and how it relates to the field. Active discussion and questions are encouraged.

Who? Talks are predominantly given by researchers from the institute with the aim to showcase their department’s work in an accessible way. We also invite a select number of external speakers each semester to present their research. The seminar is open to the public and we encourage everyone interested to attend.

When? The seminar includes about five talks each semester during lecture time and talks are usually scheduled in the morning from 11:30 to 12:30 or in the afternoon from 13:30 to 14:30. Coffee is sometimes available in the lobby half an hour before or after the talk.

Where? The venue is usually the institute’s large lecture hall.

Please contact Tim Conrad and Christoph Spiegel by email for any organisational questions.

Upcoming Talks

There are no upcoming talks scheduled.

Past Talks

Nicholas Charron ZIB (SC)
Machine Learned Force Fields, Coarse Graining, HPC, & Beyond

Machine learned force fields (MLFFs), particular those using deep neural networks to model interaction potentials are quickly becoming a powerful tool for modelling complex molecular systems at scale with both classical and quantum accuracy. In this talk, we demonstrate an application of developing transferable coarse grained (CG) MLFFs for proteins using HPC resources to show how machine learning can be used easily and effectively on compute clusters to solve relevant chemical/physical problems. We furthermore discuss growing trends and usage of MLFFs with HPC resources, including emerging datasets, hardware demands, and integrations of machine learned potentials with existing simulation software.

Backpropagation and Nonsmooth Optimization for Machine Learning

Backpropagation is the major workhorse for many machine learning algorithms. In this presentation, we will examine the theory behind backpropagation as provided by the technique of algorithmic differentiation. Subsequently, we will discuss how this classic derivative information can be used for nonsmooth optimization. Examples from reail will illustrate the application of the proposed nonsmooth optimization algorithm.

Andrea Walther
Christoph von Tycowicz ZIB (VDCC)
Geometric Deep Learning

The increasing success of deep learning techniques during the last decade express a paradigm shift in machine learning and data science. While learning generic functions in high dimensions is a cursed estimation problem, many challenging tasks such as protein folding or image-based diagnosis, have now been shown to be achievable with appropriate computational resources.

These breakthroughs can be attributed to the fact that most tasks of interest aren’t actually generic; they possess inherent regularities derived from the effective low-dimensionality and structure of the physical world.

In this talk, we will see how geometric concepts allow to expose these regularities and how we can use them to incorporate prior (physical) knowledge into neural architectures.

Christoph von Tycowicz
Sebastian Pokutta ZIB (AIS2T)
An Introduction to Conditional Gradients

Conditional Gradient methods are an important class of methods to minimize (non-)smooth convex functions over (combinatorial) polytopes. Recently these methods received a lot of attention as they allow for structured optimization and hence learning, incorporating the underlying polyhedral structure into solutions. In this talk I will give a broad overview of these methods, their applications, as well as present some recent results both in traditional optimization and learning as well as in deep learning.

 Sebastian Pokutta
Pedro Maristany de las Casas ZIB (NEO)
Multiobjective Shortest Path Problems

In this talk we discuss new algorithms for the Multiobjective Shortest Path (MOSP) problem. The baseline algorithm, the Multiobjective Dijkstra Algorithm (MDA) has already been introduced in seminars at ZIB. New aspects discussed in this talk are its output-sensitive running time bound and how the bound compares to the one derived for previously existing MOSP algorithms, a version of the MDA for One-to-One MOSP instances, and the usage of the MDA as a subroutine. The discussed application in which the MDA acts as a subroutine are the Multiobjective Minimum Spanning Tree problem and the K-Shortest Simple Path problem.

 Pedro Maristany de las Casas
Thoughts on Machine Learning

Techniques of machine learning (ML) and what is called “artificial intelligence” (AI) today find a rapidly increasing range of applications touching upon social, economic, and technological aspects of everyday life. They are also being used increasingly and with great enthusiasm to fill in gaps in our scientific knowledge by data-based modelling approaches. I have followed these developments over the past almost 20 years with interest and concern, and with mounting disappointment. This leaves me sufficiently worried to raise here a couple of pointed remarks.

 Rupert Klein
Thorsten Koch ZIB (AAIM)
On the state of QUBO solving

It is regularly claimed that quantum computers will bring breakthrough progress in solving challenging combinatorial optimization problems relevant in practice. In particular, Quadratic Unconstrained Binary Optimization (QUBO) problems are said to be the model of choice for use in (adiabatic) quantum systems during the NISQ era. Even the first commercial quantum-based systems are advertised to solve such problems and QUBOs are certainly an interesting way of modeling combinatorial optimization problems. Theoretically, any Integer Program can be converted into a QUBO. In practice, however, there are some caveats. Furthermore, even for problems that can be nicely modeled as a QUBO, this might not be the most effective way to solve them. We review the state of QUBO solving on digital and Quantum computers and give some insights regarding current benchmark instances and modeling.

 Thorsten Koch
The few and the many

The talk will give a short introduction to complex dynamics of interacting systems of individual units that can be particles (molecules, …),  or agents (individual humans, media agents, …). We are interested in systems with at least two types of such units, one type of which just a “few” individual units are present and another type of which there are “many”. For such systems we will review mathematical models on different levels: from the micro-level in which all particles/agents are described individually to the macro-level where the “many” are modelled in an aggregated way. The effective dynamics given by these models will be illustrated by examples from cellular systems (neurotransmission processes) and opinion dynamics in social networks.  You will be able to follow the talk even if you do not have any detailed knowledge about particles/agents or cellular/social processes (at least I hope).

 Christof Schütte
Sparse Personalized PageRank:
New results on the 25 billion dollar eigenvector problem

This talk will go over the basics of the PageRank problem, studied initially by the founders of Google, which allowed them to create their search engine by applying it to the internet graph with links defining edges. Then, we will explain some of our results on the problem for undirected graphs, whose main application is finding local clusters in networks, and is used in many branches of science. We can now find local clusters fast in a time that does not depend on the whole graph but on the local cluster itself.

This is joint work with Elias Wirth and Sebastian Pokutta.

 David Martínez-Rubio