# Probability Day

Date: January 25, 2013
Venue: Mathematik-Zentrum, Lipschitz Lecture Hall (Room 1.016), Endenicher Allee 60, Bonn

## Program:

 12:00-14:00 Lunch at ‘Fellinis’; Address: Clemens-August-Straße 8, 53113 Bonn 14:00-15:00 Walter Schachermayer (University of Vienna): From Doob's inequality to robust super-replication 15:10-16:10 Steffen Dereich (University of Münster): Complex networks with preferential attachment: percolation and typical distances 16:10-16:45 Coffee break 16:45-17:45 Terry Lyons (University of Oxford): Cubature rough paths and the patched particle filter

## Abstracts

Steffen Dereich: Complex networks with preferential attachment: percolation and typical distances

Since the publication of the highly influential paper of Barabasi and Albert in 1999 the preferential attachment paradigm has captured the imagination of scientists across the disciplines. The underlying idea is that structural properties of large networks, such as the World-Wide-Web, social interaction or citation networks, can be explained by the principle that these networks are built dynamically, and new vertices prefer to be attached to vertices which have already a high degree in the existing network.

Mathematically, we consider a dynamic random network model in which at every construction step a new vertex is introduced and attached to every existing vertex independently with a probability proportional to a concave function $f$ of its current degree. The qualitative analysis of the complex network now comprises the derivation of limit theorems as the size of the random graph tends to infinity. In this talk, I will discuss recent progress on percolation properties and typical distances.

Technically, the crucial point is the description of the local neighbourhood of a randomly chosen vertex by a (truncated) branching random walk. We compare our findings with previous results on alternative network models and illustrate the impact of the preferential attachment paradigm.

Terry Lyons: Cubature rough paths and the patched particle filter

Many important algorithms involve transporting measures forward and it is an empirical fact that methods that approximate the measure by anempirical measure work effectively. In this talk we explain why monte carloworks badly in high dimensions (like 2 or 3) and explain other algorithmsthat out perform it.

Walter Schachermayer: From Doob's inequality to robust super-replication

Our starting point is Doob's classical maximal inequality for martingales. We give a pathwise proof of this wellknown result. This yields versions of this inequality which are slightly sharper than previously known ones. More importantly, it allows for a new interpretation of this inequality: we can view it as a robust super-replication result of an exotice option, namely the square of the maximal function. We then extend the finance point of view also to other results in stochastic analysis, e.g., the Bichteler-Dellacherie theorem. This leads to a general duality theory for robust, i.e. model-free, super-replication. We present some recent results obtained in collaboration with B. Acciaio, M. Beiglböck, F. Penkner, and J. Temme.