Filter off: No filter for categories
Digital society is built heavily on information and communication technology (ICT). However, programs running on computers are invisible, thus it is difficult to observe when these technologies are being misused, harming individuals and society.
Cryptography studies mechanisms to control information flow or restrict certain procedures within a network of systems. Moreover, it provides verification means to ensure that each entity is behaving according to a predetermined set of rules. Therefore, they are important tools for designing secure and fair systems and bring transparency in digital society. Mathematics is necessary to make sure that cryptographic protocols and primitives achieve the designed criteria.
To receive regular updates and Zoom links for the lecture series, register at https://forms.gle/YfPt578EpjAoA9x48
In Bernoulli bond percolation, each edge of some graph are chosen to be either deleted or retained independently at random with retention probability p. For many large finite graphs, there is a phase transition such that if p is sufficiently large then there exists a giant cluster whose volume is proportional to that of the graph with high probability. We prove that in this phase the giant cluster must be unique with high probability: this was previously known only for tori and expander graphs via methods specific to those cases. Joint work with Philip Easo.
Survival of living cells underlies many influences such as nutrient saturation, oxygen level, drug concentrations or mechanical forces. Data-supported mathematical modeling can be a powerful tool to get a better understanding of cell behavior in different settings. However, under consideration of numerous environmental factors mathematical modeling can get challenging. We present an approach to model the separate influences of each environmental quantity on the cells in a collective manner by introducing the ”environmental stress level”. It is an artificial, immeasurable variable, which quantifies to what extent viable cells would get in a stressed state, if exposed to certain conditions. A high stress level can inhibit cell growth, promote cell death and influence cell movement. As a proof of concept, we compare two systems of ordinary differential equations, which model tumor cell dynamics under various nutrient saturations respectively with and without considering an environmental stress level. Particle-based Bayesian inversion methods are used to calibrate unknown model parameters with time resolved measurements of in vitro populations of liver cancer cells.
Link and Passcode: https://tum-conf.zoom.us/j/96536097137 Code 101816
Please register at: https://forms.gle/YfPt578EpjAoA9x48
Abstract: New sources of human behavior data can empower humanitarian projects, but they need to be carefully handled, properly anonymized and aggregated. In this talk, I will discuss the potential benefits and risks of data collaboratives for social good. Data collaboratives are public-private partnerships for data sharing and both legal and ethical aspects are very important for these initiatives. I will give examples from the Data for Refugees (D4R) Challenge, which was a non-profit challenge initiated to improve the conditions of the Syrian refugees in Turkey by providing a special database to the scientific community for enabling research on urgent problems concerning refugees, including health, education, unemployment, safety, and social integration. Collected from 1 million telecommunications customers over a one-year period, the mobile CDR database shows the activity and movement of refugees and citizens over the entire country. I will also briefly describe the Hummingbird Horizon2020 project that started last year, which uses similar mobile data to investigate irregular migration.
We introduce the weight-dependent random connection model, which is a general class of geometric random graphs. The vertices are given by a marked Poisson process on Euclidean space, and the probability of an edge between two marked Poisson points is given through a connectivity function. We consider a specific choice of connectivity function and derive a random graph model that corresponds to a continuum version of the scale-free percolation model introduced by Deijfen, van der Hofstad, and Hooghiemstra (2013). We sketch how one can transfer results about the degree distribution and about the graph distances from the scale-free percolation model to the random connection model.
A kinetic description of a plasma in external and self-consistent fields is given by the Vlasov equation for the particle distribution functions coupled to Maxwell's equation. The model is computational challenging due to its multiscale structure and the its relatively high dimensionality. This talk will give an overview of numerical solution methods and discuss structure-preserving particle methods as well as low-rank tensor discretizations in particular. A framework of structure-preserving particle in cell methods will be presented that allows for a long-time stable solution of the system. On the other hand, low-rank tensor methods are an efficient tool to compress multidimensional functions. We will demonstrate for benchmark problems that the essential features of the solution can be captured with a relatively low-rank.
Link and Passcode: https://tum-conf.zoom.us/j/96536097137 Code 101816
The linearly edge reinforced random walk (ERRW) was introduced in 1986 by Coppersmith and Diaconis and is one of the first example of reinforced random walks. Recently a link has been found between this model, the vertex reinforced jump process and a random spin model. Because of these links it was possible to show that in dimension 3 and above, the ERRW is recurrent for large reinforcement and transient for small ones and thus exhibits a phase transition. We will present the links between those models and show that the model has some monotonicity (the larger the reinforcements the more recurrent it is) and that its phase transition is unique.
Host-parasite coevolution is a well-known example of inter species interaction which has evolutionary consequences in both partners. Detecting the genomic regions under coevolution is of great interest in disease control and drug design. Different genomic regions are expected to contribute differently to the coevolution based on its functionality. Some regions are expected to impact the process substantially, those we call it as the major genes. Regions that have milder effects are called minor genes. Also, some regions that do not influence the coevolution process are called neutral genes. This study aims to develop simple statistical quantities that can measure the level of association between genomic regions and identify major, minor, and neutral genomic regions in host and parasite genome.
Link and Passcode : https://tum-conf.zoom.us/j/96536097137 , Code 101816
The development of new classification and regression algorithms based on deep neural networks coined Deep Learning have had a dramatic impact in the areas of artificial intelligence, machine learning, and data analysis. More recently, these methods have been applied successfully to the numerical solution of partial differential equations (PDEs). However, a rigorous analysis of their potential and limitations is still largely open. In this talk we will survey recent results contributing to such an analysis. In particular I will present recent empirical and theoretical results supporting the capability of Deep Learning based methods to break the curse of dimensionality for several high dimensional PDEs, including nonlinear Black Scholes equations used in computational finance, Hamilton Jacobi Bellman equations used in optimal control, and stationary Schrödinger equations used in quantum chemistry. Despite these encouraging results, it is still largely unclear for which problem classes a Deep Learning based ansatz can be beneficial. To this end I will, in a second part, present recent work establishing fundamental limitations on the computational efficiency of Deep Learning based numerical algorithms that, in particular, confirm a previously empirically observed "theory-to-practice gap".
ZOOM
Meeting ID: 999 4690 2916 Passcode: 695211
https://lmu-munich.zoom.us/j/99946902916?pwd=UWM5SGtIL091NmdjU3BHVVpOU0lEdz09
Classical umbral calculus is the theory of Sheffer polynomial sequences, which are characterised by the exponential form of their generating function. Meixner in 1934 found all Sheffer sequences that are orthogonal with respect to a probability measure on the real line. The class of such probability measures consists of Gaussian, Poisson, gamma, negative binomial and Meixner distributions. Note that all these measures are infinitely divisible, hence they give rise to a corresponding Lévy process. Let $\mathcal D$ denote the space of all smooth functions on the real line with compact support, and let $\mathcal D'$ be its dual space, i.e., the space of all generalized functions on the real line. We will introduce the notion of a polynomial sequence on $\mathcal D'$ and a Sheffer sequence on $\mathcal D'$. A Lévy white noise measure is a probability measure on $\mathcal D?$ which is the law of a generalised stochastic process obtained as the (generalised) derivative of a Lévy process. We will find the class of all Lévy white noises for which there exists an orthogonal polynomial sequence on $\mathcal D'$. This class will be in one-to-one correspondence with the Meixner class of probability measures on the real line, and the corresponding orthogonal polynomial sequences on $\mathcal D'$ are all Sheffer sequences. Extending Grabiner's result related to the one-dimensional umbral calculus, we will construct a class of spaces of entire functions on the complexification of $\mathcal D'$ that is spanned by Sheffer polynomial sequences. This will, in particular, extend the well-known characterisation of the Hida test space of Gaussian white noise as a space of entire functions.