Filter off: No filter for categories
We propose a continuous-time stochastic model to analyze the dynamics of impermanent loss in liquidity pools in decentralized finance (DeFi) protocols. We replicate the impermanent loss using option portfolios for the individual tokens. We estimate the risk-neutral joint distribution of the tokens by minimizing the Hansen–Jagannathan bound, which we then use for the valuation of options on relative prices and for the calculation of implied correlations. In our analyses, we investigate implied volatilities and implied correlations as possible drivers of the impermanent loss and show that they explain the cross-sectional returns of liquidity pools. We test our hypothesis on options data from a major centralized derivative exchange.
In this talk, we introduce a novel mesh-free and direct method for computing the shape derivative in PDE-constrained shape optimization problems. Our approach is based on a probabilistic representation of the shape derivative and is applicable for second-order semilinear elliptic PDEs with Dirichlet boundary conditions and a general class of target functions. The probabilistic representation derives from a boundary sensitivity result for diffusion processes due to Costantini, Gobet and El Karoui. Via so-called Taylor tests we verify the numerical accuracy of our methodology.
In this talk, we shall discuss long time behavior of solutions to parabolic stochastic partial differential equations with singular nonlinear divergence-type diffusivity. As these kinds of equations usually lack good coercivity estimates in higher spatial dimensions, we choose to address the general well-posedness question by variational weak energy methods. Examples include the stochastic singular $p$-Laplace equation, and the stochastic curve shortening flow with additive Gaussian noise. We shall present improved pathwise regularity results and improved moment and decay estimates for a general class of singular divergence-type PDEs.
Based on joint works with Benjamin Gess (Leipzig and Bielefeld), Wei Liu (Xuzhou), Florian Seib (Berlin), and Wilhelm Stannat (Berlin).
The theory we present aims at expanding the classical Arbitrage Pricing Theory to a setting where N agents invest in stochastic security markets while also engaging in zero-sum risk exchange mechanisms. We introduce in this setting the notions of Collective Arbitrage and of Collective Super-replication and accordingly establish versions of the fundamental theorem of asset pricing and of the pricing-hedging duality. When computing the Collective Super-replication price for a given vector of contingent claims, one for each agent in the system, allowing additional exchanges among the agents reduces the overall cost compared to classical individual super-replication. The positive difference between the aggregation (sum) of individual superhedging prices and the Collective Super-replication price represents the value of cooperation. Finally, we explain how these collective features can be associated with a broader class of risk measurement or cost assessment procedures beyond the superhedging framework. This leads to the notion of Collective Risk Measures, which generalize the idea of risk sharing and inf-convolution of risk measures.
Abstract: Audiences often think of music as primarily a product of the heart, but pianist / composer / coder Dan Tepfer argues that algorithms - rules that are followed consistently - are just as important. Without constraints underlying creativity, whether they're conscious or not, music tends to lack the deep structure that makes it timeless. In his newest project, Natural Machines, he's taken this idea to the limit, programming rules into his computer that enable it to respond in real time to the music he improvises. The computer creates immediate structure around whatever he plays at the Yamaha Disklavier player piano, which in turn guides him to improvise in certain ways, for an unprecedented melding of natural and mechanical processes. The idea of music living at the intersection of the algorithmic and the spiritual is far from new. It was Pythagoras who first codified the logic behind harmonic consonance. Renaissance composers such as Ockeghem created music that followed strict mathematical procedures. And Bach, whose Goldberg Variations Tepfer has been performing worldwide since the 2011 release of his album Goldberg Variations / Variations, in which he follows each of Bach's variations with an improvised variation of his own, seemed to gain endless creative results from imposing constraints on himself. Join Tepfer as he explains the deep connections between the high-tech Natural Machines, the timeless music of Bach, and the algorithms that support it all.
About the speaker: Dan Tepfer is an internationally renown pianist and composer based in New York City who has performed and recorded around the world with leading musicians both in Jazz and classical music, such as Lee Konitz, Paul Motion and Renee Fleming. Dan Tepfer earned global acclaim for his 2011 release Goldberg Variations / Variations, where he performs J.S. Bach's masterpiece as well as improvising upon it to "elegant, thoughtful and thrilling" effect (New York magazine). Tepfer's 2019 video album Natural Machines stands as one of his most ingeniously forward-minded yet, finding him exploring in real time the intersection between science and art, coding and improvisation, digital algorithms and the rhythms of the heart. His 2023 return to Bach, Inventions / Reinventions, an exploration of the narrative processes behind Bach's beloved Inventions, became a best-seller, spending two weeks in the #1 spot on the Billboard Classical Charts. Besides his musical career, including a degree in Jazz Piano from the New England Conservatory in Bosten, he has earned a Bachelor's degree in Astrophysics from the University of Edinbourgh. From a young age, Dan Tepfer has been interested in coding, which he now uses in very creative ways for making music, such as in Natural Machines. During the pandemic, his belief that music brings people together in times of crisis led him to dive into live-streaming, performing close to two hundred online concerts. As part of this effort, he pioneered ultra-low-latency audio technology enabling him to perform live through the internet with musicians in separate locations, culminating in the development of his own app, FarPlay, which is now distributed by a company of which he is the CEO.
We introduce the Rigged Dynamic Mode Decomposition (Rigged DMD) algorithm, which computes generalized eigenfunction decompositions of Koopman operators. By considering the evolution of observables, Koopman operators transform complex nonlinear dynamics into a linear framework suitable for spectral analysis. While powerful, traditional Dynamic Mode Decomposition (DMD) techniques often struggle with continuous spectra. Rigged DMD addresses these challenges with a data-driven methodology that approximates the Koopman operator's resolvent and its generalized eigenfunctions using snapshot data from the system's evolution. At its core, Rigged DMD builds wave-packet approximations for generalized Koopman eigenfunctions and modes by integrating Measure-Preserving Extended Dynamic Mode Decomposition with high-order kernels for smoothing. This provides a robust decomposition encompassing both discrete and continuous spectral elements. We derive explicit high-order convergence theorems for generalized eigenfunctions and spectral measures. Additionally, we propose a novel framework for constructing rigged Hilbert spaces using time-delay embedding, significantly extending the algorithm's applicability. We provide examples, including systems with a Lebesgue spectrum, integrable Hamiltonian systems, the Lorenz system, and a high-Reynolds number lid-driven flow in a two-dimensional square cavity, demonstrating Rigged DMD's convergence, efficiency, and versatility. This work paves the way for future research and applications of decompositions with continuous spectra. This talk is based on joint work with Catherine Drysdale (University of Birmingham) and Andrew Horning (MIT).
NN
t.b.a.
TBA
Insurance claims are often not paid out immediately. In long-tail lines such as liability or motor liability, it can take years or even decades until a claim is settled. In order to set up adequate reserves, so-called IBNR methods are used to predict future payments. Chain ladder is probably the most popular IBNR method worldwide. Since large losses behave quite differently from attritional losses, it is advisable to separate the two loss categories in the IBNR calculation. We introduce a stochastic model for the development of attritional and large claims in long-tail lines of business and present a corresponding chain ladder-like IBNR method that predicts attritional and large losses in a consistent way.
TBA