Ruin Theoreticised
Around 334 years ago, Edward Lloyd, owner of a London coffee house, recognized the need for transport risk insurance. Today Lloyd's of London has a premium income of over $20 million each day[1]. The existence of risks to humans, property, and the environment have led to a global industry offering financial coverage for risk exposure losses. A key item is the inherent randomness, currently handled by a stochastic basis.
An insurance company is a collection of contracts, each an obligation. Insurers focus on large outcomes, which mean large obligations, and aim to keep their Solvency Capital Requirement (SCR) coverage ratio high. Ruin Theory models the evolution of SCR, using tools like compound Poisson models, Martingale processes, and Markov processes. It calculates the asymptotic probability of ruin, focusing on the chance that the insurer's surplus falls below zero.
Lundberg proposed an exponent that measures the riskiness of insurance portfolios. The inequality suggests that the probability of ruin (bankruptcy) is inversely exponential proportional to the initial capital of the insurance company and the Lundberg exponent. A smaller exponent means a riskier portfolio. A typical result is that surrogating the claims occurrence process with the renewal process increases the chances of risk compared to the Poisson process (while keeping other factors constant).
Standard risk models are based on fat or light tail distributions with delayed claims occurrences[2]. A risk process consists of initial capital + premium income – claims. Various numerical approximations exist for the accumulated claims distributions, like normal-power, orthogonal-polynomial, Bowers-Gamma, Esscher and Fourier transforms. All offer refined and complicated models. However, I find these neo-classical approaches fundamentally flawed.
Economics, especially risk modeling, can learn from meteorology. If accepted, cases for indeterminacy, information, and equilibrium from physics and meteorology contribute to Ruin Theory. Jenny Harrison's chainlet theory can bridge discrete and continuous modeling in economics and sciences[3]. Though the distributions of stochastic processes in our case can be large, they are small in category. An infinite sequence of point processes is a subset of Cantor space.
We can achieve a complete (not necessarily compact) topological characterization of the random processes and risk models for a specific topology (relativization of a space) and a suitable distance measure. This characterization can be efficiently and accurately computed using differentiable geometric computational tools.