Найдено 102
Cyber insurance-linked securities
Braun A., Eling M., Jaenicke C.
Q1
Cambridge University Press
ASTIN Bulletin, 2023, цитирований: 6, doi.org, Abstract
Abstract We investigate the feasibility of cyber risk transfer through insurance-linked securities (ILS). On the investor side, we elicit the preferred characteristics of cyber ILS and the corresponding return expectations. We then estimate the cost of equity of insurers and compare it to the Rate on Line expected by investors to match demand and supply in the cyber ILS market. Our results show that cyber ILS will work for both cedents and investors if the cyber risk is sufficiently well understood. Thus, challenges related to cyber risk modeling need to be overcome before a meaningful cyber ILS market may emerge.
Ermanno Pitacco (1947–2022)
Embrechts P., Wüthrich M.
Q1
Cambridge University Press
ASTIN Bulletin, 2022, цитирований: 0, doi.org
MORTALITY CREDITS WITHIN LARGE SURVIVOR FUNDS
Denuit M., Hieber P., Robert C.Y.
Q1
Cambridge University Press
ASTIN Bulletin, 2022, цитирований: 9, doi.org, Abstract
AbstractSurvivor funds are financial arrangements where participants agree to share the proceeds of a collective investment pool in a predescribed way depending on their survival. This offers investors a way to benefit from mortality credits, boosting financial returns. Following Denuit (2019, ASTIN Bulletin, 49, 591–617), participants are assumed to adopt the conditional mean risk sharing rule introduced in Denuit and Dhaene (2012, Insurance: Mathematics and Economics, 51, 265–270) to assess their respective shares in mortality credits. This paper looks at pools of individuals that are heterogeneous in terms of their survival probability and their contributions. Imposing mild conditions, we show that individual risk can be fully diversified if the size of the group tends to infinity. For large groups, we derive simple, hierarchical approximations of the conditional mean risk sharing rule.
MODERN LIFE-CARE TONTINES
Hieber P., Lucas N.
Q1
Cambridge University Press
ASTIN Bulletin, 2022, цитирований: 12, doi.org, Abstract
AbstractThe tendency of insurance providers to refrain from offering long-term guarantees on investment or mortality risk has shifted attention to mutual risk pooling schemes like (modern) tontines, pooled annuities or group self annuitization schemes. While the literature has focused on mortality risk pooling schemes, this paper builds on the advantage of pooling mortality and morbidity risks, and their inherent natural hedge. We introduce a modern “life-care tontine”, which in addition to retirement income targets the needs of long-term care (LTC) coverage for an ageing population. In contrast to a classical life-care annuity, both mortality and LTC risks are shared within the policyholder pool by mortality and morbidity credits, respectively. Technically, we rely on a backward iteration to deduce the smoothed cashflows pattern and the separation of cash-flows in a fixed withdrawal and a surplus from the two types of risks. We illustrate our results using real life data, demonstrating the adequacy of the proposed tontine scheme.
PHASE-TYPE DISTRIBUTIONS FOR CLAIM SEVERITY REGRESSION MODELING
Bladt M.
Q1
Cambridge University Press
ASTIN Bulletin, 2022, цитирований: 6, doi.org, Abstract
AbstractThis paper addresses the task of modeling severity losses using segmentation when the data distribution does not fall into the usual regression frameworks. This situation is not uncommon in lines of business such as third-party liability insurance, where heavy-tails and multimodality often hamper a direct statistical analysis. We propose to use regression models based on phase-type distributions, regressing on their underlying inhomogeneous Markov intensity and using an extension of the expectation–maximization algorithm. These models are interpretable and tractable in terms of multistate processes and generalize the proportional hazards specification when the dimension of the state space is larger than 1. We show that the combination of matrix parameters, inhomogeneity transforms, and covariate information provides flexible regression models that effectively capture the entire distribution of loss severities.
DISCRIMINATION-FREE INSURANCE PRICING
Lindholm M., Richman R., Tsanakas A., Wüthrich M.V.
Q1
Cambridge University Press
ASTIN Bulletin, 2021, цитирований: 32, doi.org, Abstract
AbstractWe consider the following question: given information on individual policyholder characteristics, how can we ensure that insurance prices do not discriminate with respect to protected characteristics, such as gender? We address the issues of direct and indirect discrimination, the latter resulting from implicit learning of protected characteristics from nonprotected ones. We provide rigorous mathematical definitions for direct and indirect discrimination, and we introduce a simple formula for discrimination-free pricing, that avoids both direct and indirect discrimination. Our formula works in any statistical model. We demonstrate its application on a health insurance example, using a state-of-the-art generalized linear model and a neural network regression model. An important conclusion is that discrimination-free pricing in general requires collection of policyholders’ discriminatory characteristics, posing potential challenges in relation to policyholder’s privacy concerns.
TEMPERED PARETO-TYPE MODELLING USING WEIBULL DISTRIBUTIONS
Albrecher H., Araujo-Acuna J.C., Beirlant J.
Q1
Cambridge University Press
ASTIN Bulletin, 2021, цитирований: 4, doi.org, Abstract
AbstractIn various applications of heavy-tail modelling, the assumed Pareto behaviour is tempered ultimately in the range of the largest data. In insurance applications, claim payments are influenced by claim management and claims may, for instance, be subject to a higher level of inspection at highest damage levels leading to weaker tails than apparent from modal claims. Generalizing earlier results of Meerschaert et al. (2012) and Raschke (2020), in this paper we consider tempering of a Pareto-type distribution with a general Weibull distribution in a peaks-over-threshold approach. This requires to modulate the tempering parameters as a function of the chosen threshold. Modelling such a tempering effect is important in order to avoid overestimation of risk measures such as the value-at-risk at high quantiles. We use a pseudo maximum likelihood approach to estimate the model parameters and consider the estimation of extreme quantiles. We derive basic asymptotic results for the estimators, give illustrations with simulation experiments and apply the developed techniques to fire and liability insurance data, providing insight into the relevance of the tempering component in heavy-tail modelling.
A NEURAL NETWORK BOOSTED DOUBLE OVERDISPERSED POISSON CLAIMS RESERVING MODEL
Gabrielli A.
Q1
Cambridge University Press
ASTIN Bulletin, 2019, цитирований: 19, doi.org, Abstract
AbstractWe present an actuarial claims reserving technique that takes into account both claim counts and claim amounts. Separate (overdispersed) Poisson models for the claim counts and the claim amounts are combined by a joint embedding into a neural network architecture. As starting point of the neural network calibration, we use exactly these two separate (overdispersed) Poisson models. Such a nested model can be interpreted as a boosting machine. It allows us for joint modeling and mutual learning of claim counts and claim amounts beyond the two individual (overdispersed) Poisson models.
ON MARINE LIABILITY PORTFOLIO MODELING
Guevara-Alarcón W., Albrecher H., Chowdhury P.
Q1
Cambridge University Press
ASTIN Bulletin, 2019, цитирований: 0, doi.org, Abstract
AbstractMarine is the oldest type of insurance coverage. Nevertheless, unlike cargo and hull covers, marine liability is a rather young line of business with claims that can have very heavy and long tails. For reinsurers, the accumulation of losses from an event insured by various Protection and Indemnity clubs is an additional source for very large claims in the portfolio. In this paper, we first describe some recent developments of the marine liability market and then statistically analyze a data set of large losses for this line of business in a detailed manner both in terms of frequency and severity, including censoring techniques and tests for stationarity over time. We further formalize and examine an optimization problem that occurs for reinsurers participating in XL on XL coverages in this line of business and give illustrations of its solution.
THE RESERVE UNCERTAINTIES IN THE CHAIN LADDER MODEL OF MACK REVISITED
Gisler A.
Q1
Cambridge University Press
ASTIN Bulletin, 2019, цитирований: 11, doi.org, Abstract
AbstractWe revisit the “full picture” of the claims development uncertainty in Mack’s (1993) distribution-free stochastic chain ladder model. We derive the uncertainty estimators in a new and easily understandable way, which is much simpler than the derivation found so far in the literature, and compare them with the well known estimators of Mack and of Merz–Wüthrich.Our uncertainty estimators of the one-year run-off risks are new and different to the Merz–Wüthrich formulas. But if we approximate our estimators by a first order Taylor expansion, we obtain equivalent but simpler formulas. As regards the ultimate run-off risk, we obtain the same formulas as Mack for single accident years and an equivalent but better interpretable formula for the total over all accident years.
ON INTEGRATED CHANCE CONSTRAINTS IN ALM FOR PENSION FUNDS
Toukourou Y.A., Dufresne F.
Q1
Cambridge University Press
ASTIN Bulletin, 2018, цитирований: 3, doi.org, Abstract
AbstractWe discuss the role ofintegrated chance constraints(ICC) as quantitative risk constraints in asset and liability management (ALM) for pension funds. We define two types of ICC: theone periodintegrated chance constraint (OICC) and themultiperiodintegrated chance constraint (MICC). As their names suggest, the OICC covers only one period, whereas several periods are taken into account with the MICC. A multistage stochastic linear programming model is therefore developed for this purpose and a special mention is paid to the modeling of the MICC. Based on a numerical example, we first analyze the effects of the OICC and the MICC on the optimal decisions (asset allocation and contribution rate) of a pension fund. By definition, the MICC is more restrictive and safer compared to the OICC. Second, we quantify this MICC safety increase. The results show that although the optimal decisions from the OICC and the MICC differ, the total costs are very close, showing that the MICC is definitely a better approach since it is more prudent.
ON THE EVALUATION OF MULTIVARIATE COMPOUND DISTRIBUTIONS WITH CONTINUOUS SEVERITY DISTRIBUTIONS AND SARMANOV'S COUNTING DISTRIBUTION
Tamraz M., Vernic R.
Q1
Cambridge University Press
ASTIN Bulletin, 2018, цитирований: 2, doi.org, Abstract
AbstractIn this paper, we present closed-type formulas for some multivariate compound distributions with multivariate Sarmanov counting distribution and independent Erlang distributed claim sizes. Further on, we also consider a type-II Pareto dependency between the claim sizes of a certain type. The resulting densities rely on the special hypergeometric function, which has the advantage of being implemented in the usual software. We numerically illustrate the applicability and efficiency of such formulas by evaluating a bivariate cumulative distribution function, which is also compared with the similar function obtained by the classical recursion-discretization approach.
PROBABILITY OF SUFFICIENCY OF SOLVENCY II RESERVE RISK MARGINS: PRACTICAL APPROXIMATIONS
Moro E.D., Krvavych Y.
Q1
Cambridge University Press
ASTIN Bulletin, 2017, цитирований: 9, doi.org, Abstract
AbstractThe new Solvency II Directive and the upcoming IFRS 17 regime bring significant changes to current reporting of insurance entities, and particularly in relation to valuation of insurance liabilities. Insurers will be required to valuate their insurance liabilities on a risk-adjusted basis to allow for uncertainty inherent in cash flows that arise from the liability of insurance contracts. Whilst most European-based insurers are expected to adopt the Cost of Capital approach to calculate reserve risk margin — the risk adjustment method commonly agreed under Solvency II and IFRS 17, there is one additional requirement of IFRS 17 to also disclose confidence level of the risk margin.Given there is no specific guidance on the calculation of confidence level, the purpose of this paper is to explore and examine practical ways of estimating the risk margin confidence level measured by Probability of Sufficiency (PoS). The paper provides some practical approximation formulae that would allow one to quickly estimate the implied PoS of Solvency II risk margin for a given non-life insurance liability, the risk profile of which is specified by the type and characteristics of the liability (e.g. type/nature of business, liability duration and convexity, etc.), which, in turn, are associated with•the level of variability measured by Coefficient of Variation (CoV);•the degree of Skewness per unit of CoV; and•the degree of Kurtosis per unit of CoV2.The approximation formulae of PoS are derived for both the standalone class risk margin and the diversified risk margin at the portfolio level.
CHAIN LADDER AND ERROR PROPAGATION
Röhr A.
Q1
Cambridge University Press
ASTIN Bulletin, 2016, цитирований: 18, doi.org, Abstract
AbstractWe show how estimators for the chain ladder prediction error in Mack's (1993) distribution-free stochastic model can be derived using the error propagation formula. Our method allows for the treatment of the general case of the prediction error of the loss development result between two arbitrary future horizons. In the well-known special cases considered previously by Mack (1993) and Merz and Wüthrich (2008), our estimators coincide with theirs. However, the algebraic form in which we cast them is new, considerably more compact and more intuitive to understand. For example, in the classical case treated by Mack (1993), we show that the mean squared prediction error divided by the squared estimated ultimate loss can be written as ∑jû2j, where ûj measures the (relative) uncertainty around the jth development factor and the proportion of the estimated ultimate loss that it affects. The error propagation method also provides a natural split into process error and parameter error. Our proofs identify and exploit symmetries of “chain ladder processes” in a novel way. For the sake of wider practical applicability of the formulae derived, we allow for incomplete historical data and the exclusion of outliers in the triangles.
CONSISTENT YIELD CURVE PREDICTION
Teichmann J., Wüthrich M.V.
Q1
Cambridge University Press
ASTIN Bulletin, 2016, цитирований: 7, doi.org, Abstract
AbstractWe present an arbitrage-free non-parametric yield curve prediction model which takes the full discretized yield curve data as input state variable. Absence of arbitrage is a particularly important model feature for prediction models in case of highly correlated data as, for instance, interest rates. Furthermore, the model structure allows to separate constructing the daily yield curve from estimating its volatility structure and from calibrating the market prices of risk. The empirical part includes tests on modeling assumptions, out-of-sample back-testing and a comparison with the Vasiček (1977) short-rate model.
INTERNATIONAL CAUSE-SPECIFIC MORTALITY RATES: NEW INSIGHTS FROM A COINTEGRATION ANALYSIS
Arnold-Gaille S., Sherris M.
Q1
Cambridge University Press
ASTIN Bulletin, 2015, цитирований: 11, doi.org, Abstract
AbstractThis paper applies cointegration techniques, developed in econometrics to model long-run relationships, to cause-of-death data. We analyze the five main causes of death across five major countries, including USA, Japan, France, England & Wales and Australia. Our analysis provides a better understanding of the long-run equilibrium relationships between the five main causes of death, providing new insights into similarities and differences in trends. The results identify for the first time similarities between countries and genders that are consistent with past studies on the aging processes by biologists and demographers. The insights from biological theory on aging are found to be reflected in the cointegrating relations in all of the countries included in the study.
CREDIBILITY CLAIMS RESERVING WITH STOCHASTIC DIAGONAL EFFECTS
Bühlmann H., Moriconi F.
Q1
Cambridge University Press
ASTIN Bulletin, 2015, цитирований: 9, doi.org, Abstract
AbstractAn interesting class of stochastic claims reserving methods is given by the models with conditionally independent loss increments (CILI), where the incremental losses are conditionally independent given a risk parameter Θi,j depending on both the accident year i and the development year j. The Bühlmann–Straub credibility reserving (BSCR) model is a particular case of a CILI model where the risk parameter is only depending on i. We consider CILI models with additive diagonal risk (ADR), where the risk parameter is given by the sum of two components, one depending on the accident year i and the other depending on the calendar year t = i + j. The model can be viewed as an extension of the BSCR model including random diagonal effects, which are often declared to be important in loss reserving but rarely are specifically modeled. We show that the ADR model is tractable in closed form, providing credibility formulae for the reserve and the mean square error of prediction (MSEP). We also derive unbiased estimators for the variance parameters which extend the classical Bühlmann–Straub estimators. The results are illustrated by a numerical example and the estimators are tested by simulation. We find that the inclusion of random diagonal effects can be significant for the reserve estimates and, especially, for the evaluation of the MSEP. The paper is written with the purpose of illustrating the role of stochastic diagonal effects. To isolate these effects, we assume that the development pattern is given. In particular, our MSEP values do not include the uncertainty due to the estimation of the development pattern.
ON SARMANOV MIXED ERLANG RISKS IN INSURANCE APPLICATIONS
Hashorva E., Ratovomirija G.
Q1
Cambridge University Press
ASTIN Bulletin, 2014, цитирований: 17, doi.org, Abstract
AbstractIn this paper we consider an extension to the aggregation of the FGM mixed Erlang risks, proposed by Cossette et al. (2013 Insurance: Mathematics and Economics, 52, 560–572), in which we introduce the Sarmanov distribution to model the dependence structure. For our framework, we demonstrate that the aggregated risk belongs to the class of Erlang mixtures. Following results from S. C. K. Lee and X. S. Lin (2010 North American Actuarial Journal, 14(1) 107–130), G. E. Willmot and X. S. Lin (2011 Applied Stochastic Models in Business and Industry, 27(1) 8–22), analytical expressions of the contribution of each individual risk to the economic capital for the entire portfolio are derived under both the TVaR and the covariance capital allocation principle. By analysing the commonly used dependence measures, we also show that the dependence structure is wide and flexible. Numerical examples and simulation studies illustrate the tractability of our approach.
Approximations to Risk Theory's F(x, t) by Means of the Gamma Distribution
Seal H.L.
Q1
Cambridge University Press
ASTIN Bulletin, 2014, цитирований: 18, doi.org, Abstract
It seems that there are people who are prepared to accept what the numerical analyst would regard as a shockingly poor approximation to F (x, t), the distribution function of aggregate claims in the interval of time (o, t), provided it can be quickly produced on a desk or pocket computer with the use of standard statistical tables. The so-called NP (Normal Power) approximation has acquired an undeserved reputation for accuracy among the various possibilities and we propose to show why it should be abandoned in favour of a simple gamma function approximation.Discounting encomiums on the NP method such as Bühlmann's (1974): “Everybody known to me who has worked with it has been surprised by its unexpectedly good accuracy”, we believe there are only three sources of original published material on the approximation, namely Kauppi et al (1969), Pesonen (1969) and Berger (1972). Only the last two authors calculated values of F(x, t) by the NP method and compared them with “true” four or five decimal values obtained by inverting the characteristic function of F(x, t) on an electronic computer.
From Aggregate Claims Distribution to Probability of Ruin
Seal H.L.
Q1
Cambridge University Press
ASTIN Bulletin, 2014, цитирований: 11, doi.org, Abstract
When the distribution of the number of claims in an interval of time of length t is mixed Poisson and the moments of the independent distribution of individual claim amounts are known, the moments of the distribution of aggregate claims through epoch t can be calculated (O. Lundberg, 1940, ch. VI). Several approximations to the corresponding distribution function, F(·, t), are available (see, e.g., Seal, 1969, ch. 2) and, in particular, a simple gamma (Pearson Type III) based on the first three moments has proved definitely superior to the widely accepted “Normal Power” approximation (Seal, 1976). Briefly,where the P-notation for the incomplete gamma ratio is now standard and α, a function of t, is to be found fromthe kappas being the cumulants of F(·, t). An excellent table of the incomplete gamma ratio is that of Khamis (1965).The problem that is solved in this paper is the production of an approximation to U(w, t), the probability of non-ruin in an interval of time of length t, by using the above mentioned gamma approximation to F(·, t).
Application of Reliability Theory to Insurance
Straub E.
Q1
Cambridge University Press
ASTIN Bulletin, 2014, цитирований: 8, doi.org, Abstract
1. There is a general rule applicable to all insurance and reinsurance fields according to which the level of the so-called technical minimum premium should be fixed such that a certain stability criterion is satisfied for the portfolio under consideration. The two bestknown such criteria are(i) the probability that there is a technical loss in any of the future years should be less than a given percentage(ii) the probability that the company gets “ruined” i.e. initial reserves plus accumulated premiums minus accumulated claims becomes negative at any time of a given period in the future should be less than a tolerated percentage.Confining ourselves to criterion (i) in the present paper we may then say that the problem of calculating technical minimum premiums is broadly spoken equivalent with the problem of estimating loss probabilities. Since an exact calculation of such probabilities is only possible for a few very simple and therefore mostly unrealistic risk models and since e.g. Esscher's method is not always very easy to apply in practice it might be worthwhile to describe in the following an alternative approach using results and techniques from Reliability Theory in order to establish bounds for unknown loss probabilities.It would have been impossible for me to write this paper without having had the opportunity of numerous discussions with the Reliability experts R. Barlow and F. Proschan while I was at Stanford University. In particular I was told the elegant proof of theorem 3 given below by R. Barlow recently.
Some aspects on reinsurance profits and loadings
Benktander G.
Q1
Cambridge University Press
ASTIN Bulletin, 2014, цитирований: 4, doi.org, Abstract
In a note on the security loading of excess loss rates I am deducing a simple formula intended to replace some tedious calculations. In the beginning of that note I made the point that some authors recommend a loading proportional to the dispersion of the total claims amount of a treaty δ1 while others tend to favour .I also stated that a loading proportional to δ1 or its estimate δ1* could be deduced from the statistical uncertainty in measuring the risk (section 4).The question has been raised if and to what extent a loading system based on the dispersion is unduly punishing the smaller portfolios. This will be examined below.The pricing concept will be analyzed from the point of view of a big dominating Reinsurer who wants to be fair in all directions. The conclusion of this study supports an affirmative answer to the question put above.In a second part the loading is studied from a different angle bringing competition into the picture. The pricing or loading becomes a problem of operations research under the simplified assumption that profit is the only purpose of our activity. Not unexpectedly, the loading coming out from this aspect differs from those of part one.Part two also deals with the question of how much of the loadings which we are aiming for, get lost in the competitive process. It is also shown that in most cases the harder the competition is, the higher loadings shall be used.Part one and part two thus deal with the loading problem from different aspects, and illustrate the complexity of the problem. It is my hope that this note could stimulate further researches in this interesting and important area, also in a moment when some reinsurers are more concerned with the question of surviving than in fixing the loadings which should on the average and in the long run turn up as profits.
Estimation of the number of excess claims by means of the credibility theory
Straub E.
Q1
Cambridge University Press
ASTIN Bulletin, 2014, цитирований: 2, doi.org, Abstract
The purpose of this paper is to describe a possible application on the rating of excess of loss covers of Mr. Bühlmann's work on Experience Rating and Credibility). One of the most important problems in connection with the rating of such treaties consists in estimating the number of excess claims and the average excess claim amount. Especially with cases where claims data are scarce there is a temptation to estimate these two quantities by means of the credibility theory. This approach leads, on the one hand, to a relatively complicated formula when considering the average excess claim amount, and, on the other, to a rather simple one for the credibility factor of the number of excess claims.
The Story Of 100 Actuarially Guaranteed No-Ruin Casualty Insurance Companies
Seal H.L.
Q1
Cambridge University Press
ASTIN Bulletin, 2014, цитирований: 0, doi.org, Abstract
“Most people think that an insurance company's business is to make money out of insuring things. They are wrong. Its business is to take as much money off the public as possible, invest it successfully and hope that the conditions on which it was taken never happen.”The Economist, April 13, 1974 (p. 119)In order to motivate the series of Monte Carlo simulations we have carried out in the following article we would like readers to imagine that a small rural casualty insurance company, the Farm Fire and Flood Damage Ins. Co. (FFFDIC), is to be bought by an entrepreneur (whom we shall designate by EP) provided his consulting actuary (the author of this article) can satisfy his requirements which are as follow:(i) A 15-year investment is foreseen at the end of which time EP wishes to be able to sell, hopefully without loss.(ii) The risk-capital is to be invested and (although some of it must be in easily liquidable securities) should yield a rate of return comparable with that obtainable on the same amount of capital invested in the market.(iii) The premiums will not have risk-loadings, as such, but will be loaded for profit by 15%.(iv) The risk-capital should, on the average, be returnable at the end of the 15-year investment.
AN INDUSTRY QUESTION: THE ULTIMATE AND ONE-YEAR RESERVING UNCERTAINTY FOR DIFFERENT NON-LIFE RESERVING METHODOLOGIES
Dal Moro E., Lo J.
Q1
Cambridge University Press
ASTIN Bulletin, 2014, цитирований: 5, doi.org, Abstract
AbstractIn the industry, generally, reserving actuaries use a mix of reserving methods to derive their best estimates. On the basis of the best estimate, Solvency 2 requires the use of a one-year volatility of the reserves. When internal models are used, such one-year volatility has to be provided by the reserving actuaries. Due to the lack of closed-form formulas for the one-year volatility of Bornhuetter-Ferguson, Cape-Cod and Benktander-Hovinen, reserving actuaries have limited possibilities to estimate such volatility apart from scaling from tractable models, which are based on other reserving methods. However, such scaling is technically difficult to justify cleanly and awkward to interact with. The challenge described in this editorial is therefore to come up with similar models like those of Mack or Merz-Wüthrich for the chain ladder, but applicable to Bornhuetter-Ferguson, mix Chain-Ladder and Bornhuetter-Ferguson, potentially Cape-Cod and Benktander-Hovinen — and their mixtures.
Cobalt Бета
ru en