Chernobai, Anna and Burnecki, Krzysztof and Rachev, Svetlozar and Trueck, Stefan and Weron, Rafal (original) (raw)

Modelling Catastrophe Claims with Left-Truncated Severity Distributions (Extended Version)

SSRN Electronic Journal, 2000

In this paper, we present a procedure for consistent estimation of the severity and frequency distributions based on incomplete insurance data and demonstrate that ignoring the thresholds leads to a serious underestimation of the ruin probabilities. The event frequency is modelled with a non-homogeneous Poisson process with a sinusoidal intensity rate function. The choice of an adequate loss distribution is conducted via the in-sample goodness-of-fit procedures and forecasting, using classical and robust methodologies.

Modelling catastrophe claims with left-truncated severity distributions

Computational Statistics, 2006

In this paper, we present a procedure for consistent estimation of the severity and frequency distributions based on incomplete insurance data and demonstrate that ignoring the thresholds leads to a serious underestimation of the ruin probabilities. The event frequency is modelled with a non-homogeneous Poisson process with a sinusoidal intensity rate function. The choice of an adequate loss distribution is conducted via the in-sample goodness-of-fit procedures and forecasting, using classical and robust methodologies.

Modeling catastrophe claims with left-truncated severity distributions (extended version)

2005

In this paper, we present a procedure for consistent estimation of the severity and frequency distributions based on incomplete insurance data and demonstrate that ignoring the thresholds leads to a serious underestimation of the ruin probabilities. The event frequency is modelled with a non-homogeneous Poisson process with a sinusoidal intensity rate function. The choice of an adequate loss distribution is

Modeling of Claim Severity through the Mixture of Exponential Distribution and Computation of its Probability of Ultimate Ruin

Thailand Statistician, 2017

In this paper we have discussed the infinite time ruin probabilities in continuous time in a compound Poisson process with a constant premium rate for the mixture of exponential claims. Firstly, we have fitted the mixture of two exponential and the mixture of three exponential to a set of claim data and thereafter, have computed the probability of ultimate ruin through a method giving its exact expression and then through a numerical method, namely the method of product integration. The derivation of the exact expression for ultimate ruin probability for the mixture of three and mixture of two exponential is done through the moment generating function of the maximal aggregate loss random variable. Consistencies are observed in the values of ultimate ruin probabilities obtained by both the methods.

Modeling the Frequency and Severity of Auto Insurance Claims Using Statistical Distributions

Journal of Mathematical Finance, 2018

Claims experience in non-life insurance is contingent on random eventualities of claim frequency and claim severity. By design, a single policy may possibly incur more than one claim such that the total number of claims as well as the total size of claims due on any given portfolio is unpredictable. For insurers to be able to settle claims that may occur from existing portfolios of policies at some future time periods, it is imperative that they adequately model historical and current data on claims experience; this can be used to project the expected future claims experience and setting sufficient reserves. Non-life insurance companies are often faced with two challenges when modeling claims data; selecting appropriate statistical distributions for claims data and establishing how well the selected statistical distributions fit the claims data. Accurate evaluation of claim frequency and claim severity plays a critical role in determining: An adequate premium loading factor, required reserve levels, product profitability and the impact of policy modifications. Whilst the assessment of insurers' actuarial risks in respect of their solvency status is a complex process, the first step toward the solution is the modeling of individual claims frequency and severity. This paper presents a methodical framework for choosing a suitable probability model that best describes automobile claim frequency and loss severity as well as their application in risk management. Selected statistical distributions are fitted to historical automobile claims data and parameters estimated using the maximum likelihood method. The Chi-square test is used to check the goodness-of-fit for claim frequency distributions whereas the Kolmogorov-Smirnov and Anderson-Darling tests are applied to claim severity distributions. The Akaike information criterion (AIC) is used to choose between competing distributions. Empirical results indicate that claim severity data is better modeled using heavy-tailed and skewed distributions. The lognormal distribution is selected as the best distribution to model the claim size while negative binomial and geometric dis

Natural Catastrophe Models for Insurance Risk Management

2019

Catastrophic events are characterized by three main points: there are relatively rareness, there are statistical unexpected and there have huge impact on the whole society. Insurance or reinsurance is one way of reducing the economic consequences of catastrophic events. Risk management of insurance and reinsurance companies have to have available relevant information for estimation and adjusting premium to cover these risks. The aim of this article is to present two of the useful methods-block maxima method and peaks over threshold method. These methods use information from historical data about insured losses of natural catastrophes and estimates future insured losses. These estimates are very important for actuaries and for risk managers as one of the bases for calculating and adjusting premiums of products covering these types of risks.

A new approach to modelling claims due to natural hazards

United Nations International Strategy for Disaster Reduction defines risk of natural disaster as "a potentially damaging phenomenon that may lead to loss of life or injury, property damage, social and economic disruption or environmental degradation". Each hazard is characterized by location, intensity, frequency and probability. It is interesting to study inter-arrival time between two disasters in a vulnerable geographic area. In this article, a new approach to model inter-arival time between two disasters based on Stoynov distribution and process is considered. MSC 2010: 60G51, 97K60 1. Introduction. United Nations International Strategy for Disaster Reduction defines risk of natural disaster as "a potentially damaging phenomenon that may lead to loss of life or injury, property damage, social and economic disruption or environmental degradation". Each hazard is characterized by location, intensity, frequency and probability. It is interesting to study inter-a...

Modeling Frequency and Severity of Insurance Claims in an Insurance Portfolio

American Journal of Applied Mathematics and Statistics, 2020

Premium pricing is always a challenging task in general insurance. Furthermore, frequency of the insurance claims plays a major role in the pricing of the premiums. Severity in insurance on the other hand, can either be the amount paid due to a loss or the size of the loss event. For insurer’s to be in a position to settle claims that occur from existing portfolios of policies in future, it is necessary that they adequately model past and current data on claim experience then use the models to project the expected future experience in claim amounts. In addition, non-life insurance companies are faced with problems when modeling claim data i.e selecting appropriate statistical distribution and establishing how well it fits the claimed data. Therefore, the study presents a framework for choosing the most suitable probability distribution and fitting it to the past motor claims data and the parameters are estimated using maximum likelihood method (MLE). The goodness of fit of frequency...

Nonparametric Estimation of the Ruin Probability in the Classical Compound Poisson Risk Model

2020

In this paper we study estimating ruin probability which is an important problem in insurance. Our work is developed upon the existing nonparametric estimation method for the ruin probability in the classical risk model, which employs the Fourier transform but requires smoothing on the density of the sizes of claims. We propose a nonparametric estimation approach which does not involve smoothing and thus is free of the bandwidth choice. Compared with the Fourier-transformation-based estimators, our estimators have simpler forms and thus are easier to calculate. We establish asymptotic distributions of our estimators, which allows us to consistently estimate the asymptotic variances of our estimators with the plug-in principle and enables interval estimates of the ruin probability.