×

Gamma mixture density networks and their application to modelling insurance claim amounts. (English) Zbl 1475.91294

Summary: We discuss how mixtures of Gamma distributions with mixing probabilities, shape and rate parameters depending on features can be fitted with neural networks. We develop two versions of the EM algorithm for fitting so-called Gamma mixture density Networks, which we call the EM network boosting algorithm and the EM forward network algorithm, and we test their implementation together with the choices of hyperparameters. A simulation study shows that our algorithms perform very well on synthetic data sets. We further illustrate the application of the Gamma mixture density network on a real data set of motor insurance claim amounts and conclude that Gamma mixture density networks can improve the fit of the regression model and the predictions of the claim severities used for rate-making compared to classical actuarial techniques.

MSC:

91G05 Actuarial mathematics
PDF BibTeX XML Cite
Full Text: DOI

References:

[1] Abramowitz, M.; Stegun, I. A., Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables (1972), National Bureau of Standards, Applied Mathematics Series · Zbl 0543.33001
[2] Badescu, A. L.; Lin, X. S.; Tang, D.; Valdez, E. A., Multivariate Pascal mixture regression models for correlated claim frequencies (2015)
[3] Bermúdez, L.; Karlis, D., A finite mixture of bivariate Poisson regression models with an application to insurance ratemaking, Computational Statistics & Data Analysis, 56, 3988-3999 (2012) · Zbl 1254.91264
[4] Bishop, C., Mixture density networks (1994)
[5] Bishop, C., Pattern Recognition and Machine Learning (2006), Springer Science+Business Media · Zbl 1107.68072
[6] Davis, C. N.; Hollingsworth, T. D.; Caudron, Q.; Irvine, M. A., The use of mixture density networks in the emulation of complex epidemiological individual-based models, Computational Biology, 16, 1-16 (2020)
[7] Blostein, M.; Miljkovic, T., On modeling left-truncated loss data using mixtures of distributions, Insurance. Mathematics & Economics, 85, 35-46 (2019) · Zbl 1415.62076
[8] Bonnett, C., Mixture density networks for galaxy distance determination in TensorFlow (2016)
[9] CASdataset, R package by C. Dutang and A. Charpentier (2020)
[10] Delong, Ł.; Szatkowski, M., One-year and ultimate reserve risk in Mack Chain Ladder model (2021)
[11] Dozat, T., Incorporating Nesterov momentum into Adam, (ICLR Workshop 1 (2016)), 2013-2016
[12] Dunn, P. K.; Smyth, G. K., Randomized quantile residuals, Journal of Computational and Graphical Statistics, 5, 236-244 (1996)
[13] Fung, T. C.; Badescu, A.; Lin, X. S., A class of mixture of expert models for general insurance: theoretical developments, Insurance. Mathematics & Economics, 89, 111-127 (2019) · Zbl 1427.91228
[14] Fung, T. C.; Badescu, A.; Lin, X. S., A class of mixture of expert models for general insurance: application to correlated claim frequencies, ASTIN Bulletin, 49, 647-688 (2019) · Zbl 1427.91227
[15] Fung, T. C.; Tzougas, G.; Wüthrich, M. V., Mixture composite regression models with multi-type feature selection (2021)
[16] Glorot, X.; Bengio, Y., Understanding the difficulty of training deep feedforward neural networks, (Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics. JMLR Workshop and Conference Proceedings (2010))
[17] Grün, B.; Leisch, F., Finite mixtures of generalized linear regression models, (Shalabh, S.; Heumann, C., Recent Advances in Linear Models and Related Areas: Essays in Honour of Helge Toutenburg (2008), Springer Verlag)
[18] Gui, W.; Huang, R.; Lin, X. S., Fitting the Erlang mixture model to data via a GEM-CMM algorithm, Journal of Computational and Applied Mathematics, 343, 189-205 (2018) · Zbl 06892263
[19] Guo, C.; Berkhahn, F., Entity embeddings of categorical variables (2016)
[20] Hu, S.; Murphy, T. B.; O’Hagan, A., Bivariate Gamma mixture of experts models for joint insurance claims modeling (2019)
[21] Jacobs, R. A.; Jordan, M. I.; Nowlan, S. J.; Hinton, G. E., Adaptive mixtures of local experts, Neural Computation, 3, 79-87 (1991)
[22] Jordan, M. I.; Jacobs, R. A., Hierarchical mixtures of experts and the EM algorithm, Neural Computation, 6, 181-214 (1994)
[23] Jørgensen, B., Exponential dispersion models, Journal of the Royal Statistical Society, Series B, Methodological, 49, 127-162 (1987) · Zbl 0662.62078
[24] Lee, S. C.; Lin, X. S., Modeling and evaluating insurance losses via mixtures of Erlang distributions, North American Actuarial Journal, 14, 107-130 (2010)
[25] McLachlan, G. J.; Krishnan, T., The EM Algorithm and Extension (2008), Wiley Series
[26] Miljkovic, T.; Fernández, D., On two mixture-based clustering approaches used in modeling an insurance portfolio, Risks, 6, 57-75 (2018)
[27] Miljkovic, T.; Grün, B., Modeling loss data using mixtures of distributions, Insurance. Mathematics & Economics, 70, 387-396 (2016) · Zbl 1373.62527
[28] Parodi, P., A generalised property exposure rating framework that incorporates scale-independent losses and maximum possible loss uncertianty, ASTIN Bulletin, 50, 513-553 (2020) · Zbl 1447.91144
[29] Počuča, N.; Jevtić, P.; McNicholas, P. D.; Miljkovic, T., Modeling frequency and severity of claims with the zero-inflated generalized cluster-weighted models, Insurance. Mathematics & Economics, 94, 79-93 (2020) · Zbl 1452.91280
[30] Tseung, S. C.; Badescu, A.; Fung, T. S.; Lin, S. L., LRMoE: an R package fof fexible actuarial loss modelling using mixture of experts regression model (2021)
[31] Tzougas, G.; Karlis, D., An EM algorithm for fitting a new class of mixed exponential regression models with varying dispersion, ASTIN Bulletin, 50, 555-583 (2020) · Zbl 1447.91149
[32] Wedel, M.; DeSarbo, W. S., A mixture likelihood approach for generalized linear models, Journal of Classification, 12, 21-55 (1995) · Zbl 0825.62611
[33] Venturini, S.; Dominici, F.; Parmigiani, G., Gamma shape mixtures for heavy-tailed distributions, The Annals of Applied Probability, 2, 756-776 (2008) · Zbl 1400.62292
[34] Verbelen, R.; Gong, L.; Antonio, K.; Badescu, A., Fitting mixtures of Erlangs to censored and truncated data using the EM algorithm, ASTIN Bulletin, 45, 729-758 (2015) · Zbl 1390.62227
[35] Yin, C.; Lin, X. S., Efficient estimation of Erlang mixtures using iSCAD penalty with insurance application, ASTIN Bulletin, 46, 779-799 (2016) · Zbl 1390.62030
[36] Young, D. S.; Chen, X.; Hewage, D. C.; Nilo-Poyanco, R., Finite mixture-of-gamma distributions: estimation, inference, and model-based clustering, Advances in Data Analysis and Classification, 13, 1053-1082 (2019) · Zbl 1474.62043
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.