×

zbMATH — the first resource for mathematics

Multivariate time series analysis from a Bayesian machine learning perspective. (English) Zbl 07342710
Summary: In this paper, we perform multivariate time series analysis from a Bayesian machine learning perspective through the proposed multivariate Bayesian time series (MBTS) model. The multivariate structure and the Bayesian framework allow the model to take advantage of the association structure among target series, select important features, and train the data-driven model at the same time. Extensive analyses on both simulated data and empirical data indicate that the MBTS model is able to, cover the true values of regression coefficients in \(90\%\) credible intervals, select the most important predictors, and boost the prediction accuracy with higher correlation in absolute value of the target series, and consistently yield superior performance over the univariate Bayesian structural time series (BSTS) model, the autoregressive integrated moving average with regression (ARIMAX) model, and the multivariate ARIMAX (MARIMAX) model, in one-step-ahead forecast and ten-steps-ahead forecast.
MSC:
62H12 Estimation in multivariate analysis
62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH)
62F15 Bayesian inference
62F07 Statistical ranking and selection procedures
68T05 Learning and adaptive systems in artificial intelligence
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Arora, S.; Barak, B., Computational complexity: a modern approach (2009), Cambridge: Cambridge University Press, Cambridge · Zbl 1193.68112
[2] Belussi, A.; Migliorini, S., A spatio-temporal framework for managing archeological data, Ann. Math. Artif. Intell., 80, 3-4, 175-218 (2017) · Zbl 1419.68046
[3] Bretó, C.; He, D.; Ionides, EL; King, AA, Time series analysis via mechanistic models, Annals Appl. Stat., 3, 1, 319-348 (2009) · Zbl 1160.62080
[4] Chen, B.; Chen, L.; Chen, Y., Efficient ant colony optimization for image feature selection, Signal Process., 93, 6, 1566-1576 (2013)
[5] Chen, M., Chen, Y., Weinberger, K.Q.: Automatic feature decomposition for single view co-training. In: Proceedings of the 28th international conference on machine learning (ICML-11), pp 953-960 (2011)
[6] Chen, Y., Dong, G., Han, J., Wah, B.W., Wang, J: Multi-dimensional regression analysis of time-series data streams. In: VLDB’02: Proceedings of the 28th international conference on very large databases, pp 323-334. Elsevier, New York (2002)
[7] Crisan, D.; Doucet, A., A survey of convergence results on particle filtering methods for practitioners, IEEE Trans. Signal Process., 50, 3, 736-746 (2002) · Zbl 1369.60015
[8] Cui, Z., Chen, W., Chen, Y: Multi-scale convolutional neural networks for time series classification. arXiv:1603.06995 (2016)
[9] Dong, Y.; Tao, D.; Li, X.; Ma, J.; Pu, J., Texture classification and retrieval using shearlets and linear regression, IEEE Trans. Cybern., 45, 3, 358-369 (2014)
[10] Douc, R.; Moulines, E.; Stoffer, D., Nonlinear time series: theory, methods and applications with R examples (2014), Boca Raton: CRC Press, Boca Raton · Zbl 1306.62026
[11] Durbin, J.; Koopman, SJ, A simple and efficient simulation smoother for state space time series analysis, Biometrika, 89, 3, 603-616 (2002) · Zbl 1036.62071
[12] Fan, J., Ma, C., Zhong, Y.: A selective overview of deep learning. arXiv:1904.05526 (2019)
[13] George, E.I., McCulloch, R.E.: Approaches for bayesian variable selection. Stat. Sinica pp. 339-373 (1997) · Zbl 0884.62031
[14] Golumbic, MC, Algorithmic graph theory and perfect graphs, vol. 57 (2004), New York: Elsevier, New York · Zbl 1050.05002
[15] Gutfreund, D.; Kontorovich, A.; Levy, R.; Rosen-Zvi, M., Boosting conditional probability estimators, Ann. Math. Artif. Intell., 79, 1-3, 129-144 (2017) · Zbl 1407.62118
[16] Harvey, AC, forecasting structural time series models and the Kalman filter (1990), Cambridge: Cambridge University Press, Cambridge
[17] Harvey, AC; Jaeger, A., Detrending, stylized facts and the business cycle, J Appl Economet, 8, 3, 231-247 (1993)
[18] Harvey, A.C., Koopman, S.J.M., Heij, C., Schumacher, H., Hanzon, B., Praagman, C.: Multivariate structural time series models Series in Financial Economics and Quantitative Analysis (1997)
[19] Harvey, AC; Trimbur, TM; Van Dijk, HK, Trends and cycles in economic time series: a bayesian approach, J. Econ., 140, 2, 618-649 (2007) · Zbl 1247.91149
[20] Hoeting, J.A., Madigan, D., Raftery, A.E., Volinsky, C.T.: Bayesian model averaging: a tutorial. Stat. Sci. pp. 382-401 (1999) · Zbl 1059.62525
[21] Hou, C.; Nie, F.; Li, X.; Yi, D.; Wu, Y., Joint embedding learning and sparse regression: a framework for unsupervised feature selection, IEEE Trans. Cybern., 44, 6, 793-804 (2013)
[22] Kuleshov, A.; Bernstein, A., Nonlinear multi-output regression on unknown input manifold, Ann. Math. Artif. Intell., 81, 1-2, 209-240 (2017) · Zbl 1386.68133
[23] Li, W., Wang, Z., Ho, D.W.C., Wei, G.: On boundedness of error covariances for kalman consensus filtering problems IEEE Transactions on Automatic Control (2019) · Zbl 07256378
[24] Li, X.; Zhang, H.; Zhang, R.; Liu, Y.; Nie, F., Generalized uncorrelated regression with adaptive graph for unsupervised feature selection, IEEE Trans. Neural Netw. Learn. Syst., 30, 5, 1587-1595 (2018)
[25] Liao, TW, Clustering of time series data—a survey, Patt. Recog., 38, 11, 1857-1874 (2005) · Zbl 1077.68803
[26] Madigan, D.; Raftery, AE, Model selection and accounting for model uncertainty in graphical models using occam’s window, J. Am. Stat. Assoc., 89, 428, 1535-1546 (1994) · Zbl 0814.62030
[27] Mamon, RS; Elliott, RJ, Hidden Markov models in finance, vol. 4 (2007), New York: Springer, New York
[28] Narimatsu, H.; Kasai, H., State duration and interval modeling in hidden semi-markov model for sequential data analysis, Ann. Math. Artif. Intell., 81, 3-4, 377-403 (2017) · Zbl 1423.68387
[29] Pang, T.; Nie, F.; Han, J.; Li, X., Efficient feature selection via ℓ2,0-norm constrained sparse regression, IEEE Trans. Knowl. Data Eng., 31, 5, 880-893 (2019)
[30] Petris, G., Petrone, S., Campagnoli, P.: Dynamic linear models. Dynamic Linear Models with R. pp. 31-84 (2009) · Zbl 1176.62088
[31] Preis, T., Moat, H.S., Stanley, H.E.: Quantifying trading behavior in financial markets using google trends Scientific reports 3:srep01684 (2013)
[32] Qiu, J., Liu, W., Ning, N.: Evolution of regional innovation with spatial knowledge spillovers: Convergence or divergence? Netw. Spatial Econ. pp. 1-30 (2019)
[33] Said, SE; Dickey, DA, Testing for unit roots in autoregressive-moving average models of unknown order, Biometrika, 71, 3, 599-607 (1984) · Zbl 0564.62075
[34] Scott, SL; Varian, HR, Predicting the present with bayesian structural time series, Int. J. Math. Model. Numer. Opt., 5, 1-2, 4-23 (2014) · Zbl 1302.62289
[35] Scott, S.L., Varian, H.R.: Bayesian variable selection for nowcasting economic time series. In: Economic analysis of the digital economy, pp 119-135. University of Chicago Press, Chicago (2015)
[36] Su, Y.; Gao, X.; Li, X.; Tao, D., Multivariate multilinear regression, IEEE Trans. Syst. Man Cybern., Part B (Cybernetics), 42, 6, 1560-1573 (2012)
[37] Vincent, L.E., Thome, N.: Shape and time distortion loss for training deep time series forecasting models. In: Advances in neural information processing systems, pp 4191-4203 (2019)
[38] Vovk, V.; Pavlovic, D., Universal probability-free prediction, Ann. Math. Artif. Intell., 81, 1-2, 47-70 (2017) · Zbl 1384.60076
[39] Yao, C.; Han, J.; Nie, F.; Xiao, F.; Li, X., Local regression and global information-embedded dimension reduction, IEEE Trans Neural Netw. Learn. Syst., 29, 10, 4882-4893 (2018)
[40] Zhang, H., Zhang, R., Nie, F., Li, X.: A Generalized Uncorrelated Ridge Regression with Nonnegative Labels for Unsupervised Feature Selection. In: 2018 IEEE international conference on acoustics, speech and signal processing (ICASSP), IEEE, pp 2781-2785 (2018)
[41] Zhang, R.; Nie, F.; Li, X., Feature selection under regularized orthogonal least square regression with optimal scaling, Neurocomputing, 273, 547-553 (2018)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.