×

Intrinsic dimension estimation: relevant techniques and a benchmark framework. (English) Zbl 1395.68244

Summary: When dealing with datasets comprising high-dimensional points, it is usually advantageous to discover some data structure. A fundamental information needed to this aim is the minimum number of parameters required to describe the data while minimizing the information loss. This number, usually called intrinsic dimension, can be interpreted as the dimension of the manifold from which the input data are supposed to be drawn. Due to its usefulness in many theoretical and practical problems, in the last decades the concept of intrinsic dimension has gained considerable attention in the scientific community, motivating the large number of intrinsic dimensionality estimators proposed in the literature. However, the problem is still open since most techniques cannot efficiently deal with datasets drawn from manifolds of high intrinsic dimension and nonlinearly embedded in higher dimensional spaces. This paper surveys some of the most interesting, widespread used, and advanced state-of-the-art methodologies. Unfortunately, since no benchmark database exists in this research field, an objective comparison among different techniques is not possible. Consequently, we suggest a benchmark framework and apply it to comparatively evaluate relevant state-of-the-art estimators.

MSC:

68T10 Pattern recognition, speech recognition
94A12 Signal theory (characterization, reconstruction, filtering, etc.)
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Bennett, R. S., The intrinsic dimensionality of signal collections, IEEE Transactions on Information Theory, 15, 5, 517-525, (1969) · Zbl 0176.49604
[2] Bishop, C. M., Neural Networks for Pattern Recognition, (1995), Oxford, UK: Oxford University Press, Oxford, UK
[3] Chávez, E.; Navarro, G.; Baeza-Yates, R.; Marroquín, J. L., Searching in metric spaces, ACM Computing Surveys, 33, 3, 273-321, (2001) · doi:10.1145/502807.502808
[4] Pestov, V., An axiomatic approach to intrinsic dimension of a dataset, Neural Networks, 21, 2-3, 204-213, (2008) · Zbl 1254.68102 · doi:10.1016/j.neunet.2007.12.030
[5] Pestov, V., Intrinsic dimensionality, SIGSPATIAL Special, 2, 2, 8-11, (2010) · doi:10.1145/1862413.1862416
[6] Katetov, M.; Simon, P., Origins of dimension theory, Handbook of the History of General Topology, 1, (1997) · Zbl 0902.54002
[7] Kégl, B.; Becker, S.; Thrun, S.; Obermayer, K., Intrinsic dimension estimation using packing numbers, Proceedings of the Neural Information Processing Systems (NIPS ’02), MIT Press
[8] Zhang, Z.; Zha, H., Adaptive manifold learning, Advances in Neural Information Processing Systems, 17, (2005)
[9] Gashler, M.; Martinez, T., Tangent space guided intelligent neighbor finding, Proceedings of the International Joint Conference on Neural Network (IJCNN ’11) · doi:10.1109/ijcnn.2011.6033560
[10] Gashler, M.; Martinez, T., Robust manifold learning with CycleCut, Connection Science, 24, 1, 57-69, (2012) · doi:10.1080/09540091.2012.664122
[11] Zhang, P.; Qiao, H.; Zhang, B., An improved local tangent space alignment method for manifold learning, Pattern Recognition Letters, 32, 2, 181-189, (2011) · doi:10.1016/j.patrec.2010.10.005
[12] Verma, N., Distance preserving embeddings for general \(n\)-dimensional manifolds, Journal of Machine Learning Research, 14, 2415-2448, (2013) · Zbl 1317.68194
[13] Bellman, R. E., Adaptive Control Processes: A Guided Tour, (1961), Princeton, NJ, USA: Princeton University Press, Princeton, NJ, USA
[14] Kirby, M., Geometric Data Analysis: An Empirical Approach to Dimensionality Reduction and the Study of Patterns, (2001), John Wiley & Sons · Zbl 1008.68116
[15] Jolliffe, I. T., Principal Component Analysis. Principal Component Analysis, Springer Series in Statistics, (1986), New York, NY, USA: Springer, New York, NY, USA · Zbl 1011.62064 · doi:10.1007/978-1-4757-1904-8
[16] Vapnik, V. N., Statistical Learning Theory, (1998), John Wiley & Sons · Zbl 0935.62007
[17] Friedman, J. H.; Hastie, T.; Tibshirani, R., The Elements of Statistical Learning—Data Mining, Inference and Prediction, (2009), Berlin, Germany: Springer, Berlin, Germany · Zbl 1273.62005
[18] Campadelli, P.; Casiraghi, E.; Ceruti, C.; Lombardi, G.; Rozza, A.; Petrosino, A., Local intrinsic dimensionality based features for clustering, Image Analysis and Processing—ICIAP 2013. Image Analysis and Processing—ICIAP 2013, Lecture Notes in Computer Science, 8156, 41-50, (2013), Berlin, Germany: Springer, Berlin, Germany · Zbl 1260.68358 · doi:10.1007/978-3-642-41181-6_5
[19] Grassberger, P.; Procaccia, I., Measuring the strangeness of strange attractors, Physica D. Nonlinear Phenomena, 9, 1-2, 189-208, (1983) · Zbl 0593.58024 · doi:10.1016/0167-2789(83)90298-1
[20] Lähdesmäki, H.; Yli-Harja, O.; Zhang, W.; Shmulevich, I., Intrinsic dimensionality in gene expression analysis, Proceedings of the International Workshop on Genomic Signal Processing and Statistics (GENSIPS ’05)
[21] Camastra, F.; Filippone, M., A comparative evaluation of nonlinear dynamics methods for time series prediction, Neural Computing and Applications, 18, 8, 1021-1029, (2009) · doi:10.1007/s00521-009-0266-y
[22] Valle, M.; Oganov, A. R., Crystal fingerprint space—a novel paradigm for studying crystal-structure sets, Acta Crystallographica Section A, 66, 5, 507-517, (2010) · doi:10.1107/s0108767310026395
[23] Carter, K. M.; Raich, R.; Hero, A. O., On local intrinsic dimension estimation and its applications, IEEE Transactions on Signal Processing, 58, 2, 650-663, (2010) · Zbl 1392.94122 · doi:10.1109/tsp.2009.2031722
[24] Lapuyade-Lahorgue, J.; Mohammad-Djafari, A., Nearest neighbors and correlation dimension for dimensionality estimation. Application to factor analysis of real biological time series data, Proceedings of The European Symposium on Artificial Neural Networks (ESANN ’11)
[25] Heylen, R.; Scheunders, P., Hyperspectral intrinsic dimensionality estimation with nearest-neighbor distance ratios, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 6, 2, 570-579, (2013) · doi:10.1109/jstars.2013.2256338
[26] Pettis, K. W.; Bailey, T. A.; Jain, A. K.; Dubes, R. C., An intrinsic dimensionality estimator from near-neighbor information, IEEE Transactions on Pattern Analysis and Machine Intelligence, 1, 1, 25-37, (1979) · Zbl 0418.68074
[27] Levina, E.; Bickel, P. J., Maximum likelihood estimation of intrinsic dimension, Proceedings of the NIPS
[28] Camastra, F.; Vinciarelli, A., Estimating the intrinsic dimension of data with a fractal-based method, IEEE Transactions on Pattern Analysis and Machine Intelligence, 24, 10, 1404-1407, (2002) · doi:10.1109/TPAMI.2002.1039212
[29] Costa, J. A.; Hero, A. O., Geodesic entropic graphs for dimension and entropy estimation in manifold learning, IEEE Transactions on Signal Processing, 52, 8, 2210-2221, (2004) · Zbl 1369.68278 · doi:10.1109/TSP.2004.831130
[30] Costa, J. A.; Hero, A. O., Learning intrinsic dimension and entropy of high-dimensional shape spaces, Proceedings of the European Signal Processing Conference (EUSIPCO ’04)
[31] Beyer, K. S.; Goldstein, J.; Ramakrishnan, R.; Shaft, U., When is ‘nearest neighbor’ meaningful?, Proceedings of the 7th International Conference on Database Theory (ICDT ’99), Springer
[32] Carter, K. M.; Raich, R.; Finn, W. G.; Hero, A. O., FINE: fisher information nonparametric embedding, IEEE Transactions on Pattern Analysis and Machine Intelligence, 31, 11, 2093-2098, (2009) · doi:10.1109/tpami.2009.67
[33] Farahmand, A. M.; Szepesvári, C.; Audibert, J.-Y., Manifold-adaptive dimension estimation, Proceedings of the 24th international conference on Machine learning (ICML ’07) · doi:10.1145/1273496.1273530
[34] Scheinkman, J. A.; LeBaron, B., Nonlinear dynamics and stock returns, The Journal of Business, 62, 3, 311-337, (1989) · doi:10.1086/296465
[35] Chialvo, D. R.; Gilmour, R. F.; Jalife, J., Low dimensional chaos in cardiac tissue, Nature, 343, 6259, 653-657, (1990) · doi:10.1038/343653a0
[36] Mekler, A., Calculation of eeg correlation dimension: large massifs of experimental data, Computer Methods and Programs in Biomedicine, 92, 1, 154-160, (2008) · doi:10.1016/j.cmpb.2008.06.009
[37] Derry, G. N.; Derry, P. S., Age dependence of the menstrual cycle correlation dimension, Open Journal of Biophysics, 2, 2, 40-45, (2012) · doi:10.4236/ojbiphy.2012.22006
[38] Isham, V., Statistical Aspects of Chaos: A Review, (1993), London, UK: Chapman and Hall, London, UK · Zbl 0819.62083
[39] Haykin, S.; Li, X. B., Detection of signals in chaos, Proceedings of the IEEE, 83, 1, 95-122, (1995) · doi:10.1109/5.362751
[40] Somervuo, P., Speech dimensionality analysis on hypercubical self-organizing maps, Neural Processing Letters, 17, 2, 125-136, (2003) · Zbl 1038.68106 · doi:10.1023/a:1023646203167
[41] Hu, B.; Rakthanmanon, T.; Hao, Y.; Evans, S.; Lonardi, S.; Keogh, E., Towards discovering the intrinsic cardinality and dimensionality of time series using MDL, Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence. Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence, Lecture Notes in Computer Science, 7070, 184-197, (2013), Berlin, Germany: Springer, Berlin, Germany · Zbl 1403.68189 · doi:10.1007/978-3-642-44958-1_14
[42] Laughlin, D. C., The intrinsic dimensionality of plant traits and its relevance to community assembly, Journal of Ecology, 102, 1, 186-193, (2014) · doi:10.1111/1365-2745.12187
[43] Camastra, F., Data dimensionality estimation methods: a survey, Pattern Recognition, 36, 12, 2945-2954, (2003) · Zbl 1059.68100 · doi:10.1016/s0031-3203(03)00176-6
[44] Romney, A. K.; Shepard, R. N.; Nerlove, S. B., Multidimensionaling Scaling, Volume I: Theory, (1972), Seminar Press
[45] Romney, A. K.; Shepard, R. N.; Nerlove, S. B., Multidimensionaling Scaling, Volume II: Applications, (1972), Seminar Press
[46] Lin, T.; Zha, H., Riemannian manifold learning, IEEE Transactions on Pattern Analysis and Machine Intelligence, 30, 5, 796-809, (2008) · doi:10.1109/tpami.2007.70735
[47] Shepard, R. N., The analysis of proximities: multidimensional scaling with an unknown distance function. Part I, Psychometrika, 27, 125-140, (1962) · Zbl 0129.12103
[48] Shepard, R. N., The analysis of proximities: multidimensional scaling with an unknown distance function, part II, Psychometrika, 27, 219-246, (1962) · Zbl 0129.12103
[49] Kruskal, J. B., Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis, Psychometrika, 29, 1-27, (1964) · Zbl 0123.36803
[50] Kruskal, J. B.; Carrol, J. D., Geometrical Models and Badness-of-Fit Functions, 2, (1969), Academic Press
[51] Shepard, R. N.; Carroll, J. D., Parametric Representation of Nonlinear Data Structures, (1969), New York, NY, USA: Academic Press, New York, NY, USA
[52] Kruskal, J. B., Linear Transformation of Multivariate Data to Reveal Clustering, 1, (1972), New York, NY, USA: Academic Press, New York, NY, USA
[53] Chen, C. K.; Andrews, H. C., Nonlinear intrinsic dimensionality computations, IEEE Transactions on Computers, C-23, 2, 178-184, (1974) · Zbl 0283.68067 · doi:10.1109/T-C.1974.223882
[54] Sammon, J. W. J., A nonlinear mapping for data structure analysis, IEEE Transactions on Computers, 18, 401-409, (1969)
[55] Demartines, P.; Hérault, J., Curvilinear component analysis: A self-organizing neural network for nonlinear mapping of data sets, IEEE Transactions on Neural Networks, 8, 1, 148-154, (1997) · doi:10.1109/72.554199
[56] Tenenbaum, J. B.; de Silva, V.; Langford, J. C., A global geometric framework for nonlinear dimensionality reduction, Science, 290, 5500, 2319-2323, (2000) · doi:10.1126/science.290.5500.2319
[57] Roweis, S. T.; Saul, L. K., Nonlinear dimensionality reduction by locally linear embedding, Science, 290, 5500, 2323-2326, (2000) · doi:10.1126/science.290.5500.2323
[58] Lee, J. A.; Verleysen, M., Nonlinear Dimensionality Reduction, (2007), New York, NY, USA: Springer, New York, NY, USA · Zbl 1128.68024
[59] Karbauskaite, R.; Dzemyda, G.; Mazetis, E., Geodesic distances in the maximum likelihood estimator of intrinsic dimensionality, Nonlinear Analysis: Modelling and Control, 16, 4, 387-402, (2011) · Zbl 1271.93148
[60] Polito, M.; Perona, P., Grouping and dimensionality reduction by locally linear embedding, Advances in Neural Information Processing Systems, 14, 1255-1262, (2001)
[61] Schölkopf, B.; Smola, A.; Müller, K.-R., Nonlinear component analysis as a kernel eigenvalue problem, Neural Computation, 10, 5, 1299-1319, (1998) · doi:10.1162/089976698300017467
[62] Fukunaga, K.; Olsen, D. R., An algorithm for finding intrinsic dimensionality of data, IEEE Transactions on Computers, 20, 2, 176-183, (1971) · Zbl 0216.50201 · doi:10.1109/t-c.1971.223208
[63] Verveer, P. J.; Duin, R. P. W., An evaluation of intrinsic dimensionality estimators, IEEE Transactions on Pattern Analysis and Machine Intelligence, 17, 1, 81-86, (1995) · doi:10.1109/34.368147
[64] Brüske, J.; Sommer, G., Intrinsic dimensionality estimation with optimally topology preserving maps, IEEE Transactions on Pattern Analysis and Machine Intelligence, 20, 5, 572-575, (1998) · doi:10.1109/34.682189
[65] Martinetz, T.; Schulten, K., Topology representing networks, Neural Networks, 7, 3, 507-522, (1994) · doi:10.1016/0893-6080(94)90109-0
[66] Tipping, M. E.; Bishop, C. M., Probabilistic principal component analysis, Journal of the Royal Statistical Society. Series B. Statistical Methodology, 61, 3, 611-622, (1999) · Zbl 0924.62068 · doi:10.1111/1467-9868.00196
[67] Everson, R.; Roberts, S., Inferring the eigenvalues of covariance matrices from limited, noisy data, IEEE Transactions on Signal Processing, 48, 7, 2083-2091, (2000) · Zbl 0992.94006 · doi:10.1109/78.847792
[68] Bishop, C. M., Bayesian PCA, Proceedings of the 12th Annual Conference on Neural Information Processing Systems (NIPS ’98)
[69] Rajan, J. J.; Rayner, P. J. W., Model order selection for the singular value decomposition and the discrete Karhunen-Loeve transform using a Bayesian approach, IEE Proceedings—Vision, Image and Signal Processing, 144, 2, 116-123, (1997)
[70] Minka, T. P., Automatic choice of dimensionality for PCA, 514, (2000), MIT
[71] Bouveyron, C.; Celeux, G.; Girard, S., Intrinsic dimension estimation by maximum likelihood in isotropic probabilistic PCA, Pattern Recognition Letters, 32, 14, 1706-1713, (2011) · doi:10.1016/j.patrec.2011.07.017
[72] Li, J.; Tao, D., Simple exponential family PCA, Proceedings of the 13th International Conference on Artificial Intelligence and Statistics (AISTATS ’10)
[73] Guan, Y.; Dy, J. G., Sparse probabilistic principal component analysis, Journal of Machine Learning Research, 5, 185-192, (2009)
[74] Zou, H.; Hastie, T.; Tibshirani, R., Sparse principal component analysis, Journal of Computational and Graphical Statistics, 15, 2, 265-286, (2006) · doi:10.1198/106186006X113430
[75] Bishop, C. M., Pattern Recognition and Machine Learning, (2006), New York, NY, USA: Springer, New York, NY, USA · Zbl 1107.68072 · doi:10.1007/978-0-387-45528-0
[76] Ceruti, C.; Bassis, S.; Rozza, A.; Lombardi, G.; Casiraghi, E.; Campadelli, P., DANCo: an intrinsic dimensionality estimator exploiting angle and norm concentration, Pattern Recognition, 47, 8, 2569-2581, (2014) · Zbl 1339.68219 · doi:10.1016/j.patcog.2014.02.013
[77] Little, A. V.; Maggioni, M.; Rosasco, L., Multiscale geometric methods for data sets I: multiscale SVD, noise and curvature, MIT-CSAIL-TR, 2012-029, (2012)
[78] Kaslovsky, F. G.; Meyer, D. N., Optimal tangent plane recovery from noisy manifold samples
[79] Hein, M.; Audibert, J. Y., Intrinsic dimensionality estimation of submanifolds in euclidean space, Proceedings of the International Conference on Machine Learning (ICML ’05)
[80] Haro, G.; Randall, G.; Sapiro, G., Translated poisson mixture model for stratification learning, International Journal of Computer Vision, 80, 3, 358-374, (2008) · doi:10.1007/s11263-008-0144-6
[81] Chen, M.; Silva, J.; Paisley, J.; Wang, C.; Dunson, D.; Carin, L., Compressive sensing on manifolds using a nonparametric mixture of factor analyzers: algorithm and performance bounds, IEEE Transactions on Signal Processing, 58, 12, 6140-6155, (2010) · Zbl 1392.94139 · doi:10.1109/tsp.2010.2070796
[82] Brouwer, L., Collected Works, Volume I, Philosophy and Foundations of Mathematics and II, Geometry, Analysis, Topology and Mechanics, (1976), North-Holland/American Elsevier
[83] James, I. M., History of Topology. History of Topology, Mathematics, (1999), Elsevier · Zbl 0922.54003
[84] Medioni, G.; Mordohai, P., The tensor voting framework, Emerging Topics in Computer Vision, 191-255, (2004), Prentice Hall · Zbl 1098.68821
[85] Lombardi, G.; Casiraghi, E.; Campadelli, P., Curvature estimation and curve inference with tensor voting: a new approach, Proceedings of the 10th International Conference on Advanced Concepts for Intelligent Vision Systems (ACIVS ’08)
[86] Wertheimer, M., Untersuchungen zur Lehre von der Gestalt II, Psycologische Forshung, 4, 301-350, (1923)
[87] Mordohai, P.; Medioni, G., Dimensionality estimation, manifold learning and function approximation using tensor voting, Journal of Machine Learning Research, 11, 411-450, (2010) · Zbl 1242.68239
[88] Robinson, J. C., Dimensions, Embeddings, and Attractors. Dimensions, Embeddings, and Attractors, Cambridge Tracts in Mathematics, (2010), Cambridge University Press · Zbl 1222.37004
[89] Li, C.-G.; Guo, J.; Xiao, B., Intrinsic dimensionality estimation within neighborhood convex hull, International Journal of Pattern Recognition and Artificial Intelligence, 23, 1, 31-44, (2009) · doi:10.1142/s0218001409007016
[90] Falconer, K., Fractal Geometry—Mathematical Foundations and Applications, (2003), John Wiley & Sons · doi:10.1002/0470013850
[91] Tatti, N.; Mielikäinen, T.; Gionis, A.; Mannila, H., What is the dimension of your binary data?, Proceedings of the 6th International Conference on Data Mining (ICDM’ 06) · doi:10.1109/icdm.2006.167
[92] Eckmann, J.-P.; Ruelle, D., Fundamental limitations for estimating dimensions and Lyapunov exponents in dynamical systems, Physica D. Nonlinear Phenomena, 56, 2-3, 185-187, (1992) · Zbl 0759.58030 · doi:10.1016/0167-2789(92)90023-g
[93] Takens, F.; Braaksma, B. J.; Broer, H. W.; Takens, F., On the numerical determination of the dimension of an attractor, Dynamical Systems and Bifurcations. Dynamical Systems and Bifurcations, Lecture Notes in Mathematics, 1125, 99-106, (1985), Berlin, Germany: Springer, Berlin, Germany · doi:10.1007/bfb0075637
[94] Ashkenazy, Y., The use of generalized information dimension in measuring fractal dimension of time series, Physica A: Statistical Mechanics and Its Applications, 271, 3-4, 427-447, (1999) · doi:10.1016/s0378-4371(99)00192-2
[95] Tricot, C., Two definitions of fractional dimension, Mathematical Proceedings of the Cambridge Philosophical Society, 91, 1, 57-74, (1982) · Zbl 0483.28010 · doi:10.1017/s0305004100059119
[96] Brito, M. R.; Quiroz, A. J.; Yukich, J. E., Intrinsic dimension identification via graph-theoretic methods, Journal of Multivariate Analysis, 116, 263-277, (2013) · Zbl 1359.62279 · doi:10.1016/j.jmva.2012.12.007
[97] Raginsky, M.; Lazebnik, S., Estimation of intrinsic dimensionality using high-rate vector quantization, Proceedings of the NIPS
[98] Zador, P. L., Asymptotic quantization error of continuous signals and the quantization dimension, IEEE Transactions on Information Theory, 28, 2, 139-149, (1982) · Zbl 0476.94008 · doi:10.1109/tit.1982.1056490
[99] Kumaraswamy, K.; Megalooikonomou, V.; Faloutsos, C., Fractal dimension and vector quantization, Information Processing Letters, 91, 3, 107-113, (2004) · Zbl 1178.68200 · doi:10.1016/j.ipl.2004.04.005
[100] Trunk, G. V., Statistical estimation of the intrinsic dimensionality of a noisy signal collection, IEEE Transactions on Computers, 25, 2, 165-171, (1976) · Zbl 0317.62068
[101] Fan, M.; Qiao, H.; Zhang, B., Intrinsic dimension estimation of manifolds by incising balls, Pattern Recognition, 42, 5, 780-787, (2009) · Zbl 1162.68405 · doi:10.1016/j.patcog.2008.09.016
[102] MacKay, D.; Ghahramani, Z., Comments on maximum likelihood estimation of intrinsic dimension by E. Levina and P. Bickel
[103] Penrose, M. D.; Yukich, J. E., Limit theory for point processes in manifolds, The Annals of Applied Probability, 23, 6, 2161-2211, (2013) · Zbl 1285.60021 · doi:10.1214/12-aap897
[104] Bickel, P. J.; Yan, D., Sparsity and the possibility of inference, Sankhya: The Indian Journal of Statistics, 70, 1, (2008) · Zbl 1192.62113
[105] Das Gupta, M.; Huang, T. S.; Grünwald, P.; Spirtes, P., Regularized maximum likelihood for intrinsic dimension estimation, Proceedings of the 26th Conference on Uncertainty in Artificial Intelligence (UAI ’10), AUAI Press
[106] Karbauskaite, R.; Dzemyda, G., Investigation of the maximum likelihood estimator of intrinsic dimensionality, Proceedings of the 10th International Conference on Computer Data Analysis and Modeling
[107] Rozza, A.; Lombardi, G.; Ceruti, C.; Casiraghi, E.; Campadelli, P., Novel high intrinsic dimensionality estimators, Machine Learning, 89, 1-2, 37-65, (2012) · Zbl 1260.68358 · doi:10.1007/s10994-012-5294-7
[108] Wang, Q.; Kulkarni, S. R.; Verdú, S., A nearest-neighbor approach to estimating divergence between continuous random vectors, Proceedings of the IEEE International Symposium on Information Theory (ISIT ’06) · doi:10.1109/isit.2006.261842
[109] Mardia, K. V., Statistics of Directional Data, (1972), Academic Press · Zbl 0244.62005
[110] Quiroz, A. J., Graph-theoretical methods, Encyclopedia of Statistical Sciences, 5, (2006), New York, NY, USA: Wiley and Sons, New York, NY, USA
[111] Hero, A. O.; Ma, B.; Michel, O. J. J.; Gorman, J., Applications of entropic spanning graphs, IEEE Signal Processing Magazine, 19, 5, 85-95, (2002) · doi:10.1109/msp.2002.1028355
[112] Costa, J. A.; Girotra, A.; Hero, A. O., Estimating local intrinsic dimension with k-nearest neighbor graphs, Proceedings of the IEEE/SP 13th Workshop on Statistical Signal Processing
[113] Friedman, J. H.; Rafsky, L. C., Graph-theoretic measures of multivariate association and prediction, Annals of Statistics, 11, 2, 377-391, (1983) · Zbl 0528.62052 · doi:10.1214/aos/1176346148
[114] Penrose, M. D.; Yukich, J. E., Central limit theorems for some graphs in computational geometry, The Annals of Applied Probability, 11, 4, 1005-1041, (2001) · Zbl 1044.60016 · doi:10.1214/aoap/1015345393
[115] Brito, M. R.; Quiroz, A. J.; Yukich, J. E., Graph-theoretic procedures for dimension identification, Journal of Multivariate Analysis, 81, 1, 67-84, (2002) · Zbl 1006.60024 · doi:10.1006/jmva.2001.1992
[116] Steele, J. M.; Shepp, L. A.; Eddy, W. F., On the number of leaves of a Euclidean minimal spanning tree, Journal of Applied Probability, 24, 4, 809-826, (1987) · Zbl 0639.60014 · doi:10.2307/3214207
[117] Schilling, M. F., Mutual and shared neighbor probabilities: finite- and infinite-dimensional results, Advances in Applied Probability, 18, 2, 388-405, (1986) · Zbl 0606.60018 · doi:10.2307/1427305
[118] Sricharan, K.; Raich, R.; Hero, A. O., Optimized intrinsic dimension estimator using nearest neighbor graphs, Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP ’10) · doi:10.1109/icassp.2010.5494931
[119] LeCun, Y.; Bottou, L.; Bengio, Y.; Haffner, P., Gradient-based learning applied to document recognition, Proceedings of the IEEE, 86, 11, 2278-2324, (1998) · doi:10.1109/5.726791
[120] Frank, A.; Asuncion, A., UCI Machine Learning Repository, (2010), UCI
[121] Pineda, F.; Sommerer, J., Estimating generalized dimensions and choosing time delays: a fast algorithm, Time Series Prediction: Forecasting the Future and Understanding the Past, 367-385, (1994)
[122] Costa, J. A.; Hero, A. O., Determining Intrinsic Dimension and Entropy of High-Dimensional Shape Spaces, (2006), Boston, Mass, USA: Birkhäuser, Boston, Mass, USA · Zbl 1160.94303
[123] Kivimäki, I.; Lagus, K.; Nieminen, I.; Väyrynen, J.; Honkela, T., Using correlation dimension for analysing text data, Artificial Neural Networks—ICANN 2010: Proceedings of the 20th International Conference, Thessaloniki, Greece, September 15–18, 2010, Part I. Artificial Neural Networks—ICANN 2010: Proceedings of the 20th International Conference, Thessaloniki, Greece, September 15–18, 2010, Part I, Lecture Notes in Computer Science, 6352, 368-373, (2010), Berlin, Germany: Springer, Berlin, Germany · doi:10.1007/978-3-642-15819-3_49
[124] Ott, E., Chaos in Dynamical Systems, (1993), Cambridge, UK: Cambridge University Press, Cambridge, UK · Zbl 1006.37001 · doi:10.1017/cbo9780511803260
[125] Chua, L. O.; Komuro, M.; Matsumoto, T., The double scroll, IEEE Transactions on Circuits and Systems, 32, 8, 797-818, (1985) · Zbl 0578.94023 · doi:10.1109/tcs.1985.1085791
[126] Lombardi, G.; Rozza, A.; Ceruti, C.; Casiraghi, E.; Campadelli, P., Minimum neighbor distance estimators of intrinsic dimension, Machine Learning and Knowledge Discovery in Databases: Proceedings of the European Conference, ECML PKDD 2011, Athens, Greece, September 5–9, 2011, Part II. Machine Learning and Knowledge Discovery in Databases: Proceedings of the European Conference, ECML PKDD 2011, Athens, Greece, September 5–9, 2011, Part II, Lecture Notes in Computer Science, 6912, 374-389, (2011), Berlin, Germany: Springer, Berlin, Germany · Zbl 1260.68358 · doi:10.1007/978-3-642-23783-6_24
[127] Jaccard, J.; Becker, M. A.; Wood, G., Pairwise multiple comparison procedures: a review, Psychological Bulletin, 96, 3, 589-596, (1984) · doi:10.1037/0033-2909.96.3.589
[128] Gong, D.; Zhao, X.; Medioni, G., Robust multiple manifolds structure learning, Proceedings of the 29th International Conference on Machine Learning (ICML’ 12)
[129] Wei, J.; Peng, H.; Lin, Y.-S.; Huang, Z.-M.; Wang, J.-B., Adaptive neighborhood selection for manifold learning, Proceedings of the International Conference on Machine Learning and Cybernetics (ICMLC ’08), IEEE · doi:10.1109/icmlc.2008.4620435
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.