# zbMATH — the first resource for mathematics

Incremental concept learning for bounded data mining. (English) Zbl 1045.68572
Summary: Important refinements of concept learning in the limit from positive data considerably restricting the accessibility of input data are studied. Let c be any concept; every infinite sequence of elements exhausting $$c$$ is called positive presentation of $$c$$. In all learning models considered the learning machine computes a sequence of hypotheses about the target concept from a positive presentation of it. With iterative learning, the learning machine, in making a conjecture, has access to its previous conjecture and the latest data items coming in. In $$k$$-bounded example-memory inference ($$k$$ is a priori fixed) the learner is allowed to access, in making a conjecture, its previous hypothesis, its memory of up to $$k$$ data items it has already seen, and the next element coming in.
In the case of $$k$$-feedback identification, the learning machine, in making a conjecture, has access to its previous conjecture, the latest data item coming in, and, on the basis of this information, it can compute $$k$$ items and query the database of previous data to find out, for each of the $$k$$ items, whether or not it is in the database ($$k$$ is again a priori fixed). In all cases, the sequence of conjectures has to converge to a hypothesis correctly describing the target concept. Our results are manyfold. An infinite hierarchy of more and more powerful feedback learners in dependence on the number $$k$$ of queries allowed to be asked is established.
However, the hierarchy collapses to 1-feedback inference if only indexed families of infinite concepts are considered, and moreover, its learning power is then equal to learning in the limit. But it remains infinite for concept classes of only infinite r.e. concepts. Both $$k$$-feedback inference and $$k$$-bounded example-memory identification are more powerful than iterative learning but incomparable to one another. Furthermore, there are cases where redundancy in the hypothesis space is shown to be a resource increasing the learning power of iterative learners. Finally, the union of at most $$k$$ pattern languages is shown to be iteratively inferable.

##### MSC:
 68Q32 Computational learning theory 68T05 Learning and adaptive systems in artificial intelligence
learning machine
Full Text:
##### References:
 [1] Angluin, D., Finding patterns common to a set of strings, J. comput. system sci., 21, 46-62, (1980) · Zbl 0454.68108 [2] Angluin, D., Inductive inference of formal languages from positive data, Inform. and control, 45, 117-135, (1980) · Zbl 0459.68051 [3] Arikawa, S.; Shinohara, T.; Yamamoto, A., Learning elementary formal systems, Theoret. comput. sci., 95, 97-113, (1992) · Zbl 0746.68069 [4] Arimura, H.; Shinohara, T., Inductive inference of prolog programs with linear data dependency from positive data, Proceedings information modeling and knowledge bases V, (1994), IOS Press Burke, p. 365-375 [5] Blum, M., A machine independent theory of the complexity of recursive functions, J. assoc. comput. Mach., 14, 322-336, (1967) · Zbl 0155.01503 [6] Blum, L.; Blum, M., Toward a mathematical theory of inductive inference, Inform. and control, 28, 122-155, (1975) · Zbl 0375.02028 [7] Brachman, R.; Anand, T., The process of knowledge discovery in databases: A human centered approach, (), 37-58 [8] Case, J., Periodicity in generation of automata, Math. systems theory, 8, 15-32, (1974) · Zbl 0295.02019 [9] Case, J., The power of vacillation, (), 196-205 [10] Case, J., Infinitary self-reference in learning theory, J. experimental & theoretical artificial intelligence, 6, 3-16, (1994) · Zbl 0803.68110 [11] Case, J. 1996, The Power of Vacillation in Language Learning, Technical Report, LP-96-08, Logic, Philosophy and Linguistics Series of the Institute for Logic, Language and Computation, University of Amsterdam, The Netherlands. · Zbl 0939.68099 [12] Case, J.; Smith, C.H., Comparison of identification criteria for machine inductive inference, Theoret. comput. sci., 25, 193-220, (1983) · Zbl 0524.03025 [13] Fayyad, U.M.; Djorgovski, S.G.; Weir, N., Automating the analysis and cataloging of sky surveys, (), 471-494 [14] Fayyad, U.M.; Piatetsky-Shapiro, G.; Smyth, P., From data mining to knowledge discovery: an overview, (), 1-34 [15] Filé, G., The relation of two patterns with comparable languages, (), 184-192 [16] Freivalds, R.; Smith, C.H., On the role of procrastination for machine learning, Inform. and comput., 107, 237-271, (1993) · Zbl 0794.68127 [17] Fulk, M.; Jain, S.; Osherson, D.N., Open problems in systems that learn, J. comput. system sci., 49, 589-604, (1994) [18] Gold, M.E., Language identification in the limit, Inform. and control, 10, 447-474, (1967) · Zbl 0259.68032 [19] Hopcroft, J.E.; Ullman, J.D., Formal languages and their relation to automata, (1969), Addison-Wesley Reading · Zbl 0196.01701 [20] Jain, S.; Sharma, A., On monotonic strategies for learning r.e. languages, (), 349-364 · Zbl 1044.68655 [21] Jain, S.; Sharma, A., On identification by teams and probabilistic machines, (), 108-145 [22] Jiang, T.; Salomaa, A.; Salomaa, K.; Yu, S., Inclusion is undecidable for pattern languages, (), 301-312 [23] Kloesgen, W., Efficient discovery of interesting statements in databases, J. intell. inform. systems, 4, 53-69, (1995) [24] Krishna Rao, M.R.K., A class of prolog programs inferable from positive data, (), 272-284 · Zbl 1184.68287 [25] Lange, S.; Wiehagen, R., Polynomial-time inference of arbitrary pattern languages, New generation computing, 8, 361-370, (1991) · Zbl 0712.68082 [26] Lange, S.; Zeugmann, T., Language learning in dependence on the space of hypotheses, (), 127-136 [27] Lange, S.; Zeugmann, T., Learning recursive languages with bounded mind changes, International journal of foundations of computer science, 4, 157-178, (1993) · Zbl 0802.68105 [28] Lange, S.; Zeugmann, T., Monotonic versus non-monotonic language learning, (), 254-269 · Zbl 0819.68098 [29] Lange, S.; Zeugmann, T., Incremental learning from positive data, J. comput. system sci., 53, 88-103, (1996) · Zbl 0859.68088 [30] Lange, S.; Zeugmann, T., Set-driven and rearrangement-independent learning of recursive languages, Math. systems theory, 29, 599-634, (1996) · Zbl 0860.68088 [31] Matheus, C.J.; Piatetsky-Shapiro, G.; McNeil, D., Selecting and reporting what is interesting, (), 495-515 [32] Meyer, L., Probabilistic language learning under monotonicity constraints, (), 169-184 [33] Meyer, L., Monotonic and dual monotonic probabilistic language learning of indexed families with high probability, (), 66-78 [34] Nix, R.P., Editing by examples, Technical report, (1983) [35] Osherson, D.N.; Stob, M.; Weinstein, S., Systems that learn, an introduction to learning theory for cognitive and computer scientists, (1986), MIT Press Cambridge [36] Rogers, H. 1967, Theory of Recursive Functions and Effective Computability, McGraw Hill, New York, 1967. [Reprinted, MIT Press, Cambridge, MA, 1987] · Zbl 0183.01401 [37] Rossmanith, P.; Zeugmann, T., Learning k-variable pattern languages efficiently stochastically finite on average from positive data, (), 13-24 [38] Salomaa, A., Patterns (the formal language theory column), EATCS bull., 54, 46-62, (1994) [39] Salomaa, A., Return to patterns (the formal language theory column), EATCS bull., 55, 144-157, (1994) [40] Shimozono, S.; Shinohara, A.; Shinohara, T.; Miyano, S.; Kuhara, S.; Arikawa, S., Knowledge acquisition from amino acid sequences by machine learning system BONSAI, Trans. inform. process. soc. jpn., 35, 2009-2018, (1994) [41] Shinohara, T., Inferring unions of two pattern languages, Bull. inform. cybern., 20, 83-88, (1983) · Zbl 0516.68065 [42] Shinohara, T., Inductive inference of monotonic formal systems from positive data, New generation computing, 8, 371-384, (1991) · Zbl 0712.68062 [43] Shinohara, T.; Arikawa, S., Pattern inference, (), 259-291 [44] Shinohara, T.; Arimura, H., Inductive inference of unbounded unions of pattern languages from positive data, (), 256-271 · Zbl 1184.68294 [45] Smullyan, R., Theory of formal systems, Annals of mathematical studies, 47, (1961), Princeton Univ. Press Princeton [46] Wiehagen, R., Limes-erkennung rekursiver funktionen durch spezielle strategien, Journal of information processing and cybernetics (EIK), 12, 93-99, (1976) · Zbl 0346.02019 [47] Wiehagen, R.; Zeugmann, T., Ignoring data may be the only way to learn efficiently, Journal of experimental & theoretical artificial intelligence, 6, 131-144, (1994) · Zbl 0803.68109 [48] Wright, K., Identification of unions of languages drawn from an identifiable class, (), 328-333 · Zbl 0747.68061 [49] Zeugmann, T., Lange and Wiehagen’s pattern language learning algorithm: an average-case analysis with respect to its total learning time, RIFIS technical report, (1995) · Zbl 0918.68103 [50] Zeugmann, T., Lange and Wiehagen’s pattern language learning algorithm: an average-case analysis with respect to its total learning time, Annals of mathematics and artificial intelligence, 23, 117-145, (1998) · Zbl 0918.68103 [51] Zeugmann, T.; Lange, S., A guided tour across the boundaries of learning recursive languages, (), 190-258 [52] Zeugmann, T.; Lange, S.; Kapur, S., Characterizations of monotonic and dual monotonic language learning, Inform. and comput., 120, 155-173, (1995) · Zbl 0835.68103
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.