Abstract
Computability theoretic learning theory (machine inductive inference) typically involves learning programs for languages or functions from a stream of complete data about them and, importantly, allows mind changes as to conjectured programs. This theory takes into account algorithmicity but typically does not take into account feasibility of computational resources. This paper provides some example results and problems for three ways this theory can be constrained by computational feasibility. Considered are: the learner has memory limitations, the learned programs are desired to be optimal, and there are feasibility constraints on obtaining each output program and on the number of mind changes.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Ambainis, A., Case, J., Jain, S., Suraj, M.: Parsimony hierarchies for inductive inference. Journal of Symbolic Logic 69, 287–328 (2004)
Ash, C., Knight, J.: Recursive structures and Eshov’s hierarchy. Mathematical Logic Quarterly 42, 461–468 (1996)
Baliga, G., Case, J., Merkle, W., Stephan, F., Wiehagen, W.: When unlearning helps, Journal submission (2007)
Bārzdiņš, J.: Two theorems on the limiting synthesis of functions. Theory of Algorithms and Programs, Latvian State University, Riga. 210, 82–88 (1974)
Bārzdiņš, J., Freivalds, R.: Prediction and limiting synthesis of recursively enumerable classes of functions. Latvijas Valsts Univ. Zinatn. Raksti 210, 101–111 (1974)
Bernstein, E., Vazirani, U.: Quantum complexity theory. SIAM Journal on Computing 26, 1411–1473 (1997)
Carlucci, L., Case, J., Jain, S., Stephan, F.: Non U-shaped vacillatory and team learning. In: Jain, S., Simon, H.U., Tomita, E. (eds.) ALT 2005. LNCS (LNAI), vol. 3734, pp. 241–255. Springer, Heidelberg (2005)
Carlucci, L., Case, J., Jain, S., Stephan, F.: Memory-limited U-shaped learning. In: Lugosi, G., Simon, H. (eds.) COLT 2006. LNCS (LNAI), vol. 4005, pp. 244–258. Springer, Heidelberg (2006)
Case, J.: The power of vacillation in language learning. SIAM Journal on Computing 28(6), 1941–1969 (1999)
Case, J., Chen, K., Jain, S., Merkle, W., Royer, J.: Generality’s price: Inescapable deficiencies in machine-learned programs. Annals of Pure. and Applied Logic 139, 303–326 (2006)
Case, J., Jain, S., Lange, S., Zeugmann, T.: Incremental concept learning for bounded data mining. Information and Computation 152, 74–110 (1999)
Case, J., Lynes, C.: Machine inductive inference and language identification. In: Nielsen, M., Schmidt, E.M. (eds.) Automata, Languages, and Programming. LNCS, vol. 140, pp. 107–115. Springer, Heidelberg (1982)
Case, J., Moelius, S.: U-shaped, iterative, and iterative-with-counter learning (2007) (Submitted)
Case, J., Paddock, T., Kötzing, T.: Feasible iteration of feasible learning functionals, Work in progress (2007)
Case, J., Smith, C.: Comparison of identification criteria for machine inductive inference. Theoretical Computer Science 25, 193–220 (1983)
Cormen, T., Leiserson, C., Rivest, R., Stein, C.: Introduction to Algorithms, 2nd edn. MIT Press, Cambridge, MA (2001)
Daley, R., Smith, C.: On the complexity of inductive inference. Information and Control 69, 12–40 (1986)
Downey, R., Fellows, M.: Parameterized Complexity. In: Monographs in Computer Science, Springer, Heidelberg (1998)
Ershov, Y.: A hierarchy of sets, I. Algebra i Logika, 7(1):47–74, 1968. In Russian (English translation in Algebra and Logic, 7:25–43 1968) (1968)
Ershov, Y.: A hierarchy of sets II. Algebra and Logic 7, 212–232 (1968)
Freivalds, R., Smith, C.: On the role of procrastination in machine learning. Information and Computation 107(2), 237–271 (1993)
Fulk, M., Jain, S., Osherson, D.: Open problems in Systems That Learn. Journal of Computer and System Sciences 49(3), 589–604 (1994)
Gold, E.: Language identification in the limit. Information and Control 10, 447–474 (1967)
Hartmanis, J., Stearns, R.: On the computational complexity of algorithms. Transactions of the American Mathematical Society 117, 285–306 (1965)
Hildebrand, F.: Introduction to Numerical Analysis. McGraw-Hill, New York (1956)
Hopcroft, J., Ullman, J.: Introduction to Automata Theory Languages and Computation. Addison-Wesley Publishing Company, London, UK (1979)
Irwin, R., Kapron, B., Royer, J.: On characterizations of the basic feasible functional, Part I. Journal of Functional Programming 11, 117–153 (2001)
Jain, S., Osherson, D., Royer, J., Sharma, A.: Systems that Learn: An Introduction to Learning Theory, 2nd edn. MIT Press, Cambridge, MA (1999)
Jain, S., Sharma, A.: Elementary formal systems, intrinsic complexity, and procrastination. Information and Computation 132, 65–84 (1997)
Kapron, B., Cook, S.: A new characterization of type 2 feasibility. SIAM Journal on Computing 25, 117–132 (1996)
Kearns, M., Vazirani, U.: An Introduction to Computational Learning Theory. MIT Press, Cambridge, MA (1994)
Kinber, E., Stephan, F.: Language learning from texts: Mind changes, limited memory and monotonicity. Information and Computation 123, 224–241 (1995)
Lange, S., Zeugmann, T.: Incremental learning from positive data. Journal of Computer and System Sciences 53, 88–103 (1996)
Marcus, G., Pinker, S., Ullman, M., Hollander, M., Rosen, T.J., Xu, F.: Overregularization in Language Acquisition. In: Monographs of the Society for Research in Child Development (Includes commentary by H. Clahsen), vol. 57(4), University of Chicago Press, Chicago (1992)
Mehlhorn, K.: Polynomial and abstract subrecursive classes. Journal of Computer and System Sciences 12, 147–178 (1976)
Odifreddi, P.: Classical Recursion Theory, volume II. Elsivier, Amsterdam (1999)
Plunkett, K., Marchman, V.: U-shaped learning and frequency effects in a multi-layered perceptron: implications for child language acquisition. Cognition 86(1), 43–102 (1991)
Reischuk, R., Zeugmann, T.: An average-case optimal one-variable pattern language learner. Journal of Computer and System Sciences (Special Issue for COLT’98) 60(2), 302–335 (2000)
Rogers, H.: Theory of Recursive Functions and Effective Computability. McGraw Hill, New York, 1967. Reprinted, MIT Press (1987)
Royer, J., Case, J.: Subrecursive Programming Systems: Complexity and Succinctness. Research monograph in Progress in Theoretical Computer Science. Birkhäuser Boston (1994)
Sipser, M.: Private communication (1978)
Strauss, S., Stavy, R. (eds.): U-Shaped Behavioral Growth. Academic Press, NY (1982)
Taatgen, N., Anderson, J.: Why do children learn to say broke? A model of learning the past tense without feedback. Cognition 86(2), 123–155 (2002)
Wiehagen, R.: Limes-erkennung rekursiver funktionen durch spezielle strategien. Electronische Informationverarbeitung und Kybernetik 12, 93–99 (1976)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Case, J. (2007). Resource Restricted Computability Theoretic Learning: Illustrative Topics and Problems. In: Cooper, S.B., Löwe, B., Sorbi, A. (eds) Computation and Logic in the Real World. CiE 2007. Lecture Notes in Computer Science, vol 4497. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-73001-9_12
Download citation
DOI: https://doi.org/10.1007/978-3-540-73001-9_12
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-73000-2
Online ISBN: 978-3-540-73001-9
eBook Packages: Computer ScienceComputer Science (R0)