Complexity and Information Theory

  • Aldo De Luca
Part of the International Centre for Mechanical Sciences book series (CISM, volume 216)


The concept of “information” appeared in Physics in connection with the concept of “entropy”. It was observed (Boltzmann, 1896) in the framework of statistical thermodynamics, that the entropy is proportional to the logarithm of the number of alternatives (or microscopic states) which are possible for a physical system knowing all the macroscopic information about it. The entropy gives, in other words, a measure of the total amount of missing information about the system.


Turing Machine Complexity Measure Binary String Recursive Function Computable Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


Algorithms and Computabiltiy

  1. [1]
    Church, A., An unsolvable problem of elementary number theory, Amer. J. of Math., 58, 345–363, 1936.CrossRefMathSciNetGoogle Scholar
  2. [2]
    Church, A., “The Calculi of Lambda-conversion”, Annals of Mathematics Studies, no. 6, Princeton University Press, Princeton, N.Y. 1941.Google Scholar
  3. [3]
    Davis, M., “Computability and Unsolvability”, McGraw Hill, New York, 1958.MATHGoogle Scholar
  4. [4]
    Gödel, K., On undecidable propositions of formal mathematical systems, mimeographed lecture notes, Institute for Advanced Study, Princeton, N.J., 1934.Google Scholar
  5. [5]
    Kleene, S.C., General recursive functions of natural numbers, Matematische Annalen, 112, 727–742, 1936.CrossRefMathSciNetGoogle Scholar
  6. [6]
    Kleene, S.C., “Introduction to Metamathematics”, Van Nostrand Company, Inc. Princeton, N.J. 1952.Google Scholar
  7. [7]
    Markov, A.A., The theory of algorithms (Russian), Trudy Mathematischeskogo Instituta imeni V.A. Steklova, 38, 176–189, 1951.MATHGoogle Scholar
  8. [8]
    Theory of algorithms“ (English transl.) Israel Program for scientific translations. Jerusalem, 1962.Google Scholar
  9. [9]
    Péter, R., “Rekursive Funktionen”. Akademiai Koadó, Budapest, 1951.MATHGoogle Scholar
  10. [10]
    Post, E.L., Finite combinatory processes-Formulation I,J. of Symbolic Logic, 1, 103–105, 1936.Google Scholar
  11. [11]
    Post, E.L., Formal reductions of the general combinatorial decision problem, Amer. J. Math., 65, 197–215, 1943.CrossRefMATHMathSciNetGoogle Scholar
  12. [12]
    Rice, H.G., Classes of recursively enumerable sets and their decision problems. Trans. of Amer. Math. Soc. 74, 358–366, 1953.CrossRefMATHMathSciNetGoogle Scholar
  13. [13]
    Rogers, H. Jr., Gödel numberings of partial recursive functions, J. of Symbolic Logic, 23, 331–341, 1958.CrossRefGoogle Scholar
  14. [14]
    Rogers, H. Jr., “Theory of Recursive Functions and Effective Computability”, McGraw Hill, New York, 1967.MATHGoogle Scholar
  15. [15]
    Smullyan, R.M., “Theory of formal Systems”, Annals of Mathematics Studies, no. 47, Princeton University Press, Princeton, N.J. 1961.Google Scholar
  16. [16]
    Turing, A.M., On computable numbers, with an application to the Entscheidungs problem, Proc. of the London Math. Soc. ser. 2, 42, 230–265, 1936–1937.Google Scholar

Complexity of Algorithms

  1. [17]
    Adrianopoli, F. and A. De Luca. Closure operations on measures of computational complexity, Calcolo, 2,1–13, 1974; (abstract) in the Notices of AMS April, 1972.Google Scholar
  2. [18]
    Arbib,M. and M. Blum, Machine dependence of degrees of difficulty, Proc. of the Amer. Math. Soc., 16, 442–447, 1965.Google Scholar
  3. [19]
    Arbib, M., Speed-up theorems and incompleteness theorems, in “Automata Theory” (Caianiello, E.R. ed.), Academic Press, New York, 1966.Google Scholar
  4. [20]
    Ausiello, G., Teorie della complessità di calcolo, Calcolo, 7, 387–408, 1970.CrossRefGoogle Scholar
  5. [21]
    Ausiello, G., Abstract computational complexity and cycling computations, J. Comp. and Sys. Sci. 5, 118–128, 1971.Google Scholar
  6. [22]
    Axt, P., Enumeration and the Grzegorczyk hieratchy.Zeit.Math. Logik und Grundlagen Math 9. 5.-65, 1963.Google Scholar
  7. [23]
    Blum, M., A machine independent theory of the complexity of recursive functions, J. of ACM, 14, 322–336, 1967a.CrossRefMATHGoogle Scholar
  8. [24]
    Blum, M., On the size of machines, Information and Control, 11, 257–265, 1967b.CrossRefMATHMathSciNetGoogle Scholar
  9. [25]
    Borodin, A, On effective procedures for speeding up algorithms, J. of ACM, 18, 290–305, 1971.CrossRefGoogle Scholar
  10. [26]
    Borodin, A., Complexity classes of recursive functions and the existence of complexity gaps, ACM Symp. on Theory of Computing, Marina del Rey, Calif. 1969.Google Scholar
  11. [27]
    Borodin, A., Homers rule is uniquely optimal, in “Theory of machines and Computations” (Z. Kohavi and A. Paz, eds.) Academic Press, New York and London, 1971.Google Scholar
  12. [28]
    Borodin, A., Computational complexity and the existence of complexity gaps, J. of ACM, 19, 158–174, 1972.CrossRefMATHMathSciNetGoogle Scholar
  13. [29]
    Borodin, A., Computational complexity: Theory and practice. In “Currents in the theory of computing” (A.V.Aho, ed.) Prentice-Hall Series in automatic computation, 1973.Google Scholar
  14. [30]
    Cobham, A., The intrinsic computational difficulty of functions, Proc. Congress for Logic, Mathematics, and Philosophy of Science. North-Holland, Amsterdam, 1964.Google Scholar
  15. [31]
    Grzegorczyk, A., Some classes of recursive functions, Rozprawy Mat. 4, Warsaw, 1–45, 1953.Google Scholar
  16. [32]
    Hartmanis, J. and R.E. Stearns, On the computational complexity of algorithms, Trans. of the Amer. Math. Soc., 117, 285–306, 1965.CrossRefMATHMathSciNetGoogle Scholar
  17. [33]
    Hartmanis, J.P., P.M. Lewis II and R.E. Stearns, Classifications of computation by time and memory requirements, IFIP Congress 1965, Vol. 1, Spartan books, Washington, D.C., 1965.Google Scholar
  18. [34]
    Hartmanis, J., Tape-reversal bounded Turing machine computations, J. of Comp. and Syst. Sci., 2, 117–135, 1968.CrossRefMATHMathSciNetGoogle Scholar
  19. [35]
    Hartmanis, J. and R.E. Stearns, Automata–based computational complexity, Information Sciences, 1, 173–184, 1969.CrossRefMathSciNetGoogle Scholar
  20. [36]
    Hartmanis, J. and J.E. Hoperoft, An overview of the theory of computational complexity, J. of ACM, 18, 444–475, 1971.CrossRefMATHGoogle Scholar
  21. [37]
    Hennie, F.C., One tape off-line Turing machine computations, Information and Control, 8, 553–578, 1965.CrossRefMathSciNetGoogle Scholar
  22. [38]
    Hennie, F.C. and R.E. Stearns, Two-tape simulation of multi-tape Turing machines, J. of ACM, 13, 533–546, 1966.CrossRefMATHMathSciNetGoogle Scholar
  23. [39]
    Hoperoft, J.E. and J.D. Ullman, “Formal Languages and their relation to Automata”., Addison Wesley Publ. Co. 1969.Google Scholar
  24. [40]
    Knuth, D.E., “The art of Computer programming”, Addison Wesley Publ. Co. 1969.MATHGoogle Scholar
  25. [41]
    McCreight, E.M. and A.R. Meyer, Classes of computable functions defined by bounds on computation, ACM Symp. on Theory of Computing, Marina del Rey, Calif. 1969.Google Scholar
  26. [42]
    Meyer, A.R. and P.C. Fisher, On computational speed-up, IEEE Conf. Rec. 9th Ann.Symp. on Switching and Automata theory, October, 1968.Google Scholar
  27. [43]
    Rabin, M.O., Degrees of difficulty of computing a function and a partial ordering of recursive sets, Tech. Report 2, Hebrew University, Jerusalem, Israel, 1960.Google Scholar
  28. [44]
    Rabin, M.O. and D. Scott, Finite automata and their decision problems, IBM J. Res.Develop., 3, 114–125, 1959.MathSciNetGoogle Scholar
  29. [45]
    Ritchie, R.W., Classes of recursive functions based on Ackermann’s function, Pacific J. of Math., 15, 3, 1965.CrossRefMathSciNetGoogle Scholar
  30. [46]
    Schnorr, C.P., Does the computational speed-up concern programming?, in “Automata, Languages and Programming” ( Nivat, M. ed.) North-Holland/American Elsevier, 1972.Google Scholar
  31. [47]
    Trakhtenbrot, B.A., Complexity of algorithms and computations, Course Notes, Novosibirsk University, 1967.Google Scholar

Information Theory and Complexity

  1. [48]
    Bardzin, J., Complexity of programs to determine whether natural numbers not greater than n belong to a recursively enumerable set. Soviet Math. Dokl. 9,5, 1251–1254, 1968.Google Scholar
  2. [49]
    Boltzmann, L., “Vorlesungen über Gas Theorie”, Vol. 1, J.A. Barth Leipzig. 1896 English Transl. Lectures on gas theory. Berkeley Calif. 1964.Google Scholar
  3. [50]
    Brillouin, L., “Science and Information Theory”, Academic Press Inc. New York, 1956.MATHGoogle Scholar
  4. [51]
    Carnap, R. and Y. Bar-Hillel, An outline of a theory of semantic information, Tech. Rep. 247, M.I.T., Research Laboratory of Electronics, 1952, Reprinted in Bar-Hillel, Y., “Language and Information”, Addison Wesley, Reading, Mass., 1962.Google Scholar
  5. [52]
    Chaitin, G.J., On the length of programs for computing finite binary sequences, J. of ACM, 13, 547–569, 1966.CrossRefMATHMathSciNetGoogle Scholar
  6. [53]
    Chaitin, G.J., Information-theoretic computational complexity. IEEE Trans. on Information Theory IT-20, 10–15, 1974.Google Scholar
  7. [54]
    Church, A., On the concept of a random sequence, Bull. Amer. Math. Soc. 46, 130–135, 1940.CrossRefMathSciNetGoogle Scholar
  8. [55]
    Cramer, H. “Mathematical methods of statistics”, Princeton University Press, Princeton, 1945.Google Scholar
  9. [56]
    Daley, R., Minimal-program complexity of pseudo-recursive and pseudo-random sequences. Dept. of Math. Carnegie Mellon University. Report 71–28, 1971a.Google Scholar
  10. [57]
    Daley, R., Minimal program complexity with restricted resources, University of Chicago ICR Report no. 30, 1971b.Google Scholar
  11. [58]
    Daley, R., An example of information and computation resource trade-off, J. of ACM, 20, 687–695, 1973.CrossRefMATHMathSciNetGoogle Scholar
  12. [59]
    De Luca, A. and E. Fischetti, Outline of a new logical approach to information theory, Proc. of NATO Summer School on “New concepts and Technologies in parallel processing”, Capri, 17–30 June 1973. Published by Noordhoff, Series E, n.9, Leyden, 1975.Google Scholar
  13. [60]
    De Luca, A. and S. Termini, Algorithmic aspects in complex systems analysis, Scientia, 106, 659–671, 1972a.Google Scholar
  14. [61]
    De Luca, A. and S. Termini, A definition of a non-probabilistic entropy in the setting of fuzzy sets theory, Information and Control, 20, 301–312, 1972b.CrossRefMATHMathSciNetGoogle Scholar
  15. [62]
    Fano, R.M., “Transmission of Information”, M.I.T. Press and J. Wiley Sons, Inc. New York and London 1961.Google Scholar
  16. [63]
    Guccione, S. and P. Lo Sardo, Casualità ed effettività, Reprint, Istituto di Fisica Teorica dell’Università di Napoli, 1972.Google Scholar
  17. [64]
    Hartley, R.V.L., Transmission of information, Bell Syst. Tech. J., 7, 535–563, 1928.Google Scholar
  18. [65]
    Kanovic, M. and N.V., Petri, Some theorems of complexity of normal computations. Soviet Math. Dokl. 10,1,233–234, 1969. algorithms andGoogle Scholar
  19. [66]
    Kolmogorov, A.N., Grundbegriffe der Wahrscheinlichkeitsrechnung, in Mathematik, Berlin, 1933. Reprinted: “Foundations of the Probability” 2nd ed., Chelsea, New York, 1956.Google Scholar
  20. [67]
    Kolmogorov, A.N., On tables of random numbers, Sankhya, 25, 369–374, 1963. Ergebnisse der theory of theGoogle Scholar
  21. [68]
    Kolmogorov, A.N., Three approaches to the quantitative definition of information, Problemy Peredachi Infor¬matsii, 1, 3–11, 1965.MATHMathSciNetGoogle Scholar
  22. [69]
    Kolmogorov, A.N., Logical basis for information theory and probability theory, IEEE Trans. on Information Theory, IT-14, 662–664, 1968.Google Scholar
  23. [70]
    Loveland, D.W., The Kleene hierarchy classification of recursively random sequences, Trans. Amer.Math.Soc. 125, 497–510, 1966.Google Scholar
  24. [71]
    Loveland, D.W., A variant of the Kolmogorov concept of complexity, Information and Control, 15, 510–526, 1969a.CrossRefMATHMathSciNetGoogle Scholar
  25. [72]
    Loveland, D.W., On minimal program complexity measures, ACM Symp. on Theory of Computing, Marina del Rey, Claif. 1969b.Google Scholar
  26. [73]
    Mac Kay, D.M., Quantal aspects of scientific information theory, Philosophical magazine, 41, 289–311, 1950.Google Scholar
  27. [74]
    Martin-Löf, P., The definition of random sequences, Information and Control, 9, 602–619, 1966.CrossRefMATHMathSciNetGoogle Scholar
  28. [75]
    Martin-Löf, P., The literature on von Mises’Kollektives revisited, Theoria, 1, 12–37, 1969.Google Scholar
  29. [76]
    Martin-Löf, P., Algorithms and randomness, Rev. Inter.Stat.Inst., 37, 265–272, 1969.CrossRefMATHGoogle Scholar
  30. [77]
    Martin-Löf, P., Complexity oscillations in infinite binary sequences, Z. Wahrscheinlich. verw.Geb., 19, 225–230, 1971Google Scholar
  31. [78]
    Nyquist, H., Certain factors affecting telegraph speed, Bell Syst. Tech.J., 3, 324, 1924.Google Scholar
  32. [79]
    Petri, N.V., The complexity of algorithms and their operating time, Soviet Math. Dokl., 10,3, 547–549, 1969.Google Scholar
  33. [80]
    Rose, G.F. and J.S. Ullian, Approximation of functions of the integers, Pacific J. of Math. 13, 693–701, 1963.CrossRefMATHMathSciNetGoogle Scholar
  34. [81]
    Shannon, C.E., A mathematical theory of communication, Bell Syst. Tech. J., 27, 379–423, 623–656, 1948.Google Scholar
  35. [82]
    Shannon, C.E. and W. Weaver, “The mathematical theory of communication”, University of Illinois Press, Urbana, 1949.MATHGoogle Scholar
  36. [83]
    Schnorr, C.P., A unified approach to the definition of random sequences, Math. Syst. Theory, 5, 246–258, 1970.CrossRefMathSciNetGoogle Scholar
  37. [84]
    Schnorr, C.P., “Zufälligkeit und Wahrscheinlichkeit”, Lectures notes in Mathematics. Springer Verlag, vol. 218, Berlin-Heidelberg-New York, Springer, 1971.Google Scholar
  38. [85]
    Solomonoff, R.J., A formal theory of inductive inference. Part. I. Information and Control, 7, 1–22, 1964.CrossRefMATHMathSciNetGoogle Scholar
  39. [86]
    Szilard, L., Über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen, Zeitschrift f. Physik, 53, 840–856, 1929; Reprinted: “On decrease of entropy in thermodynamic system by the intervention of intelligent beings”, Behavioral Science, 9, 301–310, 1964.Google Scholar
  40. [87]
    Varadarajan, V.S., Probability in physics and a theorem of simultaneous observability, Comm. Pure Appl. Math., 15, 189–217, 1962.MATHMathSciNetGoogle Scholar
  41. [88]
    Ville, J., Etude Critique de la Notion de Collectif, Gauthier-Villars, Paris, 1939.Google Scholar
  42. [89]
    von Mises, R., Grundlagen der Wahrscheinlichkeitsrechnung, Mathematische Zeitschrift, 5, 52–99, 1919.CrossRefMATHMathSciNetGoogle Scholar
  43. [90]
    Wald, A., Sur la notion de collectif dans le calcul des probabilités, C.R. Acad. Sci. Paris 202, 180–183, 1936.Google Scholar
  44. [91]
    Wiener, N., “Cybernetics”, Hermann, Paris, 1948; 2nd ed., The MIT Press and J. Wiley and Sons, Inc., New York, 1961.Google Scholar
  45. [92]
    Wiener, N., “The extrapolation, interpolation and smooting of stationary time series”, John Wiley and Sons, New York, 1949.Google Scholar

Copyright information

© Springer-Verlag Wien 1975

Authors and Affiliations

  • Aldo De Luca
    • 1
    • 2
  1. 1.Laboratorio di Cibernetica del C.N.R.Arco Felice, NapoliItaly
  2. 2.Istituto di Scienze dell’InformazioneUniversità di SalernoItaly

Personalised recommendations