Journal of Statistical Physics

, Volume 162, Issue 5, pp 1294–1308 | Cite as

On the Sufficiency of Pairwise Interactions in Maximum Entropy Models of Networks

  • Lina Merchan
  • Ilya Nemenman


Biological information processing networks consist of many components, which are coupled by an even larger number of complex multivariate interactions. However, analyses of data sets from fields as diverse as neuroscience, molecular biology, and behavior have reported that observed statistics of states of some biological networks can be approximated well by maximum entropy models with only pairwise interactions among the components. Based on simulations of random Ising spin networks with p-spin (\(p>2\)) interactions, here we argue that this reduction in complexity can be thought of as a natural property of densely interacting networks in certain regimes, and not necessarily as a special property of living systems. By connecting our analysis to the theory of random constraint satisfaction problems, we suggest a reason for why some biological systems may operate in this regime.


Collective dynamics p-spin models Numerical simulations 



We thank Aly Pesic and Daniel Holz, who helped during the early stages of this project, Arthur Lander and Chris Myers, who suggested a possible link to evolution, and Thierry Mora, Aleksandra Walczak, and Gasper Tkacik for useful discussions. We also thank the anonymous referees for their insightful comments. We are grateful to the Emory College Emerson Center for Scientific Computing and its funders for the help with numerical simulations. The authors were partially supported by the James S. McDonnell foundation Complex Systems award, by the Human Frontiers Science Program, and by the National Science Foundation.


  1. 1.
    Jaynes, E.T.: Information theory and statistical mechanics. Phys. Rev. 106, 620 (1957)CrossRefADSMathSciNetMATHGoogle Scholar
  2. 2.
    Schneidman, E., Still, S., Berry, M.J., Bialek, W.: Network information and connected correlations. Phys. Rev. Lett. 91, 238701 (2003)CrossRefADSGoogle Scholar
  3. 3.
    Ackley, D., Hinton, G., Sejnowski, T.: A learning algorithm for boltzmann machines. Cogn. Sci. 9, 147 (1985)CrossRefGoogle Scholar
  4. 4.
    Broderick, T., Dudik, M., Tkacik, G., Schapire, R.E., Bialek, W.: Faster solutions of the inverse pairwise Ising problem. (2007). arXiv:0712.2437v2
  5. 5.
    Weigt, M., White, R.A., Szurmant, H., Hoch, J.A., Hwa, T.: Identification of direct residue contacts in protein-protein interaction by message passing. Proc. Natl. Acad. Sci. (USA) 106, 67 (2009)CrossRefADSGoogle Scholar
  6. 6.
    Mézard, M., Mora, T.: Constraint satisfaction problems and neural networks: a statistical physics perspective. J. Physiol. (Paris) 103, 107 (2009)CrossRefGoogle Scholar
  7. 7.
    Sessak, V., Monasson, R.: Small-correlation expansions for the inverse Ising problem. J. Phys. A 42, 055001 (2009)CrossRefADSMathSciNetGoogle Scholar
  8. 8.
    Cocco, S., Monasson, R.: Adaptive cluster expansion for inferring Boltzmann machines with noisy data. Phys. Rev. Lett. 106, 090601 (2011)CrossRefADSGoogle Scholar
  9. 9.
    Nguyen, H.C., Berg, J.: Mean-field theory for the inverse Ising problem at low temperatures. Phys. Rev. Lett. 109, 050602 (2012)CrossRefADSGoogle Scholar
  10. 10.
    Tkacik, G., Bialek, W. Information processing in living systems. (2014). arXiv:1412.8752v1
  11. 11.
    Schneidman, E., Berry, M.J., Segev, R., Bialek, W.: Weak pairwise correlations imply strongly correlated network states in a neural population. Nature 440, 1007 (2006)CrossRefADSGoogle Scholar
  12. 12.
    Tang, A., Jackson, D., Hobbs, J., Chen, W., Smith, J.L., Patel, H., Prieto, A., Petrusca, D., Grivich, M.I., Sher, A., Hottowy, P., Dabrowski, W., Litke, A.M., Beggs, J.M.: A maximum entropy model applied to spatial and temporal correlations from cortical networks in vitro. J. Neurosci. 28, 505 (2008)CrossRefGoogle Scholar
  13. 13.
    Ohiorhenuan, I., Mechler, F., Purpura, K., Schmid, A., Hu, Q., Victor, J.: Sparse coding and high-order correlations in fine-scale cortical networks. Nature 466, 617 (2010)CrossRefADSGoogle Scholar
  14. 14.
    Field, G., Gauthier, J., Sher, A., Greschner, M., Machado, T., Jepson, L., Shlens, J., Gunning, D., Mathieson, K., Dabrowski, W., Paninski, L., Litke, A., Chichilnisky, E.J.: Functional connectivity in the retina at the resolution of photoreceptors. Nature 467, 673 (2010)CrossRefADSGoogle Scholar
  15. 15.
    Tkacik, G., Marre, O., Amodei, D., Schneidman, E., Bialek, W., Berry, M.J.: Searching for collective behavior in a large network of sensory neurons. PLoS Comput. Biol. 10, e1003408 (2014)CrossRefADSGoogle Scholar
  16. 16.
    Bethge, M., Berens, P.: Near-maximum entropy models for binary neural representations of natural images. In: Platt, J., Koller, D., Singer, Y., Roweis, S. (eds.) Advances in Neural Information Processing Systems, p. 97. MIT Press, Cambridge (2008)Google Scholar
  17. 17.
    Halabi, N., Rivoire, O., Leibler, S., Ranganathan, R.: Protein sectors: evolutionary units of three-dimensional structure. Cell 138, 774 (2009)CrossRefGoogle Scholar
  18. 18.
    Mora, T., Walczak, A., Bialek, W., Callan, C.: Maximum entropy models for antibody diversity. Proc. Natl. Acad. Sci. (USA) 107, 5405 (2010)CrossRefADSGoogle Scholar
  19. 19.
    Bialek, W., Cavagna, A., Giardina, I., Mora, T., Silvestri, E., Viale, M., Walczak, A.: Statistical mechanics for natural flocks of birds. Proc. Natl. Acad. Sci. (USA) 109, 4786 (2012)CrossRefADSGoogle Scholar
  20. 20.
    Margolin, A.A., Wang, K., Califano, A., Nemenman, I.: Multivariate dependence and genetic networks inference. IET Syst. Biol. 4, 428 (2010)CrossRefGoogle Scholar
  21. 21.
    Otwinowski, J., Nemenman, I.: Genotype to phenotype mapping and the fitness landscape of the E. coli lac promoter. PLoS One 8, e61570 (2013)CrossRefADSGoogle Scholar
  22. 22.
    Mora, T., Bialek, W.: Are biological systems poised at criticality? J. Stat. Phys. 144, 268 (2011)CrossRefADSMathSciNetMATHGoogle Scholar
  23. 23.
    Schwab, D.J., Nemenman, I., Mehta, P.: Zipf’s law and criticality in multivariate data without fine-tuning. Phys. Rev. Lett. 113, 068102 (2014)CrossRefADSGoogle Scholar
  24. 24.
    Tkacik, G., Schneidman, E., Berry, M., Bialek, W.: Ising models for networks of real neurons. (2006). arXiv:org/q-bio/0611072v1
  25. 25.
    Roudi, Y., Nirenberg, S., Latham, P.: Pairwise maximum entropy models for studying large biological systems: when they can work and when they can’t. PLoS Comput. Biol. 5, e1000380 (2009)CrossRefADSMathSciNetGoogle Scholar
  26. 26.
    Mézard, M., Ricci-Tersenghi, F., Zecchina, R.: Two solutions to diluted p-spin models and XORSAT problems. J. Stat. Phys. 111, 505 (2003)CrossRefMATHGoogle Scholar
  27. 27.
    Ricci-Tersenghi, F., Weigt, M., Zecchina, R.: Simplest random k-satisfiability problem. Phys. Rev. E 63, 026702 (2001)CrossRefADSGoogle Scholar
  28. 28.
    Semerjian, G.: On the freezing of variables in random constraint satisfaction problems. J. Stat. Phys. 130(2), 251–293 (2007)CrossRefADSMathSciNetGoogle Scholar
  29. 29.
    Kirkpatrick, T., Thirumalai, D.: p-Spin-interaction spin-glass models: connections with the structural glass problem. Phys. Rev. B 36, 5388 (1987)CrossRefADSGoogle Scholar
  30. 30.
    Deming, W.E., Stephan, F.F.: On a least squares adjustment of a sampled frequency table when the expected marginal totals are known. Ann. Math. Stat. 11, 427 (1940)CrossRefMathSciNetGoogle Scholar
  31. 31.
    Csiszár, I.: I-divergence geometry of probability distributions and minimization problems. Ann. Probab. 3, 146–158 (1975)CrossRefMATHGoogle Scholar
  32. 32.
    Mézard, M., Parisi, G., Zecchina, R.: Analytic and algorithmic solution of random satisfiability problems. Science 297, 812 (2002)CrossRefADSGoogle Scholar
  33. 33.
    Krzakala, F., Montanari, A., Ricci-Tersenghi, F., Semerjian, G., Zdeborova, L.: Gibbs states and the set of solutions of random constraint satisfaction problems. Proc. Natl. Acad. Sci. (USA) 104, 10318 (2007)CrossRefADSMathSciNetMATHGoogle Scholar
  34. 34.
    Hopfield, J.J.: Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. (USA) 79, 2554 (1982)CrossRefADSMathSciNetGoogle Scholar
  35. 35.
    Cocco, S., Monasson, R., Sessak, V.: High-dimensional inference with the generalized hopfield model: principal component analysis and corrections. Phys. Rev. E 83, 051123 (2011)CrossRefADSGoogle Scholar
  36. 36.
    Cocco, S., Monasson, R., Weigt, M.: From principal component to direct coupling analysis of coevolution in proteins: low-eigenvalue modes are needed for structure prediction. PLoS Comput. Biol. 9, e1003176 (2013)CrossRefADSMathSciNetGoogle Scholar
  37. 37.
    Obuchi, T., Cocco, S., Monasson, R.: Learning probabilities from random observables in high dimensions: the maximum entropy distribution and others. J. Stat. Phys. 161, 598–632 (2015)CrossRefADSMathSciNetGoogle Scholar
  38. 38.
    Lander, A. Beyond q-bio? The closing talk at The Sixth q-bio Conference, Santa Fe, NM. (2012)Google Scholar
  39. 39.
    Barlow, H.B.: Possible principles underlying the transformation of sensory messages. In: Rosenblith, W. (ed.) Sensory Communication, p. 217. MIT Press, Cambridge, MA (1961)Google Scholar
  40. 40.
    Myers, C.R.: Satisfiability, sequence niches and molecular codes in cellular signalling. IET Syst. Biol. 2, 304 (2008)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Department of Engineering TechnologySavannah State UniversitySavannahUSA
  2. 2.Department of PhysicsEmory UniversityAtlantaUSA
  3. 3.Departments of Physics and BiologyEmory UniversityAtlantaUSA

Personalised recommendations