Phenomenology-first versus third-person approaches in the science of consciousness: the case of the integrated information theory and the unfolding argument

Abstract

Assessing the scientific status of theories of consciousness is often a difficult task. In this paper, I explore the dialectic between the Integrated Information Theory (Oizumi et al. PLoS Comput Biol, 10(5), e1003588, 2014; Tononi et al. Nat Rev Neurosci, 17(7), 450-61 , 2016) and a recently proposed criticism of that theory: the ‘unfolding argument’ (Doerig et al. Consciousness and Cognition, 72, 49-59 , 2019). I show that the phenomenology-first approach in consciousness research can lead to valid scientific theories of consciousness. I do this by highlighting the two reasons why the unfolding argument fails: first, phenomenology-first theories are grounded, not circular. Second, falsificationism does not provide an adequate demarcation criterion in philosophy of science. I conclude that this specific debate has significance for how, in general, consciousness researchers test and criticize theories of consciousness, and how dismissing the phenomenology-first methodology in favour of a third person-based methodology means endorsing a position in philosophy of mind that has already been challenged.

This is a preview of subscription content, log in to check access.

Data availability

Not applicable.

Notes

  1. 1.

    In this paper, I will use the term ‘phenomenology’ as synonym for experience, or phenomenal consciousness.

  2. 2.

    Importantly, this does not mean that there cannot be association of consciousness with specific functions (or reports). Rather, the idea is that there is no necessary connection between consciousness and functions.

  3. 3.

    For the original version of the argument, see (Doerig et al. 2019, p. 53).

  4. 4.

    Note that the original version has the form of a dilemma (causal structure theories are either false or unfalsifiable). To apply this dilemma here, P2 can be expressed as following: “IIT is true iff two systems, one feedforward, the other recurrent, can have the same input-output behaviour despite differing with respect to consciousness in virtue of their different internal causal structure”. If P2 is accepted, IIT is unfalsifiable; if denied, IIT is false.

  5. 5.

    This is not quite right, since IIT does not deny that behavioural functions can be indicative of consciousness in paradigmatic cases.

  6. 6.

    The paradigm described here can be thought of as no-report paradigm. Note that there is an underlying assumption here: under normal conditions, healthy subjects are conscious of unmasked suprathreshold stimuli. This assumption is commonly made in consciousness science and it seems to be made by IIT proponents as well, for example Haun et al. (2017); as such, whereas this assumption may be problematic, it is not a problem for IIT only. Under this assumption, the similarity between stimuli can be considered as a proxy for the similarity between conscious experiences. Note that if IIT does not accept this methodological assumption, it would indeed be quite unclear how to test it.

  7. 7.

    For this point, see (Bayne 2018) and (Tononi and Koch, 2015).

  8. 8.

    As per previous discussion, such a move, in order to be progressive, would need a clear way to distinguish and practically test these lower level properties.

  9. 9.

    For a textbook explanation of stimulus-response functionalism, see Braddon-Mitchell and Jackson (2007), p. 114.

  10. 10.

    https://www.sciencemag.org/news/2019/10/outlandish-competition-seeks-brain-s-source-consciousness

References

  1. Aaronson, S. (2014) Why I am not an integrated information theorist (or, the unconscious expander) Shtetl-Optimized. https://www.scottaaronson.com/blog/?p=1799: May 21st, 2014.

  2. Albantakis, L., Hintze, A., Koch, C., Adami, C., & Tononi, G. (2014). Evolution of integrated causal structures in Animats exposed to environments of increasing complexity. PLoS Computational Biology, 10(12), e1003966. https://doi.org/10.1371/journal.pcbi.1003966.

    Article  Google Scholar 

  3. Baars, B. J. (2009). History of consciousness science. In W. P. Banks (Ed.), Encyclopedia of consciousness (pp. 329–338). Oxford: Academic Press. https://doi.org/10.1016/B978-012373873-8.00037-2.

    Google Scholar 

  4. Barrett, A. B., & Mediano, P. A. M. (2019). The phi measure of integrated information is not well-defined for general Phsyical systems. Journal of Consciousness Studies, 26(1–2), 11–20.

    Google Scholar 

  5. Bayne, T. (2018). On the axiomatic foundations of the integrated information theory of consciousness Neurosci Conscious, 2018(1), niy007. doi:https://doi.org/10.1093/nc/niy007.

  6. Block, N. (1978). Troubles with functionalism. Minnesota Studies in the Philosophy of Science, 9, 261–325.

    Google Scholar 

  7. Block, N. (1981). Psychologism and behaviorism. The Philosophical Review, 90(1), 5–43. https://doi.org/10.2307/2184371.

    Article  Google Scholar 

  8. Braddon-Mitchell, D., & Jackson, F. (2007). The philosophy of mind and cognition. Blackwell.

  9. Brown, R., Lau, H., & LeDoux, J. E. (2019). Understanding the higher-order approach to consciousness. Trends in Cognitive Sciences, 23(9), 754–768. https://doi.org/10.1016/j.tics.2019.06.009.

    Article  Google Scholar 

  10. Dehaene, S., & Changeux, J.-P. (2011). Experimental and theoretical approaches to conscious processing. Neuron, 70(2), 200–227. https://doi.org/10.1016/j.neuron.2011.03.018.

    Article  Google Scholar 

  11. Dehaene, S., & Naccache, L. (2001). Towards a cognitive neuroscience of consciousness: Basic evidence and a workspace framework. Cognition, 79(1), 1–37. https://doi.org/10.1016/S0010-0277(00)00123-2.

    Article  Google Scholar 

  12. Doerig, A., Schurger, A., Hess, K., & Herzog, M. H. (2019). The unfolding argument: Why IIT and other causal structure theories cannot explain consciousness. Consciousness and Cognition, 72, 49–59. https://doi.org/10.1016/j.concog.2019.04.002.

    Article  Google Scholar 

  13. Haun, A. M., Oizumi, M., Kovach, C. K., Kawasaki, H., Oya, H., Howard, M. A., Adolphs, R., & Tsuchiya, N. (2017). Conscious perception as integrated information patterns in human Electrocorticography. eNeuro, 4(5), ENEURO.0085–ENEU17.2017. https://doi.org/10.1523/ENEURO.0085-17.2017.

    Article  Google Scholar 

  14. Kim, J. (1992). Multiple realization and the metaphysics of reduction. Philosophy and Phenomenological Research, 52(1), 1–26. https://doi.org/10.2307/2107741.

    Article  Google Scholar 

  15. Lakatos, I. (1968). Criticism and the methodology of scientific research Programmes. Proceedings of the Aristotelian Society, 69, 149–186 http://www.jstor.org.ezproxy.lib.monash.edu.au/stable/4544774.

    Article  Google Scholar 

  16. Lamme, V. A. F. (2006). Towards a true neural stance on consciousness. Trends in Cognitive Sciences, 10(11), 494–501. https://doi.org/10.1016/j.tics.2006.09.001.

    Article  Google Scholar 

  17. Lau, H., & Michel, M. (2019). On the dangers of conflating strong and weak versions of a theory of consciousness. https://doi.org/10.31234/osf.io/hjp3s.

  18. McQueen, K. J. (2019). Interpretation- neutral integrated information theory. Journal of Consciousness Studies, 26(1–2), 76–106 https://www.ingentaconnect.com/content/imp/jcs/2019/00000026/f0020001/art00005.

    Google Scholar 

  19. Oizumi, M., Albantakis, L., & Tononi, G. (2014). From the phenomenology to the mechanisms of consciousness: Integrated information theory 3.0. PLoS Comput Biol, 10(5), e1003588. https://doi.org/10.1371/journal.pcbi.1003588.

    Article  Google Scholar 

  20. Popper, K. R. (1959). The logic of scientific discovery. Oxford, England: Basic Books.

    Google Scholar 

  21. Putnam, H. (1967). The nature of mental states. In W. H. Capitan, & D. D. Merrill (Eds.), Art, mind, and religion (pp. 1--223). Pittsburgh University Press.

  22. Searle, J. R. (1980). Minds, brains, and programs. Behavioral and Brain Sciences, 3(3), 417–424. https://doi.org/10.1017/S0140525X00005756.

    Article  Google Scholar 

  23. Shafer, A. M., & Zimmermann, H.-G. (2007). Recurrent neural networks are universal approximators. International Journal of Neural Systems, 17(04), 253–263. https://doi.org/10.1142/s0129065707001111.

    Article  Google Scholar 

  24. Tononi, G. (2012). Integrated information theory of consciousness: An updated account. Archives Italiennes de Biologie, 150(2–3), 56–90. https://doi.org/10.4449/aib.v149i5.1388.

    Article  Google Scholar 

  25. Tononi, G., Boly, M., Massimini, M., & Koch, C. (2016). Integrated information theory: From consciousness to its physical substrate. Nature Reviews. Neuroscience, 17(7), 450–461. https://doi.org/10.1038/nrn.2016.44.

    Article  Google Scholar 

  26. Tsuchiya, N., Andrillon, T., & Haun, A. (2020). A reply to “the unfolding argument”: Beyond functionalism/behaviorism and towards a science of causal structure theories of consciousness. Consciousness and Cognition, 79, 102877. https://doi.org/10.1016/j.concog.2020.102877.

    Article  Google Scholar 

Download references

Acknowledgments

The author thanks Jakob Hohwy, Thomas Andrillon, Tim Bayne, Naotsugu Tsuchiya, and all the members of the Cognition & Philosophy Laboratory and the Melbourne Monash Consciousness Research group for the excellent comments and discussions on earlier drafts of this paper.

Code availability

Not applicable.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Niccolò Negro.

Ethics declarations

Conflict of interest

The author declares that he has no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Negro, N. Phenomenology-first versus third-person approaches in the science of consciousness: the case of the integrated information theory and the unfolding argument. Phenom Cogn Sci (2020). https://doi.org/10.1007/s11097-020-09681-3

Download citation

Keywords

  • Integrated information theory
  • Consciousness
  • Phenomenology-first
  • Unfolding argument
  • Demarcation problem