Directed Information Flow and Causality in Neural Systems
In the human experience, information typically flows from one place to another. By contrast, the notion of mutual information introduced by Shannon (1948) is perfectly symmetric in its arguments and does not distinguish between “input” and “output.” In this sense, it is perhaps surprising that this very measure of information indeed captures the capacity of any communication channel – though we should recall that the proof of this fundamental fact is not merely a simple consequence of Shannon’s definition.
In spite of Shannon’s strong and fundamental results, it has been tempting to define a notion of directed information. This was first proposed in (Marko 1973) for stationary processes. The more general and useful definition was given in the brief and beautiful note by Massey (1990). Moreover, Massey (1990), Kramer (1998) and subsequent work revealed that directed information has a natural place in the study of information transmission with feedback from the output to the...
- Granger CWJ (1969) Investigating causal relations by econometric models and cross-spectral methods. Econometrica 37(3):424438Google Scholar
- Kramer G (1998) Directed information for channels with feedback, vol 11, ETH series in information processing. HartungGorre, KonstanzGoogle Scholar
- Marko H (1973) The bidirectional communication theory a generalization of information theory. IEEE Trans Commun 21:1345–1351Google Scholar
- Massey JL (1990) Causality, feedback and directed information. In: Proceedings of the 1990 international symposium on information theory and its applications, Hawaii, pp 303–305Google Scholar
- Massey JL, Massey PC (2005) Conservation of mutual and directed information. In: Proceedings of the 2005 international symposium on information theory, Adelaide, pp 157–158Google Scholar
- Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27:379–423, and 623–656Google Scholar