Climate Action

Living Edition
| Editors: Walter Leal Filho, Anabela Marisa Azul, Luciana Brandli, Pinar Gökcin Özuyar, Tony Wall

Artificial Intelligence and Global Changes

  • Philip GarnettEmail author
Living reference work entry


Although there are definitions for many of the terms used in this entry, readers should keep in mind there is not always absolute agreement on definitions, and they can vary to a degree. These should therefore be considered more as short explanations with appropriate references.

Term or phrase

Short explanation



A system is a number of interrelated elements or constituents that can be thought of as parts of a whole. For example,, the parts of a cell, or a society

Von Bertalanffy (1968)

System boundary

This is the limits of a system, which could either be fairly recognizable as such or defined pragmatically. The boundary of a cell is recognizable; however, boundaries to a society may be highly debatable/contestable and may be drawn pragmatically

Richardson and Lissack (2001)

Complex system

A system where the whole is greater than the sum of its parts. Suggesting that the system’s observed behavior(s) could be emergent properties

Kauffman (1996)

Complex network

A system composed of a large number of highly interconnected dynamic units. For example, a social network. Complexity systems often map to complex networks

Boccaletti et al. (2006)

Systemic failure

Large-scale failure of a system, or systems of systems. Perhaps due to them being unable to sustain their dynamics due to the loss of relationships or parts

Garnett (2018)

Cascade failure

Failure that spreads through and across systems

Buldyrev et al. (2010)

Artificial intelligence

Intelligence demonstrated by machines rather than natural intelligence that is displayed by humans or other animals

Russell and Norvig (2016)

Machine learning

Algorithms and statistical models used by machines to effectively perform a task without using explicit instructions. Often relying on patterns and inference instead

Bishop (2006)

The Problem of Global Change

Processes of change at a global level (the most obvious example of which is climate change) present both a scientific and political challenge. Our scientific understanding of the systemic nature of global systems (their complexity, which is at least in part due to their size and connectivity), though improving, does not afford the ability to predict how change will progress or how one system might affect others. Be that as it may, currently political failings are perhaps the more significant problem, especially as scientific progress would likely follow political progress if it could be made. This entry highlights the difficulties of global systemic change through the lens of complexity theory and proposes that tools in the form of ways of thinking and the application of technology (such as Artificial Intelligence) might form part of a solution to global change, or at the very least, part of a toolkit that can be deployed to work toward solutions.

The application of systems sciences to the problem of climate change has been proposed, and revisited, on a number of occasions. Helbing in his 2013 paper proposed the establishment of a Global Systems Science, in response to the problems of instability in our increasingly connected world (Helbing 2013). The intention was to direct research toward the consequences of the increasing connectivity between (and perhaps within) systems, and the associated stability of global systems. Understanding change processes and failure in global systems, or coupled global systems of systems, could benefit from a complex systems theory-based approach; however, significant research still remains to be done (Garnett 2018).

The need for additional research comes in part from the features of global systems. Complex adaptive systems often display resilience to, and the ability to adapt to, internal change and external drivers (Gunderson 2000; Moulonguet and Bouche 1949). However, understanding how systems will respond to different internal and external pressures (drivers) and potentially then understanding how to intervene in systems are currently largely beyond reach. We do not know what changes in dynamics, how far a system can be pushed, or how much pressure applied will cause a system to shift into a radically different (potentially unrecognizable) state. How systems ultimately fail, and the process of failure itself, is also not well understood. There is, however, work being undertaken in this area, as well as work that seeks to highlight the problem that systemic failure poses (Carpenter et al. 2011; Brook et al. 2013; Hughes et al. 2013; Lenton and Williams 2013; Bentley et al. 2014; Garnett 2018).

Where knowledge is particularly lacking is in a few key areas. The connectivity between global systems, and therefore the extent to which they are interdependent. A problem significant to understanding how change in one system will produce effects in others, or how failure may cascade through connected systems. We lack a systemic understanding of global systems, and there remains a tendency to study the world as discrete parts (perhaps out of a need to make the problems tractable).

New technological tools, methods, and theories need to be developed, and existing ones improved, to assist with mapping the nature and character of the connections between and within systems. Providing a basis from which a more detailed understanding of the significance of the loss of connections, or entire systems, can be developed. A basis on which to develop a systemic view.

To accept a systemic view of global systems is to also accept that feedbacks exist within and between all systems that have a significant role in the security of human civilization, abandoning any possibility that the global environment is a set of discrete systems. It follows therefore that changes or failures in one system will feedback into other systems and that all systems are intimately connected in ways that we currently do not fully understand (Buldyrev et al. 2010; Helbing 2013). The Anthropocene marks a period where all global systems are increasingly under pressure from human activity. We need to consider the different potential consequences of connectivity and feedback in and between systems for system stability. We should also work to discover the limits of our own understanding of how systems respond to internal dynamics and external drivers and what they mean for of how systems change and fail in the real world.

This entry seeks to promote discussion of these problems and explores processes of change and failure of global systems and their consequences for human civilization. The entry will also discuss the significance of the temporal nature of change processes and failure. For example, our current lack of understanding of systemic processes means that it is not known if failure is inevitable and already happening. The changes could be occurring on a temporal scale which we are not sensitive to. Finally, our future security as a species will require significant changes in our behavior and ability to make deliberate interventions in global systems (or our ability to intervene to guide the evolution of system states). Knowing how and where to make interventions is essential. This entry will propose ways that modes of thinking, and the application of technologies, could provide insights into these difficult problems, not as simple solutions but as part of a toolkit which can be deployed to address systemic change and failure.

Complex Adaptive Systems

It is useful to characterize what it is we mean by a complex adaptive system and also make an argument as to why global systems should be considered as complex adaptive systems. A useful definition of a system is a set of elements or objects that act together as part of a process or mechanism (Turcotte and Rundle 2002). Often systems can be thought of as a network (or a complex network), where the nodes in the network are the individual parts (or objects) and the edges are the relationships that connect the parts (Boccaletti et al. 2006). Many global systems can be mapped to complex networks and are often referred to as such. For example, the financial system is the result of the interactions of a set of financial organizations, such as banks, hedge funds, and regulators, and we sometimes hear people speak of the “financial network.”

In complexity theory, the complexity of a system comes from the difficulty (or potential impossibility) of predicting how that system will behave by looking at the interactions (or relationships) between the parts alone. It may be possible to understand and document the types of interactions occurring between the banks, hedge funds, and regulators of the financial system. However, the theory would tell us that this is not enough to predict the behavior of a complex system due to one of the features of complex systems, emergence. Simple interactions between the parts of a complex system can result in emergent behaviors (Stepney 2018), a property elegantly demonstrated by Conway’s Game of Life (for more information see Gardner 1970; Schulman and Seiden 1978). The observed behavior of a system is often referred to as the system state, and we talk about systems changing their state and therefore changing their observed behavior. However, the underlying parts and the interactions between those parts may not have changed. Rather new behaviors are observed due to the environment (or context) that the parts and interactions are operating within.

Another feature of complex systems is that they are able to adapt to changing external inputs from their environment without the system state significantly changing; they are resilient to a changing environment. Complex systems are also resilient to internal changes. The parts of the system can change the way they interact with each other, and/or the parts themselves can also change, without there necessarily being a significant observable change in the emergent global behavior or system state. However, complex systems are nonlinear, and therefore, how a system will respond to environmental change or perturbation is not always predictable. Therefore, a perturbation might produce no change in the system, a proportional change, or a disproportionately large change. Causality in complex systems is therefore not a simple linear process of action and then reaction, in such a way that we can then easily close the gap between a current state and a future desired state. As any changes made to the system may not have easily predictable effects.

System inertia is one further feature of complex systems that should be considered. System inertia is where it would appear that subsequent to a shock, the system state does not appear to respond. The lack of response may be due to the system being resilient to the shock; however, it could also be because there is sufficient inertia in the system that change does not occur immediately. However, a process of change might be underway, to manifest a change of state at some point in the future. This is one reason (but not the only reason) why understanding causality in complex systems is challenging. Inertia might separate changes in system state temporally from the perturbation that caused them. The climate system is likely to exhibit inertia, if only due to its size; therefore, only now we are beginning to see the system react to increasing atmospheric CO2 levels as average surface temperatures have started to rise (Mann et al. 2008). Complexity theory would also therefore allow for the possibility that we are only seeing the very start of the reaction due to increased atmospheric CO2, and without and further increase, temperature could significantly accelerate in future decades as the full effects manifest in the system state. (For further discussion of the features and nature of complex systems see the work of Kauffman and others (Kauffman 1996; Kauffman and Clayton 2006; Stepney 2018).)

If we accept that global systems have the properties of complex systems, then it is perhaps not a surprise that as yet we are unable to predict or measure how much change (either to internal dynamics or interactions with the environment) would equate to an observably different system. A system could go through a slow and smooth transition over a period of time long enough that the systems around it adapt in the same way, and to us (with our short memories) that change might fail to register as different. Alternatively a system could shift rapidly, causing a major disruption – a tipping point in system state (Brook et al. 2013; Bentley et al. 2014). Whether change due to a tipping point is more significant than slow evolution of a system is perhaps irrelevant. If the result of that change is a hostile system state, then halting or reversing the change is perhaps as important than the change itself.

Systems have no central evaluator that assigns a value to the current system state. However, it would be advantageous to us to be able to evaluate a system and if necessary have the knowledge to be able to steer a system into a state that is more advantageous to our survival. Global systems do not stand in isolation, free from the influence of their environment or other systems (indeed, their environment of other systems). They are highly connected, to the point where with many systems it is difficult to determine the boundaries between systems or the boundary between a system and its environment (Vespignani 2010). (It is often necessary, and advantageous, for researchers to draw arbitrary boundaries around systems to have any hope of understanding their system of study. These simplifications are a requirement of tractability, but could introduce flawed assumptions, possibly rendering the model of the system invalid.)

Systems Failure

Systems can fail from the point of view of the observed system, and from the point of view of some or all of the parts of the system. In the first instance, failure in one part of the system, or parts of the system, would propagate through the whole system resulting in the disappearance of the global system behavior. The system essentially no longer exists. The parts and their interactions can no longer produce the emergent global behavior (or any other emergent behavior). This is not the same as a system changing state; it is the total loss of the systemic behavior. Failure could also cause some parts of the system to disappear, but this failure may manifest merely as a process of adaptation or change at the system level. This form of failure is undoubtedly happening all the time, particularly in natural systems. Parts of ecosystems fail and disappear, but the system as a whole remains or evolves.

How systems fail is difficult to predict (Alexander et al. 2004). Numerous nodes (parts of the system) or edges (relationships between the parts) could be lost or broken, and very little visible change may occur. However, there could come a point where the loss or breakage starts being driven by the internal dynamics of the system itself. Failure then could propagate throughout the entire system (Watts 2002).

Cascade and Systemic Failure

Systemic or cascade failure is concerned with failures that propagate through systems, between systems, or where changes in one system cause the failure of another. Therefore, for global systems, changes or failure in one, perhaps an ecosystem, could trigger the failure of connected systems. This failure could cascade through numerous systems, potentially causing large-scale systemic failure (Garnett 2018). The difficulty is that we do not really know what the likelihood of systemic failure is or how this type of failure would propagate and under what circumstances.

Fortunately, real-world examples of systemic failure in social, ecological, and physical systems are thankfully historically rare; however, there are examples (Karabanov et al. 2004; Freed and Samson 2004). It has also been suspected that large complex systems (such as food webs) might be in many cases unstable, rather than the complexity providing stability (May 1972; Cohen and Newman 1985). Perhaps one significant example of systemic failure (although the system did eventually recover, but that recovered required a significant input of resources) would be the North American dust bowl of the 1930s (Schubert et al. 2004; Cook et al. 2009; Hornbeck 2012). Rarity is a small comfort as analysis of global systems would indicate numerous systems are under significant pressure (Fahrig 2003; Guerra et al. 2006; Bradshaw et al. 2007; Tscharntke et al. 2012), and understanding safe operating spaces for social-ecological systems is nontrivial (Dearing et al. 2014). We are currently seeing some clear indications of problems in insect populations, with evidence of global decline (Sánchez-Bayo and Wyckhuys 2019). There is also evidence that there are shifts in seasons, particularly in higher latitudes (Post et al. 2018). Catastrophic collapse of insect populations would undoubtedly affect numerous other systems, and might be in part due to systems becoming unsynchronized (dependences between systems may be temporal as well as spatial, and important relationships can be broken by timings). How close these systems, and others, are to failure is a difficult question to address, as “we do not know exactly where to locate thresholds of irreversibility” (Falk and Stein 1970).

There have been a number of examples of where systemic failure was perhaps narrowly avoided; the global financial system during the crash of 2008 would be one. According to newspaper reports, banks in the UK were perhaps only hours away from running out of cash. The response of nations to the global financial crisis versus the response to climate change makes for an interesting comparison. The world’s nations went to great lengths and expense to avoid the collapse of global finance; we are yet to see this level of sustained and coordinated response to climate change. (For further discussion of this and systemic failure in general we direct readers to Anderson et al. (2008) and Garnett (2018)).

At present the tools (computational or otherwise), language, or understanding do not exist to address the problem of knowing how and when a system is going to react. Capacity to even map complex systems, particularly in the real world, is limited, and the holy grail of complexity science of being able to then make precise interventions that produce consistent and desirable changes remains elusive (Allen 2001). It might even be the case that the level of precision desired is impossible; it is not that we do not know which levers to pull, but more that there are no levers. What is more, intervention might also have its own unintended consequences, knowing when not to act because a system is in the process of adapting is as important. What therefore can be done?

Complex Networks

Complex systems map intuitively to network structures, as the parts (objects) of the system map onto network nodes and the relationships between the edges in the network (Mitchell 2006). Complex network analysis is therefore an intuitive method by which complex systems can be characterized and perhaps understood. The theory is that the network structure, as defined by connections between the objects (which in the financial system might be banks for example), captures something about that system, e.g., interdependence between species or bank loans – depending on the system being mapped. Connections form regions in networks, and the arrangement or type of connections (along with other warning signals (Scheffer et al. 2009)) might provide a way of characterizing the state of the larger system, or indeed indicate the relative importance of individual or groups of nodes or relationships. For example, analysis might indicate that particular nodes or relationships are of great significance to the flow of information through the network. Different network structures could be associated with current or future stability or instability. Others still might map to regions of structural dependency and/or the propensity for failure. Complex network analysis is therefore one approach that can be used to model the Anthropocene (Lövbrand et al. 2009).

This type of analysis is not a precise lever with which interventions can be made, but more a way of potentially assessing the health of a system, which might in turn provide some indication of where deliberate interventions should be made or pressure reduced. It is worth noting that we recognize that human activity is a constant source of intervention in systems, as we exist as part of systems and not separate to them. Deliberate interventions would be those made in response to perhaps an understanding that there is a problem and might take the form of reducing existing human activities. However, complex network models of global systems could be built, and tools developed to highlight areas of potential concern. Computer modeling could also be used to simulate the evolution of the networks to predict possible future state of these systems.

The mapping and analysis of complex systems, particularly those on a global scale, remains a challenge. Two significant challenges include achieving an appropriate level of abstraction and collecting the required data. The level of abstraction is particularly significant as we need to produce models that are not too complicated for scientific analysis. The analysis of dynamic networks, where nodes and edges come and go, is also challenging as it is difficult to robustly attach significance to any one node or edge over another, or predict if any given node or edge might disappear or new one forms (Newman et al. 2011). Not only the network structure (topology) is of significance, but also the character of the components and how the network changes.

Artificial Intelligence and Global Systems

Using the lens of complex systems theory, we have demonstrated the challenges of understanding and therefore making intentional interventions in global systems to effect change. However, we find ourselves in a situation where it is quite likely that we need to both radically reduce our impact on global systems (perhaps especially the climate system, but it is probably numerous other systems) and be able to make interventions to maximize stability and perhaps repair damage. That is, we are probably in danger of going beyond the point where it would be enough to simply cut emissions to reverse climate change; we will also need to intervene in other ways to avoid global systems moving toward states that are highly disadvantages to supporting human civilization. As targeted intervention requires both an understanding of the important components of a system, and an understanding of how those components can and should be perturbed or altered (perhaps by changing their relationships with other parts of the system) to remain in, or move back toward, a system state that is favorable to civilization. We need to develop methods and tools to both build the understanding of the connectivity and components of global systems (at an appropriate level of abstraction) and also ways to predict the potential consequences of making changes to the system. How, or perhaps even if, that is possible remains to be seen.

As early as 1971 Jay Forrester suggested the computer modeling might provide a tool to help understand World Dynamics (Forrester and Forrester 1971; Jantsch 1971). One group of technologies that might prove useful in this domain are forms of machine learning (ML) and artificial intelligence (AI), and perhaps, therefore, the increasing capacity of ML and AI presents an opportunity. Most advanced analytical platforms in use today which present themselves as AI are more often than not actually one form or other of machine learning. We are yet to see a computer performing something like what we would call natural intelligence, the form of intelligence seen in animals and humans. However, machine learning has proven itself as highly adept at analyzing large amounts of data for patterns (Bishop 2006). For example, methods of clustering, where objects with similar attributes are grouped together. Or searching through data for objects that share attributes with a previously identified set of objects of interest. This type of data analytics could be useful for the challenge of understanding global systems (Recknagel 2001), if data can be collected.

ML or AI tools could be used to search for patterns in data about global systems to find patterns that those systems have in common, where a pattern might be certain types of network structure (such as a particular arrangement of nodes (objects) and edges (relationships)) or other systems’ features that can then be correlated with system behaviors (Coates et al. 2011). This type of analysis is highly dependent on having meaningful data about systems and some knowledge of the behavior of the systems. In this context, meaningful data would tell us something about the component parts of the system, and the relationships between those parts, from which we can build an abstract model of the system. The difficulty is determining what an appropriate level of abstraction for a system is, as even in the world of big data we cannot collect everything (and doing so might not be helpful as the constructed model would be too complex to work with).

For example, if you have significant amounts of data about the development of a system through time that ultimately failed, ML or AI could perhaps identify features present in the evolution of that system in other global systems. This in turn might tell us something useful about how those systems are developing and whether we should be concerned about their likelihood of failure. Ideally we would also need data about how systems respond to interventions, which might allow us to make a targeted intervention in a system of concern. The difficulty is that there is no central database of global systems data which contains anything like the level of detail that would likely be required. Another potential problem with this as a methodology is that it might only tell us something about the systems in isolation due to the data collection process essentially discretizing systems. Therefore, we might not be able to capture systemic failure of multiple connected systems.

This leaves a difficult problem, what data do we collect about global systems in order to be able to construct models at a suitable level of abstraction that the resulting model is not almost as complex as the real system, but not so simple that it is not representative? We also need to have a mechanism to capture the systemic nature of global systems. This is perhaps an area where we can speculate about what future ML and AI tools might be able to contribute given additional research. Perhaps AI could be used to design the sensory networks required for data collection, by learning appropriate levels of data collection coverage for systemic problems. AI could also be used to search and integrate current global systems data sets. This would not be significantly different from current machine learning processes and could be facilitated by inexpensive internet connected sensory devices. AI could also be used to build the appropriately abstracted models required for the monitoring and analysis of global systems. Again, this might be an iterative machine learning process of model creation, prediction, and then comparison with the future system state (as detected by the sensor network or other data collection). The two processes of data collection and model development would feedback into each other. What is described here is not far-fetched; however, as yet we do not have this level of feedback or evolution in the design and interpretation of data collection and analysis system, or the creation and testing of models. What is being described is actually a form of cybernetic control system (Wiener 1948; von Foerster 2003), one that perhaps puts more emphasis on control by the AI/ML, and less in the hands of humans. However, we would envision this as a collaborative process with domain experts and modelers working alongside AI/ML tools. We ourselves might struggle to determine the significance of the connectivity in systems; however, we might be successful in designing an AI or ML mediated (collaborative) process that can address the problem of connectivity in complex systems.

It is clear however that AI is not a solution on its own; we cannot wait for AIs of sufficient intelligence to arrive that will essentially solve problems such as climate change or energy production for us. However, AI or ML tools used in carefully designed collaborative platforms may facilitate the design and building of platforms that are able to address global change. For additional general insight into the building of complex systems models, and an analysis of, and tools to support, the process of understanding the importance of assumptions about the domain of knowledge and the model design process, we suggest the edited volume by Stepney and Pollack (2018).

Complexity Thinking

Complex problems may require us to use modes of thinking that are able to deal with the complexity of the problem which we are trying to solve. Complexity thinking recognizes that complex adaptive systems are not causal in the sense that they can be easily changed from a current state to a desirable future state through a linear process of adjustment. Rather there should be a focus on describing the present, and “then acting in that situated now to test and enable the evolutionary potential of the system” (Snowden 2011). The work of Edgar Morin introduced complex thinking as a new mode of thinking that integrates some aspects of the complexity of the world (as highlighted by complex systems theory) into the process of thinking itself (Waddington 1977; Morin 2014). The intention being that complex modes of thinking would support our understanding of processes of change in the world and allow us to better manage that change. However, more work in this area is required to move from a conceptual understanding of complex thinking to an operational definition that can be actually used to develop strategies and tools to support modes of thinking to address complex problems such as global change.


The process of change at a global level presents a significant challenge, not only to those interested in its research but also to society as a whole. The long-term viability of human civilization might be significantly impacted by our ability to develop an understanding of how global systems change and how human interventions affect that change. Complex systems theory and complex networks should form part of the tools and methods deployed to improve our understanding of how global systems are connected, and the significance of those connections. It is unlikely to ever be the case that we will be able to make targeted interventions in global systems to produce desirable and predictable outcomes in those systems. That ability would assume that causation in complex systems can be understood in what would be essentially a deterministic manner, where the reality is unlikely to be that straightforward. However, this entry has demonstrated two possible ways forward.

The emergence of artificial intelligence and machine learning tools as capable (when applied with sufficient care) of understanding patterns and structures in large complex datasets could provide an opportunity for improving our understanding of global systems. Through a process of co-production, with system domain experts and software and data experts, models, tools, and methods could be developed to tackle change at the global level. A process should be rigorous and thoroughly documented, and there are methods being developed to support this form of research (Stepney and Polack 2018). A process would need to be a large-scale and sustained application of approaches to analyze data and produce models to understand how global systems can behave and under what circumstances. This data analyses and model building could also be supported by new modes of thinking. Complexity thinking recognizes that in order to address complex problems, the process of thinking itself needs to integrate aspects of the complexity of the systems of interest. This integration is required to evolve the system from one state to another, recognizing that simple causal relationships may not exist in complex systems.

It should be emphasized that complex systems theory, complexity thinking, AI, and ML are not easy, quick solutions (even when combined) to what is a difficult problem. Even if artificial intelligence develops further in its sophistication and capabilities, this will most likely not be a solution in itself. Any capacity to change global systems in response to problems such as climate change will only come from the recognition that a problem exists, and then a sustained multifaceted effort to do something to address it. This is perhaps the most significant problem of them all; at this time, response to the possibility of systemic failure appears to be more dependent on ideology than systemic thinking, and in the case of climate change this could prove catastrophic.

In the case of the global financial crisis, there was a global response of sorts. Perhaps because it was more in the mediate interests of those in or close to power? To address large-scale systemic failure, regional interests would need to give way to global interest. Addressing climate change will require a response greater and more coordinated than our response to the potential collapse of the financial sector in 2008, and climate change is unlikely to be an isolated problem. Change in global system is not confined to political regions by borders; however, to date regions and borders are a barrier to developing and implementing solutions. Therefore, we do not need to address the change in global systems such as the climate, we need to focus our efforts toward all global systems, including political and financial.



  1. Alexander R, Hall-May M, Kelly T (2004) Characterisation of systems of systems failures. In: Proceedings of the 22nd international system safety conference. Providence, RI, USA, pp 499–508Google Scholar
  2. Allen P (2001) What is complexity science? Knowledge of the limits to knowledge. Emergence 3(1):24–42CrossRefGoogle Scholar
  3. Anderson S, Cavanagh J, Redman J (2008) How the bailouts dwarf other global crisis spending. Institute for Policy Studies, Washington, DCGoogle Scholar
  4. Bentley RA, Maddison EJ, Ranner PH, Bissell J, Caiado CCS, Bhatanacharoen P, Clark T, Botha M, Akinbami F, Hollow M, Michie R, Huntley B, Curtis SE and Garnett P (2014) Social tipping points and Earth systems dynamics. Front Environ Sci 2:35.
  5. Bishop CM (2006) Pattern recognition and machine learning. Springer, Basingstoke, UKGoogle Scholar
  6. Boccaletti S, Latora V, Moreno Y, Chavez M, Hwang D-U (2006) Complex networks: structure and dynamics. Phys Rep 424(4):175–308CrossRefGoogle Scholar
  7. Bradshaw CJA, Sodhi NS, Peh KS-H, Brook BW (2007) Global evidence that deforestation amplifies flood risk and severity in the developing world. Glob Chang Biol 13(11):2379–2395CrossRefGoogle Scholar
  8. Brook BW, Ellis EC, Perring MP, Mackay AW, Blomqvist L (2013) Does the terrestrial biosphere have planetary tipping points? Trends Ecol Evol 28(7):396–401CrossRefGoogle Scholar
  9. Buldyrev SV, Parshani R, Paul G, Stanley HE, Havlin S (2010) Catastrophic cascade of failures in interdependent networks. Nature 464(7291):1025–1028CrossRefGoogle Scholar
  10. Carpenter SR, Cole JJ, Pace ML, Batt R, Brock WA, Cline T, Coloso J et al (2011) Early warnings of regime shifts: a whole-ecosystem experiment. Science 332(6033):1079–1082CrossRefGoogle Scholar
  11. Coates A, Ng A, Lee H (2011) An analysis of single-layer networks in unsupervised feature learning. In: Proceedings of the fourteenth international conference on artificial intelligence and statistics, pp 215–223.
  12. Cohen JE, Newman CM (1985) When will a large complex system be stable? J Theor Biol 113(1):153–156CrossRefGoogle Scholar
  13. Cook BI, Miller RL, Seager R (2009) Amplification of the North American ‘dust bowl’ drought through human-induced land degradation. Proc Natl Acad Sci U S A 106(13):4997–5001CrossRefGoogle Scholar
  14. Dearing JA, Wang R, Zhang K, Dyke JG, Haberl H, Sarwar Hossain M, Langdon PG et al (2014) Safe and just operating spaces for regional social-ecological systems. Glob Environ Chang Hum Policy Dimens 28(Suppl C):227–238CrossRefGoogle Scholar
  15. Fahrig L (2003) Effects of habitat fragmentation on biodiversity. Annu Rev Ecol Evol Syst 34(1):487–515CrossRefGoogle Scholar
  16. Falk RA, Stein RE (1970) Toward equilibrium in the world order system. Am J Int Law 64(4):217–226CrossRefGoogle Scholar
  17. Forrester JW, Forrester JW (1971) World dynamics, vol 59. Wright-Allen Press, Cambridge, MAGoogle Scholar
  18. Freed CD, Samson M (2004) Native Alaskan dropouts in Western Alaska: systemic failure in native Alaskan schools. J Am Indian Educ 43(2):33–45Google Scholar
  19. Gardner (1970) Mathematical games – the fantastic combinations of John Conway’s new solitaire game ‘life’. Sci Am 223:120–123CrossRefGoogle Scholar
  20. Garnett P (2018) Total systemic failure? Sci Total Environ 626:684–688CrossRefGoogle Scholar
  21. Guerra CA, Snow RW, Hay SI (2006) A global assessment of closed forests, deforestation and malaria risk. Ann Trop Med Parasitol 100(3):189–204CrossRefGoogle Scholar
  22. Gunderson LH (2000) Ecological resilience – in theory and application. Annu Rev Ecol Syst 31:425–439CrossRefGoogle Scholar
  23. Helbing D (2013) Globally networked risks and how to respond. Nature 497(7447):51–59CrossRefGoogle Scholar
  24. Hornbeck R (2012) The enduring impact of the American dust bowl: short- and long-run adjustments to environmental catastrophe. Am Econ Rev 102(4):1477–1507CrossRefGoogle Scholar
  25. Hughes TP, Carpenter S, Rockström J, Scheffer M, Walker B (2013) Multiscale regime shifts and planetary boundaries. Trends Ecol Evol 28(7):389–395CrossRefGoogle Scholar
  26. Jantsch E (1971) World dynamics. Futures 3(2):162–169CrossRefGoogle Scholar
  27. Karabanov E, Williams D, Kuzmin M, Sideleva V, Khursevich G, Prokopenko A, Solotchina E et al (2004) Ecological collapse of Lake Baikal and Lake Hovsgol ecosystems during the last glacial and consequences for aquatic species diversity. Palaeogeogr Palaeoclimatol Palaeoecol 209(1):227–243CrossRefGoogle Scholar
  28. Kauffman S (1996) At home in the universe: the search for the laws of self-organization and complexity. Oxford University Press, Middlesex, EnglandGoogle Scholar
  29. Kauffman S, Clayton P (2006) On emergence, agency, and organization. Biol Philos 21(4):501–521CrossRefGoogle Scholar
  30. Lenton TM, Williams HTP (2013) On the origin of planetary-scale tipping points. Trends Ecol Evol 28(7):380–382CrossRefGoogle Scholar
  31. Lövbrand E, Stripple J, Wiman B (2009) Earth system governmentality: reflections on science in the anthropocene. Glob Environ Chang: Hum Policy Dimens 19(1):7–13CrossRefGoogle Scholar
  32. Mann ME, Zhang Z, Hughes MK, Bradley RS, Miller SK, Rutherford S, Ni F (2008) Proxy-based reconstructions of hemispheric and global surface temperature variations over the past two millennia. Proc Natl Acad Sci U S A 105(36):13252–13257CrossRefGoogle Scholar
  33. May RM (1972) Will a large complex system be stable? Nature 238(5364):413–414CrossRefGoogle Scholar
  34. Mitchell M (2006) Complex systems: network thinking. Artif Intell 170(18):1194–1212CrossRefGoogle Scholar
  35. Morin E (2014) Complex thinking for a complex world – about reductionism, disjunction and systemism. Systema: Connecting Matter Life Cult Technol 2(1):14–22Google Scholar
  36. Moulonguet, A., Bouche J (1949) [Not Available]. Les Annales D’oto-Laryngologie 66(1–2):55–57.Google Scholar
  37. Newman M, Barabási A-L, Watts DJ (2011) The structure and dynamics of networks. Princeton University Press, Princeton, New Jersey, United StatesGoogle Scholar
  38. Post E, Steinman BA, Mann ME (2018) Acceleration of phenological advance and warming with latitude over the past century. Sci Rep 8(1):3927CrossRefGoogle Scholar
  39. Recknagel F (2001) Applications of machine learning to ecological modelling. Ecol Model 146(1):303–310CrossRefGoogle Scholar
  40. Richardson KA, Lissack MR (2001) On the status of boundaries, both natural and organizational: a complex systems perspective. Emergence 3(4):32–49CrossRefGoogle Scholar
  41. Russell SJ, Norvig P (2016) Artificial intelligence: a modern approach. Pearson Education Limited, MalaysiaGoogle Scholar
  42. Sánchez-Bayo F, Wyckhuys KAG (2019) Worldwide decline of the entomofauna: a review of its drivers. Biol Conserv 232:8–27CrossRefGoogle Scholar
  43. Scheffer M, Bascompte J, Brock WA, Brovkin V, Carpenter SR, Dakos V, Held H, van Nes EH, Rietkerk M, Sugihara G (2009) Early-warning signals for critical transitions. Nature 461(7260):53–59CrossRefGoogle Scholar
  44. Schubert SD, Suarez MJ, Pegion PJ, Koster RD, Bacmeister JT (2004) On the cause of the 1930s dust bowl. Science 303(5665):1855–1859CrossRefGoogle Scholar
  45. Schulman LS, Seiden PE (1978) Statistical mechanics of a dynamical system based on Conway’s game of life. J Stat Phys 19(3):293–314CrossRefGoogle Scholar
  46. Snowden D (2011) Babies should not be thrown out with bathwater – cognitive edge. Cognitive Edge. November 16, 2011.
  47. Stepney S (2018) Complex systems for narrative theorists. In: Walsh R, Stepney S (eds) Narrating complexity. Springer International Publishing, Cham, pp 27–36CrossRefGoogle Scholar
  48. Stepney S, Polack FAC (2018) Engineering simulations as scientific instruments: a pattern language: with Kieran Alden, Paul S. Andrews, James L. Bown, Alastair Droop, Richard B. Greaves, Mark Read, Adam T. Sampson, Jon Timmis, Alan F.T. Winfield. Springer International Publishing, SwitzerlandCrossRefGoogle Scholar
  49. Tscharntke T, Clough Y, Wanger TC, Jackson L, Motzke I, Perfecto I, Vandermeer J, Whitbread A (2012) Global food security, biodiversity conservation and the future of agricultural intensification. Biol Conserv 151(1):53–59CrossRefGoogle Scholar
  50. Turcotte DL, Rundle JB (2002) Self-organized complexity in the physical, biological, and social sciences. Proc Natl Acad Sci U S A 99(Suppl 1):2463–2465CrossRefGoogle Scholar
  51. Vespignani A (2010) Complex networks: the fragility of interdependency. Nature 464(7291):984–985CrossRefGoogle Scholar
  52. Von Bertalanffy L (1968) General system theory. New York 41973 (1968): 40. pp 3–17Google Scholar
  53. von Foerster H (2003) Cybernetics of cybernetics. In: von Foerster H (ed) Understanding understanding: essays on cybernetics and cognition. Springer New York, New York, pp 283–286CrossRefGoogle Scholar
  54. Waddington CH (1977) Tools for thought. Cape, LondonGoogle Scholar
  55. Watts DJ (2002) A simple model of global cascades on random networks. Proc Natl Acad Sci U S A 99(9):5766–5771CrossRefGoogle Scholar
  56. Wiener N (1948) Cybernetics: control and communication in the animal and the machine. Wiley, OxfordGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.York Center for Cross-disciplinary Systems Analysis & School of ManagementUniversity of YorkYorkUK

Section editors and affiliations

  • Federica Doni
    • 1
  1. 1.University of Milano-Bicocca ItalyMilanoItaly