On system-wide failures in complex, evolving systems (original) (raw)
Related papers
Risk Management, 2010
Social systems are evolving, multi-scale, spatio-temporal structures with emergent functions, needs and capabilities. One sensible definition of an ' extreme ' event would be one that led to some qualitative, structural change in the system, as it went beyond the current ' control limits ' of the system. Indeed, we can view any current structure and organization that characterize a system as being forged out of the past events and crises that led to change and innovation. Each individual, group, firm, corporation, shareholder and observer experience ' pathdependent learning ' , whereby current learning is conditional by the events and decisions taken in the past. In reality, the ' new ' system or structure that comes into being at a given time, although initially seen as a successful response to some past problem, will move into an unknowable future that will necessarily end in some new crisis, the emergence of some unexpected new factors, effects or implications that will have to be dealt with. A simple probabilistic model of event dynamics is presented showing that the discussions of power laws and self-organized criticality assume a stationary situation and the probability of given events. We contrast this with examples from human systems that are about qualitative, structural change and path-dependent learning. The first example involves the handling of severe motorway traffic accidents and the second describes the evolution of supply chains by trial and error, as new practices are adopted or rejected in a changing competitive economic environment. Social and economic systems must evolve, and will do so through the occurrence of seemingly ' extreme ' events, requiring adaptation and change. The important lesson therefore is to accept that there will be always a potential for unexpected and surprising events and that we need to build systems that are robust and resilient to their occurrence.
Innovation in Complex Adaptive Systems
2014
Our society is increasingly beset by a range of interrelated crises - with the financial crisis, the energy crisis, and the global warming crisis as leading examples - forming a "meta-crisis" with its roots in processes deeply entrenched in society (Lane et al., 2011), and emanating from large-scale complex adaptive systems so strongly interlinked that they are hard to even define and delimit. This has made our lack of understanding of such systems simultaneously more obvious and threatening, an issue further amplified by empirical developments brought about by new information and communication technology. In response to this, a substrate of semi-congruent critiques and new ideas - the former generally more articulated than the latter - are emerging in a number of major disciplines facing similar challenges, but still without the theoretical foundation needed to align and direct this substrate across disciplinary boundaries. The first part of the thesis attempts to develop...
Cascades of failure and extinction in dynamically evolving complex systems
Noise and Stochastics in Complex Systems and Finance, 2007
There is empirical evidence from a range of disciplines that as the connectivity of a network increases, we observe an increase in the average fitness of the system. But at the same time, there is an increase in the proportion of failure/extinction events which are extremely large. The probability of observing an extreme event remains very low, but it is markedly higher than in the system with lower degrees of connectivity. We are therefore concerned with systems whose properties are not static but which evolve dynamically over time. The focus in this paper, motivated by the empirical examples, is on networks in which the robustness or fragility of the vertices is not given, but which themselves evolve over time We give examples from complex systems such as outages in the US power grid, the robustness properties of cell biology networks, and trade links and the propagation of both currency crises and disease. We consider systems which are populated by agents which are heterogeneous in terms of their fitness for survival. The agents are connected on a network, which evolves over time. In each period agents take self-interested decisions to increase their fitness for survival to form alliances which increase the connectivity of the network. The network is subjected to external negative shocks both with respect to the size of the shock and the spatial impact of the shock. We examine the size/frequency distribution of extinctions and how this distribution evolves as the connectivity of the network grows. The results are robust with respect to the choice of statistical distribution of the shocks. The model is deliberately kept as parsimonious and simple as possible, and refrains from incorporating features such as increasing returns and externalities arising from preferential attachment which might bias the model in the direction of having the empirically observed features of many real world networks. The model still generates results consistent with the empirical evidence: increasing the number of connections causes an increase in the average fitness of agents, yet at the same time makes the system as whole more vulnerable to catastrophic failure/extinction events on an near-global scale.
Some Notions of Complex Adaptive Systems and their Relationship to Our World
Proceedings of Intersymp 1999 the 11th International Conference on Systems Research Informatics and Cybernetics Baden Baden Germany, 1999
This paper assumes that the world we live in consists of many interwoven complex adaptive systems. In the literature such systems are characterized as comprising many moving parts and processes that interact significantly in a mostly non-linear manner. It is argued that the forces that act on such complex systems are not additive, and that their impact occurs along non-linear interactions within the system. In particular, many of these systems are adaptive in that they change their behavior (through their interactions) over time, so that if they are subjected to a similar force or event a second time they may react in a substantially different manner. Over the past several decades a number of researchers have been intrigued by the abundance of complex adaptive systems in both the natural and human world (Holland 1995, Cowan et al. 1994, Kauffman 1992). In the natural world these include the human brain, immune systems, ecologies, cells, and many others. In the human world, where it is well known that the interaction of just two persons can reach a high level of complexity, they include cultural and social systems (Figs. 1 and 2). The paper describes the behavioral characteristics of complex adaptive systems, as they have been identified in the literature, and briefly discusses how these characteristics appear to manifest themselves in the evolution of technology, biology, and the economy. The influence of complex adaptive system notions on the structure and operation of a business organization is addressed in respect to decentralization, diversification, communication, and organizational flexibility. Attention is drawn to the inadequacies of existing quantitative tools and the opportunities that exist for leveraging human behavioral characteristics and organizational capabilities in a complex adaptive systems environment.
Contradictions and critical issues during system evolution
2002
Abstract In this paper the issue of system evolution is addressed. Activity Theory and the concept of exapansive cycles are reviewed as theories to explain systemic evolution. Contradictions often manifest themselves in deviating human behaviour or in modifications to external artefacts, ie, they result in a form of systemic behaviour which has often been treated as undesirable. It is shown that contradictions within activity systems are both catalysts and opportunities for system change.
2000
A developmental view of evolving systems (ecological, social, economical, organizational) is proposed to clarify 1) the role of selection processes versus collective, nonselective processes, 2) the origins of diversity and its role in system performance and robustness 3) the precursors of explicit subsystem interactions (cooperation / competition / symbiosis) that enhance system performance, 4) the preconditions necessary for further evolutionary development, and) the conditions for the return to a prior stage. Three sequential stages of evolving systems (based on the work of Salthe) are proposed: a Immature or Developing stage dominated by highly decentralized , selective processes with chaotic dynamics (local and global), a Mature or Developed stage dominated by nonselective, self-organizing processes with global robusmess but locally chaotic dynamics, and a Senescent stage dominated by rigid interactions with global fragility. A simple model problem, an agent solution to a maze (a sequential problem with many optimal and non-optimal solutions) is constructed to illustrate the entire developmental history and to clarify the above issues. Within the model, the agents evolve their capabiEty from a random approach to an optimized performance by natural selection (agents with long paths die), As the agents develop improved capability (shorter paths), natural selection becomes rare (few agents die), and an emergent collective solution is observed that is better than the performance of an average agent. Natural selection is neither expressed nor important for the higher performance of the system. As the collective, self-organizing structures are incorporated into individual capability witldn a stable environment, coherence (constraints) arise in the agent's interactions, and the agents lose diversity. The resulting Senescent system exhibits reduced randomness due to the rigid structuresand ultimately becomes fragile (all levels are sensitive to noise). With sufficient environmental change the senescent system will return to either the mature or developing stage.
Complex systems: Why do they need to evolve and how can evolution be supported
We live in a world characterized by evolution—that is, by ongoing processes of development, formation, and growth in both natural and human-created systems. Biology tells us that complex, natural systems are not created all at once but must instead evolve over time. We are becoming increasingly aware that evolutionary processes are ubiquitous and critical for social, educational, and technological innovations as well. The driving forces behind the evolution of these systems is their use by communities of practice in solving real-world problems as well as the changing nature of the world, specifically as it relates to technology. The seeding, evolutionary growth, and reseeding model is a process description of how this happens. By integrating working and learning in communities of practice, we have created organizational memories that include mechanisms to capture and represent task specifications, work artifacts, and group communications. These memories facilitate organizational lea...
Complex Adaptive Systems, Systems Thinking, and Agent-Based Modeling
Systems thinking and complex adaptive systems theories share a number of components, namely emergence, self-organization, and hierarchies of interacting systems. We seek to integrate these schools of thought and discuss the similarities and differences of these two models, to introduce systems dynamics and agent-based modeling as methods for modeling complex systems, and how causal-loop diagrams can be used as a means to clarify the complex interactions among components (agents). We then apply a mixture of these different but similar techniques to a fly ecosystem modeling problem to demonstrate their effectiveness. 1.1 Complex Adaptive Systems Complex adaptive systems (CAS) are all around us. Common examples given are ecosystems, financial markets, the brain, ant colonies, economies, and many other examples where large numbers of constituents independently interact on a local level that yield some unanticipated nonlinear outcome at scale. Despite the ubiquity of these systems, it is generally conceded that there is no one standard definition of CAS. For our purpose, we shall define a CAS as: a system composed of a large number of independent simple components that locally interact in an independent and nonlinear fashion, exhibit self-organization through interactions that are neither completely random nor completely regular and are not influenced by some central or global mechanism, and yield emergent behavior at large scales that is not predictable from observation of the behavior of the components[1, 2]. The smallest component elements of a CAS are commonly referred to as agents[3]. Agents are the smallest unit of organization in the system capable of producing a given response for a specific stimulus. This stimulus / response behavior of an agent is governed by a few very simple rules. In CAS, we see local interactions of groups of agents, both homogenous and heterogeneous, in a variety of different configurations. In small quantities these interactions can be anticipated, as there are usually a limited set of interactions that each agent can perform. These random local interactions generally yield outcomes approximate to the sum of the potential of each interaction; in some cases, however, as we see larger combinations of agents in varying proportions acting in different ways, we see complex and potentially novel behaviors from these combinations of agents that yield significantly greater outcomes than we would expect. When agents combine in such a way as to produce these emergent behaviors, we refer to this as aggregation and to the specific collection of agents required to produce the effects as aggregate agents[3]. These aggregate agents group together with other aggregate agents to form increasingly larger CAS with richer sets of emergent behaviors and interactions.