Friday 10 October 2014

COMPLEXITY TIME BOMB: When systems get out of control


 by Dirk Helbing

                                                                                                                                                                                Photo: RenateWernli

This is second in  a  series of blog posts that form chapters of my forthcoming book Digital Society. Last week's chapter was titled:  GENIE OUT OF THE BOTTLE: The digital revolution on its way.

Financial crises, terrorism, conflict, crime: it turns out, the conventional ‘medicines’ to tackle global problems are often inefficient or even counter-productive. The reason for this is surprisingly simple: we approach these problems with an outdated understanding of our world. While the world might still look similar to how it has looked for a long time, I will argue that it has, in fact, inconspicuously but fundamentally changed over time.

We are used to the idea that societies must be protected from external threats such as earthquakes, volcanic eruptions, hurricanes, and military attacks by enemies. However, we are increasingly threatened by another kind of problems: those that come from within the system, such as financial instabilities, economic crises, social and political unrest, organized crime and cybercrime, environmental change, and spreading diseases. These threats have become some of our greatest worries. According to the World Economic Forum's Risk Map, the largest risks today are of a socio-economic nature such as inequality or governance failure. These global 21st century problems cannot be solved with 20th century wisdom, because they are of a different scale and result from a new level of complexity in today's socio-economic systems. We must therefore better understand what complex systems are, and what are their properties. To this end, I will discuss the main reasons why things go wrong: unstable dynamics, cascading failures in networks, and systemic interdependencies. I will illustrate these problems by examples such as traffic jams, crowd disasters, blackouts, financial crises, crime, wars, and revolutions.

Phantom traffic jams


Complex systems include phenomena ranging from turbulent flows and the global weather system to decision-making, opinion formation in groups, financial and economic markets, and the evolution and spread of languages. But we must take care to distinguish complex systems from complicated ones. A car is complicated: it consists of thousands of parts, yet is easy to control (when it works properly). Traffic flow, on the other hand, which depends on the interactions of many cars, is a complex dynamical system, which produces counter-intuitive, individually uncontrollable behaviors such as "phantom traffic jams" that seem to have no cause. While many traffic jams do occur for a specific, identifiable reason, such as an accident or a building site, everyone has also encountered situations where a vehicle queue appeared "out of nothing" – and where there is no visible cause - see  visualisation

To explore the true reasons for these phantom traffic jams, Yuki Sugiyama from the Nagoya University in Japan and his colleagues carried out an experiment, in which they asked many people to drive their cars around a circular track - see visualisation  The task sounds simple, and indeed all vehicles moved smoothly for some time. But then a random perturbation in the traffic flow, an unexpected slow-down of a car, triggered the appearance of “stop-and-go” traffic – a traffic jam that travelled backwards around the track, against the driving direction.

While we often blame others for poor driving skills to explain such "phantom traffic jams," studies in complexity science have shown that they rather emerge as a collective phenomenon unavoidably resulting from the interactions between vehicles. A detailed analysis shows that, if the density of cars exceeds a certain "critical" threshold – that is, if their average separation is smaller than a certain value – then the smallest perturbation in the speed of any car will be amplified to cause a breakdown of the entire flow. Because drivers need some time to respond to such a disturbance, the next driver in line will have to brake harder to avoid an accident. Then the following driver will have to break even harder, and so on. This chain reaction amplifies the small initial perturbation and eventually produces the jam – which of course every individual would prefer to avoid.

Recessions - traffic jams in the world economy?


Economic supply chains might exhibit a similar kind of behavior. As known from John Sterman's "beer distribution game," supply chains are also hard to control. Even experienced managers will often end up ordering too much beer, or will run out of it. This is a situation that is as difficult to avoid as stop-and-go traffic. In fact, our scientific work suggests that economic recessions may be regarded as a kind of traffic jam in the global supply network (see figure below). This is actually somewhat heartening news, since it implies that, just as with traffic flow, engineered solutions may exist that can mitigate economic recessions, provided that we have access to real-time data on the world's supplies and materials flows. Such solutions will be discussed later in the chapter on Socio-Inspired Technologies.

Instability and self-organization in strongly interacting systems


A shocking example for systemic instabilities discussed later is the occurrence of crowd disasters. Here, even when everyone is peacefully minded and tries to avoid harming others, many people might die. What do all these examples tell us? Our experience will often not inform us well, and our intuition may fail, since complex dynamical systems tend to behave in unexpected or even counter-intuitive ways. Such systems are typically made up from many interacting components, which respond to the behavior of other system components. As a consequence of these interactions, complex dynamical systems tend to self-organize, i.e. to develop a collective behavior that is different from what the components would do in separation. Then, the components’ individual properties are often no longer characteristic for the system. "Chaotic" or "turbulent" dynamics are possible outcomes, but complex systems can show many other phenomena.

When self-organization occurs, one often speaks of emergent phenomena that are characterized by new system properties, which cannot be understood from the properties of the single components. For example, the facts that water feels wet, extinguishes fires, and freezes at a particular temperature are properties, which cannot be understood from the properties of single water molecules.

As a consequence of the above, we have to shift our attention from the components of our world to their interactions. In other words, we need a change from a component-oriented to an interaction-oriented, systemic view, which is at the heart of complexity science. I claim that this change in perspective, once it becomes common wisdom, will be of similar importance as the transition from the geocentric to the heliocentric worldview. The related paradigm shift has fundamental implications for the way in which complex techno-socio-economic systems must be managed and, hence, also for politics and our economy. Focusing on the interactions in a system and the multi-level emergent dynamics resulting from them, opens up fundamentally new solutions to long-standing problems.

Instability is one possible behavior of complex dynamical systems, which results when the characteristic system parameters cross certain critical thresholds. If a system behaves unstable, i.e. perturbations are amplified, a random, small deviation from the normal system state may trigger a domino effect that cannot be stopped, even if people have the best intentions to do so and have enough information, good technology, and proper training. In such situations of systemic instability, the system will inevitably get out of control sooner or later, no matter how hard we try to avoid this. As a consequence, we need to know the conditions under which systems will behave in an unstable way, in order to avoid such conditions. In many cases, too strong interactions are a recipe for disaster or other undesirable outcomes.

Group dynamics and mass psychology may be seen as typical examples of collective dynamics. People have often wondered what makes a crowd turn "mad", violent, or cruel. After the London riots in the year 2011, people asked how it was possible that teachers and daughters of millionaires – people one would not expect to be criminals – were participating in the lootings. Did they become criminal minds when their demonstrations against police violence suddenly turned into riots? Possibly, but not necessarily so. In the above traffic flow example, people wanted to do one thing: drive continuously at reasonably high speed, but a phantom traffic jam occurred instead. We found that, while individual cars are well controllable, the traffic flow – a result of the interactions of many cars – is often not. The take home message may be formulated as follows: complex systems cannot be steered like a car. Even if everyone has the latest technology, is well-informed and well-trained, and has the best intentions, an unstable complex system will sooner or later get out of control.

Therefore, while our intuition works well for weakly coupled systems, in which the system properties can be understood as sum of the component properties, complex dynamical systems behave often in counter-intuitive, hardly predictable ways. Frequently, the collective, macro-level outcome in a complex system can't be understood from and controlled by the system components. (Such system components might also be individuals or companies, for example.)

Beware of strongly coupled systems


Thus, what tends to be different in strongly coupled systems as compared to weakly interacting ones? First, the dynamics of strongly connected systems with positive feedbacks is often faster. Second, self-organization and strong correlations tend to dominate the dynamics of the system. Third, the system behavior is often counter-intuitive – unwanted feedback or side effects are common. Conventional wisdom tends to fail. In particular, extreme events occur more often than expected, and they may impact the entire system. Furthermore, the system behavior can be hard to predict, and planning for the future may not be useful. Opportunities for external control are also typically quite limited, as the system-immanent interactions tend to dominate. Finally, the loss of predictability and control may lead to an erosion of trust in private and public institutions, which in turn can create social, political, or economic instabilities.

In spite of all this, many people still have a component-oriented and individual-centric view, which can be quite misleading. We praise heroes when things run well and search for scapegoats when something goes wrong. But the discussion above has shown how difficult it is for individuals to control the outcome of a complex dynamical system, if its components' interactions are strong. This fact may be illustrated by the example of politics. Why do politicians, besides managers, have among the worst reputations among all professions? This is probably because we vote them to make politics according to the positions they publicly voice, but then we often find them doing something else. This, again, is a consequence of the fact that politicians are exposed to many strong interactions due to lobbyists and pressure groups with various points of view. Each one is trying to push the politician in a different direction. In many cases, this will force the politician to take a decision that is not compatible with his or her own points of view, which is hard for the voters to accept. Managers of companies find themselves in similar situations. But not only they: think of the decision-dynamics in many families. If it were easy to control, we would not see so many divorces...

Crime is another good example for unwanted outcomes of complex dynamics, even though a controversial one. We must ask ourselves: Are we interested in sustaining social order, or are we interested in filling prisons? If we decide for the first option, we must confront ourselves with the question: Should we really see all crime as deeds of criminal minds, as we often do? Or should we pay more attention to the circumstances that happen to cause crime? In cases, where individuals plan crimes such as the theft of a famous diamond, the conventional picture of crime is certainly appropriate. But do these cases give a representative picture?

Classically, it is assumed that crimes are committed, if the expected advantage is larger than the punishment, multiplied with the probability of being convicted. Therefore, raising punishments and discovery rates should theoretically eliminate all crime. Such punishment would make crime a lossful experience and, therefore, "unattractive." However, empirical evidence questions this simple picture. On the one hand, people usually don't pick pockets, even though they could often get away without a punishment. On the other hand, deterrence strategies are surprisingly ineffective in most countries, and high crime rates are often recurrent. For example, even though the USA have 10 times more prisoners than most European countries, rates of various crimes, including homicides, are still much higher. So, what is wrong with our common understanding of crime?

Surprisingly, many crimes, including murders, are committed by average people, not by people with criminal careers. A closer inspection shows that many crimes result from situations, over which the involved individuals lose their control. Frequently, group dynamics plays an important role, and many scientific studies indicate that the socio-economic context is a strong determining factor of crime. Therefore, in order to counter crime, it might be more effective to change these socio-economic conditions rather than sending more people to jail. I am saying this also with an eye on the price we have to pay for this: A single prisoner costs more than the salary of a postdoctoral researcher with a PhD degree, some even more than a professor!

Cascade effects in complex networks


Making things worse, complex systems may show further problems besides dynamic instabilities based on amplification effects. Thanks to globalization and technological progress, we have now a global exchange of people, goods, money, and information. Worldwide trade, air traffic, the Internet, mobile phones, and social media have made everything much more comfortable – and connected. This has created many new opportunities, but everything now depends on a lot more things. What are the implications of this increased interdependency? Today, a single tweet can send stock markets to hell. A youtube movie can trigger a riot that kills dozens of people. Our decisions can have impacts on the other side of the globe more easily than ever – and sometimes unintentionally so. For example, today’s quick spreading of emerging epidemics is largely a result of global air traffic, and can seriously affect global health, social welfare, and economic systems.

By networking our world, have we inadvertently built highways for disaster spreading? In 2011 alone, three major cascading failures occurred, which are changing the face of the world and the global balance of power: The financial crisis, the Arab spring and the combined earthquake, tsunami and nuclear disaster in Japan. In the following, I will discuss some examples of cascade effects in more detail.

Large-scale power blackouts


On November 4, 2006, a power line was temporarily turned off in Ems, Germany, to facilitate the transfer of a Norwegian ship. Within minutes, this caused a blackout in many regions all over Europe – from Germany to Portugal! Nobody expected this. Before the line was switched off, of course, a computer simulation was performed to verify that the power grid would still operate well. But the scenario analysis did not check for the coincidence of a spontaneous failure of another line. In the end, a local overload of the grid caused emergency switch-offs in the neighborhood, creating a cascade effect with pretty astonishing outcomes: some blackouts occurred in regions thousands of kilometers away, while other areas in the neighborhood were not affected at all. Is it possible to understand this strange behavior?

Indeed, a computer-based simulation study of the European power grid recently managed to reproduce such effects. It demonstrated that the failure of a few network nodes in Spain could create a surprising blackout in Eastern Europe, several thousand kilometers away, while the electricity network in Spain would still work - see visualisation

Furthermore, increasing the capacities of certain parts of the power grid would unexpectedly make things worse. The cascading failure would be even bigger! Therefore, weak elements in the system have an important function: they act as circuit breakers, thereby interrupting the failure cascade. This is an important fact to remember.

Bankruptcy cascades


The sudden financial meltdown in 2008 is another example, which hit many companies and people by surprise. In a presidential address to the American Economic Association in 2003, Robert Lucas said:
"[The] central problem of depression-prevention has been solved."
Similarly, Ben Barnenke, as chairman of the Federal Reserve Board, long believed that the economy was well understood, and doing well. In September 2007, Ric Mishkin, a professor at Columbia Business School and then a member of the Board of Governors of the US Federal Reserve System, made a statement reflecting widespread beliefs at this time:
"Fortunately, the overall financial system appears to be in good health, and the U.S. banking system is well positioned to withstand stressful market conditions."

As we all know, things came very different. A banking crisis occurred only shortly later. It started locally, with the bursting of a real estate bubble in the West of the USA. Because of this locality, most people thought this problem was easy to contain. But the mortgage crises had spill-over effects to the stock markets, where certain financial derivatives could not be sold anymore (now called "toxic assets"). Eventually, more than 400 banks all over the United States went bankrupt. How could this happen? The video presents an impressive visualisation of the bankruptcies of banks in the USA after Lehman Brothers collapsed. Apparently, one bank's default triggered further ones, and these triggered even more. In the end, hundreds of billion dollars were lost.

The above video reminds of another video which I often use to illustrate cascade effects: It shows an experiment with many table tennis balls placed on top of mouse traps. The experiment demonstrates impressively that a single local perturbation can mess up the entire system. It illustrates chain reactions, which are the basis of atomic bombs or of nuclear fission reactors. As we know, such cascade effects are technologically controllable in principle, if we stay below the critical interaction strength (sometimes called the "critical mass"). Nevertheless, these processes can sometimes get out of control, mostly in unexpected ways. The nuclear disasters in Chernobyl or in Fukushima are well-known examples for this. So, we must be extremely careful with systems showing cascade effects.

The financial crisis


As we know, the above-mentioned cascading failure of banks was just the beginning of an even bigger crisis. It subsequently caused an economic crisis and a public spending crisis in major areas of the world. Eventually, the events even threatened the stability of the Euro currency and the European Union. The crisis brought several countries (including Greece, Ireland, Portugal, Spain, Italy and the US) at the verge of bankruptcy. As a consequence, many countries have seen historical heights in unemployment rates. In some countries, more than 50 percent of young people do not have a job. In many regions, this has caused social unrests, political extremism and increased crime and violence. Unfortunately, it seems that the cascade effect has not been stopped yet. There is a long way to go until we fully recover from the financial crisis and from the public and private debts accumulated in the past years. If we can't overcome this problem soon, it has even the potential to endanger peace, democratic principles and cultural values, as I pointed out in a letter to George Soros in 2010. Looking at the situation in Ukraine, we are perhaps seeing this scenario already.

While all of this is now plausible from hindsight, the lack of advance understanding by conventional wisdom becomes clear by the following quote from November 2010, going back to the former president of the European Central Bank, Jean-Claude Trichet:

"When the crisis came, the serious limitations of existing economic and financial models immediately became apparent. Arbitrage broke down in many market segments, as markets froze and market participants were gripped by panic. Macro models failed to predict the crisis and seemed incapable of explaining what was happening to the economy in a convincing manner. As a policy-maker during the crisis, I found the available models of limited help. In fact, I would go further: in the face of the crisis, we felt abandoned by conventional tools." Similarly, Ben Bernanke summarized in May 2010: “The brief market plunge was just an example of how complex and chaotic, in a formal sense, these systems have become… What happened in the stock market is just a little example of how things can cascade, or how technology can interact with market panic.”

Leading scientists as well had problems making sense of the crisis. In a letter dated 22 July 2009 to the Queen of England, the British Academy came to the conclusion:

"When Your Majesty visited the London School of Economics last November, you quite rightly asked: why had nobody noticed that the credit crunch was on its way? ... So where was the problem? Everyone seemed to be doing their own job properly on its own merit. And according to standard measures of success, they were often doing it well. The failure was to see how collectively this added up to a series of interconnected imbalances over which no single authority had jurisdiction. ... Individual risks may rightly have been viewed as small, but the risk to the system as a whole was vast. ... So in summary ... the failure to foresee the timing, extent and severity of the crisis … was principally the failure of the collective imagination of many bright people to understand the risks to the systems as a whole."

Thus, nobody was responsible for the financial mess? I don't want to judge, but we should remember that it's often not possible to point the finger at the exact person who caused a phantom traffic jam. Therefore, given that these are collectively produced outcomes, do we have to accept collective responsibility for them? And how to enumerate everyone's share of responsibility? This is certainly an important question worth thinking about.

It is also interesting to ask, whether complexity science could have forecasted the financial crisis? In fact, before the crash, I followed the stock markets pretty closely, as I noticed strong price fluctuations, which I interpreted as "critical fluctuations," i.e. an advance warning signal of an impending financial crash. Therefore, I sold my stocks in the business launch of an airport in 2007, while waiting for the departure of my airplane. In spring 2008, about half a year before the collapse of Lehman brothers, I wrote an article together with Markus Christen and James Breiding, taking a complexity science perspective on the financial system. We came to the conclusion that the financial system was in a process of destabilization. Pretty much as Andrew Haldane, Chief Economist and Executive Director at the Bank of England, formulated it later, we believed that the increased level of complexity in the financial system was a major problem. It made the financial system more vulnerable to cascade effects than most experts thought. In spring 2008, we were so worried about this that we felt we had to alert the public, but none of the newspapers we contacted was ready to publish our essay at that time. "Too complicated for our readers" was the response, while we replied "if you cannot make this understandable to your readers, then there is nothing that can prevent the financial crisis." And so the financial crisis came! Six month after the crisis, a manager of McKinsey in the United Kingdom commented on our analysis that it was the best he had ever seen.

But there were much more prominent people who saw the financial crisis coming. For example, legendary investor Warren Buffet warned of mega-catastrophic risks created by large-scale investments into financial derivatives. Back in 2002 he wrote:

"Many people argue that derivatives reduce systemic problems, in that participants who can't bear certain risks are able to transfer them to stronger hands. These people believe that derivatives act to stabilize the economy, facilitate trade, and eliminate bumps for individual participants. On a micro level, what they say is often true. I believe, however, that the macro picture is dangerous and getting more so. ... The derivatives genie is now well out of the bottle, and these instruments will almost certainly multiply in variety and number until some event makes their toxicity clear. Central banks and governments have so far found no effective way to control, or even monitor, the risks posed by these contracts. In my view, derivatives are financial weapons of mass destruction, carrying dangers that, while now latent, are potentially lethal."
As we know, it still took five years until the "investment time bomb" exploded, causing losses of trillions of dollars to our economy.

Fundamental uncertainty


In liquid financial markets and many other hardly predictable systems such as the weather, we can still determine the probability of events, at least approximately. Thus, we make a probabilistic forecast similar to: "there is a 5 percent chance to lose more than half of my money when selling my stocks in 6 months, but a 70 percent chance that I will make a good profit, etc." It is then possible to determine the expected loss (or gain) implied by the likely actions and events. For this purpose, the damage or gain of each possible event is multiplied with its probability, and the numbers are added up to give the expected damage or gain. In principle, one could do this for all actions we might take, in order to determine the one that minimizes the damage or maximizes the gain. The only problem involved in this exercise seems to be the practical determination of the probabilities and of the likely damages or gains involved. With the increasing availability of data, this problem might, in fact, be attacked, but it will remain difficult or impossible to determine the probabilities of "extreme events," as the empirical basis for rare events is too small.

It turns out, however, that there are problems where the expected damage in large (global) systems cannot be determined at all for principal reasons. Such "fundamental" or "radical" uncertainty can occur in case of cascade effects, where one failure is likely to trigger other failures, and where the damage related to subsequent events times their likelihood is increasing. In such cases, the sum of losses may be unbounded, in principle, i.e. it may not be possible anymore to enumerate the expected loss. In practice, this means that the actual damage can be small, big, or practically unbounded, where the latter might lead to the collapse of the entire system.

Explosive pandemic outbreaks


The threat by cascade effects might be even worse if the damage occurring in an early phase of the cascade process reduces the probability of resisting failures that are triggered later. A health system, in which financial or medical resources are limited, may be considered as an example for this. How will this system deal with emergent diseases? A computer-based study that I performed together with Lucas Böttcher, Nuno Araujo, Olivia Woolley Meza and Hans Hermann shows that the outcome very much depends on the connectivity between people who may infect each other. A few additional airline connections can make the difference between a case, where the disease will be contained, and where it turns into a devastating global pandemics. The problem is that crossing a certain connectivity threshold will change the system dynamics dramatically and unexpectedly. Thus, have we built global networks that behave in unpredictable and uncontrollable ways?

Systemic interdependencies


Recently, Shlomo Havlin and others made a further important discovery: they revealed that networks of networks can be particularly vulnerable to failures. A typical example is the interdependency between electrical and communication networks. Another example, which illustrates the global interdependencies between natural, energy, climate, financial, and political systems is the following: In 2011, the Tohoku earthquake in Japan caused a tsunami that triggered chain reactions and nuclear disasters in several reactors at Fukushima. Soon after this, Germany and Switzerland decided to exit nuclear power generation over the next decade(s). However, alternative energy scenarios turn out to be problematic as well. European gas deliveries depend on some regions, which we cannot fully rely on. Likewise, Europe’s DESERTEC project, a planned 1000 billion Euro investment into infrastructure to supply solar energy for Europe – has an uncertain future due to another unexpected event, the Arab Spring. This was triggered by high food prices, which were no longer affordable to many people. These high food prices, in turn, resulted partly from biofuel production, which intended to improve the global CO2 balance, but competed with food production. The increasing food prices were further amplified by financial speculation. Hence, the energy system, the political system, the social system, the food system, the financial system – they have all become closely interdependent systems, which makes our world ever more vulnerable to perturbations.

Have humans unintentionally created a "complexity time bomb"?


We have seen that, when systems are too much connected, they might get out of control sooner or later, despite advanced knowledge and technology, and best intentions to keep things under control. Therefore, as we have created more and more links and interdependencies in the world, we must ask ourselves: have humans inadvertently produced a "complexity time bomb", a system that will ultimately get out of control?

For a long time, problems such as crowd disasters and financial crashes have been seen as puzzling, ‘God-given’ phenomena or "black swans" one had to live with. However, problems like these should not be considered “bad luck.” They are often the consequence of a flawed understanding of counter-intuitive system behaviors. While conventional thinking can cause fateful decisions and the repetition of previous mistakes, complexity science allows us to understand the mechanisms that cause complex systems to get out of control. Amplification effects can result and promote failure cascades, when the interactions of system components become stronger than the frictional effects or when the damaging impact of impaired system components on other components occurs faster than the recovery to their normal state. That is, time scales of processes largely determine the controllability of a system as well. Delayed adaptation processes are often responsible for systemic instabilities and losses of control (see the related Information Box at the end).

For certain kinds of networks, the similarity of related cascade effects with those of chain reactions in nuclear fission is quite disturbing. Such processes are difficult to control. Catastrophic damage is a realistic scenario. Therefore, given the similarity of the cascading mechanisms, is it possible that our worldwide anthropogenic system will get out of control sooner or later? When analyzing this possibility, one must bear in mind that the speed of destructive cascade effects might be slow, and the process may not appear like an explosion. Nevertheless, the process may be hard to stop and lead to an ultimate systemic failure. For example, the dynamics underlying crowd disasters is slow, but deadly. So, what kinds of global catastrophic scenarios might we face in complex societies? A collapse of the global information and communication systems or of the world economy? Global pandemics? Unsustainable growth, demographic or environmental change? A global food or energy crisis? A cultural clash? Another global-scale war? A societal shift, driven by technological innovations? Or, more likely, a combination of several of these contagious phenomena? The World Economic Forum calls this the "perfect storm," and the OECD has formulated similar concerns.

Unintended wars and revolutions


Last but not least, it is important to realize that large-scale conflicts, revolutions, and wars can also be unintended results of systemic instabilities and interdependencies. Interpreting them as deeds of historical figures personalizes these phenomena in a way that distracts from their true, systemic nature. It is important to recognize that complex systems such as our economy or societies usually resist attempts to change them at large, namely when they are close to a stable equilibrium. This is also known as Goodhart's law (1975), principle of Le Chatelier (1850-1936), or as "illusion of control." Individual factors and randomness can only have a large impact on the path taken by the complex system, when the system is driven to a tipping point (also known as "critical point"). In other words, instability is a precondition for individuals to have a historical impact. For example, the historical sciences increasingly recognize that World War I was pretty much an unintended, emergent outcome of a chain reaction of events. Moreover, World War II was preceded by a financial crisis and recession, which destabilized the German economic, social, and political system. This finally made it possible that an individual could become influential enough to drive the world to the edge.

Unfortunately, civilization is vulnerable, and a large-scale war may happen again – I would say, it is even likely. A typical unintended path towards war looks as follows: The resource situation deteriorates, for example, because of a serious economic crisis. The resulting fierce competition for limited resources lets competition, violence, crime, and corruption rise, while solidarity and tolerance go down, so that the society is fragmented into groups. This causes conflict, further dissatisfaction and social turmoil. People get frustrated about the system, calling for leadership and order. Political extremism emerges, scapegoats are searched, and minorities are put under pressure. As a consequence, socio-economic diversity is lost, which further reduces the economic success of the system. Eventually, the well-balanced "socio-economic ecosystem" collapses, such that the resource situation (the apparent "carrying capacity") deteriorates. This destabilizes the system further, such that an external enemy is "needed" to re-stabilize the country. Finally, nationalism rises, and war may seem to be the only "solution" to keep the country together.

Note that a revolution, too, can be the result of systemic instability. Hence, it does not need to be initiated by an individual, "revolutionary" leader, who challenges an established political system. The breakdown of the former German Democratic Republic (GDR) and some Arab spring revolutions (for example, in Libya) have shown that revolutions may start even without the existence of a clearly identifiable political opponent leading the revolution. On the one hand, this is the reason, why such revolutions cannot be stopped by targeting a few individuals and sending them to jail. On the other hand, the absence of revolutionary leaders has puzzled secret services around the world – the Arabic spring took them by surprise. It was also irritating for sympathetic countries, which could not easily provide support for democratic civil movements. Whom should they have talked or given money to?

It provides a better picture to imagine such revolutions as a result of situations, in which the interest of government representatives and the people (or the interests of different societal groups) have drifted away from each other. Similar to tensions created by the drift of the Earth's tectonic plates, this would sooner or later lead to an unstable situation and an "earthquake-like" stress release (the "revolution"), resulting in a re-balancing of forces. Again, it is a systemic instability, which allows individuals or small groups to become influential eventually, while the conventional picture suggests that the instability of a political regime is caused by a revolutionary leader. Putting it differently, a revolution isn't usually the result of the new political leaders, but of the politics that was made before, which destabilized the system. So, we should ask ourselves, how well are our societies doing in terms of balancing the different interests in our societies, and in terms of adapting to a quickly changing world, due to demographic change, environmental change, technological change?

Conclusion


It is obvious that there are many problems ahead of us. Most of them result from the complexity of the systems humans have created. But how can we master all these problems? Is it a lost battle against complexity? Or do we have to pursue a new, entirely different strategy? Do we perhaps even need to change our way of thinking? And how can we generate the innovations needed, before it's too late? The next chapters will let you know...


Information Box: How harmless behavior can turn critical

In the traffic flow example and for the case of crowd disasters, we have seen that a system can get out of control when the interaction strength (e.g. the density) is too large.

How a change in density can turn harmless behavior of system components uncontrollable, is illustrated by the following example: Together with Roman Mani, Lucas Böttcher, and Hans J. Herrmann, I studied collisions in a system of equally sized particles moving in one dimension, similar to Newton's Cradle see video. We assumed that the particles tended to oscillate elastically around equally spaced equilibrium points, while being exposed to random forces generated by the environment. If the distance between the equilibrium points of neighboring particles was large enough, each particle oscillated around its equilibrium point with normally distributed speeds, and all particles had the same small variance in speeds.

However, as the separation of equilibrium points approached the particle diameter, we found a cascade-like transmission of momentum between particles see video. Surprisingly, towards the boundary particles, the variance of speeds was rapidly increasing. In energy-conserving systems, the speed variance of the outer particles would even tend towards infinity with increasing system size. Due to cascading particle interactions, this makes their speeds unpredictable and uncontrollable, even though every particle follows a simple and harmless dynamics.

Information Box: Loss of Synchronization

There is another puzzling kind of systemic instability that is highly relevant for our societies, as many socio-economic processes accelerate. It occurs when the separation of time scales gets lost. For example, hierarchical systems in physics and biology are characterized by the fact that adjustment processes on higher hierarchical levels are typically much slower than on lower hierarchical levels. Therefore, lower level variables adjust quickly to the constraints set by the higher level ones, and that is why the higher levels basically control the lower ones. For example, groups tend to take decisions more slowly than the individuals forming them, and the organizations and states made up from them change even more slowly (at least it has been like this in the past).
Time scale separation implies that the system dynamics is determined by a few variables only, which are typically related to the higher hierarchy levels. Monarchies and oligarchies are good examples for this. In current socio-political and economic systems, however, we observe the trend that higher hierarchical levels show accelerating speeds of adjustment, such that the lower levels can no longer adjust more quickly than the higher levels. This may eventually destroy time scale separation, such that many more variables start to influence the system dynamics. The result of such mutual adjustment attempts on different hierarchical levels could be turbulence, "chaos," or a breakdown of synchronization. In fact, systems often get out of control, if the adjustment processes are not quick enough and responses to changed conditions are delayed.




No comments:

Post a Comment

Note: only a member of this blog may post a comment.