Camarina was a city in southern Sicily, founded by colonists from Syracuse in 598 B.C. A generation or two later, it was threatened by a pestilence—festering, some said, in the adjacent marsh. (While the germ theory of disease was certainly not widely accepted in the ancient world, there were hints-for example, Marcus Varro in the first century B. C. advised explicitly against building cities near swamps “because there are bred certain minute creatures which cannot be seen by the eyes, which float in the air and enter the body through the mouth and nose and there cause serious disease.”) The danger to Camarina was great. Plans were drawn to drain the marsh. When the oracle was consulted, though, it forbade such a course of action, counseling patience instead. But lives were at stake, the oracle was ignored, and the marsh was drained. The pestilence was promptly halted. Too late, it was recognized that the marsh had protected the city from its enemies—among whom there had now to be counted their cousins the Syracusans. As in America 2,300 years later, the colonists had quarreled with the mother country. In 552 B.C., a Syracusan force crossed over the dry land where the marsh had been, slaughtered every man, woman, and child, and razed the city. The marsh of Camarina became proverbial for eliminating a danger in such a way as to usher in another, much worse.
Carl Sagan, Pale Blue Dot
One of the things that make complex systems complex is that the relationship between various agents and elements are, well, complicated. The relationship between two given variables is not necessarily linear. Changing one thing can affect one or many things that might not, on the surface, have appeared connected. And sometimes, as in the example above, one change can have a completely unexpected result.
These ideas are covered by some of the points in our manifesto;
Complex systems are interconnected
Actions can have unexpected outcomes
and also by two of the cognitive attitudes we hope we can encourage and demonstrate through our performance;
the ability to balance outcomes against expectations.
Unexpected outcomes can also lead to cognitive dissonance, and a questioning of assumptions and biases.
There are several different ways to look at these unexpected outcomes in a system. The one above demonstrates problem solving that causes an unforeseen, bigger problem. This is significant for modeling because a) modeling can help to shed light on the areas in a system which are more connected than generally understood, but also b) modeling cannot ever map a system so faithfully, with perfect accuracy, that a Marsh of Camerina scenario is impossible.
For us, in a performance, this kind of relationship is problematic. How could we possibly demonstrate such a scenario without the audience feeling tricked? If the audience cannot know about such a consequence, then punishing them with one is a clear betrayal of their investment.
Our bible Resilience Practice has examples of counterintuitive feedback effects, where the action seems as though it would only make things worse, but due to the complicated nature of the systems’ relationships and their feedback loops, actually solved the problems. In Papua New Guinea, an introduced species of aquatic fern was growing out of control and causing significant damage to the native species in local ecosystems. In other countries this fern had been managed by the introduction of a small species of weevil that ate the fern, but in Papua New Guinea the weevil died out. Ecologist Peter Room discovered that the nitrogen levels in the fern were too low for the weevils, and that by fertilising the plants he could raise nitrogen levels to a point that the weevils could grow their population and eat the fern. He also found a threshold point in the weevil population where their damage to the fern increased nitrogen content to a level that allowed the weevils to continue breeding.
System interactions like this are difficult for us to include in our model, because the lack of real-world specificity makes it hard to sell such a specific relationship. Without an extensive knowledge of the system, how can we reasonably expect an audience to make such sophisticated and counterintuitive decisions?
Emergent properties are another kind of unexpected outcome observable in complex systems. Emergent properties are effects that the system brings about as a whole, and that cannot be produced by altering or managing one isolated component. It is difficult to understand or predict the formation and flying behaviour of a flock by looking at a single bird. Sometimes emergent properties can be negative effects that occur from every component in a system running optimally – if every farm in an area produces bumper crops then the market might devalue each farmer’s crops, giving less return. Emergent properties usually won’t break a system or tip it into an undesirable state, so perhaps this is a better way for us to demonstrate unexpected outcomes in a performance without leaving the audience feeling betrayed. Hopefully as we continue our systems description of the music festival we can discover some interesting emergent properties.
How can we present a counterintuitive relationship in our system that doesn’t make the audience feel as though their investment has been betrayed, but rather encourages a state of cognitive dissonance that leads to a greater understanding of the nature of complex systems?