One of the most important things to grasp about a complex adaptive system is that, at a system level, we have no linear material cause but instead we have a dispositional state, a set of possibilities and plausibilities in which a future state cannot be predicted. Now I’ve always found this easy to grasp but I know I’m in a minority. I think it links to religious and other traditions which accept that somethings just are, there is no reason for it. Fundamental attribution error, a mainly Northern European/US issue is the assumption that there is always a cause or a reason for things. I’d trace that back to Kant, Newton etc. and the criticality of cause and effect relationships for the scientific revolution. The birth of sociology is also during that period and the quasi determinism, that in a modern age we might deem Seldon Man, permeates popular thought as well as governance. If something has happened then there must be a reason for it and we attributeture blame or credit accordingly.
Our ability to assume cause is uniquely human, we are attributing qualities to the system that are not part of the system per se, but which arise from our need to make meaning in social constructs. This, like most things, is both good and bad. It is disconnected with reality in both cases and in general the resultant inauthenticity is mostly bad. If we attribute cause, and intentional cause at that, then we get to a position were we start to believe that control is possible. Because we believe that we set targets; achievement of targets is a survival capability so they are achieved regardless of consequence and the whole cycle carries on. One of the things I learnt during my IBM time is the corporate vampire who moves from business to business every year to two years, focuses on target achievement regardless of medium and long term consequence and then moves on a repeats. Reality and consequence are left for who ever follows.
The one thing we can say for certain in a complex adaptive system is that whatever we do will have unintended consequences. The bigger the intervention the higher the risk hence the need to do smaller experiments in parallel. In general many complex problems are solved obliquely not directly. One reason for that is that working obliquely produces unintended consequences that we then observe (anomalies attract) and we sense a new possibility, we innovate. By working around a problem we find possibilities that could not have been discovered directly. In dispositional mapping we not only look at dominant views we also isolate and identity outlier views; people who are seeing the world differently from the norm. In a complex system it is views on the edge that matter. We incline to the conventional, we need to pay attention to the unconventional, the outliers, the mavericks.
Dispositional Inclination
37x13x10″ (95x33x26cm)
April 2006
Gaffer 45% Lead Crystal and silicon bronze
Lost wax glass casting, formed and welded bronze
Unique work
This item is on display at Brian Russell Studio.
Top picture – sunset seen the the trees on a neolithic barrow, Avebury 2017
Cognitive Edge Ltd. & Cognitive Edge Pte. trading as The Cynefin Company and The Cynefin Centre.
© COPYRIGHT 2025
One of the most common, and the most irritating aspect of change workshops is the ...
You see conflict and care in all systems but humans are pretty unique in allowing ...