Recently many of my conversations have focused on accountability. Sometimes the context is management of a complex project in an unpredictable landscape. Sometimes it is demonstration of outcomes and impacts for international programs to shift massively complex patterns such as nutrition and livelihood. Sometimes, it is simply a prickly relationship between a supervisor and a technical professional. Sometimes it is a new resident in the White House and his retinue.
Unpredictability makes accountability a problematic concept in complex systems, especially ones that are recognized to be adaptive or self-organizing. If you don’t know what an outcome will be, how do you hold self or others accountable to produce a result?
At the HSD Institute, we think of three different kinds of accountability in complex systems.
One is the traditional kind of accountability. It is possible—even advisable—when a system is in a relatively stable state. Examples in my experience include safety regulations, assembly line production, auto repair, logistics, and construction. Even as I write these, I am aware that practitioners in each of these fields would challenge the predictability of their domains, nevertheless:
•Change is slow.
•Parts are tightly coupled to each other.
•Causality is linear or simply nonlinear.
•Boundaries are relatively impermeable and clear.
•Diversity is limited.
•Degrees of freedom are low.
•Part, whole, and greater whole are constrained.
In such situations, one can be relatively certain of outcomes—at least in the short term. It is possible to articulate roles, responsibilities, and objectives and to expect them to remain constant over a specified period of time. In such conditions, one can be held accountable for outcomes.
The second kind of accountability emerges in the process of active self-organizing. From my point of view examples include functional mergers and acquisitions, service delivery, collaborations, cross-functional teams, and effective governments. In these situations:
•Change is unpredictable—sometimes moving quickly and sometimes moving slowly.
•Parts are loosely coupled to each other.
•Causality is complex and nonlinear.
•Boundaries are acknowledged, but they are permeable and sometimes fuzzy.
•Diversity is acknowledged in a small number of differences that make a difference.
•Degrees of freedom are variable across the system.
•Part, whole, and greater whole are mutually influential as patterns emerge and disappear across the entire system.
Accountability does not disappear in these systems, but traditional accountability isn’t possible. Instead of being held accountable to outcomes, individuals and groups can be held accountable for learning, shared meaning making, and directional movement. Are individuals and groups learning new things? Is shared meaning being constructed and/or maintained? Is the trend-line of processes and products moving toward a desirable goal?
The third kind of accountability is even more problematic. It arises when systems appear totally unorganized or random. Immediate response to disasters, transition times in economics and politics, crowds, new technologies, emerging markets, many conflicts are examples of random dynamics in human systems. Also, as I list these, I know there are people who see order and predictability in each of these situations. Patterns, and the categories that describe them, are always relative. In such situations:
•Change is so fast that patterns cannot be discerned over time.
•Parts are uncoupled from each other.
•Causality appears to be absent.
•Boundaries are nonexistent or so numerous that they are meaningless.
•Diversity is unbounded.
•Degrees of freedom tend toward the infinite.
•Interaction of the parts is so random that a system-wide pattern of the whole cannot appear and/or be maintained over time.
Though it may be hard to imagine, accountability is important even in human systems tending toward the random. In these situations, people can be held accountable to explore and share. In the same way that an ant colony spreads out in random patterns, finds a juicy spot, then returns to share the news, people hunt and gather. In random systems, people must be held accountable to gather and disseminate information. This behavior over time increases the coherence of system-wide understanding and action.
So, as I work with clients to improve performance, outcomes, and impacts, I try to help them distinguish which kind of accountabilities are possible in their environment, which kind fits their mission and vision, and which ones represent a reasonable investment in the continuing improvement of humans and their complex systems.
What can you learn when you apply this three-part definition of accountability to yourself and your colleagues?
Cognitive Edge Ltd. & Cognitive Edge Pte. trading as The Cynefin Company and The Cynefin Centre.
© COPYRIGHT 2023
I’m curious these days about dominance, its role in human systems, and its function in ...