Acumen Incited 1 of 2

January 28, 2024

I haven’t used a gapingvoid® cartoon for a bit in a blog post, but this one could not have been more appropriate for my theme of today, which deals with both conflict and the AI issue; hence, my playing with the A and the I in the title.  The other reason is the pattern-based QuickSense we will launch on the 1st of February, one version of which uses semiotics from Gapingvoid.  Empathy is critical to distributed forms of conflict resolution, and it is also a distinguishing feature, for the moment, of human consciousness. I am using it here to mean seeing others in the same light as yourself and your immediate family.  But its meaning in aesthetics is “the quality or power of projecting one’s personality into or mentally identifying oneself with an object of contemplation”, which is also relevant.  Its use started as a translation of einfühlung at the turn of the last century, and it has a related meaning concerning “the phenomenon whereby a humanoid robot or computer-generated figure bearing a close but imperfect resemblance to a human being arouses a sense of unease or revulsion in a person viewing it.”  That comes from Mori’s Uncanny Valley, where looking too human-like triggers revulsion; flawlessness disturbs humans in many ways.  In a sense, we can’t be empathetic with perfection.  For those unfamiliar with Mori, he plotted emotional responses to robots.  Empathy increased as the robots became more human, but as verisimilitude approached, that went into sudden reversal.  We can draw on that in understanding the appropriate use of AI.

 Like most things human, proximity matters.  Empathy is not a remote, abstract concept but occurs at the moment, both in time and space and probably involves more than just sight and sound.  An increasing body of work is starting to show the profound effect of smell on human sense-making, something with implications for virtual meetings and many other things.  It also means that empathy is not a cognitive-only process.  You might be able to write an algorithm that would give an appearance of sympathy, but that is not the same as empathy. Physical touch, which is seen as empathetic, has an impact on pain.   Empathy intensives with close interaction. Looking at some of the work we have been talking about for end-of-life decisions for children, as the moment approaches, the medical staff start to use increasingly technical language to distance themselves from the emotional impact. At the same time, the parents become more emotionally intense, and communication becomes more challenging.  Empathy can be a hindrance and a help if there is a disconnect between the parties.

There is also the question of what we want in a tool.  I mentioned the  Uncanny Valley, in which we like robots to have some human characteristics but not imitate humans too closely.  Humans like to make distinctions and draw lines between family, clan, tribe and even horde; heterogeneity is our natural mode of operation.  Walter Freeman, who I had the great privilege to spend time with before he died, famously demonstrated that there are chemicals in the brain that ensure we can never remember anything the same way twice; there needs to be a gradient that introduces anomalies into our sense-making.  The inherent complexity of human decision-making, particularly its social aspect,  is in the early days of scientific discovery, and we need to be very cautious in our assumptions. Requisit ambiguity again, and the idea of gradients in perception seems to be a physically evolved characteristic, not something that can be reduced (and I use that word advisedly) to a set of algorithms.   That said, I recognise some people see reducing things to algorithms as a universally good thing, but I feel that their condition is dehumanising.  Regrettably, many of them have lived the bulk of their lives and a significant period around puberty in a world primarily determined by algorithms, so it may be more than sympathy they need.

I’ve previously covered the question of small group proximity using empathy to achieve peace and reconciliation and simple conflict resolution.   That was in the second of the recent Patterns of Change & Conflict series, which also referred to a more specific post.  I’ve also hinted at using a variant of that approach to allow for distributed allocation of resources and decision-making in all organisations.  More on that in future posts, and I’ll be up in Sheffield tomorrow working on possible partnerships.  Delegating decision-making and resource allocation is not the same as distributing; the former reduces risk and stress.  I feel that we are just shifting to a position in society where we are starting to understand some of the revolutionary possibilities that complexity science offers to do more, with significantly fewer resources, using network intelligence.  That includes my earlier post on non-directed foraging patterns.

 But I want to move on to Acumen and Inciting change. The Oxford English Dictionary defines acumen as sharpness of wit, quickness or penetration of perception, keenness of discrimination, and (now esp.) the ability to make good judgements and decisions.  To some extent, it is linked to the Scottish dialect word canny, defined as cautious and careful in worldly or business matters; worldly-wise, shrewd.  The ideas of wit, keenness, and shrewdness all describe complex human properties relating to how to exercise judgment.  They are, to a degree, ephemeral. We know when someone has them, but we don’t have a prescribed path to their acquisition and use.    To reference an old debate: while nature deals the cards, nurture plays them, and those processes are, in the main, non-replicable in the sense of a recipe.

When we have a quality that we recognise but can’t fully describe its nature or prescribe a pathway to its acquisition, then we have a quality which can’t be confined to or by an algorithm or, for that matter, a training data set.  This also gives us a pathway to working out where we need to be cautious in automation. ASHEN helps here in that habits, experience and natural talent in various combinations all provide for the emergence of those qualities.  To a degree, the qualities are present in all human beings, but in some more than others.  So, how do we incite people to use and develop those innate skills in the face of growing automation?  I’ve chosen to use incite deliberately, as it means to urge or spur on, to stir up, animate, instigate, or stimulate. Const. to do something; to or unto some action. (OED again).  It cannot be done to people, although it can be catalysed; they must do it themselves.   I’ll  develop thist in the second post.


The opening illustration is from the inimitable Gaping Void, and the banner picture is cropped from an original by Luna Wang on Unsplash.  Definitions and related phrases have been taken from the Oxford English Dictionary.

Recent Posts

About the Cynefin Company

The Cynefin Company (formerly known as Cognitive Edge) was founded in 2005 by Dave Snowden. We believe in praxis and focus on building methods, tools and capability that apply the wisdom from Complex Adaptive Systems theory and other scientific disciplines in social systems. We are the world leader in developing management approaches (in society, government and industry) that empower organisations to absorb uncertainty, detect weak signals to enable sense-making in complex systems, act on the rich data, create resilience and, ultimately, thrive in a complex world.

Cognitive Edge Ltd. & Cognitive Edge Pte. trading as The Cynefin Company and The Cynefin Centre.


< Prev

Models & mesmerism

One of the most frequent corrections I have to make to public uses of Cynefin ...

More posts

Next >

Acumen Incited 2 of 2

My last post took the general position that acumen in human systems is not just ...

More posts

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram