“What did you just call me?!”

what did you just call meSpeaker: “Erm, sorry, but I don’t think I ‘called you’ anything. I was just pointing out that, in this particular case, I believe that you are ignorant of what is actually happening….”

Receiver: “How VERY dare you!!!”

Speaker: “No, no, there’s nothing wrong with this – it’s not an accusation…”

When a rather useful word goes bad

If I look up the meaning of the word ‘ignorant’ in, say, the Oxford dictionary, I get a couple of meanings:

1. “Lacking knowledge, information, or awareness about a particular thing”; and

2. “Discourteous or rude”

The example sentence given is “he was told constantly that he was ignorant and stupid”.

Unfortunately, this example sentence ensures that definition’s 1 and 2 are tangled together, and this ‘insult’ meaning has become the normal usage of the word – just as implied by the receiver in the introductory conversation.

…but I think the purely factual definition in meaning 1. is REALLY important and shouldn’t be taken negatively.

Pointing out the facts:

We are ALL ignorant, and whilst the nature of our ignorance will change, we will always be so.

This is where the following well-known quote2 fits in:

“The more you know, the more you realise how much you don’t know.”3

This is a good thing, because if we accept this, then it gives us an incredibly valuable platform to embark on a never-ending but ever-interesting journey of discovery and learning.

Trying to reclaim a word:

So, how about embracing the word ‘ignorant’.

I want to know if something I say or do shows that I am ignorant in respect of something important. In fact, I’d hate you to know this and NOT let me in on it!

But of course, in the same spirit, hopefully you might be uncertain as to whether it’s the other way around i.e. that I might know something that you don’t…

…and we have the perfect environment for a collaborative, non-judgemental conversation about our current worldviews.

Who knows what we might learn – we’ll probably find out that we are both ignorant 🙂 ….but we’ll both be the better for it.

(hopefully obvious) Clarification: I’m NOT suggesting that you rush out and start telling people that they are ignorant! Rather, I’m asking you to rethink the word, and what good it could do us all.

To close: You are very welcome to point out my ignorance in the comments section of any post that I publish…and I will (try to) read and consider in the manner that I describe above.

Footnotes

1. This short post comes from my weekly coffee conversation with my good mate Paul. We always talk over stuff and find out new ways of thinking about things.

2. Quote source: attributed to just about anyone and everyone over time!! (From Aristotle through to Einstein)

3. There is an addition to this quote: “The less you know, the more you think you know”and this takes us directly to the Dunning-Kruger effect.

I often find myself smiling whenever I think about the Dunning-Kruger graph. Here’s how the conversation goes in my head:

Dunning Kruger effect“Mmm, I lack confidence as to whether I know….so my doubt must put me towards the ‘expert’ right-hand side of the graph…

…but me thinking this (i.e. being confident) then throws me to the ‘novice’ left-hand side of the graph…

…but then this doubt about whether I actually know anything puts me back over on the….

…oh, never mind where the hell I sit on that bloody graph! Just accept your ignorance, and enjoy continually learning.” 🙂

 

Advertisements

Lights, camera…and ACTION!

Clapper boardMy last post explained the thinking behind the softening of systems thinking – to include the reality of human beings into the mix.

I ended by noting that this naturally leads on to the hugely important question of how interventions into social systems (i.e. attempts at improving them) should be approached

What’s the difference between…?

The word ‘Science’ is a big one! It breaks down into several major branches, which are often set out as the:

  • Natural sciences – the study of natural phenomena;
  • Formal sciences – the study of Mathematics and Logic; and
  • Social sciences – the study of human behaviour, and social patterns.

Natural science can be further broken down into the familiar fields of the Physical sciences (Physics, Chemistry, Earth Science and Astronomy) and the Life sciences (a.k.a Biology).

The aim of scientists working in the natural science domain is to uncover and explain the rules that govern the Universe, and this is done by applying the scientific method (using experimentation1) to their research.

The key to any and every advancement in the Natural sciences is that an experiment that has supposedly added to our ‘body of knowledge’ (i.e. found out something new) must be:

  • Repeatable – you could do it again (and again and again) and get the same result; and
  • Reproduceable – someone else could carry out your method and arrive at the same findings.

This explains why all ‘good science’ must have been subjected to peer review – i.e. robust review by several independent and objective experts in the field in question.

“Erm, okay…thanks for the ‘lecture’…but so what?!”

Well, Social science is different. It involves humans and, as such, is complex.

The Natural science approach to learning (e.g. to set up a hypothesis and then test it experimentally) doesn’t transfer well to the immensely rich and varied reality of humanity.

“In [social science] research you accept the great difficulty of ‘scientific’ experimental work in human situations, since each human situation is not only unique, but changes through time and exhibits multiple conflicting worldviews.” (Checkland)

I’ll try to explain the enormity of this distinction between natural and social scientific learning with some examples, and these will necessarily return to those repeatability and reproducibility tests:


sodium into waterUnique: I’ll start with Chemistry. If you were to line up two beakers of water and (carefully) drop a small piece of sodium into each then you would observe the same explosive reaction…and, even though you could predict what would happen if you did it a third time, you’d still like to do it again 🙂

I was looking for a ‘social’ comparison and, following a comedy coffee conversation with a fellow Dad, the following observation arose: If you are a parent of two or more children, then you’ll know that consistency along the lines of ‘sodium into water’ is a pipe dream. I’ve got two teenage sons (currently 17 and 15 years old) and whenever I think I’ve learned something from bringing up the first one, it usually (and rather quickly) turns out to be mostly the opposite for the second! They are certainly unique.

The same goes for group dynamics – two different groups of people will act and react in different ways…which you won’t be able to fully determine up-front – it will emerge.


Electric circuitChanging over time: Now, over to Physics. If you were to select a  battery, light bulb and resistor combination and then connect them together with cables in a defined pattern (e.g. in series) then you could work out (using good old Ohm’s law) what will happen within the circuit that you’ve just created. Then, you could take it all apart and put it away, safe in the knowledge that it would work in the same predictable way when you got it all out the next time.

However, in our ‘social’ comparison, you can’t expect to do the same with people…because each time you (attempt to) do something to/with them, they change. They attain new interactions, experiences, knowledge and opinions. This means that it is far too simplistic to suggest that “we can always just undo it if we want to” when we are referring to social situations.

Just about every sci-fi movie recognises this fact and comes up with some ingenious device to ‘wipe people’s minds’ such that they conveniently forget what just happened to them – their memories are rewound to a defined point earlier in time. The ‘Men in Black’ use a wand with a bright red light on the end (hence their protective sunglasses…or is that also fashion?).

In reality, because such devices don’t exist (that I’m aware of), people in most organisations suffer from (what I refer to as) ‘Change fatigue’ – they’ve become wary of (what they’ve come to think of as) the current corporate ‘silver bullet’, and act accordingly. This understandably frustrates ‘management’ who often don’t want to see/ understand the fatigue2 and respond with speeches along the lines of “Now, just wipe the past from your mind – pretend none of it happened – and this time around, I want you to act like it’s really worth throwing yourself into 110%!”.

Mmmm, if only they could!


scalpelsConflicting worldviews: Finally, a Biology example. Let’s suppose that you’ve done a couple of lung dissections – one’s pink and spongy, the other is a black oogy mess3. Everyone agrees which one belonged to the 40-a-day-for-life smoker.

However, for our ‘social’ comparison, get a bunch of people into a room and ask them for their opinions on other people and their actions, and you will get wildly differing points of view – just ask a split jury!

The social phenomena of the ‘facts’ are subject to multiple, and changing, interpretations.

After reading the above you might be thinking…

“….so how on earth can we learn when people are involved?”

Kurt LewinThis is where I bring in the foundational work of the psychologist Kurt Lewin (1890 – 1947).

Lewin realised the important difference between natural and social science and came up with a prototype for social research, which he labelled as ‘Action research’.

“The method that [Lewin] evolved was of involving his subjects as active, inquiring participants in the conduct of social experiments about themselves.” (Argyris & Schon)

His reasoning was that:

“People are more likely to accept and act on research findings if they helped to design the research and participate in the gathering and analysis of data.” (Lewin & Grabbe)

Yep, as a fellow human being, I’d wholeheartedly agree with that!

Who’s doing the research?

I hope that you can see that the (potentially grand) title of ‘social research’ doesn’t presume a group of people in white lab coats attached to a University or such like. Rather, applied social research can (and should) be happening every minute of every day within your organisation – it does at Toyota!

Argyris and Schon wrote about two (divergent) methods of attempting to intervene in an organisation. They labelled these as:

  • ‘Spectator – Manipulator’: a distant observer who keeps themselves at arms lengths from the worker, yet frequently disturbs the work with ‘experiments’ to manipulate the environment and observe the response;

and

  • ‘Agent – Experient’: an actor who locates themselves within the problematic situation (with the people), to appreciate and be guided by it, to facilitate change (in actions and thinking) by better understanding of the situation.

You can see that the first fits well with natural science whilst the second fits with social.

‘The ‘spectator – manipulator’ method also describes rather well the reality of commanding and controlling, through attempting to implement (supposed) ‘best practise’ on people, and then rolling out ever wider.

The nice thing about action research is that the researcher (the agent) and the practitioner (the people doing the work) participate together, meaning that:

“The divide between practitioner and researcher is thus closed down. The two roles become one. All involved are co-workers, co-researchers and co-authors…of the output.” (Flood)

Proper4 action research dissolves the barrier between researcher and participant.

And, as such, ‘Action research’ is now often relabelled as ‘Action learning’…. because that is exactly what the participants are doing.

A note on intervention

InterventionAny intervention into a social system causes change5. Further, the interventionist cannot be ‘separated from the system’ – they will change too!

Argyris and Schon wrote that:

“An inquiry into an actor’s reasons for acting in a certain way is itself an intervention…[which] can and do have powerful effects on the ways in which both inquirer and informant construe the meaning of their interaction, interpret each other’s messages, act towards each other, and perceive each other’s actions. These effects can complicate and often subvert the inquirer’s quest for valid information.

Organisational inquiry is almost inevitably a political process…the attempt to uncover the causes of a systems failure is inevitably a perceived test of loyalty to one’s subgroup and an opportunity to allocate blame or credit…

[We thus focus on] the problem of creating conditions for collaborative inquiry in which people in organisations function as co-researchers rather than as merely subjects.”

You might think that taking a ‘spectator – manipulator’ approach (i.e. remaining distant) removes the problem of unintended consequences from intervening…but this would be the opposite. The more remote you keep yourself then the more concerned the workers will likely be about your motives and intentions….and the less open and expansive their assistance is likely to be.

So, as Argyris and Schon wrote, the best thing for meaningful learning to occur would be to create an appropriate environment – and that would mean gaining people’s trust….and we are back at action learning.

The stages of action research within an organisation

Action research might be described as having three stages6, which are repeated indefinitely. These are:

  1. Discovery;
  2. Measurable action; and
  3. Reflection

Discovery means to study your system, to find out what is really happening, and to drive down to root cause – from events, through patterns of behaviour, to the actual structure of the system (i.e. what fundamentally makes it operate as it does) …and at this point you are likely to be dealing with people’s beliefs.

…and to be crystal clear: the people doing the discovery are not some central corporate function or consultants ‘coming in’ – it’s the people (and perhaps a skilled facilitator) who are working in the system.


Measurable action means to use what you have discovered and, together, take some deliberate experimental action that you (through consensus) believe will move you towards your purpose.

But we aren’t talking about conventional measurement. We’re referring to ‘the right measures, measured right’!


Reflection means to consider what happened, looking from all points of view, and consider the learning within…. leading on to the next loop – starting again with discovery.

This requires an environment that ensures that open and honest reflection will occur. That’s an easy sentence to write, but a much harder thing to achieve – it requires the dismantling of many conventional management instruments. I’m not going to list them – you would need to find them for yourselves…which will only happen once you start your discovery journey.

What it isn’t

Action research isn’t ‘a project’; something to be implemented; best practise; something to be ‘standardised’…(carry on with a list of conventional thinking).

If you want individuals, and the organisation itself, to meaningfully learn then ‘commanding and controlling’ won’t deliver what you desire.

…and finally: A big caveat

warning trianglePeter Checkland adopted action research as the method within his Soft Systems Methodology (SSM) and yet he was highly critical of “the now extensive and rapidly growing literature” on the approach, calling it “poverty stricken”. Here’s why:

“The great issue with action research is obvious: what is its truth criterion? It cannot be the repeatability of natural science, for no human (social) situation ever exactly duplicates another such situation.” (Checkland)

The risk of simply saying “we’re doing action research” is that any account of what you achieved becomes nothing more than plausible story telling. Whilst social research can never be as solid as the repeatable and reproduceable natural science equivalent, there must be an ‘is it reasonable?’ test on the outcomes for it to be meaningful.

There’s already plenty of ‘narrative fallacy’ story telling done within organisations – where virtually every outcome is explained away in a “didn’t we do well” style.

To be able to judge outcomes from action research, Checkland argues that an advanced declaration is required of “what constitutes knowledge about the situation. This helps to draw the distinction between research and novel writing.”

This makes the action research recoverable by anyone interested in subjecting the work to critical scrutiny.

So what does that mean? Well, taking John Seddon’s Vanguard Method7 as an example, the Check stage specifically starts up front with:

  1. Defining the purpose of the system (from the customer’s perspective – ‘outside in’);
  2. Understanding the demands being placed on the system (and so appreciating value from a wide variety of customer points of view); and
  3. Setting out a set of capability measures that would objectively determine whether any subsequent interventions have moved the system towards its purpose.

…and (in meeting Checkland’s point) this is done BEFORE anyone runs off to map any processes etc.

In summary:

We need to appreciate “the role of surprise as a stimulus to new ways of thinking and acting.” (Argyris & Schon).

People should be discovering, doing and seeing for themselves, which will create a learning system.

Footnotes:

1. Experiments: If you’d like a clearer understanding of experiments, and some comment on their validity then I wrote about this in a very early post called Shonky Experiments

2. Not wanting to see the change fatigue: This would happen if a manager is feverishly working towards a ‘SMART’ KPI, where this would be exacerbated if there is a bonus attached.

3. Lung dissection: I was searching around for an image of a healthy and then a smoker’s lungs…but thought that not everyone would like to see it for real…so I’ve put up an image with a collection of surgical scalpels – you can imagine for yourself 🙂

4. Proper: see the ‘big caveat’ at the end of the post.

5. Such changes may or may not be intended, and may be considered as positive, negative or benign.

6. Three stages of action research: If I look at the likes of Toyota’s Improvement/Coaching Katas or John Seddon’s Vanguard Method then these three stages can be seen as existing within.

7. The Vanguard Method is based on the foundation of action learning.

Hard, Soft…or Laminated?

Laminated manThis post is about something that I find very interesting – Systems Thinking as applied to organisations, and society – and about whether there are two different ‘factions’….or not.

I’ve had versions of this post in mind for some time, but have finally ‘put it on paper’3.

In the beginning there was…Biology

Well, not the beginning4. I’m referring to the beginning of modern systems thinking.

Back in the 1920s the Biologist Ludwig von Bertalanffy challenged the ability of 19th Century Physics to explain living things – in particular the dynamics of organisms.

Reductionist Physics back then treated things as ‘closed systems’: reducing them into their parts and, through studying the forces acting on them, establishing principles of their behaviours. Such an approach works well for mechanistic systems.

However, von Bertalanffy’s research showed that:

“A whole organism demonstrably behaves in a way that is more than the sum of its parts. It exhibits synergy. Furthermore, much of an organism’s existence is characterised by increasing, or at least maintaining order.” [Flood5]

Open vs closed systemsHe went on to develop ‘Open Systems theory’, which considers an organism’s co-existence with its environment.

The interesting bit (to me at least) is that, rather than just maintaining a steady state (homeostasis) or, worse, declining into disorder (entropy), an organism can continually improve itself (self-organisation).

Whether it will or not, well there’s the thing!

Von Bertalanffy, wanting to realign the sciences through his new understanding, went on to develop ‘General Systems theory’ (1940s) – the derivation of principles applicable to systems in general.

…and so the modern systems movement was born.

Onwards and upwards (a.k.a ‘Hard’ systems thinking)

hard woodThe study of systems really got moving from the 1940s onwards, with many offshoot disciplines.

Some notable developments include:


  • World War II and Operational Research6 (analytical methods of problem solving and decision making): A team of scientists were brought together to advise the British army. They used mathematical techniques to research strategic and tactical problems associated with military operations. Their work aimed to get the most out of limited resources (the most efficient usage, for greatest effect).

Following the war, much effort was put into translating and developing the OR methods and learnings into (usually large) organisations, and their management.


  • Stafford Beer and Organisational Cybernetics (the scientific study of control and communication within organisations): Beer analysed how the human body is controlled by the brain and nervous system, and then translated this to model how any autonomous system (such as an organisation….or a country) should be organised in such a way as to meet the demands of surviving in the changing environment (ref. Beer’s ‘Viable System Model’)


  • Tragedy of the commonsJay Forrester7 and System Dynamics (understanding the behaviours of complex systems over time): Forrester and his MIT department set about modelling (using computers) how systems behave over time, employing the science of feedback, and thus seeing (often counter-intuitive) patterns within the complexity. The aim being to discern effective levers for change.

Their work grew from ‘industrial dynamics’ (e.g. the study of an organisation over time), to ‘urban dynamics’ (e.g. a society over time) to ‘world dynamics’.

Donella Meadows (a member of Forrester’s team) took up world dynamics, and research regarding the limits of Earth’s capacity to support human economic expansion.

Peter Senge (another MIT team member) wrote the popular management book ‘The Fifth Discipline’, which sets out the disciplines necessary for a ‘learning organisation’8. He identifies systems thinking as the “cornerstone”, though his explanations are heavily based on his System Dynamics heritage.

Those involved with System Dynamics articulated a set of (thought provoking) system archetypes – which are commonly occurring patterns of system behaviour, due to specific combinations of feedback loops (reinforcing and balancing) and delays. For example, you might have heard of ‘The tragedy of the commons’ (see system model diagram above) or ‘Success to the successful’.


Note: (it is my belief) that there are (understandably) huge overlaps between each of the above disciplines.

All of the above is centred around being able to:

  • identify ‘a system’ i.e. the subject of analysis (as if it were a real thing);
  • create a well-defined problem statement;
  • take a scientific approach to problem solving; and thus
  • reach some (presumed) solution to the problem

This has been labelled as the school of hard systems thinking (explained later), where a system is something that, if we studied it together, we would all describe/ articulate in a similar way – as in a ‘thing’ that can be set out and agreed upon….and almost touch!

If we combine that we can define, model and understand ‘it’ then, hey presto, we should be able to solve ‘it’…as if there is a solution. Excellent! Let’s get modelling and improving.

But there’s a lot more to it – ‘Soft’ systems thinking

soft woodSo where did that ‘hard’ term come from and why?

It was coined by Peter Checkland in the 1970’s to label what he thought of the current approaches, and to propose an alternative ‘soft’ view. Here’s his explanation:

“[hard systems thinking believes that] the world contains interacting systems…[that] can be ‘engineered’ to achieve their objectives

…[however] none of these [hard systems thinking] approaches pays attention to the existence of conflicting worldviews, something which characterises all social interactions…

In order to incorporate the concept of worldviews…it [is] necessary to abandon the idea that the world is a set of systems.

In [soft systems thinking] the (social) world is taken to be very complex, problematical, mysterious, characterised by clashes of worldviews. It is continually being created and recreated by people thinking, talking and taking action. However, our coping with it…can itself be organised as a learning system.”

Now, I’m not saying that understanding everything that Checkland writes is easy – it isn’t (at least not for me) – but whatever you think of his ‘Soft Systems Methodology’ and the various models within, I believe that the fundamentals are substantial…such as his human-centric thinking on:

  • Problematic situations; and
  • Worldviews

I’ve previously touched on the first point in my post titled “what I think is…”, which perhaps can be lightly summarised as ‘problems are in the eye of the beholder’, so I’ll move on to worldviews, nicely explained by Checkland as follows:

“When we interact with real-world situations we make judgements about them: are they ‘good’ or ‘bad’, ‘acceptable’ or ‘unacceptable’, ‘permanent’ or ‘transient’?

Now, to make any judgement we have to appeal to some criteria or standards, these being the characteristics which define ‘good’ or ‘bad’ etc. for us. And where do such criteria come from? They will be formed partially by our genetic inheritance from our parents – the kind of person we are innately – and, most significantly, from our previous experiences of the world.

Over time these criteria and the interpretations they lead to will tend to firm up into a relatively stable outlook through which we then perceive the world. We develop ‘worldviews’, built-in tendencies to see the world in a particular way. It is different worldviews which make one person ‘liberal’, another ‘reactionary’. Such worldviews are relatively stable but can change over time…”

worldviews eyeThis ‘worldview’ concept is easily understood, and yet incredibly powerful. At its most extreme, it deals efficiently with the often-cited phrase that ‘one man’s terrorist is another man’s freedom fighter’.

I think that Checkland’s worldview explanation is profound (and yet, when thought about, bloody obvious). All worldviews (and hence perceived problems within) are personal, and a proper understanding of them (and why they are held) must be central to any meaningful approach of moving a social group (whether a family, an organisation or a society) to a better place.

It is just too simplistic for someone in a position of power9 to say ‘this is the system, this is the current problem, let’s get on and solve it.’

Checkland talks of getting people to think about their own thinking about the world.

Many people do that naturally and many people never ever do that – they simply engage with the world in an unreflective way.

If you are going to [really change the world then] you have to become [conscious about] thinking about your own thinking. You have to be able to stop yourself in a situation and ask yourself ‘how am I thinking about this? How else could I be thinking about this?

This is a meta-level of thinking, which is not obvious in everyday life – we don’t normally do it in day-to-day chat.”

Over in America

Whilst Checkland and his colleagues in the UK were questioning 1960s systems thinking (and deriving his ‘Soft Systems Methodology’9), two of his contemporaries were doing similar over in the US.

C. West Churchman and Russell Ackoff were there at the very start of Operational Research (OR) in 1950s America, but by the 1970s they understood the essential missing piece and felt the need for radical change. Ackoff broke away from his OR faculty and initiated a new program called ‘Social Systems Sciences’, whilst Churchman wrote:

“The systems approach begins when first you see the world through the eyes of another. [It] goes on to discover that every world-view is terribly restricted. There are no experts in the systems approach.” 

A side note: Sadly, I expect that Churchman and Ackoff would be ‘turning in their graves’ if they could be made aware of the lack of thinking, particularly of worldviews, by Donald Trump and his band of (ahem) ‘patriotic’ followers. Patriotic seems to have become proudly re-defined by them as ‘closed minded’.

…but, hey, that’s just my worldview speaking 😊.

Laminating the two together

I’m not a champion of ‘soft’ over ‘hard’ or vice versa. Rather, I find real interest in their combined thinking…as in laminating the two together.

I personally like to think about systems in a hard and soft format.

  • ‘hard’ because a logical model to represent a ‘thing’ (as if I can touch it) is incredibly useful for me; yet
  • ‘soft’ because it requires me:
    • to accept that I merely have a perspective…with a need to surface my beliefs and assumptions, and;
    • to understand the relevant worldviews of those around me….and change myself accordingly.

Similarly, some 30 or so years after first deriving the ‘hard’ and ‘soft’ terminology, Peter Checkland ends his last book with the following:

“New approaches (now thought of as ‘soft’), underpinned by a different social theory, have emerged. They do not, however, suggest that the 1960s theory was ‘wrong’ and should be abandoned. Rather the ‘new’ theory sees the ‘old’ one as a special case, perfectly adequate in certain circumstances, but less general than the social theory behind the ‘soft’ outlook.”

Perhaps the modern terminology for Checkland’s ‘Worldviews’ wording is ‘Mental Models’ – our internal pictures of how the world works – and this has become a major area of focus.

The need to surface, test and improve our mental models has, pleasingly, become entwined with systems thinking.

To summarise

Meadows, a giant systems thinker, embraced the need to expose our mental models:

“Remember, always, that everything you know, and everything everyone knows, is only a model. Get your model out there where it can be viewed. Invite others to challenge your assumptions and add their own. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible.”

Nice!

…and finally, where to from here?

Checkland’s incredibly important softening of systems thinking (i.e. to include the reality of human beings into the mix) leads on to the question of how meaningful interventions into social systems are to be approached…which (I’m hoping) will be the subject of my next post: on ‘Action Research’.

Footnotes

1. Laminated: “Bonding layers of materials together”.

2. Post Image: I was searching for an image that showed a human made up of two complimentary materials and found this lovely plywood sculpture.

3. Trigger: I partially wrote this post after reading a ThinkPurpose post way back in Nov. ’16. That post was a light-hearted critique of Peter Checkland’s ‘Soft Systems Methodology’ (SSM) and, whilst I enjoyed reading it (as ever), I had many thoughts going on…which were far too verbose to put into a comments section.

4. In the beginning: My understanding is that, before Biology, there was Chemistry (necessary for life to start), and before that Physics (back to a big bang and, potentially, the multiverse)…and we (human beings) are ‘still working on’ what (if anything) came before that.

Personally, I’m a fan of the never-ending loop (ref. Louis Armstrong Guinness advert). Every time science finds something bigger (as it regularly seems to do)…there’s always another bigger. Every time science finds something smaller (e.g. at CERN using the Large Hadron collider)…there’s always another smaller – surely it must just all wrap back round 🙂 If there’s a name for this proposition/ delusion, let me know.

5. Book reference:- Flood, Robert Louis (1999): ‘Rethinking the Fifth Disciple – Learning within the unknowable’. The first half of this book sets out the work and thinking of a number of the main 20th century systems thinking giants.

6. The origin of Operational Research is regularly attributed to Charles Babbage’s study of England’s mail system (the costs of transport and sorting), resulting in the Penny Post (1840).

7. Forrester wrote the original System Dynamics text book (‘Principles of Systems’, 1968) setting out definitions and system modelling.

8. Senge’s five disciplines are: Personal Mastery, Mental Models, Shared Vision, Team Learning and….drum roll…Systems Thinking, though obviously you’d need to read the book to understand what is meant by each of these phrases.

Senge’s chapter on ‘Mental Models’ is based primarily on the work of Chris Argyris (whom I wrote about in ‘Double Trouble’).

9. Power: It is highly likely (and unsurprising) that a person’s worldview is heavily influenced by where they ‘sit’ within an organisation’s hierarchy. It’s always informative (and often amusing) to compare and contrast the organisational beliefs of a CEO with, say, a front line worker.

10. Misunderstanding SSM: I should note that, probably rather frustratingly for Checkland, people (including many an academic) seem to misinterpret (and/or perhaps misunderstand) what he was putting forward within SSM. He wrote a whole chapter at the end of his last book titled ‘Misunderstanding SSM’.

How good is that one number?

Lottery ballsThis post is a promised follow up to the recent ‘Not Particularly Surprising’ post on Net Promoter Score.

I’ll break it into two parts:

  • Relevance; and
  • Reliability

Part 1 – Relevance

A number of posts already written have explained that:

Donald Wheeler, in his superb book ‘Understanding Variation’, nicely sets out Dr Walter Shewhart’s1 ‘Rule One for the Presentation of Data’:

“Data should always be presented in such a way that preserves the evidence in the data…”

Or, in Wheeler’s words “Data cannot be divorced from their context without the danger of distortion…[and if context is stripped out] are effectively rendered meaningless.”

And so to a key point: The Net Promoter Score (NPS) metric does a most excellent job of stripping out meaning from within. Here’s a reminder from my previous post that, when asking the ‘score us from 0 – 10’ question about “would you recommend us to a friend”:

  • NPS scaleA respondent scoring a 9 or 10 is labelled as a ‘Promoter’;
  • A scorer of 0 to 6 is labelled as a ‘Detractor’; and
  • A 7 or 8 is labelled as being ‘Passive’.

….so this means that:

  • A catastrophic response of 0 gets the same recognition as a casual 6. Wow, I bet two such polar-opposite ‘Detractors’ have got very different stories of what happened to them!

and yet

  • a concrete boundary is place between responses of 6 and 7 (and between 8 and 9). Such an ‘on the boundary’ responder may have vaguely pondered which box to tick and metaphorically (or even literally) ‘tossed a coin’ to decide.

Now, you might say “yeah, but Reichheld’s broad-brush NPS metric will do” so I’ve mocked up three (deliberately) extreme comparison cases to illustrate the stripping out of meaning:

First, imagine that I’ve surveyed 100 subjects with my NPS question and that 50 ‘helpful’ people have provided responses. Further, instead of providing management with just a number, I’m furnishing them with a bar chart of the results.

Comparison pair 1: ‘Terrifying vs. Tardy’

Below are two quite different potential ‘NPS question’ response charts. I would describe the first set of results as terrifying, whilst the second is merely tardy.

Chart 1 Terrifying vs Tardy

Both sets of results have the same % of Detractors (below the red line) and Promoters (above the green line)…and so are assigned the same NPS score (which, in this case would be -100). This comparison illustrates the significant dumbing down of data by lumping responses of 0 – 6 into the one category.

I’d want to clearly see the variation within the responses i.e. such as the bar charts shown, rather than have it stripped out for the sake of a ‘simple number’.

You might respond with “but we do have that data….we just provide Senior Management with the single NPS figure”….and that would be the problem! I don’t want Senior Management making blinkered decisions2, using a single number.

I’m reminded of a rather good Inspector Guilfoyle poster that fits perfectly with having the data but deliberately not using it.

Comparison pair 2: ‘Polarised vs. Contented’

Below are two more NPS response charts for comparison….and, again, they both derive the same NPS score (-12 in this case) …and yet they tell quite different stories:

Chart 2 Polarised vs Cotented

The first set of data uncovers that the organisation is having a polarising effect on its customers – some absolutely love ‘em …whilst many others are really not impressed.

The second set shows quite a warm picture of contentedness.

Whilst the NPS scores may be the same, the diagnosis is unlikely to be. Another example where seeing the variation within the data is key.

Comparison pair 3: ‘No Contest vs. No Show’

And here’s my penultimate pair of comparison charts:

Chart 3 No contest vs No show

Yep, you’ve guessed it – the two sets of response data have the same NPS scores (+30).

The difference this time is that, whilst the first chart reflects 50 respondents (out of the 100 surveyed), only 10 people responded in the second chart.

You might think “what’s the problem, the NPS of +30 was retained – so we keep our KPI inspired bonus!” …but do you think the surveys are comparable. Why might so many people not have responded? Is this likely to be a good sign?  Can you honestly compare those NPS numbers? (perhaps see ‘What have the Romans ever done for us?!’)

….which leads me nicely onto the second part of this post:

Part 2 – Reliability

A 2012 article co-authored by Fred Reichheld (creator of NPS), identifies many issues that are highly relevant to compiling that one number:

  • Frequency: that NPS surveys should be frequently performed (e.g. weekly), rather than, say, a quarterly exercise.

The article doesn’t, however, refer to the essential need to always present the results over time, or whether/ how such ‘over time’ charts should (and should not) be interpreted.


  • Consistency: that the survey method should be kept constant because two different methods could produce wildly different scores.

The authors comment that “the consistency principle applies even to seemingly trivial variations in methodologies”, giving an example of the difference between a face-to-face method at the culmination of a restaurant meal (deriving an NPS of +40) and a follow-up email method (NPS of -39).


  • Response rate: that the higher the response rate, then the greater the accuracy – which I think we can all understand. Just reference comparison 3 above.

But the article goes to say that “what counts most, of course, is high response rates from your core or target customers – those who are most profitable…” In choosing these words, the authors demonstrate the goal of profitability, rather than customer purpose. If you want to understand the significance of this then please read ‘Oxygen isn’t what life is about’.

I’d suggest that there will be huge value in studying those customers that aren’t your current status quo.


  • Freedom from bias: that many types of bias can affect survey data.

The authors are clearly right to worry about the non-trivial issue of bias. They go on to talk about some key issues such as ‘confidentiality bias’, ‘responder bias’ and the whopper of employees ‘gaming the system’ (which they unhelpfully label as unethical behaviour, rather than pondering the system-causing motivations – see ‘Worse than useless’)


  • Granularity: that of breaking results down to regions, plants/ departments, stores/branches…enabling “individuals and small teams…to be held responsible for results”.

Owch….and we’d be back at that risk of bias again, with employees playing survival games. There is nothing within the article that recognises what a system is, why this is of fundamental importance, and hence why supreme care would be needed with using such granular NPS feedback. You could cause a great deal of harm.

Wow, that’s a few reliability issues to consider and, as a result, there’s a whole NPS industry being created within organisational customer/ marketing teams3…which is diverting valuable resources from people working together to properly study, measure and improve the customer value stream(s) ‘in operation’, towards each and every customer’s purpose.

Reichheld’s article ends with what it calls “The key”: the advice to “validate [your derived NPS number] with behaviours”, by which he explains that “you must regularly validate the link between individual customers’ scores and those customers’ behaviours over time.”

I find this closing advice amusing, because I see it being completely the wrong way around.

Rather than getting so obsessed with the ‘science’ of compiling frequent, consistent, high response, unbiased and granular Net Promoter Scores, we should be working really hard to:

“use Operational measures to manage, and [lagging4] measures to keep the score.” [John Seddon]

…and so to my last set of comparison charts:

Chart 4 Dont just stand there do something

Let’s say that the first chart corresponds to last month’s NPS survey results and the second is this month. Oh sh1t, we’ve dropped by 14 whole points. Quick, don’t just stand there, do something!

But wait…before you run off with action plan in hand, has anything actually changed?

Who knows? It’s just a binary comparison – even if it is dressed up as a fancy bar chart.

To summarise:

  • Net Promoter Score (NPS) has been defined as a customer loyalty metric;
  • There may be interesting data within customer surveys, subject to a heavy caveat around how such data is collected, presented and interpreted;
  • NPS doesn’t explain ‘why’ and any accompanying qualitative survey data is limited, potentially distorting and easily put to bad use;
  • Far better data (for meaningful and sustainable improvement) is to be found from:
    • studying a system in operation (at the points of demand arriving into the system, and by following units of demand through to their customer satisfaction); and
    • using operational capability measures (see ‘Capability what?’) to understand and experiment;
  • If we properly study and redesign an organisational system, then we can expect a healthy leap in the NPS metric – this is the simple operation of cause and effect;

  • NPS is not a system of management.

Footnotes

1. Dr Walter Shewhart (1891 – 1967) was the ‘father’ of statistical quality control. Deming was heavily influenced by Shewhart’s work and they collaborated together.

2. Blinkered decisions, like setting KPI targets and paying out incentives for ‘hitting it’.

3. I should add that, EVEN IF the (now rather large) NPS team succeeds in creating a ‘reliable’ NPS machine, we should still expect common cause variation within the results over time. Such variation is not a bad thing. Misunderstanding it and tampering would be.

4. Seddon’s original quote is “use operational measures to manage, and financial measures to keep the score” but his ‘keeping the score’ meaning (as demonstrated in other pieces that he has written) can be widened to cover lagging/ outcome/ results measures in general…which would include NPS.

Seddon’s quote mirrors Deming’s ‘Management by Results’ criticism (as explained in the previous post).

Not Particularly Surprising

pH scaleHave you heard people telling you their NPS number? (perhaps with their chests puffed out…or maybe somewhat quietly – depending on the score). Further, have they been telling you that they must do all they can to retain or increase it?1

NPS – what’s one of those?

‘Net Promoter Score’, or NPS, is a customer loyalty metric that has become much loved by the management of many (most?) large corporations. It was introduced to the management world by Fred Reichheld2 in his 2003 HBR article titled ‘One number you need to grow’.

So far, so what.

But as most things in ‘modern management‘ medicine, once introduced, NPS took on a life of its own.

Reichheld designed NPS to be rather simple. You just ask a sample of subjects (usually customers3) one question and give them an 11-point scale of 0 to 10 to answer it. And that question?

‘How likely is it that you would recommend our company/product/ service to a friend or a colleague?’

You then take all your responses (which, incidentally, may be rather low) and boil them down into one number. Marvellous…that will be easy to (ab)use!

But, before you grab your calculators, this number isn’t just an arithmetic average of the responses. Oh no, there’s some magic to take you from your survey results to your rather exciting score…and here’s how:

  • A respondent scoring a 9 or 10 is labelled as a ‘Promoter’;
  • A scorer of 0 to 6 is labelled as a ‘Detractor’; and
  • A 7 or 8 is labelled as being ‘Passive’4.

where the sum of all Promoters, Detractors and Passives = the total number of respondents.

NPS calculation.jpgYou then work out the % of your total respondents that are Promoters and Detractors, and subtract one from the other.

You’ll get a number between -100 (they are all Detractors) and +100 (all Promoters), with a zero meaning Detractors and Promoters exactly balance each other out.

And, guess what…a positive score is desirable…and, over the long term, a likely necessity if you want to stay in business.

Okay, so I’ve done the up-front explanatory bit and regular readers of this blog are probably now ready for me to go on and attempt to tear ‘NPS’ apart.

I’m not particularly bothered by the score – it might be of some interest…though exceedingly limited in its usefulness.

Rather, I’m bothered by:

  1. what use it is said to be; and
  2. what use it is put to.

I’ve split my thoughts into two posts. This post deals with the second ‘bother’, and my next one will go back to consider the first.

Qualitative from Quantitative – trying to ‘make a wrong thing righter’

The sane manager, when faced with an NPS score and a ‘strategic objective’ to improve it, wants to move on from the purely quantitative score and ‘get behind it’ – they want to know why a score of x was given.

Reichheld’s NPS method covers this obvious craving by encouraging a second open-ended question requesting the respondent’s reasoning behind the rating just given – a ‘please explain’ comments box of sorts. The logic being that this additional qualitative data can then be provided to operational management for analysis and follow up action(s).

Reichheld’s research might suggest that NPS provides an indicator of ‘customer loyalty’, but…and here’s the key bit…don’t believe it to be a particularly good tool to help you improve your system’s performance.

There are many limitations with attempting to study the reasons for your system’s performance through such a delayed, incomplete and second-hand ‘the horse has bolted’ method such as NPS.

  • Which subjects (e.g. customers) were surveyed?
  • What caused you to survey them?
  • Which subjects chose to respond…and which didn’t?
  • What effort from the respondent is likely to go into explaining their scoring?
  • Does the respondent even know their ‘why’?
  • Can they put their (potentially hidden) feelings into words?…and do they even want to?

If you truly want to understand how your system works and why, so that you can meaningfully and sustainably improve it, wouldn’t it just be soooo much better (and simpler) to jump straight to (properly5) studying the system in operation?!

A lagging indicator vs. Operational measures

One of my very early posts on this blog covered the mad, yet conventional, idea of ‘management by results’ and subsequent posts have delved into ‘cause and effect’ in more detail (e.g. ‘Chain beats Triangle’).

My ‘cause and effect’ post ends with the key point that:

“Customer Purpose (which, by definition, means quality) comes first…which then delivers growth and profitability, and NOT the other way around!”

Now, if you read up on what Reichheld has to say about NPS, he will tell you that it is a leading measure, whereas I argue that it is a lagging one. The difference is because we are coming from opposite ends of the chain:

  • Reichheld appears to be concerned with growth and profitability, and argues that NPS predicts what is going to happen to these two financial measures (I would say in the short term);

  • I am concerned with customer purpose, and an organisation’s capability at delivering against its customers’ needs. This means that I want to know what IS happening, here and now so that I can understand and improve it …which will deliver (for our customers, for the organisation, for its stakeholders) now, and over the long term.

You might read the above and think I am playing with semantics. I think not.

I want operational measures on the actual demands coming in the door, and how my processes are actually working. I want first hand operational knowledge, rather than attempting to reverse engineer this from partial and likely misleading secondary NPS survey evidence.

“Managers learn to examine results, outcomes. This is wrong. The manager’s concern should be with processes….the concentration of a manager should be to make his processes better and better. To do so, he needs information about the performance of the process – the ‘voice of the process’. “ [‘Four Days with Dr Deming’]

Deming’s clear message was ‘focus on the process and the result will come’ and, conversely, you can look at results all you like but you’d be looking in the wrong place!

NPS thinking fits into the ‘remote control’ school of management. Don’t survey and interrogate. ‘Go to the gemba’ (the place where the work occurs).

 “But what about the Lean Start-up Steve?”

Some readers familiar with Eric Ries’ Lean Start-up movement might respond “but Eric advocates the use of customer data!” and yes, he does.

But he isn’t trying to get a score from them, he is trying to deeply engage with a small number of them, understand how they think and behave when experiencing a product or service, and learn from this…and repeat this loop again and again.

This fits with studying demand, where it comes in, and as it flows.

The Lean Startup movement is about observing and reflecting upon what is actually happening at the point of customer interaction, and not about surveying them afterwards.

To close – some wise words

After writing this post I remembered that John Seddon had written something about NPS…so I searched through my book collection to recover what he had to say…and he didn’t disappoint:

“Even though NPS is completely useless in helping service organisations improve, on our first assignment [e.g. as system improvement interventionists] we say nothing about it, because we know the result of redesigning the system will be an immediate jump in the NPS score…and because when this is reported to the board our work gets the directors’ attention.

It makes it easy to see why NPS is a waste of time and money. First, it is what we call a ‘lagging measure’ – as with all customer satisfaction measures, it assesses the result of something done in the past. Since it doesn’t help anyone understand or improve performance in the present, it fails the test of a good measure5 – it can’t help to understand or improve performance.” [Seddon, ‘The Whitehall Effect’]

Seddon goes on to illuminate a clear and pernicious ‘red herring’ triggered by the use of NPS:  the simple question of ‘would you recommend this service to a friend’ mutates to a hunt for the person who delivered the particular instance of service currently under the microscope. Management become “concerned with the behaviour of people delivering the service” as opposed to the system that makes such behaviour highly likely to occur!

I have experience of this exact management behaviour in full flow, with senior management contacting specified members of staff directly (i.e. those who handled the random transaction in question) to congratulate or interrogate/berate them, following the receipt of particularly outstanding6 NPS responses.

This is to focus on the 5% (the people) and ignore the 95% (the system that they are required to operate within). NPS “becomes an attractive device for controlling them”.

Indeed.

The title of this post follows from Seddon’s point that if you focus on studying, understanding and improving the system then, guess what, the NPS will improve – usually markedly. Not Particularly Surprising.

My next post called ‘How good is that one number’ contains the second part of my NPS critique.

Footnotes

1. This post, as usual, comes from having a most excellent conversation with a friend (and ex-colleague) …and she bought me lunch!

I should add that the title image (the pH scale) is a light-hearted satire of the various NPS images I found i.e. smiley, neutral and angry faces arranged on a coloured and numbered scale.

2. Reichheld has written a number of books on customer loyalty, with one of his more recent ones trying to relabel ‘NPS’ from Net Promoter Score to Net Promoter System (of management) …which, to put it mildly, I am not a fan of.

It reminds me of the earlier ‘Balanced Scorecard’ attempting to morph into a system of management. See ‘Slaughtering the Sacred Cow’.

Yet another ‘management idea’ expanding beyond its initial semblance of relevance, in the hands of book sellers and consultants.

Sorry, but that’s how I feel about it.

NPS is linked to the ‘Balanced Scorecard’ in that it provides a metric for the customer ‘quadrant’ of the scorecard …but, as with financial measures, it is still an ‘outcome’ (lagging) measure of an organisation’s people and processes.

3. The original NPS focused on customers, but this has subsequently been expanded to consider other subjects, particularly employees.

4. Being British (i.e. somewhat subdued), I find the labelling of a 7 or 8 score as ‘Passive’ to be hilarious. A score of 7 from me would be positively gushing in praise! What a great example of the variety inherent within customers…and which NPS cannot reveal.

5. For the ‘tests of a good measure, please see an earlier post titled ‘Capability what?’

6. Where ‘outstanding’ means particularly low, as well as high.

The notion of ‘Leadership’

ChurchillI’ve been re-reading a book on leadership by Elliott Jaques1 and, whilst I’m not smitten with where he took his ‘Requisite Organisation’ ideas2, I respect his original thinking and really like what he had to say about the notion of leadership. I thought I’d try and set this out in a post…but before I get into any of his work:

“When I grow up I want to be a leader!“

Over the years I’ve spoken with graduate recruits/ management trainees in large organisations about their aspirations, and I often hear that ‘when they grow up’ they ‘want to lead people’.

And I think “Really? Lead who? Where? Why?”

Why is it that (many) people think that ‘to lead’ is the goal? Perhaps it is because ‘modern management’ has rammed the ‘being labelled a leader IS success’ idea down our throats.

It seems strange to me that people feel the need ‘to lead’ per se. For me, whether I would want to lead (or not) absolutely depends…on things like:

  1. Is there a set of people (whether large or small) that needs leading?
    • If they don’t, then I shouldn’t be attempting to force myself upon them.
  2. Am I passionate about the thing that ‘we’ want to move towards? (the purpose)
    • If not, then I’m going to find it rather hard to genuinely inspire people to follow. I would be faking it.
  3. Do I (really) care about those that need leading?
    • If I don’t, then this is likely to become obvious through my words and deeds
    • ‘really’ caring means constantly putting myself in their shoes – to understand them – and acting on what I find.
  4. Do I think I have the means to lead in this scenario? (e.g. the necessary cognitive capacity/ knowledge/ skills/ experience)
    • If someone else in the group (or close by) is better placed to lead in this scenario (for all the reasons above), then I should welcome this, and even seek them out3 – and not ‘fight them for it’.

I think we need to move on from the simplistic ‘I’m a leader!’ paradigm.

So, turning to what Jaques had to say…

Defining leadership

Jaques noted that the concept of leadership is rarely defined with any precision”. He wrote that:

“Good leadership is one of the most valued of all human activities. To be known as a good leader is a great accolade… It signifies the talent to bring people together…to work effectively together to meet a common goal, to co-operate with each other, to rely upon each other, to trust each other.”

I’d ask you to pause here, and have a think about that phrase “to be known as a good leader…”

How many ‘good leaders’ have you seen?

I’d suggest that, given the number of people we come across in (what have been labelled as) ‘leadership’ positions, it is rare for us to mentally award the ‘good leader’ moniker.

We don’t give out such badges easily – we are rather discerning.

Why? Because being well led really matters to us. It has a huge impact upon our lives.

The ‘personality’ obsession

Mr MenWe (humans) seem to have spent much time over the last few decades trying to create a list of the key personality characteristics that are said to determine a good leader.

There have been two methods used to create such lists, which Jaques explains as follows:

“Most of the descriptions of leadership have focused on superiority or shortcomings in personal qualities in people and their behaviour. Thus, much has been written:

  • about surveys that describe what executives do who are said to be good at leadership; or 
  • about the lives of well-known individuals who had reputations as ‘good leaders’ as though somehow emulating such people can help.”

Jaques believed that ‘modern management’ places far too much4 emphasis on personality make-up.

If you google ‘the characteristics of a good leader’ you will be bombarded with list upon list of what a leader should supposedly look like – with many claiming legitimacy from ‘academic exercises’ that sought out a set of people who appear to have ‘done well’ and then collecting a myriad of attributes about them, and searching for commonality (perhaps even using some nice statistics) …and, voila, that’s ‘a leader’ right there!

If you are an organisation desiring ‘leaders’, then all you then need do is find people like this. Perhaps, in time, you could pluck ‘a leader’ off a supermarket shelf.

If you ‘want to be a leader’, then all you need do is imitate the list of characteristics. After all, don’t you just ‘fake it to make it’ nowadays?

Mmm, if only leadership were so simple.

In reality, there are a huge range of personalities that will be able to successful lead and, conversely, there will be circumstances where someone with (supposedly) the most amazing ‘leadership’ personality fit won’t succeed5. This will come back to leading the ‘who’, to ‘where’ and ‘why’.

Further, some of those ‘what makes a good leader’ lists contain some very opaque ‘characteristics’…such as that you must be ‘enthusiastic’, ‘confident’, ‘purposeful’, ‘passionate’ and ‘caring’. These are all outcomes (effects) from those earlier ‘it depends’ four questions (causes), not things that you can simply be!

Personally, I’ll be enthusiastic and purposeful about, say, reducing plastic waste in our environment but I won’t be enthusiastic and purposeful about manufacturing weapons! I suppose that Donald Trump and Kim Jong-un might be different.

Jaques wrote that:

“It is the current focus upon psychological characteristics and style that leads to the unfortunate attempts within companies to change the personalities of individuals or to maintaining procedures aimed at ‘getting a correct balance’ of personalities in working groups…

…our analysis and experience would suggest that such practises are at best likely to be counterproductive in the medium and long term…

…attempts to improve leadership by psychologically changing our ‘leaders’ serve mainly as placebos or band-aids which, however well-intentioned they may be, nevertheless obscure the grossly undermining effects of the widespread organisational shortcomings and destructive defects”

It really won’t matter what personality you (attempt to) adopt if you continue to preside over a system that:

  • lives a false purpose; and
  • attempts to:
    • command through budgets, detailed implementation plans, targets and cascaded objectives; and
    • control through rules, judgements and contingent rewards.

Conversely, if you help lead your organisation through meaningfully and sustainably changing the system, towards better meeting its (customer) purpose, then you will have achieved a great thing! And the people around you (employees, customers, suppliers… society) will be truly grateful – and hold you in high regard – even if they can’t list a set of ‘desirable’ traits that you displayed along the way.

Peter Senge, in his systems thinking book ‘The Fifth Discipline’ writes that:

“Most of the outstanding leaders I have had the privilege to know possess neither striking appearance nor forceful personality. What distinguishes them is the clarity and persuasiveness of their ideas, the depth of their commitment, and the extent of their openness to continually learning more.

They do not ‘have the answer’, but they seem to instil confidence in those around them that, together, ‘we can learn whatever we need to learn in order to achieve the results we truly desire’.”

To close the ‘personality’ point – Jaques believed that:

“The ability to exercise leadership is not some great ‘charismystery’ but is, rather, an ordinary quality to be found in Everyman and Everywoman so long as the essential conditions exist

…Charisma is a quality relevant only to cult leadership”.

We should stop the simplistic labeling of “this one here is a leader, and that one over there is not”.

Manager? Leader? Or are we confusing the two?!

Find and replaceSo, back to that ‘Manager or Leader’ debate.

It feels to me that many an HR department hit upon the ‘leader’ word, say 10 years ago, and considering it as highly desirable, decided that it would be a good idea to do a ‘find and replace’ throughout all of their organisation’s lexicon. i.e. find wherever the word ‘Manager’ is used and replace (i.e. in their eyes ‘upgrade’) with the word ‘Leader’.

And so we got ‘Team Leaders’ instead of ‘Team Managers’ and ‘Senior Leadership’ instead of ‘Senior Management’….and on and on.

And this changed everything, and nothing.

Jaques explained that:

Leadership is not a free-standing activity: it is one function, among many, that occurs in some but not all roles.”

“Part of the work of the role [of a manager] is the exercise of leadership, but it is not a ‘leadership role’ any more than it would be called a telephoning role because telephoning is also a part of the work required.”

Peter Senge writes that:

“we encode a broader message when we refer to such people as the leaders. That message is that the only people with power to bring about change are those at the top of the hierarchy, not those further down. This represents a profound and tragic confusion.”

And so to three important leadership concepts: Accountability, Authority and Responsibility:

Accountability

Put simply, the occupant of a role is accountable:

  • for achieving what has been defined as requirements of the role; and
  • to the person or persons who have established that role.

Jaques writes that “management without leadership accountability is lifeless…leadership accountability should automatically be an ordinary part of any managerial role.”

Such leadership isn’t bigger than, or instead of, management – it is just a necessary part within. As such, it doesn’t make sense to say that “he/she is a good manager, but not a good leader”.

Authority

Authority is that which enables someone to carry out the role that they are accountable for.

“In order to discharge accountability, a person in a role must have appropriate authority; that is to say, authority with respect to the use of materials or financial resources or with respect to other people making it reasonably possible to do what needs to be done.”

Jaques goes on to split authority into ‘authority vested’ and ‘authority earned’.

Role-vested authority by itself, properly used, should be enough to produce a minimal satisfactory result, by means of [people] doing what they are role bound to do. What it cannot do is to release the full and enthusiastic co-operation of others

personally earned authority is needed if people are to go along with us, using their full competence in a really willing and enthusiastic way; it carries the difference between a just-good-enough result and an outstanding or even scintillating one.”

In short, managers have to (continually) earn the trust and respect of their people.

(You might like to revisit an earlier post that explained Scholtes’ excellent diagram on trustPeople and Relationships).

Responsibility

Let’s suppose that you are at the scene of a traffic accident. If you are on your own you will likely take on the social responsibility of doing the best you can in the circumstances. If others are there (say there is a crowd), you will likely assess whether you have special knowledge that is not already present:

  • is anyone attending to the injured? If not, what can you do?
  • If first aid is underway, do (you believe that) you know more than they appear to? Can you be of assistance to what they are already doing?
  • If the police are not yet there, what can you do to secure the safety of others, such as warning other traffic?

In such circumstances, nobody carries the authority to call you to account (unless you knowingly do something illegal).

Jaques explains this as the general leadership responsibility and does so to:

“show how deeply leadership notions are embedded in the most general issues of social conscience, social morality, and the general social good.”

“General leadership responsibility must apply even where a person’s role does not carry leadership accountability…[employees] must strive to carry leadership responsibility, even towards their managers, whenever they consider it to be for everyone’s good for them to do so.”

The understanding of the difference between leadership accountability and general leadership responsibility (for the good of society, or a sub-set within) makes clear that it is never a case of “I’m the leader and you’re not.”

Jaques went on to write that:

“The effective and sensible discharge of general leadership responsibility is one sign of a healthy collaborative organisation.”

…and finally, to react to a likely critique:

“You’re so naïve Steve!”

Many of you reading this post may think me naïve.  You may reply that there are, and will always be, people out there who want to feel the power and ego (self-importance) of being labelled as ‘a leader’…and yet (regardless of their words) don’t actually care about the ‘who, where and why’ of leading. You might cite a large swathe of politicians and senior corporate executives as evidence.

Yep, I’d agree that there will be people out there like this who will ‘play the game’ and work their way into (supposed) ‘leadership’ positions…but I don’t believe that such “I’m a leader!” people are likely to make ‘good leaders’ (in the sense of what Jacques defines as leadership). Sure, they can play the ‘leader’ game, but what really counts is whether a system (such as an organisation, or a community) meaningfully and sustainably moves towards its true purpose, for the good of society.

Senge writes that:

“the term ‘leader’ is generally an assessment made by others. People who are truly leading seem rarely to think of themselves in that way. Their focus is invariably on what needs to be done, the larger system in which they are operating, and the people with whom they are creating – not on themselves as ‘leaders’. Indeed, if it is otherwise, this is probably a problem. For there is always the danger, especially for those [installed into] leadership positions, of becoming ‘heroes in their own minds’.”

In summary:

If there is a need, and a person really cares about the purpose and the people, and they have the means then they will likely lead well – regardless of their personality type whilst doing so.

Conversely, it doesn’t matter what ‘an amazing person’ someone might (appear to) be if the conditions for ‘leading’ aren’t there.

‘Winning’ at becoming ‘the leader’ shouldn’t be the goal.

Footnotes

1. Elliott Jaques (1917 – 2003) was a Canadian psychoanalyst and organizational psychologist. 

2. Requisite Organisation: Jaques wrote a book called the Requisite Organisation, which puts many of his ideas together. Personally, I find the ideas interesting but ‘of a time’ and/or of a particular ‘hierarchical’ mindset.

3. Seeking out the person best placed to lead: This would be a sign that you cared more about the purpose and the people than leading.

4. Regarding ‘far too much emphasis on personality’: Notice that Jaques says ‘too much’ but he doesn’t say that personality is irrelevant. But, rather than come up with what qualities ‘we’ should have, he turns it the other way around. A managerial leader should have:

“The absence of abnormal temperamental or emotional characteristics that disrupt the ability to work with others.”

This is nice. It presumes that, so long as we aren’t ‘abnormal’ then any of us can lead given the necessary conditions.

5. Winston Churchill is often used as an example of a great leader – and he was, under certain circumstances…but many historians have written about how this didn’t carry through to every situation (such as running a country in peacetime).

 

Double Trouble

Double troubleThere’s a lovely idea which I’ve known about for some time but which I haven’t yet written about.

The reason for my sluggishness is that the idea sounds so simple…but (as is often the case) there’s a lot more to it. It’s going to ‘mess with my head’ trying to explain – but here goes:

[‘Heads up’: This is one of my long posts]

Learning through feedback

We learn when we (properly) test out a theory, and (appropriately) reflect on what the application of the theory is telling us i.e. we need to test our beliefs against data.

“Theory by itself teaches nothing. Application by itself teaches nothing. Learning is the result of dynamic interplay between the two.” (Scholtes)

Great. So far, so good.

Single-loop learning vs. Double-loop learning

Chris Argyris (1923 – 2013) clarified that there are two levels to this learning, which he explained through the phrases ‘single-loop’ and double-loop’1.

Here are his definitions to start with:

Single-loop learning: learning that changes strategies of action (i.e. the how) …in ways that leave the values of a theory of action unchanged (i.e. the why)

Double-loop learning: learning that results in a change in the values of theory-in-use (i.e. the why), as well as in its strategies and assumptions (i.e. the how).

That’s a bit of a mouthful – and (with no disrespect meant) not much easier to comprehend when you read his book!2

If you look up ‘double loop learning’ on the wonders of Google Images, you will find dozens of (very similar) diagrams3, showing a visualisation of what Argyris was getting at.

Here’s my version4 of such a diagram:

Double loop 1

You can think about this diagram as it relates either to an individual (e.g. yourself) or at an organisational level (how you all work together).

Start at the box on the left. Whether we like it or not, we (at a given point in time) think in a certain way. This thinking comes about from our current beliefs and assumptions about the world (and, for some, what might lie beyond).

Our thinking guides our actions (what we do), and these actions heavily influence5 our performance (what we get).

And so to the ‘error’ bit:

“Organisations [are] continually engaged in transactions with their environments [and, as such] regularly carry out inquiry that takes the form of detection and correction of error.” (Argyris & Schon)

We are continually observing, and inquiring into, our current outcomes – asking ourselves whether we are ‘on track’, or everything is ‘as we would expect’ or perhaps whether we could do better. Such inquiry might range from:

  • subconscious and unstructured (e.g. just part of daily work); right through to
  • deliberate and formal (such as a major review producing a big fat report).

Argyris labels this constant inquiry as the ‘detection of error’. The error is that we aren’t where we would want to be, and the correction is to do something about this.

Okay, so we’ve detected an error and we want to make a corrective change. The easiest thing to do is to revisit our actions (and the strategies that they are derived from), and assess and develop new action strategies whilst keeping our underlying thinking (our beliefs and assumptions) steadfastly constant. This is ‘single-loop’ learning i.e. new actions, borne from the same thinking.


I reflect that the phrase ‘the more things change, the more they stay the same’ fits nicely here:

If the reason for the ‘error’ is within your thinking, then your single-loop learning, and the resultant change, won’t work. Worse, you will re-observe that error as it ‘comes round again’, and probably quicker this time…and so you make another ‘action’ change….and that error keeps on coming around. You have merely been making changes within the system, rather than changing the system.

A previous post called ‘making a wrong thing righter’ demonstrates this loop through the example of short term incentive schemes, and their constant revision “to make them even better”.


So, the final piece of the diagram…that green line. Many ‘errors’ will only be corrected through inquiry into, and modification of, our thinking…and, if this meaningfully occurs, then this would result in ‘double-loop’ learning – you would have changed the system itself.

Right, so that’s me finished explaining the difference between single-loop and double-loop learning…which I hope is clear and makes sense.

You may now be thinking “great, let’s do double-loop learning from now on!”

…because this is how most (if not all) those Google Image diagrams make it look. I mean, now you know about it, why wouldn’t you?

But you can’t!

The bit that’s missing…

Unfortunately, there’s a wall. Worse still, this wall is (currently) invisible. Here’s the diagram again, but altered accordingly:

Double loop 2

Right, I’d better try and explain that wall. Argyris & Schon wrote that:

“People learn collectively to maintain patterns of thoughts and action that inhibit productive learning.”

What are they on about?

Imagine that, through some form of inquiry, an error (as explained above) has been detected and a team of relevant people commence a conversation to talk about it:

  • The hierarchically senior person begins with a ‘take charge’ attitude (assuming responsibility, being persuasive, appealing to larger pre-existing goals);
    • it is typical within organisations that, once goals have been decided, changing them is seen as a sign of weakness.

  • He/she request a ‘constructive dialogue’, thereby stifling the expression of negative (yet real) feelings by themselves, and by everyone else involved…and yet acts as if this is not happening;
    • each person in the group is therefore being asked to suppress their feelings – to experience them privately, censor them from the group, and act as if they are not doing so.”

  • He/she takes a rational approach and asks the group to develop a ‘credible plan’ (which becomes the objective) to respond to the error…and so has skipped the necessary organisational self-reflection for double-loop learning to occur.
    • Coming up with a plan is ‘jumping into solution mode’ before you’ve properly studied the current condition and asked ‘why’.

So how does this affect the group dynamics?

“The participants experience an interest in solving the business problem, but their ways of crafting their conversation, combined with their self-censorship, [will lead] to a dialogue that [is] defensive and self-reinforcing.” (Argyris & Schon)

Given that this approach will hide so much, we can expect lots of private conversations (pre-meetings to prepare for meetings, post-meetings about what was/wasn’t said in meetings, meetings about what meetings aren’t happening…). Does this describe what you sometimes see in your organisation? I think that it is often labelled as ‘politics’…. which would be evidence of that wall.

Taken together, Argyris and Schon label the above as primary inhibitory loops.

Argyris sets out a (non-exhaustive) list of conditions that trigger and, in turn, reinforce, such defensive and dysfunctional behaviour. Here’s the list of conditions, together with how they should be combated:

Condition Corrective response
Vagueness Specify
Ambiguity Clarify
Un-test-ability Make testable
Scattered Information Concert (arrange, co-ordinate)
Information withheld Reveal
Un-discuss-ability Make discussable
Uncertainty Inquire
Inconsistency/ Incompatibility Resolve

”[such] conditions…trigger defensive reactions…these reactions, in turn, reduce the likelihood that individuals will engage in the kind of organisational inquiry that leads to productive learning outcomes.” (Argyris & Schon)

i.e. If you’ve got defensive behaviour, look for these conditions… and work on correcting them. Otherwise you will remain stuck.

Unfortunately, primary loops lead to secondary inhibitory loops. That is, that they lead to second-order consequences, and these become self-reinforcing.

  • Managers begin to (privately) judge their staff poorly, whilst the staff, ahem, ‘return the compliment’, with “both views becoming embedded in the organisational norms that govern relationships between line and staff”;

  • Sensitive issues of inter-group conflict become undiscussable. “Each group sees the other as unmovable, and both see the problem as un-correctable.”
    • A classic example of this is the constant conflict in many organisations between ‘IT’ and ‘The business’.

  • The organisation creates defensive routines intended to protect individuals from experiencing embarrassment or threat”with the unintended side effect that this then prevents the identification of the causes of the embarrassment or threat in order to correct the relevant problems.”

From this we get organisational messages that:

  • are inconsistent (in themselves and/or with other messages);
  • act as if there is no inconsistency; and
  • make the inconsistency undiscussable.

“The message is made undiscussable by the very naturalness with which it is delivered and by the absence of any invitation or disposition to inquire about it.”

Do you receive regular messages from, say, those ‘above you’ in the hierarchy? Perhaps a weekly or monthly Senior Manager communication?

  • How often are you amazed (in an incredulous way) about what they have written or said?
  • Do you feel welcome to point this inconsistency out? Probably not.

We end up with people giving others advice to reinforce the status quo: ‘Be careful what you say’, ‘You’ll get yourself into trouble’, ‘I wouldn’t say that if I were you’, ‘Remember what happened last time’…etc.

In short, there are powerful forces* at work in most organisations that are preventing (or at least seriously impeding) productive learning from taking place, despite the ability and intrinsic desire of those within the organisation to do so.

(* Note: Budgets – as in fixed performance contracts – are a classic ‘single-loop reinforcing’ management instrument. Conversely, Rolling forecasts can be a ‘double-loop’ enabler.)

So what to do instead?

Right, here’s my third (and last) diagram:

 

Double loop 3

It looks very similar to the last diagram, but this time there’s a ladder! But where do we get one of those from?

“For double-loop learning to occur and persist at any level in the organisation, the self-fuelling processes must be interrupted. In order to interrupt these processes, individual theories-in-use [how we think] must be altered.” (Argyris & Schon)

Oooh, exciting stuff! They go on to write:

“An organisation with a [defensive] learning system is highly unlikely to learn to alter its governing variables, norms and assumptions [i.e. thinking] because this would require organisational inquiry into double-loop issues, and [defensive] systems militate against this…we will have to create a new learning system as a rare event.”

There’s two places to go from here:

  • What would a productive learning system look like? and
  • How might we jolt the system to see the wall, and then attempt to climb the ladder?

If I can begin to tease these two out, then BINGO, this blog post is ready for print. Right, nearly there…

A Productive learning system

Argyris and Schon identify three values necessary for a productive learning system:

  • Valid information;
  • Free and informed choice; and
  • Internal commitment to the choice, including constant monitoring of its implementation.

Sounds lovely…but such a learning system requires the fundamental altering of conventional social virtues that have been taught to us since early in our lives. The following table ‘compares and contrasts’ the conventional with the productive:

Social Virtue: Instead of… Work towards…
Help and Support Giving approval and praise to others, and protecting their feelings Increasing others capacity to confront their own ideas, and to face what they might find.
Respect for others Deferring to others, and avoiding confronting their actions and reasoning. Attributing to others the capacity for self-reflection and self-examination.
Strength Advocating your position in order to ‘win’, and holding firm in the face of advocacy. Advocating your position, whilst encouraging inquiry of it and self-reflection.
Honesty Not telling lies, or

(the opposite) telling others all you think and feel.

Encouraging yourself and others, to reveal what they know yet fear to say. Minimising distortion and cover-up.
Integrity Sticking to your principles, values and beliefs Advocating them in a way that invites enquiry into them. Encouraging others to do likewise.

There’s a HUGE difference between the two.

The consequences will be an enhancement of the conditions necessary for double-loop learning – with current thinking being surfaced, publicly confronted, tested and restructured – and therefore increasing long-term effectiveness.

You’d likely liberate6 a bunch of great people, and create a purpose-seeking organisation.

Intervention

The first task is for you to see yourself – you have to become aware of the wall…and Argyris & Schon are suggesting that you may (likely) require an intervention (a shake) to do this. Your current defensive learning system is getting in the way.

Let’s be clear on what would make a successful intervention possible, and what would not.

An interventionist would locate themselves in your system and help you (properly) see yourselves…and coach you through contemplating what you see and the new questions that you are now asking…and facilitate you through experimenting with your new thinking and making this the ‘new normal’. This is ‘action learning’.

This ‘new normal’ isn’t version 2 of your current system. It would be a different type of system – one that thinks differently.

Conversely, you will not change the nature of your system if you attempt to ‘get someone in to do it to you’.

Why not?

“Kurt Lewin pointed out many years ago that people are more likely to accept and act on research findings if they helped to design the research, and participate in the gathering and analysis of data.

The method he evolved was that of involving his subjects as active, inquiring participants in the conduct of social experiments about themselves.” (Argyris & Schon)

In short: It can’t be done to you.

That ladder? That would be a skilled interventionist, helping you see and change yourselves through ‘action learning’.

To Close

Next time someone shows you that lovely (as in ‘simple’) double-loop learning diagram, I hope you can tell them about the wall…and the ladder.

Footnotes

1. Chris Argyris is known as one of the co-founders of ‘Organisation Development’ (OD) – the study of successful organizational change and performance. Argyris notes that he borrowed the distinction between single-loop and double-loop from the work of W. Ross Ashby. For blog readers, we met Ashby in an earlier post on requisite variety.

2. Book: ‘Organisational Learning II: Theory, Method, and Practise’ (1996) by Chris Argyris and Donald A. Schon).

3. Diagrams: Many of the diagrams stay true to what Argyris wrote about. Some attempt to build upon it. Others (in my view) bastardise it completely!

4. Language: I should note that Argyris used different language to my diagram. Here’s a table that compares:

My diagram: Argyris and Schon:
Thinking (Our beliefs and assumptions) Values, norms and assumptions
Action Action Strategies
Performance Performance, effectiveness
Defensive learning system Model O – I
Productive learning system Model O – II

5. Influence: I haven’t used the bolder ‘cause’ word because there’s a lot going on that is outside the system (e.g. the external environment).

6. Liberate: You don’t need to bring in ‘new’ people, most of what you need are already with you – they just need liberating from the system that they work within.

7. Kurt Lewin: often referred to as ‘the founder of social psychology’. Much of my writings in this blog are based around Lewin’s equation that Bƒ(PE) or, in plain English, that behaviour is a function of the person in their environment.