Work in progress. Thinking piece at the moment
Or “What does Management need to understand as they start an organizational agile journey”
Your brain is making you a poorer manager / knowledge worker. Managing shouldn't be so hard. Knowledge work shouldn't be so hard. For most of us it is all about figuring out what the customer wants, and then figuring out how to get it to them in the fastest possible time, preferrably without killing ourselves in the process.
Or if you are a “Lean” person the aim is to deliver value with the “the shortest sustainable lead time”.
Yet most managers and knowledge workers demonstrably do not do this well. In many ways Agile is about rethinking the problem of value delivery which is very different to the way you'd traditionally think about the problem.
And here is the problem - your brain is getting in the way. Not only is it hard to learn new tricks Why Is Agile So Hard? but also our brains are wired in a way which actually gets in the way of being successful.
Most of us believe that we apply logic to the problems we face. In fact, if I were to ask a group of people whether their main approach to business problems most would say “we use logic to make decisions, especially key decisions”.
In reality we are hard-wired and have inbuilt biases that actually get in the way of making good decisions. These biases cause us to misinterpret information and often push us to decisions which really do not make sense when looked at objectively, and which we may even regret.
Most of us have heard of these “biases” that effect our thinking. Here are some biases that effect management and knowledge work in particular.
You start with an answer, and then search for evidence to back it up.
If you believe that big upfront plans are important for success, you'll probably read lots of blogs and follow twitter feeds by those who share the same view and read books how to improve your plans and planning process. If you are convinced that all plans and planning are evil, you'll probably search for others with similar opinions. Neither helps you separate emotions from reality
A better approach is to adopt the intellectual strategy of Charles Darwin, who regularly tried to disprove his own theories, and was especially skeptical of his own ideas that seemed most compelling. The same logic should apply to management and knowledge work ideas.
Letting recent events skew your perception of the future.
When project goes south you think it'll last forever and that we'll never recover. Rarely is that actually the case – it's usually the other way around – but it's what feels right when memories are fresh in our minds.
When presented with information that goes against your viewpoints, you not only reject challengers, but double down on your original view.
Voters often view the candidate they support more favorably after the candidate is attacked by the other party. This applies to ideas as well as people. For example, many technical people get very attached to a technical product or solution to the point where that solution is the “hammer” to every “nail” (as in “if you are a hammer, every from is a nail”). Increasing evidence makes these people double-down on their preferred solution.
Letting one piece of irrelevant information govern your thought-process.
Estimating offers a lot of examples of this. If you present a requirement and someone says “it'll be about 5 days” most of the estimates will come in around that 5 days even if the person saying it (for example, the person who wants the capability) has no actual information about how long it will take.
Reacting differently to the same information depending on how it's presented.
The classic example is where an experiment is set up as follows:
|Framing||Treatment A||Treatment B|
|Positive||“Saves 200 lives”||“A 33% chance of saving all 600 people, 66% possibility of saving no one.”|
|Negative||“400 people will die”||“A 33% chance that no people will die, 66% probability that all 600 will die.”|
Treatment A was chosen by 72% of participants when it was presented with positive framing (“saves 200 lives”) dropping to only 22% when the same choice was presented with negative framing (“400 people will die”).
Need a better example than this.
Bottom line is that a lot of organizations are risk adverse and so framing things in a negative way will lead to a disproportionate negative response.
When education and training causes confidence to increase faster than ability.
There is an old saying “In theory, theory and practice are the same thing. In practice …” Many people believe they know more about is happening because of training and education, but in reality its the people who are working the day-to-day issues that really know what is going on.
In fact it was Peter Drucker who defined a knowledge worker as “a person who knows more about the job than their boss”. And the Japanese say “no important invention happens in the office - you have to go and see”. But most of our approaches to knowledge work are focussed on thinking rather than doing (e.g. big upfront plan) and thinking that the manager knows more than the people doing the work.
Hindsight bias is the inclination, after an event has occurred, to see the event as having been predictable, despite there having been little or no objective basis for predicting it.
Our reaction to project plans that don't work out as expected is an example of this. In these situations our reaction is to ask “what is wrong with our plan” or “what was wrong with our estimates” and, looking back, we can probably find something to work on. The problem is that the next time we do this same type of work, we are looking forward, not backward with the result that we cannot predict what might happen. And we keep doing this despite the evidence to the contrary.
Underestimating the odds of something going right.
Many organizations will always try to reduce the downside of something going wrong. They will do this by making early decisions in an effort to eliminate risk in the plan. The problem with this approach is that you are unable to take advantage of something positive that might happen. For example if you left the decision to a later time, when you have more information, could result in totally new was of solving the problem.
“If we see a person first in a good light, it is difficult subsequently to darken that light,” writes the Economist.
And somehow we always attribute the “success” to one identifiable person, the hero, ignoring everyone else's contribution to the result. And once are seen that way, success will be automatically attributed to that person.
This is especially interesting in relationship to most work we do to day. In reality in a modern, reasonably sized organization no one person delivers value to the customers today. It takes a village. But we still tend to attribute positive (and negative) results to one person.
Thinking that your decisions and skill led to a desired outcome, when luck was likely a big factor.
Most systems we work with today are “complex adaptive systems”. Complex adaptive systems have the following characteristics:
Today, both of the product we develop and well as the system we develop it in are complex adaptive systems. And since exact outcomes cannot be predicted, how many of the upfront decisions are we making that caused the successful outcome? In reality a lot of success is related to luck …
The “sunk cost falisy”.
“We have to spend more on this project as we have already spent so much.”
Doubling down on a failing project, not because you believe in its future, but because you feel the need to make back losses is bad.
Assuming perpetual doom, that problems will never be fixed, and that all hope is lost.
How many times do we not take the time to do the right thing because something else is more urgent. Many organizations fall into this trap, and then wonder why their people are depressed.
Ignoring reality when it's screaming in your face, usually in an attempt to rationalize a certain viewpoint.
How many times do you have to see that estimation is not going to give you an accurate result but still ask to increase the accuracy of estimates. Perhaps there is a limit to how accurate it can be (there is) and perhaps we should build plans based on this knowledge.
Attempting to eliminate one risk, but exposing yourself to another, potentially more harmful, risk.
From "You Are Not As Smart As You Think" “The Misconception: You should focus on the successful if you wish to become successful. The Truth: When failure becomes invisible, the difference between failure and success may also become invisible.”
Survivorship Bias is the logical error of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility. This can lead to false conclusions in several different ways.
XKCD on Survivorship Bias. “Every inspirational talk should start with disclaimer on survivorship bias”. We have been successful so we attribute our actions to this success when it is not clear they had any affect on the success or failure.
Combined with “attribution bias”
The self-serving bias can be seen in the common human tendency to take credit for success but deny responsibility for failure. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way that is beneficial to their interests.”
“… and where all children are above average.” – Lake Wobegone
Need to equate these to business issues and culture issues directly.
Idea - we talk about the negative biases. Is there a way to leverage the biases for positive outcome.