Why is it So Difficult to Change to an Agile Approach?

Work in progress. Thinking piece at the moment

Or “What does Management need to understand as they start an organizational agile journey”

Your brain is making you a poorer manager / knowledge worker.

Premise

Managing shouldn't be so hard. Knowledge work shouldn't be so hard. For most of us it is all about figuring out what the customer wants, and then figuring out how to get it to them in the fastest possible time, preferrably without killing ourselves in the process.

Or if you are a “Lean” person the aim is to deliver value with the “the shortest sustainable lead time”.

Yet most managers and knowledge workers demonstrably do not do this well. In many ways Agile is about rethinking the problem of value delivery which is very different to the way you'd traditionally think about the problem.

And here is the problem - your brain is getting in the way. Not only is it hard to learn new tricks Why Is Agile So Hard? but also our brains are wired in a way which actually gets in the way of being successful.

Most of us believe that we apply logic to the problems we face. In fact, if I were to ask a group of people whether their main approach to business problems most would say “we use logic to make decisions, especially key decisions”.

In reality we are hard-wired and have inbuilt biases that actually get in the way of making good decisions. These biases cause us to misinterpret information and often push us to decisions which really do not make sense when looked at objectively, and which we may even regret.

Most of us have heard of these “biases” that effect our thinking. Here are some biases that effect management and knowledge work in particular.

Confirmation Bias

You start with an answer, and then search for evidence to back it up.

If you believe that big upfront plans are important for success, you'll probably read lots of blogs and follow twitter feeds by those who share the same view and read books how to improve your plans and planning process. If you are convinced that all plans and planning are evil, you'll probably search for others with similar opinions. Neither helps you separate emotions from reality

A better approach is to adopt the intellectual strategy of Charles Darwin, who regularly tried to disprove his own theories, and was especially skeptical of his own ideas that seemed most compelling. The same logic should apply to management and knowledge work ideas.

Recency Bias

Letting recent events skew your perception of the future.

When project goes south you think it'll last forever and that we'll never recover. Rarely is that actually the case – it's usually the other way around – but it's what feels right when memories are fresh in our minds.

Backfiring Effect

When presented with information that goes against your viewpoints, you not only reject challengers, but double down on your original view.

Voters often view the candidate they support more favorably after the candidate is attacked by the other party. This applies to ideas as well as people. For example, many technical people get very attached to a technical product or solution to the point where that solution is the “hammer” to every “nail” (as in “if you are a hammer, every from is a nail”). Increasing evidence makes these people double-down on their preferred solution.

Anchoring

Letting one piece of irrelevant information govern your thought-process.

Estimating offers a lot of examples of this. If you present a requirement and someone says “it'll be about 5 days” most of the estimates will come in around that 5 days even if the person saying it (for example, the person who wants the capability) has no actual information about how long it will take.

Framing Bias

Reacting differently to the same information depending on how it's presented.

The classic example is where an experiment is set up as follows:

Framing Treatment A Treatment B
Positive “Saves 200 lives” “A 33% chance of saving all 600 people, 66% possibility of saving no one.”
Negative “400 people will die” “A 33% chance that no people will die, 66% probability that all 600 will die.”

Treatment A was chosen by 72% of participants when it was presented with positive framing (“saves 200 lives”) dropping to only 22% when the same choice was presented with negative framing (“400 people will die”).

Need a better example than this.

Bottom line is that a lot of organizations are risk adverse and so framing things in a negative way will lead to a disproportionate negative response.

Skill Bias

When education and training causes confidence to increase faster than ability.

There is an old saying “In theory, theory and practice are the same thing. In practice …” Many people believe they know more about is happening because of training and education, but in reality its the people who are working the day-to-day issues that really know what is going on.

In fact it was Peter Drucker who defined a knowledge worker as “a person who knows more about the job than their boss”. And the Japanese say “no important invention happens in the office - you have to go and see”. But most of our approaches to knowledge work are focussed on thinking rather than doing (e.g. big upfront plan) and thinking that the manager knows more than the people doing the work.

Hindsight Bias

Hindsight bias is the inclination, after an event has occurred, to see the event as having been predictable, despite there having been little or no objective basis for predicting it.

Our reaction to project plans that don't work out as expected is an example of this. In these situations our reaction is to ask “what is wrong with our plan” or “what was wrong with our estimates” and, looking back, we can probably find something to work on. The problem is that the next time we do this same type of work, we are looking forward, not backward with the result that we cannot predict what might happen. And we keep doing this despite the evidence to the contrary.

Pessimism Bias

Underestimating the odds of something going right.

Many organizations will always try to reduce the downside of something going wrong. They will do this by making early decisions in an effort to eliminate risk in the plan. The problem with this approach is that you are unable to take advantage of something positive that might happen. For example if you left the decision to a later time, when you have more information, could result in totally new was of solving the problem.

Halo Effect

“If we see a person first in a good light, it is difficult subsequently to darken that light,” writes the Economist.

And somehow we always attribute the “success” to one identifiable person, the hero, ignoring everyone else's contribution to the result. And once are seen that way, success will be automatically attributed to that person.

This is especially interesting in relationship to most work we do to day. In reality in a modern, reasonably sized organization no one person delivers value to the customers today. It takes a village. But we still tend to attribute positive (and negative) results to one person.

Illusion of Control

Thinking that your decisions and skill led to a desired outcome, when luck was likely a big factor.

Most systems we work with today are “complex adaptive systems”. Complex adaptive systems have the following characteristics:

  1. Emergent: System manifests new behaviors none of its components have
  2. Non-determinism: Exact outcomes cannot be predicted
  3. Non-linearity: Very small changes lead to dramatic outcomes

Today, both of the product we develop and well as the system we develop it in are complex adaptive systems. And since exact outcomes cannot be predicted, how many of the upfront decisions are we making that caused the successful outcome? In reality a lot of success is related to luck …

Escalation of Commitment

The “sunk cost falisy”.

“We have to spend more on this project as we have already spent so much.”

Doubling down on a failing project, not because you believe in its future, but because you feel the need to make back losses is bad.

Negativity Bias

Assuming perpetual doom, that problems will never be fixed, and that all hope is lost.

How many times do we not take the time to do the right thing because something else is more urgent. Many organizations fall into this trap, and then wonder why their people are depressed.

Ostrich Bias

Ignoring reality when it's screaming in your face, usually in an attempt to rationalize a certain viewpoint.

How many times do you have to see that estimation is not going to give you an accurate result but still ask to increase the accuracy of estimates. Perhaps there is a limit to how accurate it can be (there is) and perhaps we should build plans based on this knowledge.

Risk Perception Bias

Attempting to eliminate one risk, but exposing yourself to another, potentially more harmful, risk.

Survivorship Bias

From "You Are Not As Smart As You Think" “The Misconception: You should focus on the successful if you wish to become successful. The Truth: When failure becomes invisible, the difference between failure and success may also become invisible.”

Survivorship Bias is the logical error of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility. This can lead to false conclusions in several different ways.

XKCD on Survivorship Bias. “Every inspirational talk should start with disclaimer on survivorship bias”. We have been successful so we attribute our actions to this success when it is not clear they had any affect on the success or failure.

Combined with “attribution bias”

Self Serving Bias

The self-serving bias can be seen in the common human tendency to take credit for success but deny responsibility for failure. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way that is beneficial to their interests.”

http://ices.gmu.edu/wp-content/uploads/2010/07/Spring-08_Sandra-Ludwig.pdf for experiments

“… and where all children are above average.” – Lake Wobegone

Want to Know More

  • Original idea for this article (and some of the original list) came from the from a Motley Fool article “15 Biases That Make You a Bad Investor”
  • Other ideas: Backward bicycle as bias - “knowledge” is not the same as “understanding” and that it is hard to unlearn
  • Other ideas: Three things AI find hard. Idea that emotions drive decision making and that is a good thing.
    • Hyperbolic Discounting (Going for an immediate payoff instead of a delayed larger one). Eg not writing that automated test you should.
    • IKEA Effect (Overvaluing your own solutions to a problem, and thus in contrast undervalue other solutions). The IKEA effect is a cognitive bias in which consumers place a disproportionately high value on products they partially created. If you’ve ever worked for a company that used a dumb internal tool rather than a better out-of-the-box solution, you know what I’m talking about.
    • Premature Optimization (Optimizing before you know that you need to). For example, the perfectly coded prototype.
    • Planning Fallacy (Optimistically underestimating the time required to complete a task). We think “ideal” and this comes apart in the real world. Also the old aphorism, “The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time.”
    • Recency Bias (Placing higher value on recent events than ones that occurred further in the past). “Do you find yourself using the same design patterns over and over again? If yes, you’re probably looking at different problems from the same lens.”
  • From "Teeing yourself up for investment success". “A few months ago, I came across an article1 in The Wall Street Journal that made me think twice about two important aspects of my life: my clients’ investments and my golf game. The article explained that most golfers overestimate their abilities when they use specialized clubs designed to optimize distance and control. According to researchers at Arccos Golf, poor club selection was the reason 40% of approach shots landed short of the green. Why? Golfers chose their club based on how far the ball would travel if they hit their best shot. Golfers aren’t the only ones who overestimate their abilities—investors allow inflated performance expectations to influence their behavior too. According to the Journal of Economic Perspectives,2 investors attribute strong portfolio performance and high returns to their skills, which leads to self-assurance. When the same investors experience poor performance and low returns, they attribute it to bad luck. The result: persistent overconfidence.”
  • Additional biases Ziegnart Effect; Question / Behavior Effect
  • Poster of biases to potentially use in workshop

To Do

Need to equate these to business issues and culture issues directly.

Idea - we talk about the negative biases. Is there a way to leverage the biases for positive outcome.

Use the following URL for manually sending trackbacks: http://www.hanssamios.com/dokuwiki/lib/plugins/linkback/exe/trackback.php/why_is_it_so_difficult_to_change_to_an_agile_approach
You could leave a comment if you were logged in.
  • /home/hpsamios/hanssamios.com/dokuwiki/data/pages/why_is_it_so_difficult_to_change_to_an_agile_approach.txt
  • Last modified: 2018/12/03 19:39
  • by Hans Samios