View profile

Weekly 3: Make the most of your errors


Idea Journal Weekly 3

May 12 · Issue #86 · View online

We combine 3 ideas to help you think differently and be more creative.

Summary: Make a sandwich. Pay attention to your blind side. See failure as a teacher. (~4 min read)
Note: We believe there’s value in looking at familiar things with a fresh perspective. Ideas #1, #2, and #3 are each from previously published issues, and we’ve combined them here to explore a new core theme: making the most of errors.

#1. Take your mistakes seriously, but not personally
Daniel Coyle writes in The Little Book of Talent that most of us are allergic to mistakes. When we make one, our instinct is to look away and pretend it didn’t happen. 
But that’s a problem because mistakes are our “guideposts for improvement.” 
Brain-scan studies show that right after we make a mistake, there’s a “vital instant,” a fraction of a second when we have 2 choices: we can look hard at the mistake, or we can ignore it.
Effective practice is about finding and fixing mistakes, and one way to help make sure that you don’t repeat a mistake in the future is what Coyle calls the “sandwich technique.”
Here’s how to use it in 3 steps:
  1. Make the correct move
  2. Make the incorrect move
  3. Make the correct move again
The goal is to reinforce the correct move, but also to put a spotlight on the mistake, so that you can prevent it “from slipping past undetected and becoming wired into your circuitry.”
#2. What would you see if you were wrong?
Business executive and entrepreneur Margaret Heffernan writes in her book Beyond Measure that information wants to be different: “If everyone brings the same knowledge, then why have five people in the room when you could just have one?”
As Heffernan points out, unanimity is a sign that participation isn’t wholehearted.
You can have more effective discussions and reach better decisions by seeking out disconfirming information and perspectives.
One way to do this is to ask: What would you see if you were wrong?
Heffernan tells the story of Herb Meyer, who worked for the Central Intelligence Agency (CIA) and used this approach to become one of the first people in the world to accurately predict the fall of the Soviet Union.
Meyer was responsible for producing the US National Intelligence Estimate, but he grew increasingly uncomfortable with the information he received because it only confirmed the prevailing wisdom: that the Cold War was still going strong, and that the Soviet Union was as powerful as ever.
Meyer then made a list of all the things that might happen if the prevailing wisdom were wrong and the Soviet Union was actually collapsing, and sent it to the spy networks.
It was a low-cost experiment: if they saw nothing, then the prevailing wisdom was accurate.
But one of the first data points that came back was news of weekly meat train that had been hijacked, with all of the meat stolen. The Soviet army had been contacted, but the country’s ruling party told the army to fall back and not tell anyone.
This is how Meyer himself recounted the events: “Well, that’s not what happens when everything in the economy’s just fine, is it? … So that started to tell us something. And then there was more like that.”
#3. Learn from “intelligent failures” to increase your organization’s odds of success
Being innovative means, by definition, working in an uncertain environment.
Columbia Business School Professor Rita Gunther McGrath writes in the Harvard Business Review that while “failure is inevitable” in such a context, believing in intelligent failures can teach you useful lessons. 
She recommends 7 practical principles to help your organization better plan for, manage, and learn from failure:
Principle 1: At the very beginning of a project, ensure that everyone agrees on the same definition of success.
Principle 2: Make your assumptions explicit, and design small experiments to test and revise them where possible.
Principle 3: Fail quickly – “quick, decisive failures” have several benefits, from limiting the number of resources that could be lost to shrinking the time needed to establish cause and effect relationships.
Principle 4: Fail cheaply – similar to Principle 3, reduce your downside risk by testing a prototype or adopting 3M’s philosophy of “make a little, sell a little.”
Principle 5: Limit uncertainties “at any particular decision point” whenever possible.
Principle 6: Build a culture that encourages and celebrates intelligent risk-taking, and doesn’t punish people for any resulting failures.
Principle 7: Capture the lessons you learn, and share them with others in your organization.
Quote of the Week
“Being right keeps you in place. Being wrong forces you to explore.”
- Author Steven Johnson in his book Where Good Ideas Come From
Idea Journal
Idea Journal
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue
New York, NY