World Monitor Mag WM_June 2018 web | Page 94

additional content
In that context , the task of a strategic leader is to do what Bailey did : relabel those messages ( becoming aware that they ’ re simply messages , not reality ), reframe them ( substituting new , more wholesome , and more accurate messages ), and refocus leaders ’ attention , again and again , on the new and more accurate messages until they , too , become second nature and part of the culture .
This difficult task is made a little bit easier because many deceptive organizational messages are prevalent in multiple organizations . Below are four of the most common categories .
1 . Misperceptions of Risk “ Again and again ,” wrote economists Carmen Reinhart and Kenneth Rogoff , “ countries , banks , individuals , and firms take on excessive debt in good times without enough awareness of the risks that will follow when the inevitable recession hits .” The title of their book , This Time Is Different : Eight Centuries of Financial Folly ( Princeton University Press , 2009 ), was a reference to the deceptive message voiced during the buildup to the financial crisis of 2008 , and before similar crises throughout history : “ We are doing things better , we are smarter , we have learned from past mistakes ,” wrote Reinhart and Rogoff , paraphrasing mistaken assessments of risk . “ The old rules of valuation no longer apply .”
Overconfident exceptionalism of this sort , in which executives underestimate the riskiness of their activity , has led many companies into complacency , and then to failure . We don ’ t have to worry about losing customers , executives say when faced with an upstart competitor . They have nowhere else to go . Sometimes this type of deceptive message arises around a narcissistically heroic leader . Our CEO takes chances and always comes out on top . If the exceptionalism extends to the entire company , then managers get into the habit of overstepping boundaries or fudging numbers , growing bolder and bolder until the risks catch up with them . The flip side of overconfident exceptionalism is excessive risk aversion . This can be equally debilitating , especially when it becomes a way of life . We must prevent — or at least prepare for — every possible failure . Excessive risk aversion often takes the form of accumulating as much support for a decision as possible before granting approval . It looks OK to me , but we can ’ t take any chances . You ’ d better ask these other two people as well . It can also show up as “ analysis paralysis ” — refusal to move forward without considering every possibility in detail . As a result , decision makers shut down entrepreneurial decisions and forgo valuable opportunities — including the opportunity to learn from risky situations and build up their own capacity for judgment . Excessively riskaverse companies unintentionally take the greatest risk of all : being left behind because of the time spent in collective rumination .
As they did at Transpacific Industries , these two deceptive messages can coexist . Underlying them both is the perception that the decision maker ’ s comfort level is an accurate indicator of risk . In reality , comfort levels are problematic indicators : They are derived from past experience with success ( which might not continue ) or painful failure ( which need not happen again ). Though the skill of risk assessment is fundamental to strategy , it is difficult to develop in the face of these deceptive organizational messages , especially when they aren ’ t recognized as such .
2 . Misperceptions of Value Deceptive messages involving value provide a misleading idea of the potential worth of current endeavors . Often , these misperceptions are manifested as perfectionism , or all-or-nothing thinking : It should be completely flawless , or it won ’ t have any value . A functional team might decide not to propose an interesting idea because they fear it isn ’ t good enough . A research group might secondguess an innovation , drag it down with extra features , and delay it until it ’ s eclipsed by rival offerings . A supervisor , considering promotions for the staff , might oscillate between extremes — treating a direct report as a star one year , but deeming that person a total screwup the next .
The opposite of all-or-nothing thinking is “ ticking the box ”: accepting suboptimal work , as long as it complies with specifications . It ’ s close enough for government work is a deceptive message of this sort . This type of message leads people to under-promise so that they can under-deliver without penalty , to dismiss improvement efforts as not worth the cost , and to look the other way when their colleagues cut corners .
Misperceptions of value often reflect a perspective that Stanford psychologist Carol Dweck calls the fixed mind-set . If everyone ’ s basic worth is fixed in place by the time they come of age , limited by the talent , intelligence , and circumstances they have inherited or acquired as children , then static judgments of value make sense . As Dweck points out , a more accurate view is the “ growth mind-set ,” or the idea that people can change habits , transcend limits , and expand their capabilities throughout their lives . Indeed , people continually do this through self-directed neuroplasticity . They focus their attention , over and over , in a way that builds new habits by etching new neural pathways in the brain . If you believe in the growth
90 world monitor