additional content
This plays out in meetings, too, where
diversity goals can be undermined
by these messy proxies to the
extent that we use proxies that
hinder particular groups: Height
gives men and people from certain
nations (whose populations tend to
be taller) an advantage, whileloud
personalities can set introverts at a
disadvantage,which similarly impacts
people fromcultural backgrounds
that tend to foster a soft-spoken
nature. This phenomenon applies to
both psychological and demographic
diversity.
People are not naturally skilled
at figuring out who they should
be listening to. But by combining
organizational and social psychology
with neuroscience, we can get a
clearer picture of why we’re so
habitually and mistakenly deferential,
and then understand how we can work
to prevent that from happening.
How Proxies Play Out in
the Brain
The brain uses shortcuts to manage
the vast amounts of information that
it processes every minute in any given
social situation. These shortcuts allow
our nonconscious brain to deal with
sorting the large volume of data while
freeing up capacity in our conscious
brain for dealing with whatever
cognitive decision making is at hand.
This process serves us well in many
circumstances, such as having the
reflex to, say, duck when someone
throws a bottle at our head. But it can
be harmful in other circumstances,
such as when shortcuts lead us to fall
for false expertise.
At a cognitive level, the biases that
lead us to believe false expertise
are similarity (“People like me are
better than people who aren’t like
me”); experience (“My perceptions of
the world must be accurate”); and
expedience(“If it feels right, it must
be true”). These shortcuts cause us
to evaluate people on the basis of
proxies — things such as height,
extroversion, gender, and other
characteristics that don’t matter,
rather than more meaningful ones.
The behavioral account of this
pattern was first captured by
breakthrough research from Daniel
Kahneman and the late Amos Tversky,
which eventually led to a Nobel Prize
in Economic Science for Kahneman,
and his bestseller Thinking, Fast and
Slow. Their distinction between so-
called System 1 thinking, a “hot” form
of cognition involving instinct, quick
reactions, and automatic responses,
and System 2 “cool” thinking, or
careful reflection and analysis, is very
important here. System 1 thinking
can be seen as a sort of autopilot. It’s
helpful in certain situations involving
obvious, straightforward decisions
— such as the ducking-the-bottle
example. But in more complicated
decision-making contexts, it can
cause more harm than good — for
instance, by allowing the person
with the highest rank in the meeting
to decide the best way forward,
rather than the person with the best
idea.
Taking Steps to
Combat Your Own
Decision-Making Bias
Given the extent to which Western
business culture puts a premium
on individualism and fast decision
making, it’s understandable that
so many people have been trained
to go their own way as quickly and
confidently as possible. The good
news is that with the right systems
in place, people can be trained
to approach problem solving in a
different, less bias-ridden way.
Although we
humans may
have biased
brains, we
also have
the capacity
to nudge
ourselves
toward more
rational
thinking
Although we cannot block a biased
assumption of which we are unaware,
we can consciously make an effort
to direct our attention to the specific
information we need to evaluate, and
to weigh it consciously. Just about
any sort of decision can get hijacked
by mental shortcuts, so it’s useful to
have a few tools to nudge yourself
and others toward more reflective,
rigorous, and objective thinking.
Set up “if-then” plans.
To guide attention back from
these proxies of expertise, you can
formulate “if-then” plans, which help
the anterior cingulate cortex — a
brain region that allows us to detect
errors and flag conflicting information
— find differences between our actual
behavior and our preferred behavior.
By incorporating this type of bias-
mitigation plan before we enter into
a situation where we know a decision
will be made, we increase our chances
of making optimal decisions.
For example, you can say to yourself:
“If I catch myself agreeing with
everything a dominant, charismatic
supported by EUROBAK
81