Climbing Bias’

Although philosophers have been thinking about, writing about, and addressing fallacies dating back to Socrates in the Ancient World, the conception of fallacies as cognitive biases was not introduced to psychology until the early 1970’s by psychologists Amos Tversky and Daniel Kahneman.  Since then, their paper ‘Judgement Under Uncertainty: Heuristics and Biases’ has influenced an enormous wave of research into the role of biases in human decision making.

Like logical fallacies, there is no agreed upon exhaustive list of formal cognitive biases.  Some lists include over 200 biases and others reduce all biases to a variety of 5.  Independent of quantity, understanding some biases can be helpful in allowing us to recognize our own blind spots.  This understanding can also raise awareness for how others come to conclusions that may radically diverge from our own, especially in politics, religion, and other deeply held personal convictions.

Never wanting to avoid controversy, we decided to take a look as some common biases and useful examples as a means to investigate some divisive climbing heuristics. 

ANCHORING EFFECT: Relying too much on the initial piece of information offered when making decisions. (It’s a challenge to re-evaluate a belief once we accept it.)

I started out as a boulderer, I just can’t get in to sport-climbing.

AVAILABILITY HEURISTIC: Overestimating the importance and likelihood of events given the greater availability of information. (Escaping a conviction when our personal and digital algorithms provide no alternative information is nearly impossible.)

Granite is the only real climbing there is, growing up around Yosemite, all my climbing-partners agree that this is the only type of rock that matters.

BANDWAGON EFFECT: Uptake of beliefs and ideas increases the more they have already been adopted by others.  (There is no way so many people would believe this if it wasn’t true.)

Why do all these dirtbags wear Patagonia jackets?  Does it make them better climbers?

BLIND SPOT BIAS: Viewing oneself as less biased than others. (“I don’t really have any biases, trust me, I would know”)

If you ever want an accurate description of the difficulty of a route, feel free to ask me!

CONFIRMATION BIAS: Focusing on information that only confirms existing preconceptions.  (Seeing a poll where 50% of the population is in agreement with your position “I knew I was right” – yeah, but doesn’t that also mean that 50% disagree?)

This is the easiest 5.11 in the gym, ½ the people who tried it sent it!

COURTESY BIAS: Giving an opinion/conclusion that is viewed as more socially acceptable so as to avoid causing offence/controversy. (Kindness is important in most interactions, but we shouldn’t let controversy get in the way of the truth.)

Despite not liking any of the new setter’s routes, Kat said she enjoyed a particular route in order to avoid hurting the new setter’s feelings.

ENDOWMENT EFFECT: The tendency for people to ascribe more value to things merely because they already own/have them. (Mine is better that yours, simply because it’s mine.)

I know we have the same brand of shoes, but mine stick a little better because of how I break them in!

STEROTYPING: Assuming a person has characteristics because they are a member of a group. (In logic we call this a Hasty Generalization, and it can be dangerous to reason from a small sample to a whole group.)

Oh, you’re a climber, how many beanies do you own?

STATUS QUO BIAS: Preferring the current state of affairs over change. (Change can be scary, unknowns can be debilitating, there is comfort in the familiar, even when the familiar can also be debilitating.)

I can’t say that I am a fan of the new policy changes in the gym, what was wrong with the way things were?

RISK COMPENSATION: Taking bigger risks when perceived safety increases; being more careful when perceived risks increase.  (This is the reason that people are less likely to wear a seatbelt when they are closer to home.)

I’d feel much safer climbing this runout if you used a GriGri instead of the Pilot.

REACTIVE DEVALUATION: Devaluing an idea because it originated from an adversary.  (How could someone from the other side possibly know anything?)

Why would I trust a Speed Climber who offers beta?  What they do barely counts as climbing!

POST-PURCHASE RATIONALISM: Tendency to retroactively ascribe positive attributes to an option one has selected.  (Of course the side I’m on, or the thing I picked is the best, if it wasn’t, I wouldn’t have picked it.)

I have the best rope, the best harness, the best shoes, I climb at the best gym, and I know the best crags.  I only pick these things because they are the best.

OSTRICH EFFECT: Avoiding negative information by pretending it doesn’t exist.  (A failure to hear criticism is easy as long as you always look the other way.)

Yeah, I heard a pop in my finger, but I’ll be fine, so there’s no need to worry, and certainly no need to see a doctor.

HYPERBOLIC DISCOUNTING: Preferring a smaller, sooner payoff over a larger, later payoff.  (Taking immediate gratification even at the expense of long-term deleterious consequences.)

I think it’s fine to develop this section this virgin rock, yeah there are negative effects on the environment, but look at the potential to climb, we can worry about future problems in the future!

GAMBLER’S FALLACY: Believing that the future probabilities are altered by past events, when in fact they are unchanged.  (As philosopher George Santayana said: “Those who cannot remember the past are condemned to repeat it”.)

The last time I climbed in Vegas, I didn’t bring enough water and got really dehydrated, and although I packed the same amount of water this time, I should be fine!

And so, I’m pretty sure I’m committing some bias by only listing 15 examples, but hopefully the point can still be made.  We all suffer from blind spots, sometimes our biases can have massive negative consequence, and so it’s important to be open to the possibility that the beliefs we hold and the decisions we make are not quite as rational as we may presume. Further, those with diverging beliefs may be subject to scrutiny, but it’s important to remember that even those who we cannot relate to suffer from the same limits of rationality.  Even in times of great disagreement, it does no good to discredit a person, when we should instead discredit the process.  As philosopher Ice T once said: “Don’t hate the player, hate the game”.

Carrot

Leave a comment