Psych 101

Critical Thinking & Heuristics


Date
Aug 16, 2020 12:00 AM

How to apply a cognitive science standpoint to the world of social media and practice being a good scientist

To be a good scientist, it’s healthy to have a touch of amiable skepticism. What is amiable skepticism? It’s a trait that combines being open to ideas and scientific findings - but also being wary when good evidence and reasoning doesn’t seem to support this idea.

To be open to amiable skepticism, you have to be a critical thinker. Critical thinking involves looking for holes in logic, using logic and reasoning to see if information makes sense, and considering alternative explanations.

Otherwise, you may get behaviors like confirmation bias popping up in today’s world with “fake” news. Confirmation bias is a term used when an individual latches onto some form of information that corroborates their own inherent beliefs.

To avoid this bias and to keep an open mind, you can ask and check yourself with these questions:

If your answer to the last question is yes, then science has helped you through psychological reasoning.

Psychological reasoning examines how people think and aims to understand when and to explain why people are likely to draw erroneous conclusions. People want to make sense of the world and events they’re involved in, so the brain works in succinct ways to find patterns and make connections between ideas. Sometimes, patterns appear in a situation where they do not really exist. For example, when looking at clouds, you may see faces, images or animals. When playing music backwards, you may hear satanic or hidden messages, or maybe you think big events happen in threes. (“I am not superstitious, but I am a little ‘stitious. – Michael Scott)

All of this is not to say that with these reasoning skills, people still have the ability to make errors and draw biased conclusions. In fact, research has shown that sometimes we are wrong, but wrong in predictable ways. Therefore, that’s not to say that these errors can still also make new discoveries and help to advance society. Besides confirmation bias (aka, ignoring evidence unless it fits with your current beliefs), want to know what are other major biases or “predictable ways we are wrong” discussed in psychology?

Here’s a running list of cognitive biases, with examples of each:

1. Failing to accurately judge source credibility: Who can you trust?

This bias deals with appeals to authority: when sources refer to their expertise rather than to the evidence or the facts. Take the two viral videos that have run amuck on social media since April (example 1 and example 2). Yes, they are doctors. Should you believe them just because of who they say they are and how many years they have had in this profession? A scientist with amiable skepticism may question their arguments especially if evidence accumulates against said argument. Posed in a different way: should you believe everything I have to say here just because I am a PhD candidate studying Cognitive Science? Since I’ve had at least 7+ years of experience in behavioral science, my arguments may bear more weight than your friend who took a high school psych class – but that’s your bias to appeal to authority to decide.

2. Seeing relationships that do not exist: Making something out of nothing

Just because two facts or ideas correlate, does not mean one caused the other. For example, global temperatures have been increasing, whereas the number of pirates on the high seas has decreased overall. Do you think these two facts are correlated?

3. Using relative comparisons: Now that you put it that way

Relative comparisons is when people use comparisons to judge the inherent value of something. Such as, you probably feel a little better getting an 85 on an exam after learning the class average was 75 and not 95.

4. Accepting after-the-fact explanations: I can explain!

Feeling like you can explain a situation after the fact is called hindsight bias. In other words, when people come up with an explanation for why events happen, even when their information can be incomplete.

(“I knew you were trouble when you walked in,” sings Taylor Swift, in typical hindsight fashion.)

5. Misunderstanding or not using statistics: Going with your gut

I’m one of the few natives from Vegas, so sometimes gambling feels like it’s in my blood. From my experience here, I know that machines and tables have odds stacked against you, but people will still want to take the risk just for the bigger payout of the odds. Sometimes with facts and ideas, people will take similar risks – there’s still the question of whether you will end up with the big pay out or not.

6. Taking mental shortcuts: Keeping it simple

Every decision is made under some degree of risk, and quite often, decision-making involves heuristics. Heuristics are fast and efficient strategies that people use to make these decisions, like common mental shortcuts, rules of thumb, or informal guidelines. The thing to keep in mind about heuristics, is that they often occur unconsciously, meaning we’re often not aware we are taking these mental shortcuts. Here’s a few examples of heuristics:

Availability heuristic
  • The availability heuristic is the general tendency to make a decision based on the answer that comes most easily to mind. When we think or make decisions, we rely on the information that is easy to retrieve.
  • Think about it in the context of this question: Is the letter R more commonly the first letter in a word, or the third letter? How would you determine the answer?

The answer is the third letter, btw.

Representativeness heuristic
  • The true definition of this heuristic is “a tendency to place a person or object in a category if the person or object is similar to our own prototype for that category.” In other words, when we are presented with a new situation, our past experiences play a role in finding some similarities in this new experience in order to guide our decisions.
  • This is a heuristic that apparently I LOVE crashing on Instagram. Typically when users see professional headshots, photos, or someone who appears to have it together (because perception is not always reality), they think IG influencer or model. I am neither, but once that photo pops up on their feed, people will automatically lump this photo into the influencer/model category since that’s what is most representative on Instagram, not consider the alternative, which is that this photo represents what a scientist looks like.
Affective forecasting heuristic
  • How many times have you taken a risk or made a decision based on the belief it will make you happy? Or avoided it because it could make you sad? Guess what: you’ve applied affective forecasting to this situation by predicting how you will feel about things in the future. This behavior has led us to believe why we often overestimate how happy we’d be for positive events like getting married or winning in a competition, but likewise, also overestimate the extent to which a negative event will affect them in the future, like a breakup or losing a job.

7. Failing to see our own inadequacies (self-serving bias): Everyone is better than average

As humans, we are positive thinking creatures and are motivated to feel good about ourselves; however, this motivation can affect how you think and highlights why people can have difficulty seeing their own weaknesses. For example, 90% of all drivers think they are better-than-average. Reality check: only 50% of drivers can be above average on any given dimension.

Now that I’ve listed these biases with examples, have you seen yourself using any of these biases?

I know I have, which is why I have worked hard to maintain an open mindset versus a closed mindset. If you close yourself off, then you fall directly into the self-serving bias.

I hope this gives you some insight into how and what others are thinking when discussing ideas about complex human behavior.

Resource: Gazzaniga, M., Heatherton, T., and Halpern, D. (2016). Psychological Science. 5th Ed. (New York: Norton).

Related