in

The seven deadly sins of reading the news

What are the logical biases and cognitive fallacies we have when absorbing information?

One of my podcast subscriptions is The Art of Charm which covers topics related to social science and cognitive psychology.

One of their recent podcasts is an interview with a guy called Alex Kouts who is a startup guy based on the West Coast of the US.

He has been researching logical biases and cognitive fallacies in the news for the past several years and categorised them into what he calls The Seven Deadly Sins of Reading the News.

It’s one of the best podcasts I’ve listened, certainly one of the best The Art of Charm has done, so I took notes to refer to them later.

See also: Media manipulation and how to protect yourself.

We live in a world now where complicated facts and information are made into bite-sized chunks or what Kouts calls “content snacks” and we use all kinds of cognitive biases when it comes to consuming information.

Below are the notes I’ve put together. From here these are words I’ve transcribed from the show.

The amount of complex issues that are dominating the news cycle is mind-boggling. Things that are so complicated, so multifaceted, so deep and nuanced are often reduced to insane context-stripped content snacks.

Where you’re taking away a very complex issue and you’re stripping away all the context around it to turn it into a visceral feeling.

And that’s happening all the time and the news cycle’s dominated by these context snacks that represent visceral feelings on complex issues reducing away the requisite complexity for people to actually understand what’s going on.

Over the past couple of decades in addition to our political climate, we’ve seen a massive unchecked explosion of information that comes into us. But the crime of the information age is that this unchecked explosion of information has far outpaced our ability to process that information.

We’re getting more and more content on a regular basis with a lower and lower ability, proportionately speaking, to process that data.

This is causing the Seven Deadly Sins of Reading the News. These are:

1. Reasoning by proxy

Reasonsing by proxy is relying on other people or organisations to offload our cognitive load for making up our minds on complex things to someone else. Whatever the person or organisation believes we believe that too.

The amount of proxies that have been made available to us has exploded. The are bloggers writing on every topic you can imagine. There are people everywhere telling you their points of view and why you should believe something and we’ve lowered the bar for what it takes [to be influenced by a proxy] in response to the incredible number of proxies that are available to us.

There are people everywhere telling you their points of view and why you should believe something and we’ve lowered the bar for what it takes [to be influenced by a proxy] in response to the incredible number of proxies that are available to us.

2. Confirmation bias

The most common cognitive bias to betray our ability to understand complex issues. The tendency to search for or interpret data that confirms our beliefs and we gravitate towards things that agree with us. You see this constantly in the way that people consume content. Most folks want to be pissed off or validated in news.

You see this constantly in the way that people consume content. Most folks want to be pissed off or validated in news.

People usually don’t have a greater kind of virtue to be informed. They either want to be pissed off or feel validated.

This is similar to the filter bubble. One of the concerns of social media today is, if I’m the product manager of Facebook my entire goal is to get you to interact with content on the site. From Facebook’s perspective, this makes a lot of sense because they want to give you content that you’re likely to engage with and that means that I’m more likely to give you content that you’ll likely agree with and it confirms your pre-existing notion about things and feeds you more content that’s part of that.

From Facebook’s perspective, this makes a lot of sense because they want to give you content that you’re likely to engage with and that means that I’m more likely to give you content that you’ll likely agree with and it confirms your pre-existing notion about things and feeds you more content that’s part of that.

Facebook is addressing this issue but Facebook is only one place it happens. We do it with newspapers we buy, websites we frequent and friends we choose to have conversations about politics with. If I have a friend who I know has a different view of politics than me I’m less likely to speak with him about it because I don’t want to get into an argument.

If I have a friend who I know has a different view of politics than me I’m less likely to speak with him about it because I don’t want to get into an argument.

3. Selection bias

This is picking and choosing selective data and then drawing conclusions based on it and then extrapolating that conclusion to a larger set of things. So it’s using a small amount of data to get an understanding of a bigger complex system when in reality that small data may not actually predict how that large system operates.

For example, if your car gets broken into in our neighbourhood you think the neighbourhood is going to shit and so is the whole country. So I’ve taken one isolated incident that happens with my car and I’ve said the neighbourhood has problems. Then I’ve said the entire country is going to hell. I’ve extrapolated data based on a small sample which is intellectually irresponsible and is a bias that we use constantly.

I’ve extrapolated data based on a small sample which is intellectually irresponsible and is a bias that we use constantly.

We see it all the time in the news and we saw it a ton in the election where you had each candidate taking individual stories of folks and isolating them as victims of a crime and suggesting that that particular crime is happening all over.

The most common area politicians do this is crime. Crime is where we bastardise small bits of data to extrapolate and influence the larger system. So the average American believes that crime is going up in America but over the past 25 years, crime has been decreasing steadily.

So the average American believes that crime is going up in America but over the past 25 years, crime has been decreasing steadily.

Negative news spreads faster, has a bigger emphasis and sells more than positive news.

“The sound of one tree falling is louder than a thousand trees growing.”

4. Bandwagon Effect

The more other people are saying something the more likely I am to want to say that. The more people believe something the more likely I want to believe it too.

Why? Because I want to be on the bandwagon. I want to belong to that social group. I want to be part of that movement.

Which is great. There is something about humans that make us want to get together in communities and want to have sympathetic behaviour so if you believe something or say something I agree with you.

At the same time that interest and tendency always make it difficult to stop and say, “Wait a second, do I believe this because I’m overridden by this social zeitgeist to want to belong?”

There are five major types of social proof:

  1. Geographic:
  2. Crowds
  3. Friends
  4. Celebrities
  5. Experts

Take the ALS ice bucket challenge which I think is a wonderful thing as it raised so much for ALS.

But ALS is not a very well understood disease and I had lots of friends who did the ice bucket challenge without being able to really tell you what ALS is or what the symptoms are.

5. Strawman Fallacy

Misrepresenting someone else’s argument to making it easier to attack. Jon Stewart, Stephen Colbert and John Oliver, all of these guys do exactly that.

They take complex arguments and figure how to simplify them and draw satire based on them. It’s a logical fallacy and you see it a lot with political punditry and the vast majority of TV shows that talk about politics.

They’re all misrepresenting someone else’s argument to make it easy to attack. The weird thing that happens based on that is, when you misrepresent an argument and attack a point of view and someone else argues back against that, they’re arguing against your misrepresentation of an argument as opposed to the actual argument.

And it goes back and forth so that they’re effectively abstracting the conversation away from the actual viewpoint to the point where they’re spinning out of orbit.

The conversation becomes more emotional and less fact based every single rhetoric that goes back and forth.

6. Appeal to emotion fallacy

This is one of the more common logical fallacies where people are using emotion instead of facts to win an argument. Kouts went to the RNC and DNC this year, both political conventions back-to-back. It was the most emotionally taxing two weeks of his life.

What you see at both conventions is parade after parade of emotional argument furthering some type of political viewpoint. Both political parties used the same emotional ploys for everything.

You sit there and parent after parent of slain child comes on stage. You can’t argue with a parent whose child was slain but you can say I understand what’s happening here.

They are picking this person because they’re trying to use an emotional ploy, trying to use the appeal to emotion fallacy to push my opinion on one direction.

“The war that we’re fighting is a war inside ourselves and being able to develop our own opinion.”

This happens in the news cycle as well. Repeal of Obamacare is one of the great examples of this. It’s not “What should the government’s role in healthcare be?” it’s “Do you want people to die?” and that’s the argument and quite frankly that’s brilliant PR.

It’s visceral, immediate, gets people to read, share and take action because they don’t want to have people die and they don’t want to be seen as complicit in helping people die. We have to be very conscious of appeal to emotion fallacy.

7. False cause fallacy

You see this constantly where people imply causation with two particular effects when there isn’t one. Most commonly this is confusing correlation with causation.

Studies related to coffee where they say it’s either good or bad for you because a group of people who drank coffee for a while experienced ‘this thing’. But the study fails to give a scientific explanation of why ‘this thing’ happened but people take this as causation.

A lot of time when studies like that get pushed people confuse causation with correlation. We see a lot of this in politics.

For example, the idea that a border wall will solve all border security problems when in reality it may solve some of them but a lot of people look at it as a panacea.

It’s not the root cause of the immigration crisis because there are a lot of other factors involved. It’s much more complex and nuanced.

How do we solve the problem?

  • Recognise you have a problem. Be mindful of when you have an emotional reaction to a certain point of view and ask yourself why.
  • Take the time to think about it. Be less trusting of other people’s opinions. The phrase I like the best is by Bill Bullard, “Opinions are the lowest form of human knowledge. They require no accountability or understanding.” Don’t trust someone because they have an opinion. Trust someone if they have a reason to say the right thing.
  • Avoid attachment to our ideas. We want to strive to have strong opinions that are loosely held. This is one of the biggest virtues when reading the news.
  • You have to be comfortable enough to say you don’t know enough to have an opinion. You ask someone a question on a complex issue and they’ll struggle to shit out some kind of half-baked ridiculous opinion when in reality the responsible things is to say, “I just don’t know enough to have an opinion yet. What do you think?”
  • Entertain the possibility that somebody that is principled, logical and intelligent believes something that makes your blood boil. People believe if someone is on the opposite side of an argument there is something wrong with them and they have a defect. We are shaped by our personal experiences.

The big conclusion to take away all of this is that there are reasons why these logical fallacies and cognitive biases exist.

They’re not just defects and actually provide value to us. They do betray our ability to understand things in a rational manner in many cases but by falling victim to the seven it can actually improve your quality of life.

Imagine all the mental cycles you save abstaining from the cognitive drag of thinking critically about some social issue. Ignorance is bliss. There’s a reason why we do it.

As extreme as it is in its efficiency it’s extreme in its moral and intellectual irresponsibility and that’s the key issue for us. Ignorance is bliss but it’s an irresponsible bliss.

Comments

Leave a Reply
  1. Fascinating. Worth pointing out too that these are well-established tools of rhetoric – used since forever to persuade. It’s the dark currency of the news media, and it will never change because there is no such thing as objective truth – have a look at Chomsky & Herman’s Propaganda Model, which is as relevant now as it’s ever been. We need to develop better bullshit detectors and (even basic) skills of critical analysis, so we can understand and gain a balanced perspective. I think WikiTribune is going to go some way to addressing that.

  2. This is a great podcast episode! I’ve listened to it twice and it’s my favourite podcast on AoC so far. With the onset of fake news, these biases and fallacies have become even more relevant. While listening to the episode, and after reading your notes, I actually realised how many times I’ve been guilty of these fallacies and biases. A podcast episode that everyone should listen to.

    • Couldn’t agree more. I’ve listened to it more than once and there’s so much in there I had to make notes. Always good to re-listen to it (or re-read it) to remind yourself.

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading…

The Business of Influence with Adam Parker of Lissted

Myths of PR with Rich Leigh