• Each week,

    20 Minutes

     invites a personality to comment on a social phenomenon in their “ 

    20 Minutes

     with…” meeting.

  • In a few years, Thomas C. Durand has gathered, with his accomplice Vled Tapas, 228,000 subscribers on his YouTube channel.

    On the program, critical thinking applied to "bullshit".

  • The trained biologist has just published

    La Science des

    balivernes with Humensciences editions.

    For

    20 Minutes

    , he explains why we believe in certain received ideas.

WHO would recommend taking 10,000 steps a day.

The Great Wall of China is said to be the only building visible to the naked eye from the moon.

These received ideas, the biologist and writer Thomas C. Durand chose to call them “nonsense”.

In the book

La Science des balivernes *

, released a few days ago

,

he wonders why we firmly believe in certain ideas that we know sometimes to be false.

For

20 Minutes

, Thomas C. Durand, who also co-hosts the "La Tronche en bias" channel, very popular on YouTube, looks back on these "nonsense" and on cognitive biases, these reasonings of our brain which act on our daily lives. without our always being aware of it.

Why do some ideas permeate our brain more than others?

You cite the example of the Great Wall of China, which would be the only building visible to the naked eye from the moon, or the sentence "we only use 10% of our brain" ...

Be careful, most of the things we hear and repeat are true.

But there are others that we repeat because it makes a good story, it's nice to believe it.

The nonsense about the Great Wall of China, for example.

Well Thomas Pesquet took another photo a few days ago.

He says he can't see her with the naked eye!

The nonsense is refuted, but it makes a great story, so we will hear it again.

There is a lot of "nonsense" about health.

You take the example of semi-precious stones, which emit energies good for our health, or the belief that there are more births on full moon nights ...

The health sector is particularly rich in nonsense.

We love to believe things about health, no doubt because it is a very important area and we often feel powerless in this area.

Believing certain things about health allows us to imagine that we are in control of things a little.

Nonsense also has a role to reassure us, limit stress and, from this point of view, it works.

So you don't necessarily have to pass a negative judgment on these beliefs.

Except when health is at stake, in which case it's best to believe real things.

Some nonsense is repeated over and over.

Shouldn't this be seen as a failure of critical thinking?

We can see it like that but,

again, most of the things we repeat are true: “2 + 2 = 4”, “Paris is the capital of France”.

My book shouldn't be another brick to categorize people

between "good guys" and "bad guys", on the contrary!

We are all equal before the statements.

If we practice, we can all resist misleading statements a little better.

There is never any guarantee that one is "bullshit" proof.

We all have wrong things in our heads that we don't want to question.

Most of the time, that's okay.

Critical thinking is full of holes, but it exists in everyone!

You devote a large part of the book to what is called cognitive biases.

In everyday life, is it important to know that we are reasoning with these biases?

Is it important to act against them or with them? 

We see them in others.

The purpose of the book is to remind everyone that this exists in everyone.

Bias is not proof that the brain is screwed up.

These are just mental shortcuts that in most situations lead us to conclusions that are close to reality, close enough for the reasoning to be considered valid.

These biases are multiple.

We are not aware of it on a daily basis ...

At present, we do not yet know how we can improve people's critical thinking.

But presenting the biases is a good lead.

We have to be aware of biases because they are exploited by marketing.

If you really want

not to be a sheep, you have to know them.

Marketing has recipes, whether they know it or not, that exactly tap into the blind spots of people's rationality to get them to say yes where they would have said no if they had truly judged. height of their personal interests.

People who are a bit conspiratorial, who reject official theories, have an interest in knowing about cognitive biases.

They are in a process where they do not want to be manipulated.

They confuse this desire with this idea: “It's okay, since I don't want to be manipulated, I can't be.

But as soon as we are sure, it is a death trap, because we stop being careful.

In the book, you explain that these biases are found in many situations, even where one might expect impartiality.

For example,

you cite the example of a scientific study carried out on court decisions in Spain.

Certain decisions present a bias linked to the indictment, or, in the context of an appeal, to previous convictions.

The final judgment is influenced by it ...

In most situations, humans have an interest in watching how other humans behave and to model their behavior on that of others.

If people

are running screaming in the street, it's not for nothing, in general.

Maybe it's a hoax, but it's a bit expensive guesswork.

It's better to run, because maybe there's a wild beast running wild or some madman shooting people.

On the other hand, when you are a judge, when you have rules to follow and you should be objective, it turns out that you are not, because the human being is not objective. .

Even the magistrates will be a little conservative in the judgment and will tend to follow the judgments of others.

You use the expression “think against yourself”.

Is this the method you recommend to free yourself from cognitive biases?

There is no silver bullet.

The biases, you will not get rid of them.

The best proof are optical illusions: even when you know what you have in front of you, sometimes you can't see reality as it is.

The cognitive biases are going to stay because the brain is made like that.

There are situations where we arrive at a judgment that is false, even when we know that it is false: we have professionals, in the hospital, who believe that there are more births on full moon days. .

Even if the data says otherwise, they feel they have seen it, that they have experienced it, that the logical connection is real.

Can we still have warning signals?

The warning signs are the others, the people around you.

That's why we have to be interested in people who don't think like us.

We can only be in circles where we all agree.

Having a little bit of disagreement with our friends is the best thing we can do.

The solution is also not to believe that you are immune.

Reading books on cognitive biases will not prevent you from having them!

You say you don't want to be the white knight of rationalism ...

We are seen like that, we Zeteticians [those who apply scientific methods to fields outside the usual field of science], the skeptics, especially because people who identify with this community have a wrong right attitude. .

They are motivated by good feelings and they are a bit too sharp in their judgment.

Our job is not to go to war on people, it is to be in prophylaxis, in popular education, and to help people ask themselves why they think what they think. , because most people don't want to believe nonsense.

Do you have plans for the YouTube channel?

We have a format called "The Little Shop of Errors", where we present "fallacies" [illusions]

,

fallacies in a slightly funny way with fictional characters - a master and his disciple who are on the hunt. to the bizarre creatures that are the fallacies.

We're going to shoot this summer and it'll come back in September.

And we do, every month at least, a two-hour live where we hear from experts to talk about different subjects.

We are going to talk about ball lightning, cancer and alternative medicine and especially the received ideas in relation to that, how we arrived at such a conclusion.

*

La Science des balivernes

was published on May 25 by Humensciences editions.

Podcast

AUDIO.

Why our brains set traps for us

Health

Coronavirus: "Open data is an anti-conspiracy weapon", explains Guillaume Rozier, at the origin of CovidTracker and Vitemadose

  • Books

  • Fake off

  • Fact checking

  • Youtube

  • Brain

  • 20 minutes with