This is an interesting idea and the essay raises many good points, but I'm not sure how much I agree with the main conclusion. Another recent aeon essay I read was in a very like vein with regards more to pseudoscience than politics, but the idea is much the same : that people with radical beliefs are just as rational as the rest of us, but have different standards of trust (or for various other reasons simply trust different sources).

We don’t have to attribute a complete disinterest in facts, evidence or reason to explain the post-truth attitude. We simply have to attribute to certain communities a vastly divergent set of trusted authorities.Listen to what it actually sounds like when people reject the plain facts – it doesn’t sound like brute irrationality. One side points out a piece of economic data; the other side rejects that data by rejecting its source. They think that newspaper is biased, or the academic elites generating the data are corrupt.

Certainly with regards to pseudoscience, I strongly disagree. If you actually do go and engage with people, you will find that they generally are being wholly irrational, and all too often it sounds very clearly like brute, outright stupidity. In my experience hardly any of them are really more concerned with being objective than the rest of us. Flat Earthers, in particular, tend to be religiously motivated, and like Creationists, will automatically disregard anything from any source whatsoever that contradicts their existing view. Even their own senses cannot be trusted, because, to be blunt, "God did it". Such a reasoning is by its very nature not objective in the slightest (the most extreme of all, who include some very intelligent people, simply do not accept the existence of an external, objective reality).

More generally, other crackpot viewpoints tend to be of similar ilk, albeit with superficially different motivations : they tend to try to alter the facts to fit their views rather than the other way around - thinking that what they know is a fact, and so anything that contradicts it must be wrong by definition. They cherry pick to an absurd degree, holding whatever supports their view to be unimpeachable but anything else to be obviously wrong, with little consistency as to which source supports or disproves them. There's no higher standard of objectivity at work there.

Now, as regards both alternative scientific theories and political viewpoints, it is certainly possible to reach radically different conclusions based on the same data; I don't mean to suggest that anyone deviating from the scientific consensus (or indeed my own strongly left-liberal political leanings) is being a moron. Far from it. For example, of the people who regularly engage on my threads, I think just about every single damn one of you has got at least one opinion on which I think you're being really quite thick. This doesn't mean I don't like ya, because, you see, the reverse is also true : I don't think I entirely disagree with any of you. Everyone's got something useful to say about something. And yes, I'm sure quite a few of you think that on a least a few issues I'm behaving moronically - and of course some fraction of this will be correct. You know what ? That's fine. We don't have to "agree to disagree". We can simply disagree, and still be friends. Unless you're a Nazi, that is.

But :

And, in many ways, echo-chamber members are following reasonable and rational procedures of enquiry. They’re engaging in critical reasoning. They’re questioning, they’re evaluating sources for themselves, they’re assessing different pathways to information. They are critically examining those who claim expertise and trustworthiness, using what they already know about the world. It’s simply that their basis for evaluation – their background beliefs about whom to trust – are radically different. They are not irrational, but systematically misinformed about where to place their trust.

Well, not exactly. They may well have a capacity to process information to form and evaluate conclusions in a logical manner, and I'm sure at least some of them really are, in effect, simply victims of chronic misinformation. But I think it would be a terrible mistake - or at best an oversimplification - to infer that this is what happens in the majority of cases. Those who aren't interested in challenging their own views are not really being rational at all (or if you prefer, they are not engaging in critical thinking). If you go around only seeking out sources which support your existing view and insist that others must be discredited by definition of their obviously wrong conclusions, without examining the reasoning behind those conclusions, you are hardly being rational. You have already formed a viewpoint and are being an evangelical activist. You are not interested in the truth unless you actively examine contrasting viewpoints and try your best to give them a fair hearing.

Now this doesn't mean everyone is going to eventually come to the same conclusions or hold no moral convictions or ideals whatsoever; this, as the article points out, would be "more than we could reasonably expect of anybody". You can't expect everybody to hold rational views about everything unless you're a total plonker. But, it's good practise to remember that some people disagree because they too are critical, rational thinkers who've taken the time and trouble to examine issues in details, but that doesn't oblige you to agree with them. All that can be reasonably expected of a genuine truth seeker is that, circumstance and time permitting, they engage in a sincere effort to find the truth and honestly question their existing views. For example, I've got followers (and in turn follow) people with whom I profoundly disagree regarding, say, Brexit, gun control, abortion, religion, capitalism, gender equality, UFOs, free speech, etc. I don't think most of these people are lesser idiots who are stubbornly refusing to see reason, I just disagree with them.

That said, there are some views and combinations of views which are red flags. I'm unlikely to befriend someone who disagrees with me on all of the above issues, and there are some views so utterly stupid that I would seriously question the intelligence of anyone holding them, just as I would if someone told me that water isn't wet*. I think it's both reasonable and unavoidable to have some beliefs that just won't budge.

* Let's have no smart-alecy BS about wetness being perception, thankyouverymuch.

Another somewhat different quibble I have is that the article very properly raises the difference between different sorts of "bubble" people can become trapped in :

An epistemic bubble is when you don’t hear people from the other side. An echo chamber is what happens when you don’t trust people from the other side... An ‘epistemic bubble’ is an informational network from which relevant voices have been excluded by omission... An ‘echo chamber’ is a social structure from which other relevant voices have been actively discredited. Where an epistemic bubble merely omits contrary views, an echo chamber brings its members to actively distrust outsiders.

I'd have had it the other way around. If you go into a physical echo chamber, you go in there to hear your own voice. You don't go in there to discredit other people, you go there to hear what you want to here, excluding rather than attacking dissenting opinions. However, this distracting nomenclature aside, the article raises the other valuable points that one of these - the bubble that excludes other views - is relatively easy to burst, whereas that which actively attacks other views is much harder.

They are isolated, not by selective exposure, but by changes in who they accept as authorities, experts and trusted sources. They hear, but dismiss, outside voices. Their worldview can survive exposure to those outside voices because their belief system has prepared them for such intellectual onslaught. In fact, exposure to contrary views could actually reinforce their views... anybody who criticises him is doing it at the behest of a secret cabal of evil elites, which has already seized control of the mainstream media. His followers are now protected against simple exposure to contrary evidence. In fact, the more they find that the mainstream media calls them out for inaccuracy, the more their predictions will be confirmed.
[Lightly edited to make a more general point]

This, then, might be why the backfire effect sometimes takes hold, sometimes doesn't, and can sometimes be overcome with sufficient evidence :
If you've found someone who's just been chronically misinformed, you can easily persuade them with evidence. But if they've experienced a deeper ideological shift, it's going to be much harder : they're actively viewing you as a threat to be attacked, not someone to engage with. They have in a sense been "inoculated" against your views, just as people in the mainstream are now actively seeking as protection from fringe groups (

Later, the article has some very nice commentary about changing opinions of those deep in the grip of either such bubble :

Imagine, for instance, that somebody has been raised and educated entirely inside an echo chamber. That child has been taught the beliefs of the echo chamber, taught to trust the TV channels and websites that reinforce those same beliefs. It must be reasonable for a child to trust in those that raise her. So, when the child finally comes into contact with the larger world – say, as a teenager – the echo chamber’s worldview is firmly in place. That teenager will distrust all sources outside her echo chamber, and she will have gotten there by following normal procedures for trust and learning... It certainly seems like our teenager is behaving reasonably. She might be intellectually voracious, seeking out new sources, investigating them, and evaluating them using what she already knows. She is not blindly trusting; she is proactively evaluating the credibility of other sources, using her own body of background beliefs.

I dunno, I'd quibble a lot with "behaving reasonably". She might be processing information logically, but seeking out information for the sole purpose of discrediting it is hardly reasonable.

There is at least one possible escape route, however. Notice that the logic of the echo chamber depends on the order in which we encounter the evidence. An echo chamber can bring our teenager to discredit outside beliefs precisely because she encountered the echo chamber’s claims first. Here is the real source of irrationality in lifelong echo-chamber members – and it turns out to be incredibly subtle. Those caught in an echo chamber are giving far too much weight to the evidence they encounter first, just because it’s first. Rationally, they should reconsider their beliefs without that arbitrary preference.

Which is of course tremendously difficult :

Our teenager would have to do something much more radical than simply reconsidering her beliefs one by one. She’d have to suspend all her beliefs at once, and restart the knowledge-gathering process, treating all sources as equally trustworthy. This is a massive undertaking; it is, perhaps, more than we could reasonably expect of anybody.

Let’s call the modernised version of Descartes’s methodology the social-epistemic reboot. The social reboot might seem rather fantastic, but it is not so unrealistic. Such a profound deep-cleanse of one’s whole belief system seems to be what’s actually required to escape. Look at the many stories of people leaving cults and echo chambers. Take, for example, the story of Derek Black in Florida – raised by a neo-Nazi father, and groomed from childhood to be a neo-Nazi leader. Black left the movement by, basically, performing a social reboot. He completely abandoned everything he’d believed in, and spent years building a new belief system from scratch... It was the project of years and a major act of self-reconstruction, but those extraordinary lengths might just be what’s actually required to undo the effects of an echo-chambered upbringing.

Is there anything we can do, then, to help an echo-chamber member to reboot? We need to attack the root, the systems of discredit themselves, and restore trust in some outside voices... accounts of people leaving echo-chambered homophobia rarely involve them encountering some institutionally reported fact. Rather, they tend to revolve around personal encounters – a child, a family member, a close friend coming out. These encounters matter because a personal connection comes with a substantial store of trust.

We don’t simply trust people as educated experts in a field – we rely on their goodwill. And this is why trust, rather than mere reliability, is the key concept. Reliability can be domain-specific. The fact, for example, that somebody is a reliable mechanic sheds no light on whether or not their political or economic beliefs are worth anything. But goodwill is a general feature of a person’s character. If I demonstrate goodwill in action, then you have some reason to think that I also have goodwill in matters of thought and knowledge. So if one can demonstrate goodwill to an echo-chambered member – as Stevenson did with Black – then perhaps one can start to pierce that echo chamber.
"An epistemic bubble is when you don’t hear people from the other side. An echo chamber is what happens when you don’t trust people from the other side."

Take na marginesie, sądzę że brakuje jeszcze jednego elementu układanki: ewangelizmu/ ewangelistów. Wcale nie jest tak, że ludzie potrafią wzajemnie zamknąć się w bańkach i budować swoje komory całkiem sami. W każdym z takich środowisk jakie poznałem, zawsze była większość: ludzie umiarkowani, wycofani, mniej aktywni i zwykle kilka osób hiperaktywnych, uprawiających ewangelizację Nowej Idei, pracujący nad publicystyką i budujący często mury. Nie chodzi o proste wyjaśnianie faktów ale o konsekwentne budowanie struktury która zamyka wszystkich którzy chcą być w danej społeczności szanowani. Zwykle towarzyszyła temu bolszewicka czujność i neoficka żarliwość. Sądzę że sporo tych ludzi robiło to non profit, czysto z potrzeby intelektualnej, emocjonalnej, dla towarzystwa lub lepszego samopoczucia. Często po jakimś czasie sami znikali zostawiając wyznawców w komorze i przenosili się do innych mediów, gdzie budowali kolejne.

Ja tam nie wiem jak uniknąć tego efektu. Ale jedynym sposobem jaki wymyśliłem jest po prostu: mieć w kręgu znajomych ludzi z którymi się nie zgadzasz. Niestety jest to bardzo trudne, nie dlatego że czytanie rzeczy których nie poważasz jest męczące. Jest to trudne, bo ludzie którzy się z tobą nie zgadzają, często nie chcą z tobą rozmawiać, więc milkną...
Shared publiclyView activity