I Don’t Want to Be Right

  • Topic Author
  • Visitor
  • Visitor
    Public
9 years 2 months ago #175497 by
This is the most mind blowing thing I read last year. Basically studies have shown that people hold beliefs based not on rational or Convincing evidence ,but almost entirely on what holding those believes make you appear to be. For example if you're a conservative christian you might believe that global warming is a hoax, not based on evidence ,but based on the fact that that's what conservative Christians believe. This has made me question everything I know, what believes might I have that I follow based on ideology and not rational evidence?


I Don’t Want to Be Right
BY MARIA KONNIKOVA
Source: I Don’t Want to Be Right

Last month, Brendan Nyhan, a professor of political science at Dartmouth, published the results of a study that he and a team of pediatricians and political scientists had been working on for three years. They had followed a group of almost two thousand parents, all of whom had at least one child under the age of seventeen, to test a simple relationship: Could various pro-vaccination campaigns change parental attitudes toward vaccines? Each household received one of four messages: a leaflet from the Centers for Disease Control and Prevention stating that there had been no evidence linking the measles, mumps, and rubella (M.M.R.) vaccine and autism; a leaflet from the Vaccine Information Statement on the dangers of the diseases that the M.M.R. vaccine prevents; photographs of children who had suffered from the diseases; and a dramatic story from a Centers for Disease Control and Prevention about an infant who almost died of measles. A control group did not receive any information at all. The goal was to test whether facts, science, emotions, or stories could make people change their minds.

The result was dramatic: a whole lot of nothing. None of the interventions worked. The first leaflet—focussed on a lack of evidence connecting vaccines and autism—seemed to reduce misperceptions about the link, but it did nothing to affect intentions to vaccinate. It even decreased intent among parents who held the most negative attitudes toward vaccines, a phenomenon known as the backfire effect. The other two interventions fared even worse: the images of sick children increased the belief that vaccines cause autism, while the dramatic narrative somehow managed to increase beliefs about the dangers of vaccines. “It’s depressing,” Nyhan said. “We were definitely depressed,” he repeated, after a pause.


Nyhan’s interest in false beliefs dates back to early 2000, when he was a senior at Swarthmore. It was the middle of a messy Presidential campaign, and he was studying the intricacies of political science. “The 2000 campaign was something of a fact-free zone,” he said. Along with two classmates, Nyhan decided to try to create a forum dedicated to debunking political lies. The result was Spinsanity, a fact-checking site that presaged venues like PolitiFact and the Annenberg Policy Center’s factcheck.org. For four years, the trio plugged along. Their work was popular—it was syndicated by Salon and the Philadelphia Inquirer, and it led to a best-selling book—but the errors persisted. And so Nyhan, who had already enrolled in a doctorate program in political science at Duke, left Spinsanity behind to focus on what he now sees as the more pressing issue: If factual correction is ineffective, how can you make people change their misperceptions? The 2014 vaccine study was part of a series of experiments designed to answer the question.

Until recently, attempts to correct false beliefs haven’t had much success. Stephan Lewandowsky, a psychologist at the University of Bristol whose research into misinformation began around the same time as Nyhan’s, conducted a review of misperception literature through 2012. He found much speculation, but, apart from his own work and the studies that Nyhan was conducting, there was little empirical research. In the past few years, Nyhan has tried to address this gap by using real-life scenarios and news in his studies: the controversy surrounding weapons of mass destruction in Iraq, the questioning of Obama’s birth certificate, and anti-G.M.O. activism. Traditional work in this area has focussed on fictional stories told in laboratory settings, but Nyhan believes that looking at real debates is the best way to learn how persistently incorrect views of the world can be corrected.

One thing he learned early on is that not all errors are created equal. Not all false information goes on to become a false belief—that is, a more lasting state of incorrect knowledge—and not all false beliefs are difficult to correct. Take astronomy. If someone asked you to explain the relationship between the Earth and the sun, you might say something wrong: perhaps that the sun rotates around the Earth, rising in the east and setting in the west. A friend who understands astronomy may correct you. It’s no big deal; you simply change your belief.

But imagine living in the time of Galileo, when understandings of the Earth-sun relationship were completely different, and when that view was tied closely to ideas of the nature of the world, the self, and religion. What would happen if Galileo tried to correct your belief? The process isn’t nearly as simple. The crucial difference between then and now, of course, is the importance of the misperception. When there’s no immediate threat to our understanding of the world, we change our beliefs. It’s when that change contradicts something we’ve long held as important that problems occur.

In those scenarios, attempts at correction can indeed be tricky. In a study from 2013, Kelly Garrett and Brian Weeks looked to see if political misinformation—specifically, details about who is and is not allowed to access your electronic health records—that was corrected immediately would be any less resilient than information that was allowed to go uncontested for a while. At first, it appeared as though the correction did cause some people to change their false beliefs. But, when the researchers took a closer look, they found that the only people who had changed their views were those who were ideologically predisposed to disbelieve the fact in question. If someone held a contrary attitude, the correction not only didn’t work—it made the subject more distrustful of the source. A climate-change study from 2012 found a similar effect. Strong partisanship affected how a story about climate change was processed, even if the story was apolitical in nature, such as an article about possible health ramifications from a disease like the West Nile Virus, a potential side effect of change. If information doesn’t square with someone’s prior beliefs, he discards the beliefs if they’re weak and discards the information if the beliefs are strong.

Even when we think we’ve properly corrected a false belief, the original exposure often continues to influence our memory and thoughts. In a series of studies, Lewandowsky and his colleagues at the University of Western Australia asked university students to read the report of a liquor robbery that had ostensibly taken place in Australia’s Northern Territory. Everyone read the same report, but in some cases racial information about the perpetrators was included and in others it wasn’t. In one scenario, the students were led to believe that the suspects were Caucasian, and in another that they were Aboriginal. At the end of the report, the racial information either was or wasn’t retracted. Participants were then asked to take part in an unrelated computer task for half an hour. After that, they were asked a number of factual questions (“What sort of car was found abandoned?”) and inference questions (“Who do you think the attackers were?”). After the students answered all of the questions, they were given a scale to assess their racial attitudes toward Aboriginals.

Everyone’s memory worked correctly: the students could all recall the details of the crime and could report precisely what information was or wasn’t retracted. But the students who scored highest on racial prejudice continued to rely on the racial misinformation that identified the perpetrators as Aboriginals, even though they knew it had been corrected. They answered the factual questions accurately, stating that the information about race was false, and yet they still relied on race in their inference responses, saying that the attackers were likely Aboriginal or that the store owner likely had trouble understanding them because they were Aboriginal. This was, in other words, a laboratory case of the very dynamic that Nyhan identified: strongly held beliefs continued to influence judgment, despite correction attempts—even with a supposedly conscious awareness of what was happening.

In a follow-up, Lewandowsky presented a scenario that was similar to the original experiment, except now, the Aboriginal was a hero who disarmed the would-be robber. This time, it was students who had scored lowest in racial prejudice who persisted in their reliance on false information, in spite of any attempt at correction. In their subsequent recollections, they mentioned race more frequently, and incorrectly, even though they knew that piece of information had been retracted. False beliefs, it turns out, have little to do with one’s stated political affiliations and far more to do with self-identity: What kind of person am I, and what kind of person do I want to be? All ideologies are similarly affected.

It’s the realization that persistently false beliefs stem from issues closely tied to our conception of self that prompted Nyhan and his colleagues to look at less traditional methods of rectifying misinformation. Rather than correcting or augmenting facts, they decided to target people’s beliefs about themselves. In a series of studies that they’ve just submitted for publication, the Dartmouth team approached false-belief correction from a self-affirmation angle, an approach that had previously been used for fighting prejudice and low self-esteem. The theory, pioneered by Claude Steele, suggests that, when people feel their sense of self threatened by the outside world, they are strongly motivated to correct the misperception, be it by reasoning away the inconsistency or by modifying their behavior. For example, when women are asked to state their gender before taking a math or science test, they end up performing worse than if no such statement appears, conforming their behavior to societal beliefs about female math-and-science ability. To address this so-called stereotype threat, Steele proposes an exercise in self-affirmation: either write down or say aloud positive moments from your past that reaffirm your sense of self and are related to the threat in question. Steele’s research suggests that affirmation makes people far more resilient and high performing, be it on an S.A.T., an I.Q. test, or at a book-club meeting.

Normally, self-affirmation is reserved for instances in which identity is threatened in direct ways: race, gender, age, weight, and the like. Here, Nyhan decided to apply it in an unrelated context: Could recalling a time when you felt good about yourself make you more broad-minded about highly politicized issues, like the Iraq surge or global warming? As it turns out, it would. On all issues, attitudes became more accurate with self-affirmation, and remained just as inaccurate without. That effect held even when no additional information was presented—that is, when people were simply asked the same questions twice, before and after the self-affirmation.

Still, as Nyhan is the first to admit, it’s hardly a solution that can be applied easily outside the lab. “People don’t just go around writing essays about a time they felt good about themselves,” he said. And who knows how long the effect lasts—it’s not as though we often think good thoughts and then go on to debate climate change.

But, despite its unwieldiness, the theory may still be useful. Facts and evidence, for one, may not be the answer everyone thinks they are: they simply aren’t that effective, given how selectively they are processed and interpreted. Instead, why not focus on presenting issues in a way keeps broader notions out of it—messages that are not political, not ideological, not in any way a reflection of who you are?

Take the example of the burgeoning raw-milk movement. So far, it’s a relatively fringe phenomenon, but if it spreads it threatens to undo the health benefits of more than a century of pasteurization. The C.D.C. calls raw milk “one of the world’s most dangerous food products,” noting that improperly handled raw milk is responsible for almost three times as many hospitalizations as any other food-borne illness. And yet raw-milk activists are becoming increasingly vocal—and the supposed health benefits of raw milk are gaining increased support. To prevent the idea from spreading even further, Nyhan advises, advocates of pasteurization shouldn’t dwell on the misperceptions, lest they “inadvertently draw more attention to the counterclaim.” Instead, they should create messaging that self-consciously avoids any broader issues of identity, pointing out, for example, that pasteurized milk has kept children healthy for a hundred years.

I asked Nyhan if a similar approach would work with vaccines. He wasn’t sure—for the present moment, at least. “We may be past that point with vaccines,” he told me. “For now, while the issue is already so personalized in such a public way, it’s hard to find anything that will work.” The message that could be useful for raw milk, he pointed out, cuts another way in the current vaccine narrative: the diseases are bad, but people now believe that the vaccines, unlike pasteurized milk, are dangerous. The longer the narrative remains co-opted by prominent figures with little to no actual medical expertise—the Jenny McCarthys of the world—the more difficult it becomes to find a unified, non-ideological theme. The message can’t change unless the perceived consensus among figures we see as opinion and thought leaders changes first.

And that, ultimately, is the final, big piece of the puzzle: the cross-party, cross-platform unification of the country’s élites, those we perceive as opinion leaders, can make it possible for messages to spread broadly. The campaign against smoking is one of the most successful public-interest fact-checking operations in history. But, if smoking were just for Republicans or Democrats, change would have been far more unlikely. It’s only after ideology is put to the side that a message itself can change, so that it becomes decoupled from notions of self-perception.

Vaccines, fortunately, aren’t political. “They’re not inherently linked to ideology,” Nyhan said. “And that’s good. That means we can get to a consensus.” Ignoring vaccination, after all, can make people of every political party, and every religion, just as sick.

Please Log in to join the conversation.

More
9 years 2 months ago #175499 by Breeze el Tierno
NPR did a similar article on Dr. Nyhan this year. It was both fascinating and chilling. Thanks for posting this.
The following user(s) said Thank You: steamboat28, Alexandre Orion

Please Log in to join the conversation.

  • Visitor
  • Visitor
    Public
9 years 2 months ago - 9 years 2 months ago #175612 by
Replied by on topic I Don’t Want to Be Right
do some scientific exploration on facebook and fear the psychotic beast that man is.


"Take the example of the burgeoning raw-milk movement. So far, it’s a relatively fringe phenomenon, but if it spreads it threatens to undo the health benefits of more than a century of pasteurization." wtf? even mainstream doctors are now saying homogenized and pasteurized milk is poison.

damn, all they are doing is trying to find out how to brainwash and manipulate "their" human livestock more effectively.
Last edit: 9 years 2 months ago by .

Please Log in to join the conversation.

More
9 years 2 months ago #175613 by Edan
Replied by Edan on topic I Don’t Want to Be Right

ghost dog wrote: do some scientific exploration on facebook and fear the psychotic beast that man is.


"Take the example of the burgeoning raw-milk movement. So far, it’s a relatively fringe phenomenon, but if it spreads it threatens to undo the health benefits of more than a century of pasteurization." wtf? even mainstream doctors are now saying homogenized and pasteurized milk is poison.

damn, all they are doing is trying to find out how to brainwash and manipulate "their" human livestock more effectively.


I feel like I'm about to do a Gisteron.. but.. sources?

It won't let me have a blank signature ...
The following user(s) said Thank You: steamboat28

Please Log in to join the conversation.

  • Visitor
  • Visitor
    Public
9 years 2 months ago - 9 years 2 months ago #175616 by
Replied by on topic I Don’t Want to Be Right
may the Source send The Force to have mercy. :(

there is science and then there is the religion of "Science" in which it's adherents blindly believe.


Source: I Don’t Want to Be Right
Last edit: 9 years 2 months ago by .

Please Log in to join the conversation.

More
9 years 2 months ago #175617 by Edan
Replied by Edan on topic I Don’t Want to Be Right

ghost dog wrote: may the Source send The Force to have mercy. :(

there is science and then there is the religion of "Science" in which it's adherents blindly believe.


Source: I Don’t Want to Be Right


I was referring to your comment about milk.

It won't let me have a blank signature ...

Please Log in to join the conversation.

  • Visitor
  • Visitor
    Public
9 years 2 months ago - 9 years 2 months ago #175618 by
Replied by on topic I Don’t Want to Be Right
i could tell you about the doctors i know personally and my family's doctor and provide you with links, but then what good would that do? we both feel like you are about to pull a gisteron and rest your case. i like gisteron by the way, sometimes he says some smart stuff.

if you do your own research on milk's effects on health i'm sure it would be much more beneficial to you. because as the above article states most people do not care about logic and reason. man is a beast that feels best congregating in herds even if the whole herd runs off of a cliff.
Last edit: 9 years 2 months ago by .

Please Log in to join the conversation.

More
9 years 2 months ago - 9 years 2 months ago #175619 by Edan
Replied by Edan on topic I Don’t Want to Be Right

ghost dog wrote: i could tell you about the doctors i know personally and my family's doctor and provide you with links, but then what good would that do? we both feel like you are about to pull a gisteron and rest your case. i like gisteron by the way, sometimes he says some smart stuff.


This 'raw milk' thing is new to me... I am only asking how you know doctors are saying pasteurised milk is bad..

It won't let me have a blank signature ...
Last edit: 9 years 2 months ago by Edan.

Please Log in to join the conversation.

  • Visitor
  • Visitor
    Public
9 years 2 months ago - 9 years 2 months ago #175620 by
Replied by on topic I Don’t Want to Be Right
homogenized, pasteurized milk is not milk. it is milk that has been homogenized and pasteurized. it is poison. alcohol is a poison too, but at least it has spiritual and mental effects for better or worse.

if you read the above article including the parts about vaccines and pasteurized milk, and i don't know what, then i am an Aboriginal blowing smoke rings, raising smoke signals with my buffalo hide blanket, and vibrating with the didgeridoo. my sources are irrelevant. i am a primal beast, less than human, believing only in myth and harboring make believe superstitions. that is what this article is really about. how we as humans perceive "us and them". and it does not matter what is real or true, it only matters to be a part of the "right" group. the agenda behind their research is to learn how to better manipulate the herds of mankind and i bet it all on the fact that their intentions are not benign.
Last edit: 9 years 2 months ago by .

Please Log in to join the conversation.

More
9 years 2 months ago #175621 by steamboat28
Raw milk is a preference, not a health food. Pastuerization helps reduce the spread of various contagions, and costs too much time and energy to do if it holds no scientific benefits that have been adequately monitored since its inception.

Fringe comments on anti-science snake oil are fine, but don't expect them to be taken as gospel.
The following user(s) said Thank You: Edan, Breeze el Tierno

Please Log in to join the conversation.

Moderators: ZerokevlarVerheilenChaotishRabeRiniTavi