PDA

View Full Version : Facts don't change people's minds


Quasimodo
16-08-10, 07:00
Grab some tea or coffee. It's a long read, but a good one.

How facts backfire
Researchers discover a surprising threat to democracy: our brains

By Joe Keohane | July 11, 2010

It’s one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. “Whenever the people are well-informed, they can be trusted with their own government,” Thomas Jefferson wrote in 1789. This notion, carried down through the years, underlies everything from humble political pamphlets to presidential debates to the very notion of a free press. Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.

In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.

This effect is only heightened by the information glut, which offers — alongside an unprecedented amount of good information — endless rumors, misinformation, and questionable variations on the truth. In other words, it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.

“Area Man Passionate Defender Of What He Imagines Constitution To Be,” read a recent Onion headline. Like the best satire, this nasty little gem elicits a laugh, which is then promptly muffled by the queasy feeling of recognition. The last five decades of political science have definitively established that most modern-day Americans lack even a basic understanding of how their country works. In 1996, Princeton University’s Larry M. Bartels argued, “the political ignorance of the American voter is one of the best documented data in political science.”

On its own, this might not be a problem: People ignorant of the facts could simply choose not to vote. But instead, it appears that misinformed people often have some of the strongest political opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare — the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct — but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)

Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”

What’s going on? How can we have things so wrong, and be so sure that we’re right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This is known as “motivated reasoning.” Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.

New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.

For the most part, it didn’t. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.

It’s unclear what is driving the behavior — it could range from simple defensiveness, to people working harder to defend their initial beliefs — but as Nyhan dryly put it, “It’s hard to be optimistic about the effectiveness of fact-checking.”

It would be reassuring to think that political scientists and psychologists have come up with a way to counter this problem, but that would be getting ahead of ourselves. The persistence of political misperceptions remains a young field of inquiry. “It’s very much up in the air,” says Nyhan.

But researchers are working on it. One avenue may involve self-esteem. Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.

There are also some cases where directness works. Kuklinski’s welfare study suggested that people will actually update their beliefs if you hit them “between the eyes” with bluntly presented, objective facts that contradict their preconceived ideas. He asked one group of participants what percentage of its budget they believed the federal government spent on welfare, and what percentage they believed the government should spend. Another group was given the same questions, but the second group was immediately told the correct percentage the government spends on welfare (1 percent). They were then asked, with that in mind, what the government should spend. Regardless of how wrong they had been before receiving the information, the second group indeed adjusted their answer to reflect the correct fact.

Kuklinski’s study, however, involved people getting information directly from researchers in a highly interactive way. When Nyhan attempted to deliver the correction in a more real-world fashion, via a news article, it backfired. Even if people do accept the new information, it might not stick over the long term, or it may just have no effect on their opinions. In 2007 John Sides of George Washington University and Jack Citrin of the University of California at Berkeley studied whether providing misled people with correct information about the proportion of immigrants in the US population would affect their views on immigration. It did not.

And if you harbor the notion — popular on both sides of the aisle — that the solution is more education and a higher level of political sophistication in voters overall, well, that’s a start, but not the solution. A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong. Taber and Lodge found this alarming, because engaged, sophisticated thinkers are “the very folks on whom democratic theory relies most heavily.”

In an ideal world, citizens would be able to maintain constant vigilance, monitoring both the information they receive and the way their brains are processing it. But keeping atop the news takes time and effort. And relentless self-questioning, as centuries of philosophers have shown, can be exhausting. Our brains are designed to create cognitive shortcuts — inference, intuition, and so forth — to avoid precisely that sort of discomfort while coping with the rush of information we receive on a daily basis. Without those shortcuts, few things would ever get done. Unfortunately, with them, we’re easily suckered by political falsehoods.

Nyhan ultimately recommends a supply-side approach. Instead of focusing on citizens and consumers of misinformation, he suggests looking at the sources. If you increase the “reputational costs” of peddling bad info, he suggests, you might discourage people from doing it so often. “So if you go on ‘Meet the Press’ and you get hammered for saying something misleading,” he says, “you’d think twice before you go and do it again.”

Unfortunately, this shame-based solution may be as implausible as it is sensible. Fast-talking political pundits have ascended to the realm of highly lucrative popular entertainment, while professional fact-checking operations languish in the dungeons of wonkery. Getting a politician or pundit to argue straight-faced that George W. Bush ordered 9/11, or that Barack Obama is the culmination of a five-decade plot by the government of Kenya to destroy the United States — that’s easy. Getting him to register shame? That isn’t.
(source (http://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire?mode=PF))

QiX
16-08-10, 07:10
You can't convince anyone who doesn't want to be convinced, and the ones who do are so few :p

aktrekker
16-08-10, 08:02
This applies not just to politics. It applies to everything. I've been meaning to post a thread about the phenomenon myself because it appears to be so prevalent in these forums.
It sometimes destroys your ability to even comprehend what you read just because it contradicts your personal belief system. You quite literally don't understand the meaning of what you read. This causes a conflict in your mind and usually results in an angry outburst to cover up the uneasy feelings.

I couldn't find any information about it because I didn't know what it's called. I had no idea what to search on.

Thanks for the information. Maybe being confronted with this "fact" people might be able to change the way they react to things, and be open to new ideas. But that's probably expecting too much.

Uzi master
16-08-10, 08:27
This applies not just to politics. It applies to everything. I've been meaning to post a thread about the phenomenon myself because it appears to be so prevalent in these forums.
It sometimes destroys your ability to even comprehend what you read just because it contradicts your personal belief system. You quite literally don't understand the meaning of what you read. This causes a conflict in your mind and usually results in an angry outburst to cover up the uneasy feelings.

I couldn't find any information about it because I didn't know what it's called. I had no idea what to search on.

Thanks for the information. Maybe being confronted with this "fact" people might be able to change the way they react to things, and be open to new ideas. But that's probably expecting too much.



sounds a lot like my little sister, maybe there's something to this article.

Super Badnik
16-08-10, 09:42
Intresting read, I didn't quite read all of it though (too early in the morning). But I don't find it very clear. What exactly are these type of beliefs with contradicting facts? Because sometimes a fact dosen't really have any power over an opinon.

aktrekker
16-08-10, 09:45
Because sometimes a fact dosen't really have any power over an opinon.
That's what they are saying. People refuse to change their opinions (or beliefs) even when provable facts show they are wrong.

Ward Dragon
16-08-10, 10:20
Maybe I'm missing the point, but the examples from his study seem very flawed to me:

In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted).

For example, if someone believes that Saddam was a horrible dictator and needed to be taken out of power, then WMD's don't matter towards that opinion. If someone is interested in the boost to the economy provided by the Bush tax cuts, they don't care that government revenue is slightly lower than it could be otherwise. If someone is in favor of stem cell research to cure diseases, they'll see any restriction as an impediment. That's not really ignoring the facts, but rather the facts the study chose to test were irrelevant towards the actual reasons people have for believing what they do.

When it comes down to the core of the matter, it's usually a moral choice with no objective answer anyway. To take the fetal stem cell research example, are the potential benefits of curing diseases worth overlooking where the stem cells come from and how they are obtained? A person's answer to that question will probably be heavily based on their moral code and can't be objectively proven wrong. All that can be proven wrong concerns implementation and what would be most effective, and people are usually willing to accept the facts there because the facts can be practically applied to show results.

Or with the Bush tax cuts, people don't comprehend the sheer magnitude of government revenue or debt, but they understand that their annual taxes are a few thousand dollars cheaper. If that's their main concern, that's all that will really matter to them and it can't objectively be proven whether they should want to keep that extra money or give it to the government. Again, that's more of a morality/judgment call.

Overall I feel like this article is an attempt to take another cheap shot at political opponents or complain about talk radio :p That's just the impression I get from which topics they chose to "prove" and how they chose to present it to the people in their study.

[Xmas]
16-08-10, 11:46
You can't convince anyone who doesn't want to be convinced, and the ones who do are so few :p This :tmb:

Dennis's Mom
16-08-10, 12:40
The media and teh internets feeds this as well. One only goes to places where one's own opinions are validated by how a story is reported. If Keith Olberman/Bill O'Reilly/insert agenda pusher here says it, it must be how it is. Gosh, it makes so much sense when you put it that way!

People no longer are challenged to "make up their minds" by looking for information. They are told what things are and what they mean up front.

Anyone (where's Eddie when I need him?) remember the old days of records, and the great scandals of backwards "subliminal messages" put on the records? Whenever theses "messages" were found, you were always told what they said before they played them. So of course, you heard the message you were told you were going to hear.

Now people report factual findings--the President did this, e.g.,--but it's always reported in a way that tells you WHY he did it and HOW it's going to turn out according to the media outlet's agenda. "It's the greatest thing since sliced bread!" one will say. "It's an unmitigated disaster!" the other will say.

The next day people are spouting the message they were told to. :hea:

Super Badnik
16-08-10, 12:44
That's what they are saying. People refuse to change their opinions (or beliefs) even when provable facts show they are wrong.I know, but certain opinons such as "Red is the best colour" can't really be disproven by facts.

interstellardave
16-08-10, 12:50
This is readily observable in the behavior of game-platform fanboys/girls.

jaywalker
16-08-10, 13:00
Know of this so well.. SO many times people would rather believe hype/rumour then an actual stated fact ;) tis funny :)

Encore
16-08-10, 14:51
I fail to see how any of this challenges the notion that knowledge and intelligence are the basis of democracy. If anything, it enforces it.

just*raidin*tomb
16-08-10, 14:58
Even after reading this, they won't change their minds.

Dustie
16-08-10, 15:02
“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

This paragraph pretty much explains it all. The inborn fear of admitting you were wrong. The fear of all the effort you might need to do in order to get back on track.

I wonder how many other problems of mankind stem from this mechanism, that was once supposed to 'defend' us.

Mikky
16-08-10, 15:27
You can't convince anyone who doesn't want to be convinced, and the ones who do are so few :p

I'm like that. I'm very "persistant". :pi:

Admles
16-08-10, 15:39
This is readily observable in the behavior of game-platform fanboys/girls.
Or religious fanatics

Dennis's Mom
16-08-10, 15:44
Admitting you were wrong also means you know more now than you did before. It means you can learn.

What's most disturbing is how the "facts" are always the product of some conspiracy. When NOAA was saying global warming was a problem and should be dealt with, they were the Smartest and Most Ethical Scientists on the planet. Now that they're saying they can't find elevated levels of contaminants in Gulf seafood, they're corrupt and shills for the oil industry.

You can't have it both ways, and and devising a conspiracy to deny any fact you don't like is just plain immature.

It reminds me of the "car races" my brother and I used to invent on long car rides. Any car that passed us was automatically disqualified so we were always winning. That crap flew when I was six, but frankly it's not worthy of "thinking adults."

snork
16-08-10, 16:07
Any car that passed us was automatically disqualified so we were always winning.

:vlol:
for small children, I think this is showing signs of intelligence and sovereignity. :D

However, there is also the other side : Welcoming or even longing for facts that prove you wrong.
When things "always" are as you expected them to be, it can get really boring.

Recently a German study on how healthy a live people lead, divided by social groups / degree of education showed that reality is the opposite of what they expected the study would show :
The ones with the lowest degree of education actually live the "healthiest". :p
And academics lead the least healthy lives : they had the highest percentage of smokers, consumption of alcohol, and "lack of physical movement". :D

These small surprises are actually welcome.

robm_2007
16-08-10, 16:30
Is this article a fact? Because if it is, then my mind won't be changed.

Encore
16-08-10, 16:47
I think it's worth noting that "facts" are often very shifty concepts in themselves. It's easy for them to be manipulated or disproven by later discoveries. Therefore, to be instantly persuaded to favour a certain idea, after reading one study or one source, isn't exactly positive either. There's nothing wrong with doubting and questioning reality, that's the essence of the scientific method anyway - never take anything for granted. And sometimes it annoys me how people automatically label a doubter as a "conspiracy theorist".

Mad Tony
16-08-10, 16:52
And sometimes it annoys me how people automatically label a doubter as a "conspiracy theorist".It can also work the other way around. So many times have I presented facts to so called 9/11 "truthers" (or any conspiracy theorists for that matter) and seen them simply dismiss it. You can't convince most of these people to see sense.

Super Badnik
16-08-10, 18:31
This is readily observable in the behavior of game-platform fanboys/girls.But surely thats just another example of what I mentioned earlier. With it being preferance and all.

It can also work the other way around. So many times have I presented facts to so called 9/11 "truthers" (or any conspiracy theorists for that matter) and seen them simply dismiss it. You can't convince most of these people to see sense.Well you can't really disprove consipiracy theories to people who strongly believe in them. Any evidence of them being false will always have been "planted" or manufactured by the government or whoever is supposedly covering it up.

Draco
16-08-10, 18:34
If facts did change people's minds, none of the current political body would be in office.

Dennis's Mom
16-08-10, 19:47
And sometimes it annoys me how people automatically label a doubter as a "conspiracy theorist".

No, but there is a difference between doubting and creating an elaborate conspiracy to justify your belief.

I have doubts about whether we (as people) are causing the earth to get warmer, however, I don't have to leap to a conspiracy that Al Gore and Hollywood have paid off every climatologist in the world to say we're all horrible people.

larafan25
16-08-10, 21:43
Know of this so well.. SO many times people would rather believe hype/rumour then an actual stated fact ;) tis funny :)

I'm scared.

Anywho, ya I try to be open to as much information as possible, I'm not sure how to tell if something is a fact or not, who can I believe.

Catracoth
16-08-10, 22:07
That's what they are saying. People refuse to change their opinions (or beliefs) even when provable facts show they are wrong.

It's a pride thing, I think. People don't like to be wrong, especially when they hold their beliefs dear.

Alpharaider47
16-08-10, 22:17
This is readily observable in the behavior of game-platform fanboys/girls.

Sadly, I was quite like this about xbox until I was overwhelmed with PS3 info by friends. Needless to say I now have one and I love it as much as my 360 :D

I think in a way it's like people are attached to their beliefs, and maybe if they're wrong about one thing it opens up the possibility that they're wrong about so much more? For me, I try my best to stay open to new things, etc and I try to be flexible, but of course it doesn't always work out that way.

Quasimodo
17-08-10, 06:29
This is readily observable in the behavior of game-platform fanboys/girls.

Since you've brought it up...
The Psychology of Fanboyism (http://www.gamepro.com/article/features/216012/the-psychology-of-fanboyism/)

All gamers eventually encounter one. The fanboy and fangirl, who you can find lurking on message boards or can hear shrieking over headsets on Xbox Live, are by no means a recent phenomenon-the first recorded use of the term "fanboy" dates back nearly a century to 1919. First used to describe passionate boxing fans-and later comic-book readers who prided themselves on knowing their cherished fictional universes inside and out-the word has since devolved as a description of immature and often obnoxious behavior in the world of video games.

But what makes fanboys tick? Why do so many take such a militant stance over their video-game console of choice, especially when the differences between consoles such as the PlayStation 3 and the Xbox 360 are minute when it comes to the average gamer's concerns? The vast majority of major game releases are multiplatform titles, and they offer essentially the same experience regardless of your console of choice. So why do so many fanboys develop a strong attachment to one game platform while rejecting the other with equal passion? With the help of psychologists and authorities on the subject, we seek to learn what motivates fanboy behavior and why it has become a pejorative term.

Video games did not spawn this type of zealous behavior; they're merely the latest, most visible host for this often vicious intellectual virus. Fanboy flame wars raged years before video-gaming ultranationalists took up arms in the Nintendo Entertainment System vs. Sega Genesis days-or even when the Intellivision stoked the fires of Atari 2600 loyalists. Of course, before video games, fanaticism of this kind often took the form of religious or sports debates.

Patrick Hanlon, author of the award-winning advertising psychology book Primal Branding: Create Zealots for Your Company, Your Brand, and Your Future, suggests that this kind of behavior has been a part of humanity since cavemen argued over which type of spearhead was best suited for taking down mammoths. Furthermore, he says the closer the community around a debate like this gets, the harder it becomes to quit.

"Whenever you bundle a group of people with similar beliefs and ideals together, it becomes harder for them to leave individually," says Hanlon, who has worked with Bungie on Halo's advertising campaigns. "If they stop, they lose the respect of the other members of the community. They feel like a member of a community there and nowhere else, and this exaggerated sense of belonging is the same as the communities that battle over Democrats vs. Republicans or Mac vs. PC." He adds that this perceived loss of camaraderie can cause people to remain part of a community against their best interests. "It's the same case with any kind of zealotry."

It's easy to see why people get up in arms about religion-few things are more important to a person's sense of identity than their faith. It gets fuzzier, though, when linking that behavior to a person's allegiance to something like a video-game console.

How can a 10-pound hunk of plastic and soldered silicon affect us in the same way as one's creed or political party? Perhaps even more important, why do we see this kind of fervor in the debate over video-game consoles and not other types of products? After all, you're not likely to find flame wars between consumers of Count Chocula and Cocoa Puffs.

"Some types of products are 'low involvement.' They don't define one as an individual," says Laurence Minsky, a professor of advertising at Columbia College Chicago, a private arts and media institution. "They tend not to cost much, so the impact of purchasing is small. In other words: If the purchase is a mistake, no big deal. Other products, where the implications of purchase are greater due to price, need for research, and [their] ability to broadcast one's personality and beliefs, tend to be 'high involvement'." "High involvement" is where video-game consoles fit in. Not only do they constitute a highly emotional purchasing decision, but their very nature expands their role in the lives of players.

"Consoles are so responsive and interactive, the technology blurs the line between animate and inanimate objects," says Dr. Nando Pelusi, a psychologist and expert in cognitive behavioral therapy. The beginning of fanboy habits seems to stem directly from the very joy of gaming itself. "Electronics engage you emotionally," Hanlon says. "They start the dopamine drip."

While the high-involvement theory explains some of the strong emotions exhibited by fanboys, it doesn't account for how gamers go about selecting which console is worthy of their loyalties. It's unlikely that something as simple as the $50 annual fee to access Xbox Live Gold membership-which is often brought up in "PS3 vs. Xbox 360" debates since Sony does not currently charge consumers for their PlayStation Network-would be enough to indoctrinate gamers so thoroughly that we have things like the Sony Defense Force, a website that claims it's on "a mission to give honest news about all things PlayStation" but often mercilessly rips anything to do with Xbox 360 and Nintendo Wii to shreds.

"Console fanboyism stems from gamers wanting their system of preference to be popular so it gets the most developer support," says Rob Foor, who runs Sony Defense Force. He singles out Sony when explaining the various reasons why gamers can become fanboys. "What factors make a console fanboy get behind a specific console? Games, price, loyalty, nostalgia, functionality, and friends. For example, a gamer could own a Wii, PS3 and Xbox 360 but still favor a PS3. Maybe they are used to the controller layout; maybe they love Sony's exclusive games; maybe their friends own a PS3 and they want to play online together; maybe they grew up with the original PlayStation and are comfortable with Sony branded systems; or maybe they just 'trust' Sony more than Microsoft or Nintendo."

When it comes to the argument that fanboyism derives from the fact that video-game consoles are expensive, making them "high-involvement" purchases, Foor believes the theory is too simplistic. "While I believe the high cost of entry into the console game space is a factor that leads to fanboyism, it is not the only factor nor does it tell the whole story."

In Primal Branding, Hanlon describes his theory on why consumers become so attached to certain brands while remaining apathetic to others, even if they're similar. He describes them as the creation of a belief system and gives seven characteristics of these strong brands: origin story, creed, icons, rituals, pagans (or "nonbelievers"), sacred words, and a leader. In essence, it's the formula for making people feel as though they are part of a team when they purchase a product, and it's not hard to see this lists' relation to the most popular gaming corporations.

Many hardcore gamers wear their allegiance on their sleeve, too. Most know the leaders of the three console manufacturers by face and name; Blizzard fans gather together ritually for BlizzCon as part of the company's massively successful means of binding together their community with games like World of Warcraft; and Nintendo has maintained a fervently loyal following that appears to get stronger with each new generation of gamers.

Out of all of the console manufacturers, Nintendo also seems to have the most virtues Hanlon describes in his book. Nintendo's huge fan base embodies this; its devotees hold a commanding lead over Sony and Microsoft fans in ill-advised tattoos.

Most of the major video-game companies today, however, have strong communities, so these virtues don't fully explain why a gamer chooses one console over another for their undying loyalty. Instead, Pelusi suggests that in the face of multiple enticing choices, "chance and peers are the main ways we get imprinted with the one that feels right." "There is a significant effect of word-of-mouth," agrees Lars Perner, Ph.D., a professor of consumer psychology at the University of Southern California. "If a brand gains a loyal following, reviews and mentions by others will tend to be more credible than paid advertising."
Out-of-Control Emotions

It may seem like a large divide exists between being "imprinted" with an attachment to a particular game console and spewing hate speech on message boards. But the reality is that we often see instances where a deep personal connection to a video-game platform can lead to odious behavior. One explanation as to why some gamers react to attacks or criticisms directed toward their favorite console-in a manner that suggests they are personally offended-is the idea that they view the console as an extension of themselves.

"There is a phenomenon whereby an object becomes part of a person's 'extended self'," says Perner. "If an individual is really into gaming, for example, his or her console may become an important part of his or her identity."

The belief structure of the brand essentially becomes part of their identity. As with religion and political affiliation, it's this sense of identity that causes fanboys to lash out defensively when they feel the ideology comes under attack. "Commitment and passion often lead to irrationality," notes Pelusi. "Commitment also leads to defending your homestead with zeal."

Aside from a commitment to a sense of personal identity, it's also a commitment to a hefty price tag. "There's more thought given to the purchase decision," says Minsky, "more opportunity for buyer's remorse after the purchase, and therefore, a greater need for the purchaser to be reassured he or she made the right choice; in other words, these purchases matter financially and emotionally." While many of the instances of video-game fanboyism we see today begin as something harmless, it can rapidly deteriorate into behavior that's malicious. While these experts have shown us a better understanding of the root of this particular form of fanaticism, we also know that since this phenomenon is nearly as old as humankind itself, it's doubtful that we'll ever be free from the wrath of fanboys. This fanaticism is ingrained in human behavior. The best we can do to counter the most negative examples of this mentality is to try to understand and discourage it, because unfortunately, it's something we'll most likely have to put up with as long as video games exist.

Yours,
Quasi
tl;dr quoter extraordinaire :p

QiX
17-08-10, 06:55
This thread reminds me of a research I heard of some years ago about the influence of advertising. I won't recall now the exact numbers and I can't remember where I saw them, but the conclusion was that the vast majority of the subjects believed that people in general are affected by advertising, but when asked if he/she ever let ads influence their decisions everyone would answer with a big round no.

It shows some of the general ideas people have about themselves against the 'others', but in another level leads us to think again on how overated is the role of television and propaganda. In the end I think that everyone has a general idea of what they want in life and filters everything according to their values. And when someone buys a product after seeing an ad he was probably looking for something to spend his money on and the ad just materialized something he already wanted.