Group polarizationhttps://en.wikipedia.org/wiki/Group_polarizationIn social psychology, group polarization refers to the tendency for a group to make decisions that are more extreme than the initial inclination of its members. These more extreme decisions are towards greater risk if individuals' initial tendencies are to be risky and towards greater caution if individuals' initial tendencies are to be cautious.[1] The phenomenon also holds that a group's attitude toward a situation may change in the sense that the individuals' initial attitudes have strengthened and intensified after group discussion, a phenomenon known as attitude polarization.[2]
OverviewGroup polarization is an important phenomenon in social psychology and is observable in many social contexts. For example, a group of women who hold moderately feminist views tend to demonstrate heightened pro-feminist beliefs following group discussion.[3] Similarly, have shown that after deliberating together, mock jury members often decided on punitive damage awards that were either larger or smaller than the amount any individual juror had favored prior to deliberation.[4] The studies indicated that when the jurors favored a relatively low award, discussion would lead to an even more lenient result, while if the jury was inclined to impose a stiff penalty, discussion would make it even harsher.[5] Moreover, in recent years, the Internet and online social media have also presented opportunities to observe group polarization and compile new research. Psychologists have found that social media outlets such as Facebook and Twitter demonstrate that group polarization can occur even when a group is not physically together. As long as the group of individuals begins with the same fundamental opinion on the topic and a consistent dialogue is kept going, group polarization can occur.[6]
Research has suggested that well-established groups suffer less from polarization, as do groups discussing problems that are well known to them. However, in situations where groups are somewhat newly formed and tasks are new, group polarization can demonstrate a more profound influence on the decision-making.[7]
Attitude polarizationAttitude polarization, also known as belief polarization and polarization effect, is a phenomenon in which a disagreement becomes more extreme as the different parties consider evidence on the issue. It is one of the effects of confirmation bias: the tendency of people to search for and interpret evidence selectively, to reinforce their current beliefs or attitudes.[8] When people encounter ambiguous evidence, this bias can potentially result in each of them interpreting it as in support of their existing attitudes, widening rather than narrowing the disagreement between them.[9]
The effect is observed with issues that activate emotions, such as political "hot button" issues.[10] For most issues, new evidence does not produce a polarization effect.[11] For those issues where polarization is found, mere thinking about the issue, without contemplating new evidence, produces the effect.[11] Social comparison processes have also been invoked as an explanation for the effect, which is increased by settings in which people repeat and validate each other's statements.[12] This apparent tendency is of interest not only to psychologists, but also to sociologists[13] and philosophers.[14]
Confirmation biashttps://en.wikipedia.org/wiki/Confirmation_biasConfirmation bias, also called confirmatory bias or myside bias,[Note 1] is the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses.[1] It is a type of cognitive bias and a systematic error of inductive reasoning. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. People also tend to interpret ambiguous evidence as supporting their existing position. Biased search, interpretation and memory have been invoked to explain attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and illusory correlation (when people falsely perceive an association between two events or situations).
A series of experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives. In certain situations, this tendency can bias people's conclusions. Explanations for the observed biases include wishful thinking and the limited human capacity to process information. Another explanation is that people show confirmation bias because they are weighing up the costs of being wrong, rather than investigating in a neutral, scientific way.
Confirmation biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. Poor decisions due to these biases have been found in political and organizational contexts.[2][3][Note 2]
TypesConfirmation biases are effects in information processing. They differ from what is sometimes called the behavioral confirmation effect, commonly known as self-fulfilling prophecy, in which a person's expectations influence their own behavior, bringing about the expected result.[4]
Some psychologists restrict the term confirmation bias to selective collection of evidence that supports what one already believes while ignoring or rejecting evidence that supports a different conclusion. Others apply the term more broadly to the tendency to preserve one's existing beliefs when searching for evidence, interpreting it, or recalling it from memory.[5][Note 3]
Biased search for informationExperiments have found repeatedly that people tend to test hypotheses in a one-sided way, by searching for evidence consistent with their current hypothesis.[7][8] Rather than searching through all the relevant evidence, they phrase questions to receive an affirmative answer that supports their theory.[9] They look for the consequences that they would expect if their hypothesis were true, rather than what would happen if they were false.[9] For example, someone using yes/no questions to find a number he or she suspects to be the number 3 might ask, "Is it an odd number?" People prefer this type of question, called a "positive test", even when a negative test such as "Is it an even number?" would yield exactly the same information.[10] However, this does not mean that people seek tests that guarantee a positive answer. In studies where subjects could select either such pseudo-tests or genuinely diagnostic ones, they favored the genuinely diagnostic.[11][12]
The preference for positive tests in itself is not a bias, since positive tests can be highly informative.[13] However, in combination with other effects, this strategy can confirm existing beliefs or assumptions, independently of whether they are true.[14] In real-world situations, evidence is often complex and mixed. For example, various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior.[8] Thus any search for evidence in favor of a hypothesis is likely to succeed.[14] One illustration of this is the way the phrasing of a question can significantly change the answer.[8] For example, people who are asked, "Are you happy with your social life?" report greater satisfaction than those asked, "Are you unhappy with your social life?"[15]
Even a small change in a question's wording can affect how people search through available information, and hence the conclusions they reach. This was shown using a fictional child custody case.[16] Participants read that Parent A was moderately suitable to be the guardian in multiple ways. Parent B had a mix of salient positive and negative qualities: a close relationship with the child but a job that would take him or her away for long periods of time. When asked, "Which parent should have custody of the child?" the majority of participants chose Parent B, looking mainly for positive attributes. However, when asked, "Which parent should be denied custody of the child?" they looked for negative attributes and the majority answered that Parent B should be denied custody, implying that Parent A should have custody.[16]
Similar studies have demonstrated how people engage in a biased search for information, but also that this phenomenon may be limited by a preference for genuine diagnostic tests. In an initial experiment, participants rated another person on the introversion–extroversion personality dimension on the basis of an interview. They chose the interview questions from a given list. When the interviewee was introduced as an introvert, the participants chose questions that presumed introversion, such as, "What do you find unpleasant about noisy parties?" When the interviewee was described as extroverted, almost all the questions presumed extroversion, such as, "What would you do to liven up a dull party?" These loaded questions gave the interviewees little or no opportunity to falsify the hypothesis about them.[17] A later version of the experiment gave the participants less presumptive questions to choose from, such as, "Do you shy away from social interactions?"[18] Participants preferred to ask these more diagnostic questions, showing only a weak bias towards positive tests. This pattern, of a main preference for diagnostic tests and a weaker preference for positive tests, has been replicated in other studies.[18]
Personality traits influence and interact with biased search processes.[19] Individuals vary in their abilities to defend their attitudes from external attacks in relation to selective exposure. Selective exposure occurs when individuals search for information that is consistent, rather than inconsistent, with their personal beliefs.[20] An experiment examined the extent to which individuals could refute arguments that contradicted their personal beliefs.[19] People with high confidence levels more readily seek out contradictory information to their personal position to form an argument. Individuals with low confidence levels do not seek out contradictory information and prefer information that supports their personal position. People generate and evaluate evidence in arguments that are biased towards their own beliefs and opinions.[21] Heightened confidence levels decrease preference for information that supports individuals' personal beliefs.
Another experiment gave participants a complex rule-discovery task that involved moving objects simulated by a computer.[22] Objects on the computer screen followed specific laws, which the participants had to figure out. So, participants could "fire" objects across the screen to test their hypotheses. Despite making many attempts over a ten-hour session, none of the participants figured out the rules of the system. They typically attempted to confirm rather than falsify their hypotheses, and were reluctant to consider alternatives. Even after seeing objective evidence that refuted their working hypotheses, they frequently continued doing the same tests. Some of the participants were taught proper hypothesis-testing, but these instructions had almost no effect.[22]
Biased interpretationConfirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased.
A team at Stanford University conducted an experiment involving participants who felt strongly about capital punishment, with half in favor and half against it.[24][25] Each participant read descriptions of two studies: a comparison of U.S. states with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the participants were asked whether their opinions had changed. Then, they read a more detailed account of each study's procedure and had to rate whether the research was well-conducted and convincing.[24] In fact, the studies were fictional. Half the participants were told that one kind of study supported the deterrent effect and the other undermined it, while for other participants the conclusions were swapped.[24][25]
The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways.[24][26] Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time," while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented."[24] The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias", has been supported by other experiments.[27]
Another study of biased interpretation occurred during the 2004 U.S. presidential election and involved participants who reported having strong feelings about the candidates. They were shown apparently contradictory pairs of statements, either from Republican candidate George W. Bush, Democratic candidate John Kerry or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether or not each individual's statements were inconsistent.[28]:1948 There were strong differences in these evaluations, with participants much more likely to interpret statements from the candidate they opposed as contradictory.[28]:1951
In this experiment, the participants made their judgments while in a magnetic resonance imaging (MRI) scanner which monitored their brain activity. As participants evaluated contradictory statements by their favored candidate, emotional centers of their brains were aroused. This did not happen with the statements by the other figures. The experimenters inferred that the different responses to the statements were not due to passive reasoning errors. Instead, the participants were actively reducing the cognitive dissonance induced by reading about their favored candidate's irrational or hypocritical behavior.[28]:1956
Biases in belief interpretation are persistent, regardless of intelligence level. Participants in an experiment took the SAT test (a college admissions test used in the United States) to assess their intelligence levels. They then read information regarding safety concerns for vehicles, and the experimenters manipulated the national origin of the car. American participants provided their opinion if the car should be banned on a six-point scale, where one indicated "definitely yes" and six indicated "definitely no". Participants firstly evaluated if they would allow a dangerous German car on American streets and a dangerous American car on German streets. Participants believed that the dangerous German car on American streets should be banned more quickly than the dangerous American car on German streets. There was no difference among intelligence levels at the rate participants would ban a car.[21]
Biased interpretation is not restricted to emotionally significant topics. In another experiment, participants were told a story about a theft. They had to rate the evidential importance of statements arguing either for or against a particular character being responsible. When they hypothesized that character's guilt, they rated statements supporting that hypothesis as more important than conflicting statements.[29]
Biased memoryEven if people gather and interpret evidence in a neutral manner, they may still remember evidence selectively to reinforce their expectations. This effect is called "selective recall", "confirmatory memory", or "access-biased memory".[30] Psychological theories differ in their predictions about selective recall. Schema theory predicts that information matching prior expectations will be more easily stored and recalled than information that does not match.[31] Some alternative approaches say that surprising information stands out and so is memorable.[31] Predictions from both these theories have been confirmed in different experimental contexts, with no theory winning outright.[32]
In one study, participants read a profile of a woman which described a mix of introverted and extroverted behaviors.[33] They later had to recall examples of her introversion and extroversion. One group was told this was to assess the woman for a job as a librarian, while a second group were told it was for a job in real estate sales. There was a significant difference between what these two groups recalled, with the "librarian" group recalling more examples of introversion and the "sales" groups recalling more extroverted behavior.[33] A selective memory effect has also been shown in experiments that manipulate the desirability of personality types.[31][34] In one of these, a group of participants were shown evidence that extroverted people are more successful than introverts. Another group were told the opposite. In a subsequent, apparently unrelated, study, they were asked to recall events from their lives in which they had been either introverted or extroverted. Each group of participants provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly.[35]
Changes in emotional states can also influence memory recall.[36][37] Participants rated how they felt when they had first learned that O.J. Simpson had been acquitted of murder charges.[36] They described their emotional reactions and confidence regarding the verdict one week, two months, and one year after the trial. Results indicated that participants' assessments for Simpson's guilt changed over time. The more that participants' opinion of the verdict had changed, the less stable were the participant's memories regarding their initial emotional reactions. When participants recalled their initial emotional reactions two months and a year later, past appraisals closely resembled current appraisals of emotion. People demonstrate sizable myside bias when discussing their opinions on controversial topics.[21] Memory recall and construction of experiences undergo revision in relation to corresponding emotional states.
Myside bias has been shown to influence the accuracy of memory recall.[37] In an experiment, widows and widowers rated the intensity of their experienced grief six months and five years after the deaths of their spouses. Participants noted a higher experience of grief at six months rather than at five years. Yet, when the participants were asked after five years how they had felt six months after the death of their significant other, the intensity of grief participants recalled was highly correlated with their current level of grief. Individuals appear to utilize their current emotional states to analyze how they must have felt when experiencing past events.[36] Emotional memories are reconstructed by current emotional states.
One study showed how selective memory can maintain belief in extrasensory perception (ESP).[38] Believers and disbelievers were each shown descriptions of ESP experiments. Half of each group were told that the experimental results supported the existence of ESP, while the others were told they did not. In a subsequent test, participants recalled the material accurately, apart from believers who had read the non-supportive evidence. This group remembered significantly less information and some of them incorrectly remembered the results as supporting ESP.[38]
Related effects
Polarization of opinionWhen people with opposing views interpret new information in a biased way, their views can move even further apart. This is called "attitude polarization".[39] The effect was demonstrated by an experiment that involved drawing a series of red and black balls from one of two concealed "bingo baskets". Participants knew that one basket contained 60% black and 40% red balls; the other, 40% black and 60% red. The experimenters looked at what happened when balls of alternating color were drawn in turn, a sequence that does not favor either basket. After each ball was drawn, participants in one group were asked to state out loud their judgments of the probability that the balls were being drawn from one or the other basket. These participants tended to grow more confident with each successive draw—whether they initially thought the basket with 60% black balls or the one with 60% red balls was the more likely source, their estimate of the probability increased. Another group of participants were asked to state probability estimates only at the end of a sequence of drawn balls, rather than after each ball. They did not show the polarization effect, suggesting that it does not necessarily occur when people simply hold opposing positions, but rather when they openly commit to them.[40]
A less abstract study was the Stanford biased interpretation experiment in which participants with strong opinions about the death penalty read about mixed experimental evidence. Twenty-three percent of the participants reported that their views had become more extreme, and this self-reported shift correlated strongly with their initial attitudes.[24] In later experiments, participants also reported their opinions becoming more extreme in response to ambiguous information. However, comparisons of their attitudes before and after the new evidence showed no significant change, suggesting that the self-reported changes might not be real.[27][39][41] Based on these experiments, Deanna Kuhn and Joseph Lao concluded that polarization is a real phenomenon but far from inevitable, only happening in a small minority of cases. They found that it was prompted not only by considering mixed evidence, but by merely thinking about the topic.[39]
Charles Taber and Milton Lodge argued that the Stanford team's result had been hard to replicate because the arguments used in later experiments were too abstract or confusing to evoke an emotional response. The Taber and Lodge study used the emotionally charged topics of gun control and affirmative action.[27] They measured the attitudes of their participants towards these issues before and after reading arguments on each side of the debate. Two groups of participants showed attitude polarization: those with strong prior opinions and those who were politically knowledgeable. In part of this study, participants chose which information sources to read, from a list prepared by the experimenters. For example, they could read the National Rifle Association's and the Brady Anti-Handgun Coalition's arguments on gun control. Even when instructed to be even-handed, participants were more likely to read arguments that supported their existing attitudes than arguments that did not. This biased search for information correlated well with the polarization effect.[27]
The backfire effect is a name for the finding that, given evidence against their beliefs, people can reject the evidence and believe even more strongly.[42][43] The phrase was first coined by Brendan Nyhan and Jason Reifler.[44]
Persistence of discredited beliefsConfirmation biases can be used to explain why some beliefs persist when the initial evidence for them is removed.[46] This belief perseverance effect has been shown by a series of experiments using what is called the "debriefing paradigm": participants read fake evidence for a hypothesis, their attitude change is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level.[45]
A common finding is that at least some of the initial belief remains even after a full debriefing.[47] In one experiment, participants had to distinguish between real and fake suicide notes. The feedback was random: some were told they had done well while others were told they had performed badly. Even after being fully debriefed, participants were still influenced by the feedback. They still thought they were better or worse than average at that kind of task, depending on what they had initially been told.[48]
In another study, participants read job performance ratings of two firefighters, along with their responses to a risk aversion test.[45] This fictional data was arranged to show either a negative or positive association: some participants were told that a risk-taking firefighter did better, while others were told they did less well than a risk-averse colleague.[49] Even if these two case studies were true, they would have been scientifically poor evidence for a conclusion about firefighters in general. However, the participants found them subjectively persuasive.[49] When the case studies were shown to be fictional, participants' belief in a link diminished, but around half of the original effect remained.[45] Follow-up interviews established that the participants had understood the debriefing and taken it seriously. Participants seemed to trust the debriefing, but regarded the discredited information as irrelevant to their personal belief.[49]
The continued influence effect is the tendency to believe previously learned misinformation even after it has been corrected. Misinformation can still influence inferences one generates after a correction has occurred.[50]
Preference for early informationExperiments have shown that information is weighted more strongly when it appears early in a series, even when the order is unimportant. For example, people form a more positive impression of someone described as "intelligent, industrious, impulsive, critical, stubborn, envious" than when they are given the same words in reverse order.[51] This irrational primacy effect is independent of the primacy effect in memory in which the earlier items in a series leave a stronger memory trace.[51] Biased interpretation offers an explanation for this effect: seeing the initial evidence, people form a working hypothesis that affects how they interpret the rest of the information.[46]
One demonstration of irrational primacy used colored chips supposedly drawn from two urns. Participants were told the color distributions of the urns, and had to estimate the probability of a chip being drawn from one of them.[51] In fact, the colors appeared in a prearranged order. The first thirty draws favored one urn and the next thirty favored the other.[46] The series as a whole was neutral, so rationally, the two urns were equally likely. However, after sixty draws, participants favored the urn suggested by the initial thirty.[51]
Another experiment involved a slide show of a single object, seen as just a blur at first and in slightly better focus with each succeeding slide.[51] After each slide, participants had to state their best guess of what the object was. Participants whose early guesses were wrong persisted with those guesses, even when the picture was sufficiently in focus that the object was readily recognizable to other people.[46]
Illusory association between eventsIllusory correlation is the tendency to see non-existent correlations in a set of data.[52] This tendency was first demonstrated in a series of experiments in the late 1960s.[53] In one experiment, participants read a set of psychiatric case studies, including responses to the Rorschach inkblot test. The participants reported that the homosexual men in the set were more likely to report seeing buttocks, anuses or sexually ambiguous figures in the inkblots. In fact the fictional case studies had been constructed so that the homosexual men were no more likely to report this imagery or, in one version of the experiment, were less likely to report it than heterosexual men.[52] In a survey, a group of experienced psychoanalysts reported the same set of illusory associations with homosexuality.[52][53]
Another study recorded the symptoms experienced by arthritic patients, along with weather conditions over a 15-month period. Nearly all the patients reported that their pains were correlated with weather conditions, although the real correlation was zero.[54]
This effect is a kind of biased interpretation, in that objectively neutral or unfavorable evidence is interpreted to support existing beliefs. It is also related to biases in hypothesis-testing behavior.[55] In judging whether two events, such as illness and bad weather, are correlated, people rely heavily on the number of positive-positive cases: in this example, instances of both pain and bad weather. They pay relatively little attention to the other kinds of observation (of no pain and/or good weather).[56] This parallels the reliance on positive tests in hypothesis testing.[55] It may also reflect selective recall, in that people may have a sense that two events are correlated because it is easier to recall times when they happened together.[55]
Everyman Standing Order 01: In the Face of Tyranny; Everybody Stands, Nobody Runs.
Everyman Standing Order 02: Everyman is Responsible for Energy and Security.
Everyman Standing Order 03: Everyman knows Timing is Critical in any Movement.