Over time, group polarization can be fortified because of “exit,” as members leave the group because they reject the direction in which things are heading.
When people talk to one another, what happens? Do they compromise? Do they move toward moderation? The answer is now clear, and it is not what intuition would suggest: members of deliberating groups typically end up in a more extreme position, consistent with their tendencies before deliberation began. This is the phenomenon known as group polarization. Group polarization is the usual pattern with deliberating groups, having been found in hundreds of studies involving more than a dozen countries, including the United States, France, and Germany. It helps account for many terrible things, including terrorism, stoning, and “mobbing” in all its forms.
It follows that a group of people who think that immigration is a serious problem will, after discussion, think that immigration is a horribly serious problem; that those who approve of an ongoing war effort will, as a result of discussion, become still more enthusiastic about that effort; that people who dislike a nation’s leaders will dislike those leaders quite intensely after talking with one another; and that people who disapprove of the United States, and are suspicious of its intentions, will increase their disapproval and suspicion if they exchange points of view. Indeed, there is specific evidence of the latter phenomenon among citizens of France.
When like-minded people talk with one another, they usually end up thinking a more extreme version of what they thought before the conversation began. It should be readily apparent that enclaves of people, inclined to rebellion or even violence, might move sharply in that direction as a consequence of internal deliberations. Political extremism is often a product of group polarization.
In the United States, group polarization helped both Barack Obama and Donald Trump to ascend to the presidency. Speaking mostly with one another, Obama supporters and Trump supporters became intensely committed to their candidate. On Facebook and Twitter, we can see group polarization in action every hour, every minute, or every day. As enclaves of like-minded people proliferate online, group polarization becomes inevitable. Sports fans fall prey to group polarization; so do companies deciding whether to launch some new product. It should be easy to see that group polarization is at work on university campuses and in feuds, ethnic and international strife, and war.
One of the characteristic features of feuds is that members of a group embroiled in a feud tend to talk only to one another, fueling and amplifying their outrage, and solidifying their impression of the relevant people and events. Many social movements, both good and bad, become possible through the heightened effects of outrage; consider the civil rights movements of the 1960s (and the contemporary #MeToo movement). Social enclaves are breeding groups for group polarization, sometimes for better and sometimes for worse.
There is another point, of special importance for purposes of understanding extremism and tribalism: In deliberating groups, those with a minority position often silence themselves or otherwise have disproportionately little weight. The result can be “hidden profiles”—important information that is not shared within the group. Group members often have information but do not discuss it, and the result is to produce bad decisions (or even worse).
Consider a study of serious errors within working groups, both face-to-face and online.1 The purpose of the study was to see how groups might collaborate to make personnel decisions. Résumés for three candidates, applying for a marketing manager position, were placed before the groups. The attributes of the candidates were rigged by the experimenters so that one applicant was clearly the best for the job described. Packets of information were given to subjects, each containing a subset of information from the résumés, so that each group member had only part of the relevant information. The groups consisted of three people, some operating face-to-face, some operating online.
Two results were especially striking. First, group polarization was common, as groups ended up in a more extreme position in the same direction as the original thinking of their members. Second, almost none of the deliberating groups made what was conspicuously the right choice, because they failed to share information in a way that would permit the group to make an objective decision. Members tended to share positive information about the winning candidate and negative information about the losers, while also suppressing negative information about the winner and positive information about the losers. Their statements served to reinforce the movement toward a group consensus rather than to add new and different points or to promote debate.
This finding fits with the more general claim, backed by a lot of evidence, that groups tend to dwell on shared information and to neglect information that is held by few members. It should be unnecessary to emphasize that this tendency can lead to big blunders—in governments, in think tanks, and on the Left and the Right. To understand this particular point, it is helpful to explore the three mechanisms that produce group polarization: information, corroboration, and social comparison.
With respect to information, the simple point is that people usually respond to the arguments made by other people—and the “argument pool,” in any group with some initial disposition in one direction, will inevitably be skewed toward that disposition. A group whose members tend to think that Israel is the real aggressor in the Middle East conflict will tend to hear many arguments to that effect, and relatively few opposing views. It is almost inevitable that the group’s members will have heard some, but not all, of the arguments that emerge from the discussion. Having heard all of what is said, people are likely to move further in the anti-Israel direction. So too with a group whose members tend to oppose immigration: group members will hear a large number of arguments against immigration and a smaller number of arguments on its behalf. If people are listening, they will have a stronger conviction, in line with the same view with which they began, as a result of deliberation. An emphasis on limited argument pools also helps to explain the problem of “hidden profiles” and the greater discussion of shared information during group discussion. It is simply a statistical fact that when more people have a piece of information, there is a greater probability that it will be mentioned. Hidden profiles are a predictable result, to the detriment of the ultimate decision.
With respect to the power of corroboration, the intuition is simple: people who lack confidence, and who are unsure what they should think, tend to moderate their views. It is for this reason that cautious people, not knowing what to do, are likely to choose the midpoint between the relevant extremes. But if other people seem to share your view and corroborate your beliefs, you are likely to become more confident that you are correct—and hence to move in a more extreme direction. You might think that on a scale of one to ten, the likelihood that climate change is occurring is seven—but if most people in your group agree that climate change is occurring, you might move up to nine.
In a wide variety of experimental contexts, people’s opinions have been shown to become more extreme simply because their view has been corroborated and because they have become confident after learning of the shared views of others. The existence of confirmation from others will strengthen confidence and hence strengthen extremity. It can also make for mobs.
With respect to social comparison, the starting point is that people want to be perceived favorably by other group members, and also to perceive themselves favorably. Their views may, to a greater or lesser extent, be a function of how they want to present themselves. Once people hear what others believe, they adjust their positions in the direction of the dominant position, to hold onto their preserved self-presentation. They may want to signal that they are politically correct, whatever that means in their group. For example, they might want to show that they are not cowardly or cautious, perhaps in an entrepreneurial group that disparages these characteristics, and hence they will frame their position so they do not appear as such by comparison to other group members. And after they hear what other people think, they might find they occupy a somewhat different position, in relation to the group, from what they hoped, and they shift accordingly.
If people believe they are somewhat more opposed to immigration than most people, they might shift a bit after finding themselves in a group of people who are strongly opposed to immigration, to maintain their preferred self-presentation. This phenomenon occurs all the time. People may wish, for example, not to seem too enthusiastic—or too restrained in their enthusiasm—for affirmative action, feminism, or an increase in expenditures on national defense; hence their views shift when they see what other group members think. The result is to press the group’s position toward one or another extreme, and also to induce shifts in individual members.
Note that an emphasis on social comparison gives a new and perhaps better explanation for the existence of hidden profiles and the failure to share certain information within a group. People might emphasize shared views and information, and downplay unusual perspectives and new evidence, simply from a fear of group rejection and a desire for general approval. In political institutions and in companies, there is an unfortunate implication: group members who care about one another’s approval, or who depend on one another for material or nonmaterial benefits, might well suppress highly relevant information.
Group polarization is not a social constant. It can be increased or decreased and even eliminated by certain features of group members or their situation.
First, extremists are especially prone to polarization. It is more probable that they will shift, and it is probable that they will shift more. When they start out at an extreme point and are placed in a group of like-minded people, they are likely to move especially far in the direction with which they started. There is a lesson here about the sources of terrorism and political violence in general. And because there is a link between confidence and extremism, the confidence of particular members also plays an important role; confident people are both more influential (the “confidence heuristic”) and more prone to polarization.
Second, if members of the group think they have a shared identity and a high degree of solidarity, there will be heightened polarization. One reason is that if people feel united by some factor (for example, politics or religious convictions), dissent will be dampened. If individual members tend to perceive one another as friendly, likeable, and similar to them, the size and likelihood of the shift will increase. The existence of affective ties reduces the number of diverse arguments and also intensifies social influences on choice.
One implication is that mistakes are likely to be increased when group members are united mostly through bonds of affection and not through concentration on a particular task; it is in the former case that alternative views will be less likely to find expression. Another implication is that people are less likely to shift if the point of view or direction advocated is being pushed by unlikeable or unfriendly group members. A sense of “group belongingness” affects the extent of polarization. In the same vein, physical spacing tends to reduce polarization; a sense of common fate and intra-group similarity tends to increase it, as does the introduction of a rival “outgroup.”
Over time, group polarization can be fortified because of “exit,” as members leave the group because they reject the direction in which things are heading. If exit is pervasive, the tendency to extremism will be greatly aggravated. The group will end up smaller, but its members will be both more like-minded and more willing to take extreme measures, and that very fact will mean that internal discussions will produce more extremism still. If the strongest loyalists are the only people who stay, the group’s median member will be more extreme, and deliberation will produce increasingly extreme movements.
We live in an era in which groups of people—on the Left, on the Right, in university departments, in religious institutions—often end up in a pitch of rage, seeing fellow members of the human species not as wrong but as enemies. Such groups may even embark on something like George Orwell’s Two Minutes Hate. When that happens, or when people go to extremes, there are many explanations. But group polarization unifies seemingly diverse phenomena. Extremism and mobbing are not so mysterious. On the contrary, they are predicable products of social interactions.
Reference:
1 See R. Hightower and L. Sayeed, The Impact of Computer-Mediated Communication Systems on Biased Group Discussion, Computers in Human Behavior Volume 11, Issue 1, Spring 1995.