This article was downloaded by: [Monash University Library]On: 14 September 2013, At: 15:00Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number:1072954 Registered office: Mortimer House, 37-41 Mortimer Street,London W1T 3JH, UK
Critical Review: A Journalof Politics and SocietyPublication details, including instructions forauthors and subscription information:http://www.tandfonline.com/loi/rcri20
THE PSYCHOLOGYOF CLOSED ANDOPEN MINDEDNESS,RATIONALITY, ANDDEMOCRACYArie W. Kruglanski & Lauren M. BoyatziPublished online: 20 Sep 2012.
To cite this article: Arie W. Kruglanski & Lauren M. Boyatzi (2012) THEPSYCHOLOGY OF CLOSED AND OPEN MINDEDNESS, RATIONALITY, ANDDEMOCRACY, Critical Review: A Journal of Politics and Society, 24:2, 217-232,DOI: 10.1080/08913811.2012.711023
To link to this article: http://dx.doi.org/10.1080/08913811.2012.711023
PLEASE SCROLL DOWN FOR ARTICLE
Taylor & Francis makes every effort to ensure the accuracy ofall the information (the “Content”) contained in the publicationson our platform. However, Taylor & Francis, our agents, and ourlicensors make no representations or warranties whatsoever as to theaccuracy, completeness, or suitability for any purpose of the Content.Any opinions and views expressed in this publication are the opinionsand views of the authors, and are not the views of or endorsed byTaylor & Francis. The accuracy of the Content should not be reliedupon and should be independently verified with primary sources of
information. Taylor and Francis shall not be liable for any losses,actions, claims, proceedings, demands, costs, expenses, damages,and other liabilities whatsoever or howsoever caused arising directlyor indirectly in connection with, in relation to or arising out of the useof the Content.
This article may be used for research, teaching, and private studypurposes. Any substantial or systematic reproduction, redistribution,reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of accessand use can be found at http://www.tandfonline.com/page/terms-and-conditions
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
Arie W. Kruglanski and Lauren M. Boyatzi
THE PSYCHOLOGY OF CLOSED AND OPEN
MINDEDNESS, RATIONALITY, AND DEMOCRACY
ABSTRACT: Charles Taber and Milton Lodge provide compelling evidence that
people’s minds may be closed to information that is inconsistent with their prior
beliefs. This type of inconsistency has often been termed ‘‘irrational.’’ However,
recent research suggests that being open or closed minded is not an unchanging
variable but depends on one’s goals, including one’s need for closure, which vary
from person to person and situation to situation. In this vein, as Taber and Lodge
suggest, those who have more political information may, after having been open
minded enough to acquire the information, become closed minded by virtue of
having acquired it. At the same time, being more open minded to inaccurate
information might lead one in the wrong direction; hence, open mindedness does not
necessarily enhance the accuracy of one’s judgments or the quality of one’s decisions.
We may need to rethink the concept of an enlightened, well-informed electorate, to
the extent that it assumes the unqualified benefits of open mindedness.
Democracies grant the power to vote only to those who are deemed
minimally competent to exercise the power rationally. For that reason,
children and the insane are denied the right to vote. In some states during
the late eighteenth and nineteenth centuries, women, African Americans,
and Native Americans were not allowed to vote on the grounds that they
were less capable of analytic and dispassionate judgment*hence that they
should not be granted a voice in matters of important public concern.
Arie W. Kruglanski, [email protected], Department of Psychology, University of Maryland,College Park, MD 20742, is the author, inter alia, of The Psychology of Closed Mindedness(Psychology Press, 2004). Lauren M. Boyatzi, [email protected], is a graduate student at theDepartment of Psychology, University of Maryland, College Park, MD 20742.
Critical Review 24(2): 217–232 ISSN 0891-3811 print, 1933-8007 online# 2012 Critical Review Foundation http://dx.doi.org/10.1080/08913811.2012.711023
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
In itself, however, rationality is insufficient for the rendition of
mature, well-considered decisions. The decision maker must also have
adequate information to assess the consequences of the choices before
her. Thomas Jefferson is reputed to have said that ‘‘whenever the people
are well-informed, they can be trusted with their own government,’’
because ‘‘whenever things get so far wrong as to attract their notice, they
may be relied on to set them right’’ (quoted in Padover 1939, 88).
The notion of an informed electorate implies not only that people
should be provided with information but also that they be receptive to,
and willing to consider, the information they are provided. But are they?
Two experiments by Charles Taber and Milton Lodge (2006) cast this
supposition in doubt. In these experiments, participants who had strong
opinions on affirmative action and gun control read pro and con
arguments on these topics from partisan sources. Both before and after
these readings, Taber and Lodge measured the strength and position of
participants’ attitudes on these two issues. The experiments revealed a
prior-attitude effect, such that participants rated arguments congruent with
their attitudes as stronger and more compelling than attitude-incongruent
arguments. Furthermore, participants listed more thoughts*typically
counterarguments*about attitude-inconsistent arguments than about
attitude-consistent arguments, evincing a disconfirmation bias. And parti-
cipants tended to choose more arguments from the attitude-consistent
category than from the attitude-inconsistent category, attesting to a
confirmation bias. Taber and Lodge also found that politically sophisticated
participants reported significantly more extreme attitudes after consider-
ing the pro and con arguments than before doing so, manifesting a
polarization effect. Finally, since the polarization effect was found for
participants with strong but not weak prior attitudes, it attests to an
attitude-strength effect.
Taber and Lodge (2006, 767) summarize their results by noting that
‘‘people are often unable to escape the pull of their prior attitudes and
beliefs, which guide the processing of new information in predictable and
sometimes insidious ways.’’ They maintain that (from one perspective)
‘‘the average citizen would appear to be both cognitively and motiva-
tionally incapable of fulfilling the requirements of rational behavior in a
democracy’’ (ibid). For ‘‘far from the rational calculator portrayed in
Enlightenment prose . . . homo politicus would seem to be a creature of
simple likes and prejudices that are quite resistant to change’’ (ibid).
218 Critical Review Vol. 24, No. 2
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
Setting aside eighteenth-century assumptions about citizens’ cognitive
capabilities, Taber and Lodge ask whether the kind of behavior observed
in their studies meets widespread contemporary standards of rationality.
They aptly note that a thin line separates, on the one hand, rigid en-
slavement by one’s prior attitudes and, on the other hand, healthy
skepticism grounded in the assumption that one’s prior attitudes are based
on extensive reflection and therefore should not be altered precipitously.
Finally, they observe that social science cannot resolve the normative
question of what is the rational response to new information that
contradicts one’s prior attitudes. As the authors pithily put it: ‘‘Research
can explore the conditions under which persuasion occurs (as social
psychologists have for decades), but it cannot establish the conditions
under which it should occur’’ (Taber and Lodge 2006, 768); they conclude
that ‘‘it is, of course, the latter question that needs answering if we are
to resolve the controversy over the rationality of motivated reasoning’’
(ibid.).
We shall address several fundamental issues raised by Taber and
Lodge’s provocative paper. First, we consider the conditions under
which people are likely to be ‘‘open’’ or ‘‘closed’’ to new information,
particularly information inconsistent with their prior beliefs. Second, we
consider whether motivational biases are rational or irrational in light
of what influential social scientists have assumed that rationality means.
We finally consider whether more open mindedness is necessarily better
for judgment and decision making, and the implications this has for the
notion of an ‘‘enlightened electorate.’’
Motivated Cognition and Closed Mindedness
We begin with the assumption that the process of judgment*that is, the
process of forming beliefs or opinions*has a motivational basis. In other
words, thinking is purposive. It is ‘‘for doing,’’ as Susan Fiske (1992) has
aptly proclaimed (recalling William James [(1890) 1983]). We form
judgments in the service of the cognitive goals to which we subscribe.
Social cognitive theory and research suggest that if one’s current beliefs
satisfactorily serve those goals, then one would be more likely to resist
new information that contradicts those beliefs (and hence undermines
the goals in question). Conversely, however, where one’s current beliefs
Kruglanski and Boyatzi • Closed and Open Mindedness 219
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
are incompatible with one’s goals, one will readily abandon those beliefs
and welcome novel information that contradicts them.
The explicit motivation in forming any judgment is to get to the truth
on a given topic; that is, to render an accurate judgment or arrive at a
correct opinion. By definition, we believe that any opinion that we hold
is correct. Thus, when one attempts to form an opinion, one avowedly
strives to form an accurate opinion. That is the intrinsic goal of judg-
ment formation; to hold a judgment that one believes to be false is
philosophically incoherent. Yet a variety of extrinsic goals may also bias
judgments toward desired conclusions (Dunning 1999; Jervis 1976;
Kruglanski 1989 and 2004; Kunda 1990; Vertzberger 1990). One might
not be conscious of such extrinsic goals, and of the biases they introduce,
and therefore one may continue to feel that one’s sole concern in the
judgmental process, one’s exclusive underlying goal, is accuracy. None-
theless, various extrinsic goals may importantly determine whether the
individual will be closed or open to new information.
What might be the extrinsic goals of judgment, beyond the intrinsic
goal of accuracy? In our own work, we have emphasized the extrinsic
need for nonspecific closure and for various specific types of closures
(Kruglanski 1989 and 2004; Kruglanski and Webster 1996; Kruglanski,
Pierro, Mannetti, and DeGrada 2006). The need for nonspecific closure
is a desire for an assured opinion on a topic, as opposed to uncertainty
and ambiguity. The need for nonspecific closure varies stably across
individuals, and a scale (now translated into many languages) may be used
to assess any person’s need for it. In addition, situational circumstances
can induce the need for nonspecific closure. Thus, the need to move on
to another task may induce the goal of closure concerning the object
of one’s current opinion formation (Webster 1993). Closure may also
seem desirable in circumstances that make it difficult to process new
information, such as ambient noise, fatigue, being under a tight deadline,
or being in the nadir of one’s circadian cycle (Kruglanski and Webster
1996; Kruglanski, Pierro, Mannetti, and DeGrada 2006; Pierro and
Kruglanski 2008).
In principle, any cognitive goal would induce open mindedness
toward information that promised to advance the goal’s attainment, and
closed mindedness toward information that threatened to undermine
it. To test this prediction, Arie W. Kruglanski, Donna M. Webster, and
Adena Klem (1993) exposed participants with and without committed
opinions to someone who contradicted the opinionated participants’
220 Critical Review Vol. 24, No. 2
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
beliefs. Orthogonally, we manipulated the goal of cognitive closure by
raising the ambient noise in the room. We found that when exposed
to noise, the opinionated subjects resisted the other person’s opinion,
reflecting closed mindedness. Participants lacking an opinion, however,
were significantly more persuaded when they were exposed to noise than
when they were not, reflecting open mindedness. In other words, for
opinionated participants, the goal of closure was already satisfied, so they
resisted information that was inconsistent with their beliefs to a greater
extent when the physical environment elevated their need for closure.
Conversely, the ‘‘opinionless’’ participants had yet to attain closure, so
they embraced information that promised to do so more quickly when
the environmental noise elevated their desire for closure.
Kruglanski et al. 1993 demonstrates that even people with a height-
ened need for closure can, in some circumstances, be (temporarily)
open minded while ultimately seeking closure. Such temporary open-
mindedness has been known as ‘‘seizing’’ on closure-promising infor-
mation, which is to say accepting it uncritically (Kruglanski and Webster
1996). Once closure has been attained, however, people with a height-
ened need for closure typically ‘‘freeze’’ their judgment and become
impervious to subsequent information. In support of these conclusions,
research on the nonspecific need for closure (whether induced by a
dispositional need for closure or by situational manipulations of this
motivation) has found that those with a relatively high need for
closure form impressions more quickly and on the basis of more limited
evidence (Kruglanski and Freund 1983); tend more to base social
judgments on widespread stereotypes (Dijksterhuis, Van Knippenberg,
Kruglanski, and Schapper 1996; Jamieson and Zanna 1989); and tend
to exhibit correspondence bias (or the ‘‘fundamental attribution error’’)
in certain circumstances (Webster 1993). Further, they generate fewer
hypotheses while engaged in problem solving (Mayseless and Kruglanski
1987); tend to fixate on their own perspective and fail to attune their
communications to audience characteristics (Richter and Kruglanski
1999); and are less empathetic to the predicaments of people whose
perspectives differ from their own (Webster-Nelson, Klein, and Irvin
2003).
It thus seems fair to conclude that not all people in all situations
equally resist new information. Those with a higher need for closure are
more resistant than those with a lower need for closure*when the
former already have an opinion on a given topic. But in the absence of
Kruglanski and Boyatzi • Closed and Open Mindedness 221
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
an opinion, those with a high need for closure are in fact more open to
new information than are those with a low need for closure (Kruglanski
et al. 1993).
However, closed and open mindedness are general phenomena, not
traits that are solely restricted to the need for nonspecific closure. Any
information-processing goal may induce either open or closed mind-
edness to new information depending on whether it promised to advance
the goal or undermine it. Consider the various needs for specific
closure*for example, the need for self-esteem, control, and other
desired outcomes. These have been studied extensively in the domains
of cognitive-dissonance phenomena related to maintaining self-esteem
(Aronson 1992; Cooper and Fazio 1984; Harmon-Jones and Mills 1999;
Steele 1988, 261�302) and through motivated-reasoning research that
addressed a broader variety of specific, motivationally desirable conclu-
sions (Dunning 1999; Kruglanski 1999; Kunda 1990; Kunda and Sinclair
1999). In accordance with the present analysis, people with specific
closure needs would freeze on their prior knowledge and be closed
minded if such knowledge were congruent with their motives (rep-
resenting the attainment of goals that such motives define), yet they
would be quite ready to unfreeze their beliefs if those were incongruent
with, or undesirable from the perspective of, their goals. Consistent with
this analysis, Peter H. Ditto and David F. Lopez (1992) obtained evi-
dence that people curtail the processing of further information, attesting
to closed mindedness, if the previous information has supported their
pre-existing conclusions (e.g., in the domain of health treatments), and
that they prolong the processing of information, suggesting open
mindedness, if the initial information was inconsistent with their prior
conclusions or was undesirable from the perspective of their current
cognitive goals.
Research by Lisa Sinclair and Ziva Kunda (1999) further suggests that
needs for specific closure may activate from memory information likely
to support individuals’ preferred judgments and, to the contrary, may
inhibit available knowledge inconsistent with desirable judgments. For
instance, if the actions of a black doctor offended the perceiver in some
way, she would tend to activate the black stereotype (suggesting in-
competence) and inhibit the doctor stereotype (suggesting competence),
to better disparage the black doctor. By contrast, if pleased, the perceiver
would activate the doctor stereotype so as to applaud him. Overall
then, extant research findings suggest that both closed and open
222 Critical Review Vol. 24, No. 2
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
mindedness are pervasive and should be seen not as unvarying per-
sonality traits but as situational: They can be triggered when the new
information confronting the perceiver promises to support her or his
cognitive goal in the specific situation (inducing open mindedness)
or, contrarily, when the new information threatens to undermine it
(inducing closed mindedness).
The Temporal Dimension of Goal Pursuit
Goal-related phenomena also have an important temporal dimension. As
recent decades of extensive goal research in social cognition have shown
(for reviews see Kruglanski and Kopetz 2009a and 2009b; Fishbach and
Fergusson 2007; Moskowitz and Grant 2009), goals can be activated
or inhibited by features of the environment and by other goals. That
means that information that at time T1
might have been desirable and
welcome might, at T3, become undesirable and unwelcome if, at T
2,
a new goal was activated with which the T1
information was incon-
sistent. Relatedly, Fishbach, Shah, and Kruglanski 2004 showed that
high-calorie foods were rated as tasty when the goal of food enjoyment
was activated but were rated less tasty when the dieting goal was
activated. One may surmise that if one’s dieting goal was activated one
might resist (or be closed minded to) information suggesting that high-
calorie foods are tasty, and likewise, if one’s eating enjoyment goal was
activated, one might welcome such information while resisting informa-
tion that high-calorie foods are fattening.
Such processes of forming judgments likely apply to all types of goals
and information, including those that are politically relevant. For in-
stance, if an individual’s goal is to affirm the importance of individual
rights, she might strongly object to ‘‘enhanced’’ interrogation procedures
seen as compromising those rights. This cognitive preference or goal
might prompt her to resist (and hence, be closed minded to) arguments
or information implying that human rights are secondary to security
concerns. However, if her security concerns were suddenly aroused*such as by a major terrorist attack in her vicinity*this might activate,
and increase the perceived importance of, the goal of maximizing
security. She may then be more open minded to the argument that
security concerns trump human-rights concerns.
Kruglanski and Boyatzi • Closed and Open Mindedness 223
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
Thus, although individuals may occasionally behave like ‘‘motivated
skeptics,’’ as Taber and Lodge (2006) clearly demonstrate, it would be
excessive to conclude that motivated skepticism constitutes a general
phenomenon, and that people invariably resist information inconsistent
with their current beliefs. Often, our beliefs can be contrary to our
desires (such as the belief that the economy is bad, that our health is in
jeopardy, that our career is stagnant); in those cases we may welcome
contradictory information and quickly accept it. Thus, our ‘‘informa-
tional skepticism’’ and ‘‘informational gullibility’’ depend on the relation
between the new information at hand and one’s cognitive goals at the
moment.
The Rationality Issue
An intriguing point implicit in Taber and Lodge’s argument is that
‘‘motivated skepticism’’ betokens people’s irrationality, and that it is
therefore inimical to the democratic ideal of an informed, fully rational
electorate. In the preceding sections, we have suggested that motivated
skepticism and closed mindedness are not necessarily general phenom-
ena, and that often people can be open minded. But even when people
are closed minded, does that imply that they are irrational? It pays to
consider the concept of rationality and how it has been interpreted by
previous scholars.
In a recent analysis, Kruglanski and Edward Orehek (2009) have
noted that rationality has typically been interpreted in relation to both
means and ends. This approach assumes that a means is rational if it serves
the actor’s objective and is irrational if it does not. From the same
perspective, a goal may be considered rational if it is attainable by some
means and irrational if it is generally unattainable. Framing rationality
in terms of means and ends is compatible with grades of rationality,
whereby a means that serves the end better than another means is,
therefore, considered more ‘‘rational.’’ For instance, if a politician’s goal
is to be elected, making frequent appearances before a targeted audience
might be considered (by some pundit) a more rational means than merely
running TV ads geared at that audience*if the pundit assumes that the
former is more likely to lead to the politician being elected than the
latter.
224 Critical Review Vol. 24, No. 2
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
In contrast, in Durkheim’s work individual rationality pertained to the
means used to achieve personal or selfish interests while social rationality
referred to means that serve the public good and larger societal interest.
Thus, according to Durkheim, religion is rational because it serves the
societal ends of cohesion and solidarity. ‘‘Religions are able by means
of ceremonies to assemble individuals, put them in direct contact with
each other, and bring forth the same ideas or sentiments amplified by
their reciprocal influences’’ (Segre 2008, 116). Similarly, for Max Weber
([1918] 1946, 117), only means can be rational, if they are instrumental
to ends; ends, however, are akin to faiths, neither rational (they cannot
be justified) nor irrational (they cannot be refuted). Herbert Simon
(1978, 2), too, viewed means as the locus of rationality: ‘‘Fundamental
(to the conception of rationality) are assumptions about adaptation of
means to ends, of actions to goals.’’ According to Simon (ibid., 3),
the concept of rationality in the social sciences is closely related to
functionality: ‘‘Behaviors are functional if they contribute to certain goals
where these goals may be the pleasure or satisfaction of an individual
or the guarantee of food and shelter for the members of a society.’’
A prominent example of such functionalism is Freud’s psychoanalytic
theory, which explains ‘‘the patient’s illness in terms of the functions it
performs for him’’ (ibid). Crucially, the concept of functionality does not
require the actor’s conscious awareness of the degree to which her or his
behaviors and feelings are means to certain (possibly inchoate) ends. For
instance, in Freudian terms a patient may be unaware that her or his
‘‘hysterical’’ symptoms afford secondary gains and are actually means of
getting attention or avoiding blame.
Finally, rational-choice theory, which has greatly affected several
fields of the social sciences, is also committed to the instrumentalist
conception of rationality (Kruglanski and Orehek 2009). Rational-
choice models are premised on the assumption that individuals act in
their own best interests according to stable preferences and constraints
(March 1994). In other words, individuals’ choices are assumed to reflect
their striving to maximize benefits and minimize costs. In turn, an
action’s costs and benefits can be gauged in terms of an individual’s
various ends. A rational choice is one that maximizes outcomes by
choosing a means (i.e., making a decision) that leads to the most
important goal while simultaneously keeping alive possible alternative
goals. Rational-choice models often assume unrealistically that indivi-
duals possess perfect information (March 1994)*or at least accurate
Kruglanski and Boyatzi • Closed and Open Mindedness 225
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
probabilistic notions about the outcomes of any given choice*and that
they have the time and the cognitive ability to weigh every choice
against every other choice.
Is Closed Mindedness Irrational?
Against this backdrop, we may now revisit the question of whether
closed mindedness is irrational. The answer would seem to depend on
whom one asks, and when one poses the question. Any behavior may be
considered ‘‘locally’’ and momentarily*from a first-person, intraperso-
nal perspective*to be instrumentally rational. When one makes a
decision, one does so because, in that moment, it appears to gratify one’s
most salient or dominant goal. In the foregoing sense, then, closed
mindedness as well as open mindedness may each be considered locally
rational in that they both serve the perceiver’s momentarily active
objectives. This view of rationality, which encompasses all actions, is
tautological, divesting the term irrationality of all meaning. Irrationality
claims can make sense, however, if one transcends the moment or the
actor’s momentary perspective. Specifically, from a third-person, inter-
personal perspective (including the perspective of the decision maker
reflecting on her former self’s decision), a decision can be deemed
‘‘irrational.’’ Thus, were a critic to assume that the perceiver’s goal was,
or should have been, X (whereas in fact it was Y), he would judge the
perceiver’s action, Z, as inappropriate or irrational.
For instance, the critic might regard accuracy as the objective that the
perceived had, or should have had, and assume further that accuracy
is best served by openness to new information. From that perspective,
the critic might consider the actor’s defensive closed mindedness as
irrational. Alternatively, the critic might agree that Y was the goal but
adjudge action Za as an ineffective means compared to action Zb, and
hence as less rational than was possible, although the actor herself might
disagree that this is the case.
Finally, the actor herself might later regret her actions as ‘‘irrational,’’
that is, incongruous with an alternative, more important, objective that
she might have suppressed at the moment; or she may regard her
previous actions as ineffective (hence less rational) means to reaching her
objective.
226 Critical Review Vol. 24, No. 2
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
Is Open Mindedness Rational?
If closed mindedness can be considered subjectively rational (at least
momentarily so), a companion question is whether open mindedness is
necessarily rational and whether it invariably contributes to the quality of
one’s outcomes. An affirmative answer to this question is fundamental
to the concept of an enlightened democracy, whereby a well-informed
electorate is seen to be superior to a less-informed one. Indeed, the view
that more information is better*in the sense that more information leads
to a more accurate judgment*has been pervasive in the philosophy
of knowledge and in the cognitive sciences (for a recent review see
Kruglanski and Gigerenzer 2011). Rudolph Carnap (1947), for example,
proposed the ‘‘principle of total evidence,’’ where one is advised to use
all of the available evidence to estimate the probability of a specific
outcome. Many theories of cognition (e.g., the Bayesian model and
prospect theory) similarly assume that all information is, or should be,
integrated into one’s final judgment (but see Gigerenzer and Brighton
2009). Leslie Zebrowitz McArthur and Reuben M. Baron’s (1983) sug-
gestion that active perceivers are typically more accurate than passive
perceivers could be interpreted to be justified by the claim that active
exploration leads to the accumulation of greater amounts of information
than passive receptivity. Too, these authors’ notion of ‘‘sins of omission’’
refers to cases in which one overlooks crucial data because of attentional
selectivity or because the stimulus array is impoverished. In this view,
errors are due to limited information, caused by one’s inability to search
for it. Research in the ‘‘biases and heuristics’’ tradition in the judgment-
and-decision-making literature similarly assumes that using simple
heuristics is generally inferior to more extensive information processing
embodied in normative (e.g., Bayesian) or deliberative processes (e.g.,
Kahneman 2003; Kahneman and Tversky 1973; Tversky and Kahneman
1974; but see Gigerenzer and Goldstein 1996).
However, recent theoretical analysis and empirical findings suggest
that the view that ‘‘more (information) is better (for accuracy)’’ is not
unfailingly correct.1 First, information is generally mediated. Thus, the
new information one receives necessarily reflects someone’s framing,
which is inevitably subject to some kind of (motivational or cognitive)
bias. For all we know, our initial judgment might have been veridical
and the new information to which we might be open minded might lead
us astray (Kruglanski 1989; Kruglanski and Gigerenzer 2011). In other
Kruglanski and Boyatzi • Closed and Open Mindedness 227
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
words, ‘‘curiosity’’ or openness to new (potentially pernicious) informa-
tion might result in highly negative outcomes.
There is growing evidence, too, that in some cases ‘‘frugal’’ heuristics
may outperform extensive and systematic aggregation of information. For
instance, following the pioneering work of Robyn M. Dawes (1979;
Dawes and Corrigan 1974), Jean Czerlinski, Gerd Gigerenzer, and Daniel
G. Goldstein (1999) used twenty studies to compare the ‘‘tallying heuris-
tic’’ (where one tallies up information and assigns equal weight to each
datum, as opposed to assigning more or less weight to certain data) with
multiple linear regression (where one is concerned with the relationship
between different data, as opposed to their particular ‘‘weighting’’). The
authors found that, averaged across all studies, tallying achieved higher
predictive accuracy than multiple regression. This doesn’t mean that
tallying will outperform multiple regression in all circumstances. The
challenge for researchers is to delineate the tasks, or ‘‘informational
ecologies’’ (Fiedler 2007), under which each of these inferential rules
produces the more accurate predictions (see Einhorn and Hogarth
1975).
Various theories of social cognition and perception (Funder 1987;
McArthur and Baron 1983; Swann 1984) highlight the notion that simple
rules can yield accurate inferences in appropriate ecologies. The eco-
logical approach emphasizes that in their natural environments, humans
and animals generally draw accurate inferences, where accuracy is
defined in pragmatist terms as ‘‘that which works’’ (James 1907 and
1909), and where ‘‘less is more.’’ To argue that heuristics necessarily lead
to errors and that complex statistical rules are necessarily aligned with
rational judgments is to miss the ecological nature of judgment. It also
misinterprets the adaptive use of less effortful rules and frugal heuristics as
signs of limited capacities and even irrationality.
Ecological judgment is particularly important in the realm of politi-
cal behavior. Research by Howard Lavine, Marco Steenbergen, and
Christopher Johnston (forthcoming) suggests that politically sophisticated
individuals are, if anything, more likely to reach biased judgments in
defense of their original position than their less-sophisticated counter-
parts, which parallels Taber and Lodge’s findings. Presumably, political
sophistication stems from having been exposed to greater amounts of
political information*suggesting that the political sophisticates were
open minded enough to have been receptive to that information. In
short, openness to information does not free one from bias. Given strong
228 Critical Review Vol. 24, No. 2
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
enough motivations (i.e., goals of sufficient magnitude) any amount
of information can be twisted, reinterpreted, suppressed, or ‘‘spun,’’
culminating in bias. Thus, sophisticated philosopher-kings can be as
vulnerable to motivational biases as the common folks, if not more so.
* * *
Essentially, then, we agree with Taber and Lodge that the idea of an
enlightened electorate is unrealistic. However, we disagree that this is
because people’s minds are generally closed to new information; often-
times, they are not. Rather, as we have seen, the problem is that open-
ness to new information assures neither the elimination (nor even the
reduction) of bias, nor does it guarantee objectively superior outcomes.
NOTE
1. One might object that more information means more accurate information,
where accuracy is defined as information that leads to better outcomes. Such
framing, however, renders tautological (or definitional) the assertion that more
information leads to better outcomes.
REFERENCES
Aronson, Elliott. 1992. ‘‘The Return of the Repressed: Dissonance Theory Makes a
Comeback.’’ Psychological Inquiry 3: 303�11.
Carnap, Rudolf. 1947. ‘‘On the Application of Inductive Logic.’’ Philosophy and
Phenomenological Research 8: 133�48.
Cooper, Joel, and Russell H. Fazio. 1984. ‘‘A New Look at Dissonance Theory.’’ In
Advances in Experimental Social Psychology, ed. Leonard Berkowitz. Hillsdale,
N.J.: Lawrence Erlbaum.
Czerlinski, Jean, Gerd Gigerenzer, and Daniel G. Goldstein. 1999. ‘‘How Good are
Simple Heuristics?’’ In Simple Heuristics that Make Us Smart, ed. Gerd
Gigerenzer, Peter M. Todd, and the ABC Research Group. New York:
Oxford University Press.
Dawes, Robyn M. 1979. ‘‘The Robust Beauty of Improper Linear Models in
Decision Making.’’ American Psychologist 34: 571�82.
Dawes, Robyn M., and Bernard Corrigan. 1974. ‘‘Linear Models in Decision
Making.’’ Psychological Bulletin 81: 95�106.
Dijksterhuis, Ap, Ad Van Knippenberg, Arie W. Kruglanski, and Carel Schaper.
1996. ‘‘Motivated Social Cognition: Need for Closure Effects on Memory and
Judgment.’’ Journal of Experimental Social Psychology 32: 254�70.
Kruglanski and Boyatzi • Closed and Open Mindedness 229
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
Ditto, Peter H., and David F. Lopez. 1992. ‘‘Motivated Skepticism: Use of
Differential Decision Criteria for Preferred and Nonpreferred Conclusions.’’
Journal of Personality and Social Psychology 63: 568�84.
Dunning, David. 1999. ‘‘A Newer Look: Motivated Social Cognition and the
Schematic Representation of Social Concepts.’’ Psychological Inquiry 10: 1�11.
Einhorn, Hillel J., and Robin M. Hogarth. 1975. ‘‘Unit Weighting Schemes
for Decision Making.’’ Organizational Behavior and Human Performance 13:
171�92.
Fiedler, Klaus. 2007. ‘‘Information Ecology and the Explanation of Social Cognition
and Behavior.’’ In Kruglanski and Higgins 2007.
Fishbach, Ayelet, and Melissa J. Fergusson. 2007. ‘‘The Goal Construct in Social
Psychology.’’ In Kruglanski and Higgins 2007.
Fishbach, Ayelet, James Y. Shah, and Arie W. Kruglanski. 2004. ‘‘Emotional Transfer
in Goal Systems.’’ Journal of Experimental Social Psychology 40: 723�38.
Fiske, Susan T. 1992. ‘‘Thinking Is for Doing: Portraits of Social Cognition from
Daguerreotype to Laserphoto.’’ Journal of Personality and Social Psychology 63:
877�89.
Funder, David C. 1987. ‘‘Errors and Mistakes: Evaluating the Accuracy of Social
Judgment.’’ Psychological Bulletin 101: 75�90.
Gigerenzer, Gerd, and Henry Brighton. 2009. ‘‘Homo Heuristicus: Why Biased
Minds Make Better Inferences.’’ Topics in Cognitive Science 1: 107�43.
Gigerenzer, Gerd, and Daniel Goldstein. 1996. ‘‘Reasoning the Fast and Frugal Way:
Models of Bounded Rationality.’’ Psychological Review 103: 650�69.
Harmon-Jones, Eddie, and Judson Mills. 1999. Cognitive Dissonance: Progress on a
Pivotal Theory in Social Psychology. Washington, D.C.: American Psychological
Association.
James, William. [1890] 1983. The Principles of Psychology, ed. George A. Miller.
Cambridge, Mass.: Harvard University Press.
James, William. 1907. Pragmatism: A New Name for Some Old Ways of Thinking.
New York: Longmans, Green & Co.
James, William. 1909. The Meaning of Truth. New York: Longmans, Green & Co.
Jamieson, David W., and Mark P. Zanna. 1989. ‘‘Need for Structure in Attitude
Formation and Expression.’’ In Attitude Structure and Function, ed. Anthony R.
Pratkanis, Steven J. Breckler, and Anthony G. Greenwald. Hillsdale, N.J.:
Lawrence Erlbaum.
Jervis, Robert. 1976. Perception and Misperception in International Politics. Princeton:
Princeton University Press.
Kahneman, Daniel. 2003. ‘‘Maps of Bounded Rationality: Psychology for Behavioral
Economics.’’ American Economic Review 93: 1449�75.
Kahneman, Daniel, and Amos A. Tversky. 1973. ‘‘On the Psychology of Predic-
tion.’’ Psychological Review 80: 237�51.
Kruglanski, Arie W. 1989. ‘‘The Psychology of Being ‘Right’: On the Problem
of Accuracy in Social Perception and Cognition.’’ Psychological Bulletin 106:
395�409.
Kruglanski, Arie W. 1999. ‘‘Motivation, Cognition, and Reality: Three Memos for
the Next Generation of Research.’’ Psychological Inquiry 10: 54�58.
230 Critical Review Vol. 24, No. 2
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
Kruglanski, Arie W. 2004. ‘‘The Quest for the Gist: On Challenges of Going
Abstract in Social-Personality Psychology.’’ Personality and Social Psychology
Review 8: 156�63.
Kruglanski, Arie W., and T. Freund. 1983. ‘‘The Freezing and Un-Freezing of
Lay-Inferences: Effects on Impressional Primacy, Ethnic Stereotyping and
Numerical Anchoring.’’ Journal of Experimental Social Psychology 19: 448�68.
Kruglanski, Arie W., and Gerd Gigerenzer. 2011. ‘‘Intuitive and Deliberate Judgments
are Based on Common Principles.’’ Psychological Review 118: 97�109.
Kruglanski, Arie W., and E. Tory Higgins, eds. 2007. Social Psychology Handbook of
Basic Principles, 2nd ed. New York: The Guildford Press.
Kruglanski, Arie W., and Catalina E. Kopetz. 2009a. ‘‘The Role of Goal-Systems in
Self-Regulation.’’ In Oxford Handbook of Human Action, ed. Ezequiel Morsella,
John A. Bargh, and Peter M. Gollwitzer. Oxford: Oxford University Press.
Kruglanski, Arie W., and Catalina E. Kopetz. 2009b. ‘‘What Is So Special (and Non-
Special) about Goals? A View from the Cognitive Perspective.’’ In Moskowitz
and Grant 2009.
Kruglanski, Arie W., and Edward Orehek. 2009. ‘‘Toward a Relativity Theory of
Rationality.’’ Social Cognition 27: 639�60.
Kruglanski, Arie W., Antonio Pierro, Lucia Manetti, and Eraldo De Grada. 2006.
‘‘Groups as Epistemic Providers: Need for Closure and the Unfolding of
Group Centrism.’’ Psychological Review 113: 84�100.
Kruglanski, Arie W., and Donna M. Webster. 1996. ‘‘Motivated Closing of the
Mind: ‘Seizing’ and ‘Freezing.’’’ Psychological Review 103: 263�83.
Kruglanski, Arie W., Donna M. Webster, and Adena Klem. 1993. ‘‘Motivated
Resistance and Openness to Persuasion in the Presence or Absence of Prior
Information.’’ Journal of Personality and Social Psychology 65: 861�76.
Kunda, Ziva. 1990. ‘‘The Case for Motivated Reasoning.’’ Psychological Bulletin 108:
480�98.
Kunda, Ziva, and Lisa Sinclair. 1999. ‘‘Motivated Reasoning with Stereotypes:
Activation, Application, and Inhibition.’’ Psychological Inquiry 10: 12�22.
Lavine, Howard, Marco Steenbergen, and Christopher Johnston. Forthcoming. The
Ambivalent Partisan: How Critical Loyalty Promotes Democracy. Oxford: Oxford
University Press.
McArthur, Leslie Zebrowitz, and Reuben M. Baron. 1983. ‘‘Toward an Ecological
Theory of Social Perception.’’ Psychological Review 90: 215�38.
March, James G. 1994. A Primer on Decision Making. New York: Free Press.
Mayseless, Ofra, and Arie W. Kruglanski. 1987. ‘‘What Makes You So Sure? Effects
of Epistemic Motivations on Judgmental Confidence.’’ Organizational Behavior
and Human Decision Processes 39: 162�83.
Moskowitz, Gordon B., and Heidi Grant, eds. 2009. The Psychology of Goals.
New York: Guilford Press.
Padover, Saul K. 1939. Thomas Jefferson on Democracy. New York: Appleton-Century-
Crofts.
Pierro, Antonio, and Arie W. Kruglanski. 2008. ‘‘‘Seizing and Freezing’ on a
Significant Person Schema: Need for Closure and the Transference Effect in
Social Judgment.’’ Personality and Social Psychology Bulletin 34: 1492�1503.
Kruglanski and Boyatzi • Closed and Open Mindedness 231
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13
Richter, Linda, and Arie W. Kruglanski. 1999. ‘‘Motivated Search for Common
Ground: Need for Closure Effects on Audience Design in Interpersonal
Communication.’’ Personality and Social Psychology Bulletin 25: 1101�14.
Segre, Sandro. 2008. ‘‘Durkheim on Rationality.’’ Journal of Classical Sociology 8:
109�44.
Simon, Herbert A. 1978. ‘‘Rationality as Process and as Product of Thought.’’ The
American Economic Review 68: 1�16.
Sinclair, Lisa, and Ziva Kunda. 1999. ‘‘Reactions to a Black Professional: Motivated
Inhibition and Activation of Conflicting Stereotypes.’’ Journal of Personality and
Social Psychology 77: 885�904.
Steele, Claude M. 1988. ‘‘The Psychology of Self-Affirmation: Sustaining the
Integrity of the Self.’’ In Advances in Experimental Social Psychology, ed. Leonard
Berkowitz. Hillsdale, N.J.: Lawrence Erlbaum.
Swann, William B., Jr. 1984. ‘‘Quest for Accuracy in Person Perception: A Matter
of Pragmatics.’’ Psychological Review 91: 457�77.
Taber, Charles S., and Milton Lodge. 2006. ‘‘Motivated Skepticism in the Evaluation
of Political Beliefs.’’ American Journal of Political Science 50: 755�69.
Tversky, Amos, and Daniel Kahneman. 1974. ‘‘Judgment under Uncertainty:
Heuristics and Biases.’’ Science 185: 1124�31.
Vertzberger, Yaacov. 1990. The World in Their Minds: Information Processing, Cognition,
and Perception in Foreign Policy Decisionmaking. Stanford: Stanford University
Press.
Weber, Max. [1918] 1946. ‘‘Politics as a Vocation.’’ In From Max Weber: Essays
in Sociology, ed. H. H. Gerth and C. Wright Mills. New York: Oxford
University Press.
Webster, Donna M. 1993. ‘‘Motivated Augmentation and Reduction of the Over-
Attribution Bias.’’ Journal of Personality and Social Psychology 65(2): 261�71.
Webster-Nelson, Donna, Cynthia T. F. Klein, and Jennifer E. Irvin. 2003.
‘‘Motivational Antecedents of Empathy: Inhibiting Effects of Fatigue.’’ Basic
and Applied Social Psychology 25: 37�50.
232 Critical Review Vol. 24, No. 2
Dow
nloa
ded
by [
Mon
ash
Uni
vers
ity L
ibra
ry]
at 1
5:00
14
Sept
embe
r 20
13